Showing posts with label 32" LCD. Show all posts
Showing posts with label 32" LCD. Show all posts

Thursday, April 30, 2015

How to Setup Dual CCTV Monitors

How to Setup Dual Monitors

Using dual monitors refers to using two physical display devices to increase the viewing space running on a single computer. Microsoft Windows Operating Systems and Mac OS X now support dual and multiple monitor configurations.

Setting up dual monitors is easy. However, it requires the user to add a second video card or install a video card that can support a dual head or two separate physical outputs. The following instructions are for those with only one video card installed in their computers:-
Make sure that the computer is working fine and that it can support more than one video card. Boot in safe mode to make sure that only one video adapter and monitor is shown in Device Manager.
Next, turn off the system and install the second video card. Once installed, connect the second monitor.
If the installation is done correctly, the computer should boot the same way as always and the second monitor should still appear dark. Depending on the presence of the correct drivers bundled with the OS, drivers for the second video card may have to be installed.
Configuring Dual Monitors under Microsoft Windows
Check Device Manager. To do this, right-click My Computer then choose Properties > Hardware > Device Manager. There should be 2 monitors under the Display Adapters.

To configure the second monitor, right-click the desktop and choose Properties. Go to the Settings tab where there are two boxes, one bigger than the other. These represent the two monitors plugged into the computer. Click the second window marked “2” and change the second monitor’s resolution and color depth. Make sure that the display parameters chosen are within the monitors’ limits.

Do not forget to click the “Extend my Windows desktop into this monitor” option. If this is done properly, the 2 monitors should have the same size. Click OK and the 2nd monitor should work properly.
Troubleshooting Dual Monitor Setups
If the above instructions were followed but the second monitor is still not working, check if the operating system supports both monitors.

Also, check the kind of video card that was installed. If the computer has only one AGP slot and a PCI or ISA slot has to be used for the second video card, change the BIOS setting so that the PCI boots before the AGP display adapter.
Set up Dual Monitors Using a Splitter
In order to connect two monitors to the same computer, a VGA or DVI-D splitter can also be used, depending on the computer hardware’s specifications. A VGA splitter simply connects to a computer via a male-to-female VGA cable end. It then splits the digital signal into two parts, without compromising the quality of either part, and directs each part to its respective VGA cable end, allowing a single VGA cable to connect one computer to two separate VGA-based monitors. If a user wishes to connect more than two monitors to a computer, he/she should use a VGA splitter that uses more cable ends.
How to Display Separate Applications on Each Monitor
Although a VGA splitter connects two or more monitors to a single computer, the secondary monitor will be completely blank until the user makes the necessary changes to Windows. In order to display separate applications on each monitor:
1) Click the Start Menu and open the Control Panel.
2) Open the “Appearance and Personalization” category and select “Adjust screen resolution.”
3) Select the “Multiple Displays” menu and choose “Extend these displays.” This will activate the secondary monitor.

4) Launch the programs to be accessed and drag them to the secondary monitor. This is done by simply dragging the program window to the side of the primary monitor, causing the program to appear on the secondary monitor.

Saturday, April 4, 2015

1080i VS 1080p The Difference

1080i vs. 1080p: What's the Difference?

Progressive (1080p) video is considered better than interlaced (1080i), but it's not always clear why; here's what's actually happening on your TV screen.
Today's HDTVs can display beautiful, 1,920 X 1,080-pixel video, but the actual quality of what you're viewing depends on the source material. A lot of the time, you're not seeing exactly 1080p. In fact, most TVs today have two modes with similar names: 1080i and 1080p. Both have the same screen resolution, so what's the difference between the two? Here are five things you need to know:

1080i video is "interlaced." 1080i video plays back at 60 frames per second, but that's a bit deceptive, because it's actually broadcast at 30 frames per second. The TV then displays those frames twice, the first pass is 1,920 X 540 for the even scan line field, and the second pass is 1,920 X 540 for the odd scan line field. The process by which this occurs is called interlacing. It contributes to a sense of motion and reduces perceived flicker.

1080p video is called "progressive scan." In this format, 1,920-by-1,080-pixel high-definition movies are progressively drawn line after line, so they're not interlaced. On paper, that may not seem like a huge deal. But in the real world, what you end up seeing looks sharper and more defined than 1080i, particularly during scenes with a lot of fast motion.
Sometimes 1080p is termed "Full HD" or "True HD," to distinguish it from 1080i or 720p video. Blu-ray discs contain 1080p video at 24 Frames Per Second, and then, using a method known as 3:2 pulldown, display it at 30 frames per second on screen.
Data compression can confuse the issue. Sometimes cable companies will deliver a 1080i picture, but then compress the data significantly in order to use up less bandwidth. The result can mean smeared details or pixelated color gradations in certain scenes. It's still technically HD, and still looks better than standard-definition cable, but it's not as good as it could be.
This also happens with 1080p streaming Internet video, but in that case, it's usually dependent on the speed of your data connection. In fact, Blu-ray is currently the only practical format for watching lots of pure 1080p content. Even the latest Apple TV, which supports 1080p streaming, does so in a compressed format that loses a bit of quality (although it still looks quite good).

Both formats look similar on smaller TVs. As a general rule, you need a larger TV to notice the difference between 1080i and 1080p. Depending on your eyesight, you can probably pick up the difference on a 32-inch LCD if you're particular about it. But most consumers don't really see a marked difference until at least a 42-inch screen, if not larger. In fact, many people are perfectly happy with 720p HDTV sets even at higher sizes; we recently named one, the 51" Samsung PN51E490B4F, best Choice for budget large-screen HDTVs.

1080p isn't even the best anymore. Technology never stands still, of course. Five years from now, you'll probably just want Ultra High Definition (aka 4K) video instead. (For a closer look at 4K video, check out What is Ultra HD?) But for now, if you're a videophile who appreciates a sharper picture, 1080p is definitely the way to go—both in HDTV capability and in the source material you're viewing.