Pages

Thursday, April 30, 2015

How to Setup Dual CCTV Monitors

How to Setup Dual Monitors

Using dual monitors refers to using two physical display devices to increase the viewing space running on a single computer. Microsoft Windows Operating Systems and Mac OS X now support dual and multiple monitor configurations.

Setting up dual monitors is easy. However, it requires the user to add a second video card or install a video card that can support a dual head or two separate physical outputs. The following instructions are for those with only one video card installed in their computers:-
Make sure that the computer is working fine and that it can support more than one video card. Boot in safe mode to make sure that only one video adapter and monitor is shown in Device Manager.
Next, turn off the system and install the second video card. Once installed, connect the second monitor.
If the installation is done correctly, the computer should boot the same way as always and the second monitor should still appear dark. Depending on the presence of the correct drivers bundled with the OS, drivers for the second video card may have to be installed.
Configuring Dual Monitors under Microsoft Windows
Check Device Manager. To do this, right-click My Computer then choose Properties > Hardware > Device Manager. There should be 2 monitors under the Display Adapters.

To configure the second monitor, right-click the desktop and choose Properties. Go to the Settings tab where there are two boxes, one bigger than the other. These represent the two monitors plugged into the computer. Click the second window marked “2” and change the second monitor’s resolution and color depth. Make sure that the display parameters chosen are within the monitors’ limits.

Do not forget to click the “Extend my Windows desktop into this monitor” option. If this is done properly, the 2 monitors should have the same size. Click OK and the 2nd monitor should work properly.
Troubleshooting Dual Monitor Setups
If the above instructions were followed but the second monitor is still not working, check if the operating system supports both monitors.

Also, check the kind of video card that was installed. If the computer has only one AGP slot and a PCI or ISA slot has to be used for the second video card, change the BIOS setting so that the PCI boots before the AGP display adapter.
Set up Dual Monitors Using a Splitter
In order to connect two monitors to the same computer, a VGA or DVI-D splitter can also be used, depending on the computer hardware’s specifications. A VGA splitter simply connects to a computer via a male-to-female VGA cable end. It then splits the digital signal into two parts, without compromising the quality of either part, and directs each part to its respective VGA cable end, allowing a single VGA cable to connect one computer to two separate VGA-based monitors. If a user wishes to connect more than two monitors to a computer, he/she should use a VGA splitter that uses more cable ends.
How to Display Separate Applications on Each Monitor
Although a VGA splitter connects two or more monitors to a single computer, the secondary monitor will be completely blank until the user makes the necessary changes to Windows. In order to display separate applications on each monitor:
1) Click the Start Menu and open the Control Panel.
2) Open the “Appearance and Personalization” category and select “Adjust screen resolution.”
3) Select the “Multiple Displays” menu and choose “Extend these displays.” This will activate the secondary monitor.

4) Launch the programs to be accessed and drag them to the secondary monitor. This is done by simply dragging the program window to the side of the primary monitor, causing the program to appear on the secondary monitor.

Friday, April 10, 2015

Convert VGA to TV

Convert VGA to TV

Converting VGA Video Signals
The process of converting a VGA signal to a usable TV format such as Composite Video can be done with special cables. These cables will usually have a special VGA connector as well as a stereo plug which fits into a normal headphone or sound output jack. This will then convert the video and sound data to the other end, which would be a standard RCA connector. This will compose of a Yellow video cable, a Red right sound channel cable and a White left sound channel cable. These options come in various lengths as well as “female” cable ends for devices which connect into the converter cable.
The other option to convert the VGA to TV signal is through the use of special video converter hardware. These consist of special input boxes which can convert the video data directly into the needed TV signal. The hardware boxes may have several different options for connecting to a television. This could be an RCA output, HDMI output or other options depending on the hardware. Several different manufacturers make these hardware options which could connect your PC or compliant VGA device to a TV screen.
Advantages of VGA Conversion
There are many reasons why directly converting the VGA signal can be an advantage for many reasons. The video signal can be used for various different uses. The following are benefits of converting VGA signals to TV signals:
Easy Presentations – Some companies may not want to spend on a projector and additional hardware to make presentations. By using a VGA to TV converter, connecting a notebook computer with a VGA port allows you to make a Microsoft PowerPoint presentation or display presentation videos without the use of a projector or other device. This can be very effective and much cheaper by using materials already on hand such as a television. This also tosses out obsolete hardware such as Video Cassette Players.

Watching Downloaded Video Media – Some people enjoy using popular services such as iTunes or other equivalents to purchase and play videos, movies and other media. Watching these videos on your home entertainment system is always much more entertaining than the constraints of your computer monitor. Some laptop/notebook PCs have tiny screens which make watching this type of media difficult. The use of a television is also more convenient as you are able to relax in the comfort of your living room or bedroom.

Having a Larger Display Option – Some people simply want a larger monitor screen without having to pay a large amount for a new monitor. Using a television as a monitor is also a smart option, especially if the television is fairly new but does not have the modern VGA connector for HD video devices. Gaming on a larger screen is also more enjoyable for many people who have a video card capable of rendering high end graphics that deserve a larger computer screen.

Temporary Backup Monitor – Disaster can strike at any moment and a monitor can break and become useless. If you do not have the money to purchase a new monitor right away, a VGA to TV converter option could be a useful way to have a temporary monitor to use. The TV can also be used as a larger, secondary monitor. Many people want to have a dual monitor setup without having to buy a second monitor if they have a video card which supports dual monitor connection with the original motherboard connection. This is possible through the use of a VGA to TV converter.
Disadvantages to VGA Conversion
Small Resolution Size – Since VGA output from most computers is designed to work with the resolution which the video card can support with generic monitors, it is often restricted to smaller screen resolutions. The common resolution is 640 x 480 pixels while some video cards can display larger sizes in addition to this size. Most televisions which are not modern will only be able to display the smaller sizes with older graphics cards. Modern televisions may already have VGA input on them so checking to see if it is included is a must.
Reduction in Color Output – The VGA to TV converters will most likely reduce the amount of colors which can be displayed on the television, especially on older television sets. This is simply because the conversion of the signals is not perfect and results in some loss in quality.

Slow Video Response – Some VGA to TV converters will have a low bandwidth output and will only show a certain amount of frames per second. The fewer the frames, the slower the video feed will be, which may seem less than perfect. Flickering of the screen and in some cases artifacts may occur from the feed. This is not unusual in VGA to TV converters which are designed to be a simple bridge between video data. Some software take more steps to ensure that the video quality is greater, thus reducing this unsightly side effect of VGA to TV video conversion.

Video Deformity – The output from VGA to TV may not match well enough and have either a too long or short height or width setting. This may or may not be adjustable with your television and can look awkward for some uses. The video output may not work with every television and depending on the brand, some TVs may have issues while others may not. Higher quality televisions are expected to have better results with VGA to TV converters.

Manual Sound Connection – Some VGA to TV converters are video only. This means that you must figure out a way to connect the sound on your own, purchase an additional sound cable to feed into your television or simply deal with the sound option that your PC or notebook has. This can be severely difficult to deal with and may make the connection of the video not as perfect as needed.

Additional Power Consumption – Some VGA to TV devices will require additional power to run and will take up additional power sockets or consume batteries to operate. This may not be the best option for some people who already have several devices consuming power. This may also not be as portable as needed for some applications. Either way, consuming more power will always equal spending more money over time. Rechargeable batteries also consume electrical energy when they are recharged.

Saturday, April 4, 2015

1080i VS 1080p The Difference

1080i vs. 1080p: What's the Difference?

Progressive (1080p) video is considered better than interlaced (1080i), but it's not always clear why; here's what's actually happening on your TV screen.
Today's HDTVs can display beautiful, 1,920 X 1,080-pixel video, but the actual quality of what you're viewing depends on the source material. A lot of the time, you're not seeing exactly 1080p. In fact, most TVs today have two modes with similar names: 1080i and 1080p. Both have the same screen resolution, so what's the difference between the two? Here are five things you need to know:

1080i video is "interlaced." 1080i video plays back at 60 frames per second, but that's a bit deceptive, because it's actually broadcast at 30 frames per second. The TV then displays those frames twice, the first pass is 1,920 X 540 for the even scan line field, and the second pass is 1,920 X 540 for the odd scan line field. The process by which this occurs is called interlacing. It contributes to a sense of motion and reduces perceived flicker.

1080p video is called "progressive scan." In this format, 1,920-by-1,080-pixel high-definition movies are progressively drawn line after line, so they're not interlaced. On paper, that may not seem like a huge deal. But in the real world, what you end up seeing looks sharper and more defined than 1080i, particularly during scenes with a lot of fast motion.
Sometimes 1080p is termed "Full HD" or "True HD," to distinguish it from 1080i or 720p video. Blu-ray discs contain 1080p video at 24 Frames Per Second, and then, using a method known as 3:2 pulldown, display it at 30 frames per second on screen.
Data compression can confuse the issue. Sometimes cable companies will deliver a 1080i picture, but then compress the data significantly in order to use up less bandwidth. The result can mean smeared details or pixelated color gradations in certain scenes. It's still technically HD, and still looks better than standard-definition cable, but it's not as good as it could be.
This also happens with 1080p streaming Internet video, but in that case, it's usually dependent on the speed of your data connection. In fact, Blu-ray is currently the only practical format for watching lots of pure 1080p content. Even the latest Apple TV, which supports 1080p streaming, does so in a compressed format that loses a bit of quality (although it still looks quite good).

Both formats look similar on smaller TVs. As a general rule, you need a larger TV to notice the difference between 1080i and 1080p. Depending on your eyesight, you can probably pick up the difference on a 32-inch LCD if you're particular about it. But most consumers don't really see a marked difference until at least a 42-inch screen, if not larger. In fact, many people are perfectly happy with 720p HDTV sets even at higher sizes; we recently named one, the 51" Samsung PN51E490B4F, best Choice for budget large-screen HDTVs.

1080p isn't even the best anymore. Technology never stands still, of course. Five years from now, you'll probably just want Ultra High Definition (aka 4K) video instead. (For a closer look at 4K video, check out What is Ultra HD?) But for now, if you're a videophile who appreciates a sharper picture, 1080p is definitely the way to go—both in HDTV capability and in the source material you're viewing.