quote:
Originally posted by CookieRevised
quote:
Originally posted by qune
it's also the cheapest and fool-proof way in most cases...
cheapest? nope... it also isn't the cheapest in manufactering it. Fool-proof? howso? the only 'fool-proof' thing about it is that you can only plug in a certain type of monitor, not really what I call fool-proof (rather a "damn, now I also need a more expensive monitor which supports DVI")...
You can't do nothing wrong by plugging monitor A in port B or whatever if both ports are of the same type. It's not like a monitor can only be used as primary or secondary monitor; it simply displays what is being output on the port it is plugged in... So this hasn't got anything todo with being fool-proof (as in prevent you from doing something wrong).
The reason why such cards have 1 VGA and 1 DVI port is for compatibility; not for fool-proofing something. New, modern, (and slightly more expensive monitors) have a DVI port instead of a VGA port. To serve both type of costumers (the ones with "older" VGA monitors and the ones with modern DVI monitors) they include both types of ports on their cards. The ability to use them as dual monitors is only an added bonus.
That's also why there are graphics cards with both the types of plugs but without dual monitoring support; it's either DVI or VGA, and not always both...
If I post my mother board info and everything tommorow, could you explain to me what I need to do? [I will also post my bios information]