View Single Post
Old 12th October 2017, 19:36   #4  |  Link
mariush
Registered User
 
Join Date: Dec 2008
Posts: 589
The software most likely uses only the processor to encode the video, the video card would not be used. If it uses the video card, it doesn't use it as a video game would use it, but the software would most likely access a special part of the video card processor which is designed from the start only for encoding and decoding videos. Basically, you could even play games while some software would use that separate isolated part of the video card processor to encode videos, with minimal slowdown.

Also, any damage to the integrated graphics in the cpu would not result in waves on the screen.

As for how can you know if it's vga or hdmi ? Just look at the cable used.

HDMI cables look like this: https://www.amazon.com/AmazonBasics-...rds=hdmi+cable (flat, wide shape, small connectors )
VGA cables look like this : https://www.amazon.com/Cable-Matters...ords=vga+cable (often the connectors are blue but the shape is what's important, D shape with round pins)
DVI cables look like this : https://www.amazon.com/StarTech-com-...ords=dvi+cable (some have more pins inside, some may be gold plated, but again, the shape is important, rectangle-ish and big connectors, often larger than vga connectors)

DisplayPort looks like this : https://www.amazon.com/Cable-Rankie-...playport+cable (sort of like hdmi but more blocky and thicker)
mariush is offline   Reply With Quote