Question:
Camcorders: What is the difference between "resolution" and "quality"?
Sean M
2010-02-24 18:15:58 UTC
I know that resolution means "the number of pixels per square inch on a computer-generated display; the greater the resolution, the better the picture"...
But then what is quality? (and i know SF=superfine, F=fine, and N=normal),but what does that do?
HERE IS A PICTURE TO SHOW WHERE THIS QUESTION CAME FROM... http://i937.photobucket.com/albums/ad213/xmfredrickson/untitled.jpg
Four answers:
Glenn
2010-02-24 19:41:52 UTC
The resolution is referring more to size differences while maintaining the same quality. A screen resolution would be 1024 x 768. That does not refer to quality. Quality is, as it sounds, how the image looks. Higher quality increases file size because there are more pixels per given space. The more dense the pixel count, the better the image.
lare
2010-02-25 09:25:23 UTC
since you ask this in the camcorder section, you need to understand that video is not static. the purpose of video is to portray constantly changing imagery.



in photographs, there are only 2 real consideration to quality, 1 being the pixel count, and 2 being the bit depth or "gray scale". To be photo realistic, a gray scale of no less than 8 bit or 64 levels of brightness, or color depth 24 bit (8 bits each RGB) is required. And a density of 64 pixels per inch of display (not to be confused with printer dots per inch). A printer requires an 8x8 array of dots (not pixels) to portray just 1 pixel with continuous tone, so print resolution of 300 dpi is usually specified as the minimum for photo work.



in video, both of these aspects apply plus the quality of motion. a video of 1 frame per second would not convince you of smooth motion. cartoons use 12 fps to portray motion, but the sometimes comic result is a bonus. NTSC analog television displays 60 fields per second to give very lifelike portrail of even the most active sports. The data rate of NTSC analog television in digital terms is 270 mbps. In order to standardize digital television sets, the pixels of a displayed picture are usually 1080x1920 or 720x1280.



a photo camera can record an image in uncompressed raw or bmp format, but the data storage requirements for even a small pixel size picture is enormous. so most cameras compress the data using JPEG. with JPEG there are quality settings but the advances of high density memory cards, it is practical to stay with the higher settings only and the resulting compression is nearly lossless.



with video, the compression needs are even more so. A 2 GB memory card would only hold 1 minute of standard video, or 15 seconds of HD video, assuming that it could load data that fast (it can't). A camcorder is designed around how fast its storage media can load data, for miniDV tape that is 25 mpbs. Through a combination of interframe compression (similar to JPEG) and color resolution reduction, it is done without reducing the NTSC rate of 60 fields per second. However internet, DVD and flash memory cards are not so blessed. They have a data rate of maximum 6 mbps so they require temporal compression. They take only 4 frames per second (that is the spec for H.264 HD), but generate a normal frame rate on playback by interpolating the missing frames. So for file based digital video, the quality settings refer to the data rate. More compression requires even more compromises. Some cameras can do more than 6 mbps rate (which is "standard" quality) so the F and SF refer to 12 and 18 mbps. The higher the setting, the less macro blocking and other visible motion artifacts.
?
2016-08-05 03:54:10 UTC
Don't believe that is correct
?
2016-09-13 03:53:48 UTC
Thanks everyone for answering.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...