You see color on things around you because light shines into your eye, is received by the cones and rods of your retina, and converted to electrochemical signals that are then processed through your brain. A digital camera detects color because light shines on sensors in your camera which are sensitive to red, green, and blue. The number of sensors in the camera will define the highest resolution possible for that camera. A traditional film based camera records an image onto a chemically treated plastic. Digital cameras record the red, green, and blue intensities for each pixel into a numerical file – the values of color and position of the pixel are defined with numbers. A number of different file types are used to compress the data so the file takes up a minimal amount of computer memory. To display the image on the computer screen, the computer takes the red intensity value for a particular pixel from the file and shines the red component of the pixel at that amount at that place on the computer screen. It does the same thing for the other two primary colors (green and blue) for every pixel in the image.
When we talk about seeing only one color we are not referring to the condition known as color blindness. People who are colorblind have difficulty distinguishing between certain colors, for example red and green, but they are still capable of perceiving light of both colors. What we are talking about is if you could only see the small range of light wavelengths that is called “red”. You would be unable to perceive the spectrum of light including orange, yellow, green, blue, and violet.
Back to Digital Image Investigations
Back to Overview of Investigations
Basics of DEW >