display - How Colors are filled in a bit i.e either 1 or 0?

07
2014-07
  • RaHuL

    This question is just out of curiosity and I tried searching for it. But most of the site speak about binary representation and 256 combinations in 8-bit.. This I already know.. I do know that RGB has (255, 255, 255) combinations.. But just consider R with 255 combinations from black to white, How does the computer generate a red color just out of electricity.. Or is it the color filled in the tubes of the screen or monitor used for displaying? Also if RGB takes (255, 255, 255) combinations, then does that mean RGB takes 3 bytes of color combination.. Thanks in advance for solving the query?

  • Answers
  • Majenko

    Internally to the computer each pixel is stored as a triplet of RGB, each with a value (typically 0 to 255). These are 8 bit values, which means that they are each represented by a combination of eight 1s and 0s (as you already know). Because there are 3 colours, and each is represented by 8 bits, that results in a total of 24-bits of colour (hence 24-bit images), or 3 bytes per pixel.

    As the computer displays the image to the screen it scans through the display area one pixel at a time, and, depending on the display technology used, sends that information to the screen to be displayed.

    For digital display systems (DVI, HDMI, etc) that binary data is sent to the screen for interpretation. For analogue systems (VGA) the interpretation is done by the computer itself and the results sent to the display.

    The interpretation is basically converting each of the 3 bytes into a voltage. This is done by a device called a DAC, or Digital to Analogue Convertor. It takes the binary value which represents 0 to 255 and outputs a discrete voltage for each different value. So, for instance, a binary value of 0 could output a voltage of 0V. A binary value of 255 could output a voltage of 5V. A binary value of 93 would therefore output a voltage of 1.823529412V.

    There are usually three DAC modules, one for each colour, and the resultant voltages are used by the display to control the brightness of that specific pixel. In an old CRT it is used to set the intensity of the electron beam. In a TFT it is used to set the opacity of a subpixel (of which there are usually 3 - one red, one green and one blue).

    It is common for TFT screens to not be able to display the full 0-255 colour range in 8 bit detail, so often you get less actual colours displayed. 256K colours, or 18 bit, are common on TFTs, whereas CRTs are capable of displaying the full 24-bit range. It is also possible to get (if you pay tens of thousands of dollars) "HDRI" monitors (High Dynamic Range Imagery) which are capable of displaying more than the normal 18- or 24-bit colour detail, but these are very rare and only used for specific high-end jobs.


  • Related Question

    How do I configure my monitor/operating system for optimal color display?
  • Blixt

    I'm a web developer, and for a hobby I take photos and edit them on my computer. I've found that sometimes I miss stuff on my own monitor that are apparent on other monitors (aliased edges that didn't appear on my screen, for example.)

    What tools/methods can I use to get the optimal settings for my computer?

    Also, is it common for VGA (DE-15) cables to show less color detail than a DVI cable? I recently got a new monitor at work and it turned out that I couldn't see the difference between white and any color with RGB values above 230 or so until I switched to a DVI cable.


  • Related Answers
  • Jeromy Irvine

    I find the Lagom LCD monitor test pages to be very useful for this. It's the first site I visit after hooking up a new monitor.

    I've also noticed the same difference in VGA vs. DVI, but I only have personal observation to back that up, no actual data. Perhaps somebody more knowledgeable about the tech could give us a definitive answer.

  • Manuel Ferreria

    I got myself an Spyder2 Express for $120 (shipped to another country, so it must be much less on the States), and it has been a life changing experience.

    Before calibrating my monitor, I was uncertain whether my pictures (I am also a photographer) will be printed or shared just as I see them. After having shared some, and people told me that they were too dark and color-casted to yellow, I got fed up and bought the Spyder2Express.

    The process is really simple, just stick the device to the screen, run the calibration software, and it automagically creates a color profile according to the colors measured.

    After that, I loaded up some pictures on a white background, and boy it was white. No cast whatsoever.

    After having suffered for not having the correct color calibration on the monitor, I will never go editing again, without making sure you have the best possible settings on a monitor. NVIDIA tweaking or Adobe Gamma can go so far, but it is not an exact science and depends very much on the eye of the calibrator. Using a device will rule you out of the equation and can be much more accurate what you get.

  • pgs

    If you can get your hands on some monitor calibration hardware then that is the way to go.

    Otherwise get an ICC profile from your monitor's manufacturer (if they provide one) and load it into windows. See the Color Management app in Vista's control panel. You can download an XP color management tool from Microsoft.

  • spoulson

    You need to calibrate your display. Tekzilla had a great episode on how to calibrate an HDTV and the same principles apply. You'll use the calibration tools available for your video card.

    This basically consists of using test patterns that help you decide where to set the color levels, brightness/contrast, sharpness, etc.

  • Stefan Thyberg

    The people at Digital Photography Review, a VERY serious photo site, used to use products from Datacolor to calibrate their monitors, but there are probably simpler ways.