Your images may be better than you see!

Pete Nolan Aug 23, 2008

  1. Pete Nolan

    Pete Nolan TrainBoard Supporter

    10,587
    237
    125
    Until October 2007, I usually worked at home, on a MAC G5 with an expensive Apple Cinema Display, until recently the "gold standard" for correct color and brightness rendition. I worked on images in blissful ignorance of what might happen to them on different computers. I did a ton of image work, and it always came back from a printer or video producer spot on.

    Since October I've had to work on site, on a "standard" PC package, which is not shabby, but oriented toward program management rather than images. For the first time today--it was very slow--I pulled up some of my Trainboard images.

    All I can say is--------YUCK! They looked horrible: wrong colors, washed out, muddy in spots, too contrasty in others.

    It blew me away. I raise the issues not to start a MAC vs PC war (the latest PC monitors are just as good as the Cinema Displays), but to warn folks that, without a good display, it's really hard to see how GOOD your images may be!
     
  2. SteamDonkey74

    SteamDonkey74 TrainBoard Supporter

    7,160
    171
    90
    That's the same with web pages. Sometimes people spend all sorts of time coding a web page on one system and don't check it against other ones. Contrast between similar colors may look great on your monitor or on my admittedly old school but very sharp Trinitron, but will mush together on some cheapie.

    Your photos look good here, Pete, on my nearly five year old mac mini running OS X 10.3.9 and with about a 12 year old Sony Trinitron Multiscan 17sfII monitor.
     
  3. Hytec

    Hytec TrainBoard Member

    13,965
    6,903
    183
    Pete, your observation is correct, but directed at the wrong culprit. The culprit usually is the firmware and/or software controlling the display device. The phosphors of CRTs, and crystals of LCDs are pretty consistent in their color veracity. Unfortunately, there is minimal or no consistency within the firmware and software development fraternity, and there are so many developers with "better ideas".

    Firmware developers are usually engineers whose goal is efficiency and best response time of their drivers and devices, with little or no concern for precise color reproduction. Software developers, on the other hand, are more concerned with giving their customers the most flexibility to create graphic "works of art", e.g. Photoshop where the user is able to (must?) define the best monitor color balance for their application and/or system. Or to develop the best "middle of the road" application for the least cost, e.g. Paint Shop where, in the effort to satisfy the largest population, they satisfy the least.

    As you said, your home system has been set up over time to satisfy your critical artistic tastes. Whereas the "on-site" system is oriented towards management applications where I assume the displays are set up to provide information for the least cost, rather than works of art.
     
    Last edited by a moderator: Oct 21, 2008
  4. Pete Nolan

    Pete Nolan TrainBoard Supporter

    10,587
    237
    125
    Hytec,

    Very astute observations! I've spent many hours trying to calibrate my Samsung SyncMaster to my Apple Cinema display with little success.

    I think the basic components fall into expensive and less expensive. On the expensive side, I can usually calibrate exactly. But when I have an expensive vs. inexpensive situation between displays (i.e., mine for now), it gets really dicey. I've worked with a digital printer for 10 years now, and know what I see on my calibrated Apple Cinema display is what will be printed. And my published articles have been spot on. I have no such confidence about my secondary screen at home, or my screens at work.
     
  5. Powersteamguy1790

    Powersteamguy1790 Permanently dispatched

    10,785
    10
    115
    Pete:

    Posted images will always be different when viewed on different monitors. Every monitor has be finely calibrated to give an ideal reproduction of a photo.

    Most standard monitors don't have this feature and these variations occur.
     
  6. Pete Nolan

    Pete Nolan TrainBoard Supporter

    10,587
    237
    125
    So I've learned! I've always paid a premium for a good main monitor. I've found that a premium monitor doesn't drift over time, while a standard monitor drifts even during a day and, over time, drifts even more, usually toward the muddy side.
     
  7. SteamDonkey74

    SteamDonkey74 TrainBoard Supporter

    7,160
    171
    90
    I have subscribed to roughly the same philosophy. The last time I purchased a monitor was 1996. I am still using it - a 17" Trinitron. I have had other monitors (cheap-os that my wife had already when we met) run all over the place with color and then die ignominiously in the middle of a project, and I keep coming back to this one.

    I have a 19" Trinitron I got in exchange for a book that I need to pull up and hook up and try. It was a surplus monitor my cousin had.

    Some day I will make the leap and get a good flat screen, but I have been less than impressed with the color reproduction on the lower priced ones. I am waiting for today's high end to come down a bit in price, or for someone to toss a used Apple Cinema Display or similar my way in exchange, perhaps, for some books out of my store. (This sort of thing happens now and then.)
     
  8. BarstowRick

    BarstowRick TrainBoard Supporter

    9,511
    5,673
    147
    Pete, your images come across my screen perfecto-ah-mundo. Looking good here.

    Regarding my images: I know something is wrong with my images. I thought it was the confounded, cheap, digital and almost box like camera I shoot with. Most likely a combination of an older PC, Monitor and Camera.

    I have no idea how my images are coming across on your monitor / screens. They look good here as per my limited photo shop program. As good as they can considering what I have to work with. In my opinion...the pictures I take with my digital camera are sub standard, when compared to the quality of my 35mm Minolta. Film costs and processing prohibit use of this fine piece of equipment. It takes all the fun out of a rail fan shoot.

    I've spotted a digital camera I would like to purchase. It has the same features as my Minolta. I just have to save up those quarters, dimes and nickels. By the time I save up, I won't need it anymore. How about that.
     
    Last edited by a moderator: Oct 27, 2008
  9. Leo Bicknell

    Leo Bicknell TrainBoard Member

    569
    30
    27
    There actually is a war here, but it's not the one you're thinking about. :)

    Your monitor has a number of color calibration settings, which should be no surprise. What should also be no surprise is there is no one standard, rather, like everything else in life there are many standards. Lastly, the thing that should surprise you the least is that Apple and Microsoft have chosen two different calibrations.

    The magical difference in this case is "Gamma". Here's a decent summary, although there are dozens across the web: Proper Gamma (Brightness) for Cross Platform WWW Images

    Basically PC monitors tend to be shipped at a Gamma of around 2.2, and many PC video card drivers have no facility to change it! Mac's ship with a Gamma of around 1.8, and if you go to System Preferences->Displays->Color->Calibrate will let you calibrate to any Gamma you want, save and manage multiple profiles. Some high end workstations use even lower Gamma's, SGI was famous for a 1.4 Gamma setting.

    Generally speaking most programs use the OS settings and are ignorant of Gamma. There are some exceptions, Photoshop for example knows the native OS Gamma and can compensate for it, here's a page with the info: Photoshop Tip - Windows vs Mac Monitor Gamma | PhotoshopSupport.com

    Of course the really frustrating thing is you can't control how the end user will see your work. Their monitor might have a Gamma of 1.4, 1.8, 2.2, or even other values. Your only defense is to keep your entire image in a fairly "safe" range, but of course that elminates the most dramatic images.

    Last but not least, before you get too upset with any computer, check the monitor settings. Monitors are shipped to look good under the poor lighting at Best Buy, and that's not the right settings for home at all. If you have a Mac perform the Calibration steps on the control panel above. I recommend doing it twice, once with a Gamma of 2.2 for normal work, and then create a second profile at Gamma 1.8 for "PC Compatibility". If you have a PC, well, I don't know how to tell you to Calibrate it other than to buy expensive stuff.

    If you're really serious about pictures and/or video you need a calibrator. It's a little mouse like usb dongle you hang over the monitor right on it. The software then cycles through images and checks them based on the mouse readings calibrating your monitor perfectly. This software usually has additional profiles like RGB, sRGB, Pantone and others as well.

    Bottom line, if you're trying to be professional you need to always check your work on both Gamma 1.8 and Gamma 2.2, or really on a Mac and on a PC. This is doubly true for images with extreme contrast.

    Don't computers make life fun! :p
     
  10. Leo Bicknell

    Leo Bicknell TrainBoard Member

    569
    30
    27
  11. BarstowRick

    BarstowRick TrainBoard Supporter

    9,511
    5,673
    147
    Leo,

    I discovered the "Gamma" feature early on, as I attempted to improve my images. The nice thing about it is I can lighten up the background in a picture. Down side, if the flash hits and bounces back from the very front of the subject I'm focused on, it bleaches that spot out. Not to mention that the picture can turn grainy.

    Aw shucks, I just need to go out and get a better camera, computer, monitor and the real Photo Shop. Everything in time.
     
  12. bkloss

    bkloss TrainBoard Supporter

    360
    191
    25
    Being a "PC Guy" I can truly attest to the broad range of colors, shades, tones... not only displayed on one's monitor but also when the product is printed. I use a product called "Spyer2Pro" (there's a newer version now) to calibrate my monitors and I use "Qimage" to calibrate between what is displayed on the screen and then printed on a high quality Epson inkjet.

    The difference is utterly amazing! The problem is always going to be between monitors - uncalibrated and (or) calibrated and also the size of the monitors used in the application.

    Brian
     
  13. Pete Nolan

    Pete Nolan TrainBoard Supporter

    10,587
    237
    125
    Leo,

    I'm thoroughly conversant in gamma, but didn't want to get into the technical stuff. Both my Apple and PC monitors are at 1.8, but the results are very different, even using the Apple utilities. Actually I used my main printer's fancy colorimeter to calibrate the main display. I was extremely close using Apple's utility, but now I'm spot-on. I've had trouble with only one publisher, who professed ignorance of the whole process.

    I think you'll agree that this is a fairly arcane and even byzantine process of no importance to most computer users. Having worked in the printing and computer industries for more than 40 years, it's second nature to me. I send my color profiles to publications/printers along with my images, and have had only one bad result in thousands of images over the years. We've never figured out that one! But it wasn't an important image, so we just let it go with a shrug. It probably had a peculiar color distribution that Xerox's printer misread for some reason. It was just a little muddy, so we didn't investigate it much.

    I think my whole point in this thread is that, if you want your pictures to look their best, you should spent a little time learning about calibrating your monitor(s).

    BTW, I can calibrate my old monster 23" Viewsonic CRT to my LCD Cinema Display. I just don't want this 95-lb monitor on my desktop!
     
  14. Leo Bicknell

    Leo Bicknell TrainBoard Member

    569
    30
    27
    The monitor situation is a decided mess. Beyond gamma, fluorescent and LED based LCD panels have fairly different tones to them, and then there's the whole glossy or matte discussion as well. Plus of course most home users seem to think 99% brightness and 99% saturation make the best images. *sigh*

    Don't even get me started if you want your monitor to match your printer.

    The industry can do better, that much I know. I think all monitors should come with a usb color calibrator....
     

Share This Page