I’ve been into photography for years and this is an issue that keeps coming up and discouraging me. If someone could help me resolve this, I’d be eternally grateful
Basically, I understand the concept of calibrating monitors but every time I actually calibrate mine it only makes my monitor look unusably awful and kind of ruins my prints that already looked good when posting online.
This all started ten years agon (and again, this pattern has repeated every 1 to 2 years for the past ten years)….
Ten years ago, I would take a RAW photo on my camera and transfer it to my macbook pro (yes, I know you shouldn’t edit and print from a laptop, but it’s all I had at the time). The RAW, undedited image from the camera to Lightroom looked identical. I edit the photo, post it online and it looks good from my iphone, facebook, other peoples phones and other computers. I even printed a couple photos and they looked pretty good. I am now looking at a photo that I edited at that time from my uncalibrated MBP and it looks very close to how it looks on my iphone, which is the same LR from 10 years ago.
At the time, I figured it was important to calibrate my monitor but when I did that it just destroyed the screen on the macbook. It didn’t even look close to natural and turned everything muddy brown. Now, I understand maybe I was just used to seeing the incorrect, uncalibrated version but I have an image that proves the uncalibrated screen printed just find and looked great on a screen. However, the calibrated screen looked too awful to continue using so I deleted the profile and continued editing the way I did.
Again, over the next ten years I’ve repeated this process over and over. The calibrated screen just looks too bad to deal with and it makes my images that I worked so hard on, and look good on other screens, look terrible.
So tonight I am now using a PC and a BenQ gaming monitor that is 100% SRGB accurate, I decided to calibrate again because I really really want to get into printing my images but the same thing happened. All my images, that look great on my iphone and match my uncalibrated screen to about 90% now look awful.
What am I doing wrong? I do like to game on this same screen but I’ve always just decreased the screens default color saturation and contrast to match how the images look on my iphone, which matches Lightroom pretty closely.
Also, the uncalibrated screen I am currently using looks identical to how the raw images look in camera but the calibrated screen looks nowhere near close.
I’m once again discouraged and giving up on trying to print but I’d love to figure out what I’m doing wrong.
It seems that I have to choose between editing and viewing my images on an uncalibrated screen and my images will look better on a screen or calibrate my screen and maybe they print more accurate but they will not look the same when posted online.
If there is someone out there who wants to make some money, PM and I will pay you 50$ for your time if you can help me figure out this problem.
What tools do you use for calibration?
Thank you so much!!
I am using a Spyder pro c
Gamma 2.2
6500k
120 bright
I’m calibrating monitors for a similar period probably. And I also started with Spyder. These display calibrators age quite fast in theory - you should replace them every 2-3 years. I just started using spectrophotometers instead - but I’m lucky to have them at work. They are much more expensive to buy and have their own shortcommings. As you have similar experience from the beginning - probably this isn’t the issue. Also older Spyders don’t work with wide gamut monitors - not sure if you monitors have wide gamut. Also monitors with TN panels don’t calibrate too well - although I tried some and there was an improvement.
Overall after longer use of uncalibrated monitor calibration always looks dull and sometimes have some tint - it’s just our brain needs to adjust. But the result speaks for itself. I work in print and have access to colour proofing equipment - calibrated screen looks really close to calibrated print. But! It mostly differs from phones, other screens. Also not all printers have colour accurate equipment. By doing calibration you are reducing deviations from your side - but there are a lot of weak links left in the chain. So depending on your needs visual matching to other screens or print (if you can’t access colour accurate printing service) might work better than calibration. Although I prefer to have my screen as the starting point which I can more or less rely to be accurate.
Not sure if all this novel is usefull at all :) Just wanted to share my experience.
First of all your benQ gaming monitor is going to have shit colors , doesn’t matter that it says 100% sRGB. The panel is made for quick response time and not color accuracy.
Second of all, calibrating a monitor needs to be done with an external device. Something that can read monitor values from the outside. It doesn’t matter how accurate the software side is if the hardware side is lacking.
Now 10 for your sentiment 10 years ago more or less every consumer device had shit displays. Today you have much more variation on high end consumer devices, you have phones and TVs with OLED screens, you have ipads with microLEDs, you have phones with normal LED screens etc etc , some of these displays have HDR capabilities some don’t.
The monitor you’re using is probably way worse than the display you had in your macbook pro.
Color accurate monitors for PCs are expensive and they still need a calibration with an external device since their out of factory performance isn’t the best.
On the other hand you have apple products that love them or hate them, there’s a reason they’re an industry standard. A 1.5k studio display will give you much better performance than any monitor in that price range in the real world straight out of the box.
Youtubers like to compare stats on paper but they never mention the insane margins of difference a panel will have from one to another.
2 years ago at work we ordered 10 32 inch screens (from a well known brand), not 2 of them were calibrated the same out of factory.
By, my perception when I calibrate the monitor (again, using a spyder pro) they colors look “worse” to me. Now, I’m guessing this is because I’ve grown accustomed to viewing shit colors where the monitor is actually showing my the true calibrated colors that are actually in my photo files? Am I understanding this correctly?
Can’t thank you enough for your advice
You can use the best calibration tool in the market if the panel cannot physically produce the result it won’t matter.
After you’ve calibrated the monitor you need to check the delta e , that will show you how accurate the colors that the monitor is displaying are vs what it’s supposed to.
Check some linus tech tips videos on monitors and you’ll see that even after calibrations some monitor just don’t have a good delta e.
……
I’ll give you an example, i have an alienware 240hz gaming monitor , it’s a 600-700 euro monitor. I would never try to get accurate colors out of it.
It’s not made for that. It’s made to refresh as fast as possible so i get a competitive edge in games.
You can’t ask a Ferrari to do off roading.
As you have pointed out, every device has a different screen and images will look different from one to another. Mobile devices and PCs are setup to show highly saturated, high contrast images. As someone else said, “punchy” and people have gotten conditioned to want this look on their screens.
If you are only using your photography on screens, I don’t think you should bother with calibration. This is because you’re editing an image on a calibrated monitor and displaying it on a variety of other screens that are uncalibrated. It won’t look right.
However, if you are going to print your photos, calibration is critical. Printed photos do not look like the punchy screens. In this case you are calibrating your monitor to show you what the image will look like in print. Even then, there are usually tweaks you need to do to get the screen to match the final print, but starting with a calibrated monitor gets you 90% there.
Maybe it makes sense to do the first round of edits on the uncalibrated profile to make sure it looks good on a screen and then only calibrate when I go to print? I guess Virtual copies make this easy to do
This is why most pros use Macs. Most Apple screens are pretty good without any manual adjustment.
A lot of useful things have already been said here, but I think there’s still something missing. Colour management has two components: calibrating devices AND profiling them so the operating system (and thus any software that uses colour management) knows what it’s dealing with.
To calibrate means to adjust e.g. a monitor to match certain target values, specifically a given colour space, gamma curve and brightness. After doing this you can say “Now my monitor complies to sRGB with a brightness of 120”. However, your software doesn’t know that unless you create an ICM profile for your calibrated device, install it in your operating system and link it to that device. For profiling you can use the same tools you already have for calibration.
Another thing I’d like to add is that 120 brightness is “correct”, but it’s significantly darker than what people are used to look at. For me personally it’s too dark so I’m going for a brightness of 160. That also works well if you keep in mind that physical prints will be darker than that.
Thanks so much. I am going to do a set of test prints on the calibrated monitor. I made a virtual copy of some images and I will do a pre and post calibrated test to see how they look.
If they come out still to dark, I will try that 160 setting as your suggested.
thanks!!!
Screen calibration IS ONLY useful if you are printing photos especially in a professional setting. Otherwise no need to calibrate.
Yes in general calibrated screens have yellowish tint. Edit in a neutral colored room with no color casts or reflections, load the profile that matches the paper stock and and regularly do proof prints to see if it matches exactly what you want the print to look like on paper.
That is not true. By calibrating your screen you are setting a standard benchmark on what colors your images are “supposed” to look like across all displays. In OP’s case, his $300 monitor is physically unable to display true accurate colors no matter how many times he calibrate his monitor. But to say calibrating your monitor only matter if you print is just ignorant. Plus you would have to calibrate the printer to get truly accurate colors and prints. Printer and monitors use different color profiles and different ways of displaying the color, it takes time, effort and machinery to match print to screen 1:1. None of it is one and done process.
IMHO, a calibrated monitor is only useful if you are printing your photos. This way you will get consistent results from your prints. Otherwise, you are just processing photos and videos on a calibrated monitor for everyone else in the world to view in uncalibrated monitors and cell phones, and the benefits are lost in translation.
I ran into this issue doing real estate photography. Sure, my monitor looks great, but my realtors and realty web sites just don’t see it the same way because they are just not calibrated.
In addition to what others have said, you’ve linked two pretty low end consumer model monitors. Probably not worth calibrating for any purpose.
Pro colorists use monitors in the tens of thousands of dollars. They cost that much for a reason. It’s not brightness or product life or functionality. It’s color accuracy.
You’re bringing a $250 Amazon product to the table and expecting it to do anything even close to a $25,000 reference monitor? Not gonna happen. Plus you’re probably missing other critical intermediary hardware.
Easiest realistic solution for you, get a MacBook Pro G2 and absolutely be close enough for your use case right out of the box. How do I know this’ll be good enough for you? If if wouldn’t, you’d already know this which means you wouldn’t be asking how to perfectly calibrate a $250 gaming monitor.
Long story short you are using a game monitor for professional grade work. Think a $5,000 lens on a disposable camera and complaining about the quality.
Have you also calibrated your printer?
Seems your calibration tool is offAgain, over the next ten years I’ve repeated this process over and over. The calibrated screen just looks too bad to deal with and it makes my images that I worked so hard on, and look good on other screens, look terrible.
if you calibrate a Macbook and it looks drastically different, you’ve done something wrong.
What am I doing wrong? I do like to game on this same screen but I’ve always just decreased the screens default color saturation and contrast to match how the images look on my iphone, which matches Lightroom pretty closely.
who knows? there’s so many steps, you don’t even bother to share any of them, nor any comparisons.
Also, the uncalibrated screen I am currently using looks identical to how the raw images look in camera but the calibrated screen looks nowhere near close.
that’s impossible and shows you don’t understand how RAW works
If there is someone out there who wants to make some money, PM and I will pay you 50$ for your time if you can help me figure out this problem.
a professional screen calibration service cost a lot more than that.
today in 2023 monitor calibration is NOWHERE NEAR as important as 10 year ago, almost everybody has factory calibration now, to the point where the minor difference in color of any decent quality monitor out of box, is not going to affect the quality of your photo edits.
Ive print quite often myself. Never calibrated monitor, it’s a 7 year old monitor too. My images come out how they look on screen.
When you calibrate you calibrate for print. Hence it’s going to look super dull compared to the super bright and saturated images on your monitor or phone
So for gaming I’d use a different preset on your monitor.
What about the Mac photography screen setting?
So, my background is in digital visual effects and animation production for motion pictures, and I have experience with designing and implementing end-to-end color processes across entire studios.
As multiple people here have pointed out, calibrating your monitor, meaning adjusting its settings to match some standard, has to be one element of an end-to-end process to achieve anything useful.
There are a whole lot of color transformations that happen between capturing your image and putting it on paper, on film, or on a viewer’s screen.
- Your camera translates a real-world intensity and combination of many wavelengths into spatial and color information that’s stored in the raw file that necessarily throws away a ton of information.
- Your raw file usually contains information from the camera that defines how its data is to be mapped to some kind of display-friendly standard, and your image editing or conversion software (often Photoshop or Lightroom) reads and applies this.
- The photo editing software converts that raw image into a color space that it uses for its own internal representation.
- When it’s displayed on the screen, another transformation occurs from the internal color space of the photo editing software to the output encoding space. (note: monitor calibration can, but doesn’t always, result in generation of a profile that can control this step.)
- Your monitor takes images in the output color space and converts that to light intensity (note: adjusting this is a major purpose of monitor calibration.)
- Your photo editing color space also has similar transformations to the color encoding of your output device, if you are printing your images to paper or film.
- Finally, the output device itself has a transformation from its encoded space to the actual colors that end up on paper or film.
If you’re not controlling (or at least using consistent settings for) these steps, you’re essentially in an uncalibrated environment, where the steps you don’t control can do just about anything.
Photoshop’s controls for managing this process are on the View -> Proof Setup submenu, and exactly how to approach it and how to use those controls is way beyond what i can give you in a Reddit post.
But, if you’re in an uncalibrated environment and want results that seem pretty much like you’re used to, you probably can calibrate your monitor to sRGB, set any monitor settings to sRGB (this is at least possible on the PD2500Q) and set your proof setup settings in Photoshop to “internet standard RGB (sRGB)” Yes, there are other ways to do things, but if you’re hitting only a couple of steps on the above chain, you’re likely to get results that range from slightly odd to very much not what you want.
I would mostly agree, however…
Calibrating to sRGB is a pretty bad idea for photographic work. In the motion picture and VFX world you would calibrate to rec. 709 or whatever standard you’re targeting, but that is mostly down to how the color management pipelines work. You don’t get ICC based color management there, and there aren’t that many different outputs.
For photographic work you can have a ton of different outputs, as every printer, ink and paper combination has it’s own color space. For that purpose, it’s best to keep the display at it’s native gamut as a wider gamut allows you to see more of the colors you’re working with.
A good and simple strategy is to try and target a D65 white point and 2.2 gamma. That is by adjusting the RGB “gain” in the display and finding the gamma setting that is closest to 2.2. Don’t write anything to the video card gamma table - that will just lead to banding (which you’ll get anyway, so best to minimize it).
Then, you simply profile the display and make sure to install the profile in your OS. This will take care of things in (ICC) color managed applications. Meaning, output transforms will be handled on the fly to match the display.
For non-color managed applications, well… it’s probably easiest to try and avoid them. Windows UI will look oversaturated and games don’t support ICC color management. There are ways to use LUT’s for this if it bothers you. In fact, you might actually want to get madVR and use a LUT for your video player, if you like to watch videos on your computer. Most web browsers work fine if you stick to gamma 2.2. With Firefox you can enable color management and plug in your display profile.
Anyhow, as for proofing… there’s nothing inherently wrong with it, but I find it unnecessary for web delivery. You could use it as a quick preview of what happens to your image after color space conversion. I rarely bother.
Start with raw conversion to a large working color space, and make sure camera raw (or whatever raw converter you use) is set to 16 bits. Prophoto RGB is good.
Make your edits and then convert it to sRGB, or whatever you want. Edit -> convert to profile in Photoshop. If you’re targeting print, proof to the printers profile. Don’t convert, as the printer software will handle the conversion.
Thank you for your insight! Since sRGB is a standard that incorporates a D65 white point and 2.2 gamma, it sounds like your main concern is that calibration not try to apply a hardware LUT to get the gamut to match?
Yes, that would be one concern. As long as you stay in color managed applications there’s no need to limit the gamut, and that larger gamut can be useful when working with photos (or the rare occasion when some content is in a larger gamut).
Limiting the gamut would only make sense in non-color managed applications. However, a lot of applications are color managed (important to know which ones though) and in cases where you want accuracy you can use a LUT to map the source color space to your display. For example if you work with video in Resolve or somesuch, or for video playback or games.
I reckon switching between calibrations and profiles are a bigger pain that trying to stick to color managed apps, and using LUT’s where desired or necessary.
It should also be mentioned that the SRGB mode in most consumer displays is utter trash. They tend to be locked down with the brightness set way too high, as well as poorly calibrated.
Of course that’s not an issue on displays that support hardware calibration.
There’s a third option as well, but it’s a bit more complicated to setup.
You could calibrate and profile the display (in this case you want to generate corrections for vcgt), then create a synthetic profiles with your displays white point and primaries, as well as your target gamma. Create a 3DLUT with the synthetic profile as source, and your display profile as destination.
If you want to limit the gamut you can use your display’s white point and rec. 709 primaries for the synth profile.
Plug the synth profile into your OS color management. Then load the 3DLUT with DWMLut.
This is sort of like doing hardware calibration, but in software… or something like that. It’s an option for displays that lack hardware calibration capabilities.
I use this approach myself. It is best with fairly well behaved displays that are not too far off of your target.
My setup is such that I target P3-D65 with gamma 2.4. The display is close to P3 gamut so it works. I have a second LUT for rec. 709 that I can switch to for grading video.
This way I’m pretty well set for switching between apps like Resolve, Blender and Photoshop. Basically, I can live in gamma 2.4 when I want to and ICC color managed apps only have to make a simple gamma 2.4 to gamma 2.2 transform. This also reduces banding.
I’m not sure I would recommend this for OP as I think he needs to get the basics of color management down first.