I’ve been into photography for years and this is an issue that keeps coming up and discouraging me. If someone could help me resolve this, I’d be eternally grateful

Basically, I understand the concept of calibrating monitors but every time I actually calibrate mine it only makes my monitor look unusably awful and kind of ruins my prints that already looked good when posting online.

This all started ten years agon (and again, this pattern has repeated every 1 to 2 years for the past ten years)….

Ten years ago, I would take a RAW photo on my camera and transfer it to my macbook pro (yes, I know you shouldn’t edit and print from a laptop, but it’s all I had at the time). The RAW, undedited image from the camera to Lightroom looked identical. I edit the photo, post it online and it looks good from my iphone, facebook, other peoples phones and other computers. I even printed a couple photos and they looked pretty good. I am now looking at a photo that I edited at that time from my uncalibrated MBP and it looks very close to how it looks on my iphone, which is the same LR from 10 years ago.

At the time, I figured it was important to calibrate my monitor but when I did that it just destroyed the screen on the macbook. It didn’t even look close to natural and turned everything muddy brown. Now, I understand maybe I was just used to seeing the incorrect, uncalibrated version but I have an image that proves the uncalibrated screen printed just find and looked great on a screen. However, the calibrated screen looked too awful to continue using so I deleted the profile and continued editing the way I did.

Again, over the next ten years I’ve repeated this process over and over. The calibrated screen just looks too bad to deal with and it makes my images that I worked so hard on, and look good on other screens, look terrible.

So tonight I am now using a PC and a BenQ gaming monitor that is 100% SRGB accurate, I decided to calibrate again because I really really want to get into printing my images but the same thing happened. All my images, that look great on my iphone and match my uncalibrated screen to about 90% now look awful.

What am I doing wrong? I do like to game on this same screen but I’ve always just decreased the screens default color saturation and contrast to match how the images look on my iphone, which matches Lightroom pretty closely.

Also, the uncalibrated screen I am currently using looks identical to how the raw images look in camera but the calibrated screen looks nowhere near close.

I’m once again discouraged and giving up on trying to print but I’d love to figure out what I’m doing wrong.

It seems that I have to choose between editing and viewing my images on an uncalibrated screen and my images will look better on a screen or calibrate my screen and maybe they print more accurate but they will not look the same when posted online.

If there is someone out there who wants to make some money, PM and I will pay you 50$ for your time if you can help me figure out this problem.

  • Vetusiratus@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    Yes, that would be one concern. As long as you stay in color managed applications there’s no need to limit the gamut, and that larger gamut can be useful when working with photos (or the rare occasion when some content is in a larger gamut).

    Limiting the gamut would only make sense in non-color managed applications. However, a lot of applications are color managed (important to know which ones though) and in cases where you want accuracy you can use a LUT to map the source color space to your display. For example if you work with video in Resolve or somesuch, or for video playback or games.

    I reckon switching between calibrations and profiles are a bigger pain that trying to stick to color managed apps, and using LUT’s where desired or necessary.

    It should also be mentioned that the SRGB mode in most consumer displays is utter trash. They tend to be locked down with the brightness set way too high, as well as poorly calibrated.

    Of course that’s not an issue on displays that support hardware calibration.

    There’s a third option as well, but it’s a bit more complicated to setup.

    You could calibrate and profile the display (in this case you want to generate corrections for vcgt), then create a synthetic profiles with your displays white point and primaries, as well as your target gamma. Create a 3DLUT with the synthetic profile as source, and your display profile as destination.

    If you want to limit the gamut you can use your display’s white point and rec. 709 primaries for the synth profile.

    Plug the synth profile into your OS color management. Then load the 3DLUT with DWMLut.

    This is sort of like doing hardware calibration, but in software… or something like that. It’s an option for displays that lack hardware calibration capabilities.

    I use this approach myself. It is best with fairly well behaved displays that are not too far off of your target.

    My setup is such that I target P3-D65 with gamma 2.4. The display is close to P3 gamut so it works. I have a second LUT for rec. 709 that I can switch to for grading video.

    This way I’m pretty well set for switching between apps like Resolve, Blender and Photoshop. Basically, I can live in gamma 2.4 when I want to and ICC color managed apps only have to make a simple gamma 2.4 to gamma 2.2 transform. This also reduces banding.

    I’m not sure I would recommend this for OP as I think he needs to get the basics of color management down first.