I’ve been into photography for years and this is an issue that keeps coming up and discouraging me. If someone could help me resolve this, I’d be eternally grateful
Basically, I understand the concept of calibrating monitors but every time I actually calibrate mine it only makes my monitor look unusably awful and kind of ruins my prints that already looked good when posting online.
This all started ten years agon (and again, this pattern has repeated every 1 to 2 years for the past ten years)….
Ten years ago, I would take a RAW photo on my camera and transfer it to my macbook pro (yes, I know you shouldn’t edit and print from a laptop, but it’s all I had at the time). The RAW, undedited image from the camera to Lightroom looked identical. I edit the photo, post it online and it looks good from my iphone, facebook, other peoples phones and other computers. I even printed a couple photos and they looked pretty good. I am now looking at a photo that I edited at that time from my uncalibrated MBP and it looks very close to how it looks on my iphone, which is the same LR from 10 years ago.
At the time, I figured it was important to calibrate my monitor but when I did that it just destroyed the screen on the macbook. It didn’t even look close to natural and turned everything muddy brown. Now, I understand maybe I was just used to seeing the incorrect, uncalibrated version but I have an image that proves the uncalibrated screen printed just find and looked great on a screen. However, the calibrated screen looked too awful to continue using so I deleted the profile and continued editing the way I did.
Again, over the next ten years I’ve repeated this process over and over. The calibrated screen just looks too bad to deal with and it makes my images that I worked so hard on, and look good on other screens, look terrible.
So tonight I am now using a PC and a BenQ gaming monitor that is 100% SRGB accurate, I decided to calibrate again because I really really want to get into printing my images but the same thing happened. All my images, that look great on my iphone and match my uncalibrated screen to about 90% now look awful.
What am I doing wrong? I do like to game on this same screen but I’ve always just decreased the screens default color saturation and contrast to match how the images look on my iphone, which matches Lightroom pretty closely.
Also, the uncalibrated screen I am currently using looks identical to how the raw images look in camera but the calibrated screen looks nowhere near close.
I’m once again discouraged and giving up on trying to print but I’d love to figure out what I’m doing wrong.
It seems that I have to choose between editing and viewing my images on an uncalibrated screen and my images will look better on a screen or calibrate my screen and maybe they print more accurate but they will not look the same when posted online.
If there is someone out there who wants to make some money, PM and I will pay you 50$ for your time if you can help me figure out this problem.
First of all your benQ gaming monitor is going to have shit colors , doesn’t matter that it says 100% sRGB. The panel is made for quick response time and not color accuracy.
Second of all, calibrating a monitor needs to be done with an external device. Something that can read monitor values from the outside. It doesn’t matter how accurate the software side is if the hardware side is lacking.
Now 10 for your sentiment 10 years ago more or less every consumer device had shit displays. Today you have much more variation on high end consumer devices, you have phones and TVs with OLED screens, you have ipads with microLEDs, you have phones with normal LED screens etc etc , some of these displays have HDR capabilities some don’t.
The monitor you’re using is probably way worse than the display you had in your macbook pro.
Color accurate monitors for PCs are expensive and they still need a calibration with an external device since their out of factory performance isn’t the best.
On the other hand you have apple products that love them or hate them, there’s a reason they’re an industry standard. A 1.5k studio display will give you much better performance than any monitor in that price range in the real world straight out of the box.
Youtubers like to compare stats on paper but they never mention the insane margins of difference a panel will have from one to another.
2 years ago at work we ordered 10 32 inch screens (from a well known brand), not 2 of them were calibrated the same out of factory.
By, my perception when I calibrate the monitor (again, using a spyder pro) they colors look "worse" to me. Now, I'm guessing this is because I've grown accustomed to viewing shit colors where the monitor is actually showing my the true calibrated colors that are actually in my photo files? Am I understanding this correctly?
Can't thank you enough for your advice
You can use the best calibration tool in the market if the panel cannot physically produce the result it won’t matter.
After you’ve calibrated the monitor you need to check the delta e , that will show you how accurate the colors that the monitor is displaying are vs what it’s supposed to.
Check some linus tech tips videos on monitors and you’ll see that even after calibrations some monitor just don’t have a good delta e.
……
I’ll give you an example, i have an alienware 240hz gaming monitor , it’s a 600-700 euro monitor. I would never try to get accurate colors out of it.
It’s not made for that. It’s made to refresh as fast as possible so i get a competitive edge in games.
You can’t ask a Ferrari to do off roading.