MacOS vs Windows on cheaper monitors
No arguments here please. Just wondered if anyone’s tried both OSes on lower end monitors.
I got a cheap (AOC) 4k or possibly UHD monitor from work, and it was fine on MacOS but everything looks much sharper and contrast etc is better on Windows. I had that replaced with another cheap one, this is definitely branded UHD, it’s Samsung this time. And it too looks better on Windows. Fine lines are sharper and there’s something weird about the contrast. The whites seem too intense somehow making the contrast with white on black worse even though the brightness is down. I’ve tried the calibration wizard on MacOS, according to someone on YT who’d done it; I even downloaded their profile and it was really bad.
I’m just curious to see if anyone else has experienced this? MacOS looks fine on the built-in display of course which is higher quality Retina job, it seems that MacOS looks much worse on cheaper displays than Windows does.Posted 1 month ago
Probably bog all to do with the OS and all to do with the GPU?
Either that or “Apple kit doesn’t play nicely with non-Apple kit, who knew”? Driver issue maybe?Posted 1 month ago
Sorry, I’ve no experience of the issue but do you just mean the OS or is the quality difference true for all apps run under the particular OS? If it’s the former it makes some sense (maybe) that MacOS text and icons etc are optimised for better quality displays while Windows might be more forgiving, though I’d expect them to be much the same on a UHD Samsung,Posted 1 month ago
Macs are optimised to use either standard 1:1 pixel scaling, or 4:1 – that’s what their ‘Retina’ displays are.
Unless you’re using 1:1 pixel scaling, Macs will render an image of your desktop with exactly 4x the pixels, then downsample it to fit your screen resolution. If your display is 4x the ‘looks like’ size then it’ll look lovely and sharp. E.g. a 4k monitor will provide an incredibly sharp-looking ‘looks like’ 1080p image, since a 4k resolution (3840×2160) has 4 times the pixels of a 1080p monitor (1920×1080).
If your display is less than 4k resolution, but still more than the 1:1 size of your desktop, Macs don’t render this in the same way as Windows. I believe Windows is said to be better with these ‘in between’ resolutions (E.g. QHD which is neither 1:1 nor ‘Retina’ standard on Macs).
It could be that? I believe that’s still the way Macs do display scaling. Have a play with the display resolutions and any scaling options there are.Posted 1 month ago
On PC when changing graphics card from nvidia to AMD, I notices a difference, not really better or worse, just subtley different somehow.
I’m not into photo editing or anything like that so absolute colour accuracy isn’t a concern of mine, i Just fiddle with the monitor controls until it’s what looks best to me.
One thought – do your monitors have HDR? the type of HDR it has, may play a part, and I dunno how macs handle HDR but they may handle it differently to windows.Posted 1 month ago
Really sounds like the mac wasn’t using the display at its native resolution. I’ve had a monitor which macOS doesn’t work on properly because the “this is what I support” data from the monitor is just broken. Windows had drivers which detected the monitor and overrode the data.
(genuinely something I debugged at work, the monitors EDID neglected to mention the actual panel resolution).
If you can plug the Mac back in, compare the resolution that macOS says it is using (System Preferences -> Display) with the actual resolution of the screen. If its wrong, and you can’t just move a slider to fix it, then you need to drop in a fixed EDID. If the monitor is fairly common this is usually just a google away.Posted 1 month ago
If its wrong, and you can’t just move a slider to fix it, then you need to drop in a fixed EDID.
I’ve never heard of an EDID, but NB there are some resolution options that Mac OS thinks are non-native, so it hides them. You have to hold alt (‘option’) and click the scaler button to see all the available options.Posted 1 month ago
Pretty sure it’s using native resolution but it’s not just the sharpness it’s the contrast. I can barely see the grey and white striped in finder. And things just look off and difficult to look at somehow. The Samsung is definitely HDR. This could be part of it. During the calibration procedure it says adjust these sliders til you can’t see the thing or the background, and I can always see the thing and the background no matter where the sliders are.
The scaling is set to one up from the smallest scaling, whatever that means.
I did use an HD monitor for a while, this was clearly far grainier on Mac than on Windows, edges of fonts were pretty pixelated.Posted 1 month ago
How are you connecting the monitor to your Mac? It’s fairly likely you don’t have 4k support so the monitor is scaling the input. If you hold command or something stupid whilst doing a little dance and clicking on a certain button in the Mac os settings (scaled maybe?) It’ll show you the actual resolutions. No idea why hiding them was a good ideaPosted 1 month ago
AFAIK not all Macs work with HDR.Posted 1 month ago
It’s not a resolution issue. It is definitely using 4k resolution. It’s something to do with the dynamic range or something weird.
It’s a 2019 MBP 13 inch.
The results of the calibration do make it look darker and more contrasty – most of my display is white thanks to the MacOS UI (and no I don’t want dark mode) so the calibration stuff helps – but it loses contrast at the low end. I’m using slack, and with the calibration process the sidebar in the window looks black when it’s actually dark red, as can be seen on the built-in display and when using the default profile. I’ve just found a generic SAMSUNG colour profile in the list though and it’s actually not that bad.
Although the scaling might be part of it too – I switched from from the smallest scaling option (native 4k) to the one larger and it all went a bit worse.Posted 1 month ago
I have a 2018 Mac mini, running two 4k monitors (27″ and 24″) at ‘default for display’ settings. Absolutely no issues. Pin sharp.
For connection I’m using Thunderbolt/USB-C to DisplayPort leads. IME avoid HDMI connections.Posted 1 month ago
A very comprehensive article on Macs and monitors….
@molgrips – any of this help?
Posted 1 month ago
A very comprehensive article on Macs and monitors….
I love that an article about monitors is presented in black type on a searingly bright yellow background that makes it pretty much unreadable. Is it some sort of obscure IT person joke?Posted 1 month ago
I love that an article about monitors is presented in black type on a searingly bright yellow background that makes it pretty much unreadable. Is it some sort of obscure IT person joke?
Have to confess I hadn’t really noticed that.
I was in awe of the amount of detail I never realised existed in rendering fonts on a monitor.Posted 1 month ago
I love that an article about monitors is presented in black type on a searingly bright yellow background that makes it pretty much unreadable.
Funnily enough it’s quite readable for me on both Mac and Windows on this monitor 🙂
Good article though, much of it I knew but hadn’t really thought about in this context. Going to try turning off the font smoothing – but changing the scaling isn’t really an option since I need the screen space.
I’m kicking myself for not going for the 32″ monitor, or trying to (it was through work). I’d have been able to use the native res probably without scaling.Posted 1 month ago
Mac expects RGB. The Samsung maybe BGR. Turn the monitor upside down and rotate the screen display settings 180 degrees and see if it comes as clear as an RGB.Posted 1 month ago
Windows gamma defaults to a different setting to OSX. Remove your calibration and set the gamma to 2.2 and see if looks better. (or change the monitor gamma setting if it has one to 1.8)
edit- at least this used to be the case … haven’t owned a mac for a few years nowPosted 1 month ago
defaults -currentHost write -g AppleFontSmoothing -int 0
defaults -currentHost write -g AppleFontSmoothing -int 1
On retina screens macs no longer use sub-pixel anti-aliasing. For a less than retina dpi, sub-pixel anti-aliasing may be a benefit but depends on the RGB subpixel layout (see above). Browsers may implement some font anti-aliasing independent of the os giving mixed results. Bugger.Posted 1 month ago
You must be logged in to reply to this topic.