Jump to content

Retina Display and Nearest Neighbor scaling


3 posts in this topic

Recommended Posts

Hey,

 

I'm developing an application which frequently handles low resolution bitmaps. I often want to display a very small image (Let's say 12x12 pixels) on a much larger surface (Let's say 480x480) so the user can see every individual pixel. I use [context setImageInterpolation: NSImageInterpolationNone] to disable bilinear/bicubic filtering, which works great on normal displays. However, when using HiDPI mode, OS X first scales 12x12 to 24x24 using (what appears to be) bilinear, and then 24x24 to 960x960 using nearest neighbor. This is an unwanted result since I get 4 blurry "pixels" instead of one. How can I stop OS X from filtering my bitmaps? I attached an image explaining the issue.

 

Thanks in advance!

post-190449-0-78954000-1369317271_thumb.png

Link to comment
Share on other sites

Apparently the bug is in strange (undocumented?) behavior in lockFocus. When executing lockFocus on an NSImage created with initWithCGImage when the size parameter doesn't match the actual size, lockFocus itself rescales the image (Ignoring the my NSImageInterpolationNone) to the correct size. And it seems that initWithCGImage didn't expect a size in physical pixels (which in this case was the actual number of pixels in the image), but in logical pixels. To solve it, I simply divided the size by NSScreen.mainScreen.backingScaleFactor.

Link to comment
Share on other sites

 Share

×
×
  • Create New...