The truth behind kCGImageSourceShouldCache
This year I was in San Francisco for the WWDC, a good place to ask some questions to Apple’s engineers on some specific technologies, or to report bugs directly, especially problems in the documentation because filling radars for this just does not change a thing.
Anyway, I’ve been working quite a lot with ImageIO, and one thing that bugged me was this inconsistency in the documentation and the header file about the kCGImageSourceShouldCache flag.
See for yourself, here is the extract from the doc :
Whether the image should be cached in a decoded form. The value of this key must be a
CFBoolean value. The default value is kCFBooleanTrue. This key can be provided in the options dictionary that you can pass to the functions
Pretty straightforward, except that when looking at the header file :
Specifies whether the image should be cached in a decoded form. The value of this key must be a CFBooleanRef; the default value is kCFBooleanFalse.
Yeah right, the default values are not the same ! I personally always assumed the header was right, and so the default value should be false.
So I went to the ImageIO lab (which btw was here only for 2 hours !) and was pointed to an engineer that was really good and explained me what the real values were.
Actually, the default values are :
- False for 32-bit OS X, and 32-bit iOS.
- True for 64-bit OS X.
Yeah, bigger address space on 64-bit.
So anyway, he took note of this, and I hope they will update the documentation to be more specific in the near future. Yeah, let’s dream.