Search for scaling an image and google will return several answers, of which 90% are somewhat similar. A category on UIImage that returns the modified (scaled) image using UIGraphicsBeginImageContext
and family. However this is a very naive implementation that won’t work reliable on threads other than the main one.
A general example of the code mostly found as the answer to scaling an image (this piece is from stackoverflow.com, but there’s more out there like this):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
|
This code works fine, when used with caution. Any image operation is cpu intensive and will take some time. Repeat this code over and over (like in a naive tableview implementation while scrolling) and you’ll start to notice the penalty. That’s why you don’t usually want to perform image operations on the main thread. The code above seems to be the most popular answer to image resizing questions, however it’s not safe to use this on background threads!
You should call this function from the main thread of your application only.
Somehow this never gets mentioned in the examples I find online. No one interested in batch processing large number of images out there? My advice to anyone who is: start using the CGBitmapContextCreate
family of functions to create an offscreen buffer to create image contexts that also work on threads other than the main one.
The way I use this is to create a NSOperation
subclass that does the heavy duty processing in the background using a NSOperationQueue
. When finished the operation delegate gets notified with the new UIImage
object.
My version is listed below (shortened for brievity) Notice the usage of the CGBitmapContextCreate to set up the offscreen context:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 |
|