So I've written in the past about how nice WebP is. I'm currently working on a personal project that'll incorporate maybe ~260 full-screen retina images on the iPad - lots of pictures! I want them to be WebP to keep download sizes sensible. So i tried a cocoapod (iOS-WebP) that purported to decode WebP for me, but it didn't work. So here's my solution which is simpler, and works:

Header file:

//
//  UIImage+WebP.h
//
//  Created by Chris Hulbert on 9/02/2014.
//  Copyright (c) 2014 Chris Hulbert. All rights reserved.
//
//  This gives you methods for sync and async webp image loading.

#import <UIKit/UIKit.h>

typedef void(^WebpImageCompletionBlock)(UIImage *image);

@interface UIImage (WebP)

/// Synchronously loads a webp image. This can be called from any thread.
+ (instancetype)webpImageNamed:(NSString *)name;

/// Async decoding of a webp image on a background thread. Callback is on main thread.
+ (void)webpDecodeImageNamed:(NSString *)name completion:(WebpImageCompletionBlock)completion;

@end

Implementation file:

//
//  UIImage+WebP.m
//
//  Created by Chris Hulbert on 9/02/2014.
//  Copyright (c) 2014 Chris Hulbert. All rights reserved.
//
//  This gives you methods for sync and async webp image loading.

#import "UIImage+WebP.h"

#import <WebP/decode.h>

@implementation UIImage (WebP)

+ (instancetype)webpImageNamed:(NSString *)name {
    // Load the source data.
    NSString *path = [[NSBundle mainBundle] pathForResource:name ofType:@"webp"];
    NSData *data = [NSData dataWithContentsOfFile:path];

    // WebP-decode it.
    int width=0, height=0;
    void *webpDecoded = WebPDecodeARGB(data.bytes, data.length, &width, &height);

    // Free the source data.
    data = nil;

    // Use the webpDecoded data to make a CG context.
    CGColorSpaceRef space = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(webpDecoded, width, height, 8, width*4, space, (CGBitmapInfo)kCGImageAlphaNoneSkipFirst);
    CGColorSpaceRelease(space);

    // Convert it to a CGImage.
    CGImageRef cgImage = CGBitmapContextCreateImage(context);
    // Free the CG context and decoded data.
    CGContextRelease(context);
    free(webpDecoded);

    // Convert the CGImage to a UIImage.
    UIImage *uiImage = [UIImage imageWithCGImage:cgImage scale:2 orientation:UIImageOrientationUp];
    CGImageRelease(cgImage);

    return uiImage;
}

+ (void)webpDecodeImageNamed:(NSString *)name completion:(WebpImageCompletionBlock)completion {
    // Make the queue.
    static dispatch_queue_t queue;
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        queue = dispatch_queue_create("com.splinter.webpdecoder", DISPATCH_QUEUE_CONCURRENT);
    });

    // Decode it in the background.
    dispatch_async(queue, ^{
        UIImage *image = [self webpImageNamed:name];

        // Callback in the main thread.
        dispatch_async(dispatch_get_main_queue(), ^{
            completion(image);
        });
    });
}

@end

You'll need to include Google's 'libwebp' cocoapod for this to work.

Thanks for reading! And if you want to get in touch, I'd love to hear from you: chris.hulbert at gmail.

Chris Hulbert

(Comp Sci, Hons - UTS)

iOS Developer in Sydney.

I have worked at places such as Google, Cochlear, News Corp, Fox Sports, NineMSN, FetchTV, Woolworths, and Westpac, among others. If you're looking for a good iOS developer, drop me a line!

Get in touch:
[email protected]
github.com/chrishulbert
linkedin
my resume



 Subscribe via RSS