How to detect when an iOS app crashes

Great post on StackOverflow about detecting when an app is killed, due to an exception, a signal, etc.

Most crashes can be caught using a 3-fold approach:

  1. appWillTerminate
  2. exception handler
  3. signal handler

This translates into using:

// installs HandleExceptions as the Uncaught Exception Handler
// create the signal action structurestruct sigaction newSignalAction;
// initialize the signal action structure
// set SignalHandler as the handler in the signal action structure
newSignalAction.sa_handler =&SignalHandler;
// set SignalHandler as the handlers for SIGABRT, SIGILL and SIGBUS
sigaction(SIGABRT,&newSignalAction, NULL);
sigaction(SIGILL,&newSignalAction, NULL);
sigaction(SIGBUS,&newSignalAction, NULL);


-(void)applicationWillTerminate:(UIApplication*)application {
void HandleExceptions(NSException*exception){
DebugLog(@"This is where we save the application data during a exception");
// Save application data on crash
void SignalHandler(int sig) {
DebugLog(@"This is where we save the application data during a signal");
// Save application data on crash// Write your code to reset brightness

My own answer to the same question, did not go as far as suggesting to intercept signals, but that is fundamental.

As I said, really great tip!


Hitchhiker’s guide to MKStoreKit

In-App Purchase is one of those great feature of the iOS ecosystem that I wish were easier to understand and implement in my apps. In the end it is just a machinery that will not really add value to my apps, and it would be great if the Apple motto “it just works” could be applied to StoreKit as well.

A very good read to start is this tutorial on Ray Wenderlish’s blog, which explains all the steps required, from setting up your in-app purchases items in itunesconnect to implementing MKStoreKit in your app.

Actually, so far I have found that the most convenient way to add In-App Purchase support to my apps is through Mugunth Kumar’s MKStoreKit. It makes things much easier and almost straightforward. On the other hand, MKStoreKit is presented by its author on a series of posts on his blog that are a bit sparse and fail somehow to give a quick view of the way you are supposed to make things work.

In this post, I am going to summarise the steps required to integrate MKStoreKit into your app for non-consumable non-renewable items. I will assume that all the App Store paraphernalia has been already dealt with; but if you are starting right now to use In-App Purchases, maybe you could read the first part of the aforementioned tutorial.

So, going to the meat of the thing, what you need to do to set up and use MKStoreKit in your app is:

  1. 1. in MKStoreConfigs.h define macros for all of your items: e.g.,

    #define kKlimtPictureSetId @"org.freescapes.jigsaw.klimt.pictureset"
    #define kKlimtAltPictureSetId @"org.freescapes.jigsaw.klimt.altpictureset"

  2. 2. create a MKStoreKitConfigs.plist file where you list all of your items; this could look like shown in the picture below.


  3. 3. in your app delegate call:

    [MKStoreManager sharedManager];

    in order to initialize MKStoreKit and give it time to retrieve info from the App Store while the app is initialising;
  4. 4. whenever you want to check if a feature has been bought, call:

    [MKStoreManager isFeaturePurchased:kKlimtPictureSetId]

  5. 5. when the user buys some feature, call:

    [[MKStoreManager sharedManager] buyFeature:kKlimtPictureSetId

  6. 6. to implement the required “restore purchases”, call:

    [[MKStoreManager sharedManager] restorePreviousTransactionsOnComplete:^()
    [self handlePurchaseSuccess:nil];
    onError:^(NSError* error)
    [self handlePurchaseFailure:error];

This is all that there is to it! Really straightforward and nice.


Core Graphics Image Interpolation Performance

Recently, I have done quite a bit of Core Graphics programming and discovered that one key point in ensuring that an app performs well is choosing the right CGContext interpolation level. Interpolation, according to the Wikipedia, “is a method of constructing new data points within the range of a discrete set of known data points.” Interpolating is what is done when you scale up an image, but also when you scale it down. In both cases, interpolation has a tremendous impact on the result of the scaling.

Core Graphics CGContext allows 4 different levels of interpolation: kCGInterpolationLow, kCGInterpolationMedium, kCGInterpolationHigh, and kCGInterpolationNone. Their effect is pretty trivial to describe: kCGInterpolationNone will give you the most jagged result of all; kCGInterpolationHigh will give you the smoothest result of all. What is less clear from the outside is which impact interpolation will have on your app.

So, I put up a benchmark test and here are the results.

The test

In my benchmark test, I draw a set of 6 different 649×649 b&w bitmap images to build a set of 12 different jigsaw puzzle tiles. The bitmap images are scaled down by a factor of 5.0 (i.e., to 129.8×129.8) to make them fit on a 320×480 display (at retina display scale).

The outcomes

Run times were measured using Instruments and only considering the time spent inside of the CGContextDrawImage function, so to remove all effects not related to the interpolation itself.

The device used was a 4th gen iPod touch.

– kCGInterpolationLow: 969 msec

– kCGInterpolationMedium: 1690 msec

– kCGInterpolationHigh: 2694 msec

– kCGInterpolationNone: 545 msec

As you can see, there is a big factor between no interpolation and high interpolation (almost 500% increase). The question arises whether this time is spent for some good sake or not, so let’s take into account the visual result.

Visual Outcome

As to the visual outcome, if you compare High or Medium to Low or None, the difference is staggering. Even noticeable, IMO, the difference between None and Low; while the difference between High and Medium is not particularly noticeable in this particular test.

What is clear is the difference in run times, so that in my particular case, I ended up using `kCGInterpolationMedium`.






Blur effect for Cocos2D textures on retina displays

I have recently needed to blur a Cocos2d texture to simulate an “out-of-view-field” effect. As it usually happens, it had been already implemented under the name of AWTextureFilter by Manuel Martínez-Almeda, who made it available on github.

Integrating it into my app was really easy, but it turned out that there was a problem with @2x textures on retina displays, both iPhone and the new iPad. So, I just fixed this and published a“>fork for anyone who would like to use it.

 Update 27 May:

Due to the lengthy calculations that AWTextureFilter does, it can be pretty slow and block your UI for up to a couple of seconds. Lengthy calculations scream for background execution, but the stock version of AWTextureFilter will not work correctly when executed on a secondary thread due to it eventually calling the Open GL layer.
So I made some changes to blur method and it is now possible to create a blurred texture off the main thread; this will not block the UI. I hope you enjoy it.