How to detect when an iOS app crashes

Great post on StackOverflow about detecting when an app is killed, due to an exception, a signal, etc.

Most crashes can be caught using a 3-fold approach:

  1. appWillTerminate
  2. exception handler
  3. signal handler

This translates into using:

// installs HandleExceptions as the Uncaught Exception Handler
// create the signal action structurestruct sigaction newSignalAction;
// initialize the signal action structure
// set SignalHandler as the handler in the signal action structure
newSignalAction.sa_handler =&SignalHandler;
// set SignalHandler as the handlers for SIGABRT, SIGILL and SIGBUS
sigaction(SIGABRT,&newSignalAction, NULL);
sigaction(SIGILL,&newSignalAction, NULL);
sigaction(SIGBUS,&newSignalAction, NULL);


-(void)applicationWillTerminate:(UIApplication*)application {
void HandleExceptions(NSException*exception){
DebugLog(@"This is where we save the application data during a exception");
// Save application data on crash
void SignalHandler(int sig) {
DebugLog(@"This is where we save the application data during a signal");
// Save application data on crash// Write your code to reset brightness

My own answer to the same question, did not go as far as suggesting to intercept signals, but that is fundamental.

As I said, really great tip!


Online resources to learn Apple’s Swift Language

A collection of resources to start with the new Swift language.


Apple introduces iOS 8 SDK and Xcode 6

Another post for InfoQ: “At its 2014 Worldwide Developer Conference, Apple announced its new mobile operating system, iOS 8, alongside new SDKs and development tools. New Apple software includes over 4000 new APIs, including all new frameworks such as HealthKit, HomeKit, and CloudKit, and enhancements to the platform gaming capabilities.”

Read it all.


How Instruments can be used to fix a graphics performance issue

Lately, I have been investigating an issue a customer of mine’s app showed.

My customer’s app is a sort of PDF viewer that also allow to add annotations. Such annotations are not stored in the PDF itself, instead they are managed in a custom way and drawn on top of the PDF in a dedicated CATiledLayer-based view.

The issue was that after a couple of zoom-in/out operations, or alternatively after moving from one PDF to another, the CPU used to jump to 100% usage, even though no apparent operation was ongoing. This hampered a lot the overall experience of the app, since practically all graphics operations became extremely slow with the app stuck in that state. Curiously, other kind of operations, e.g., downloading a file, were not slowed down significantly.

The issue had quite a trivial cause, due to some “bad” programming (meaning that some obvious rule was not respected), but the interesting part in this is how I came to understand what was going on.

Instruments was the main tool that came to rescue, as you can imagine. The picture at the left shows the CPU Profiler tool output. You can see how the overall CPU usage goes to 100% at some point and stays there. The fundamental bits of information one can get from this output are the following:

  • there was something going wrong in the cleanup phase of a pthread lifecycle; knowing that the CATiledLayer used for optimised drawing uses threads, this was a hint at that something was not handled correctly in the drawing phase; hard to think at some CATiledLayer bug, but still a possibility;
  • furthermore, (while the program was running) the “self” field showed that there were very many calls being made to practically all symbols under “pthread_dst_cleanup” and that those calls would not halt for any reason;
  • among the calls being made repetitively, my attention got caught by those to FreeContextStack/PopContext/GetContextStack.

The last point was the key to understand that something in the handling of the Core Graphics context stack was not doing correctly. So I set up to investigate the custom drawing code and indeed what I found was a couple of unbalanced calls to UIGraphicsPushContext and UIGraphicsPopContext. Fixing this, removed the CPU utilisation issue.

As I said, the issue was caused by incorrect programming, but nevertheless catching it was an interesting experience.


What’s new in iOS 7 User Interface – Part 2: Deference

In a previous post, I began describing what changes the introduction of iOS 7 brought to iOS UI/UX dimension. In that post, I listed 4 main principles shaping the idea of iOS flat-UI:

    • Clarity

Here I will try and clarify what a “deferent” UI should be. Again in the New Oxford American Dictionary, I found deference defined as: “humble submission and respect”. As Apple applies this concept to iOS, the subject of deference is meant to be the User Interface itself, while the object of deference is user content. This means that the User interface should not come in the way of user content; UI should not be prominent over content. Rather than that, it should exalt user content.

An example of deference as given by Apple can be found in its Calendar app. Specifically, as you can see in the image below, look at the search bar. In iOS 6 Calendar app, the space bar reduced the available space for user content. In iOS 7, the space bar as such disappears and it is replaced by a magnifier icon; when you tap on it, the search field appears inside of the navigation bar. The navigation bar itself changes its content to adapt to the new context by displaying two buttons: Today and Done.

Another example of deference is provided through the new Notes app. It must be said that the old Notes app was for sure one of the worst in the Apple pack. Here again we find the trick with the search bar disappearing, thus giving more space to content. But comparing two screenshots, it becomes apparent that in the new Notes app content is the king, while in the old one it was shadowed by several UI elements: the typeface used for notes; the strong colors both for the background and the text; grid lines, and so on.

Looking at the Notes app, it is interesting to note that flat-UI under iOS speak does not mean “no texture”. Indeed, the Notes app features a “realistic” (Apple wording) textured white background. It seems that what really matters is that “realistic” UI artifacts are “deferent”, as it happens with the background in the Notes app.

Finally, a great example of deference, i.e., content over UI, is found in the new Weather app. As you can see in the comparison below, gone is the card-like appearance; the only effect of this was some clutter and less usable space. Instead, we found a big background image to represent the current status; a big centered lettering specifying the current temperature; the larger available space allows to add a new textual, more explicit representation of the current weather status and to include one more hour n the detailed hourly forecast.

I hope I could make deference a bit easier to understand as a basic principle of iOS 7 UI. In a future post, I will take in exam the next principle: depth.


Hitchhiker’s guide to MKStoreKit

In-App Purchase is one of those great feature of the iOS ecosystem that I wish were easier to understand and implement in my apps. In the end it is just a machinery that will not really add value to my apps, and it would be great if the Apple motto “it just works” could be applied to StoreKit as well.

A very good read to start is this tutorial on Ray Wenderlish’s blog, which explains all the steps required, from setting up your in-app purchases items in itunesconnect to implementing MKStoreKit in your app.

Actually, so far I have found that the most convenient way to add In-App Purchase support to my apps is through Mugunth Kumar’s MKStoreKit. It makes things much easier and almost straightforward. On the other hand, MKStoreKit is presented by its author on a series of posts on his blog that are a bit sparse and fail somehow to give a quick view of the way you are supposed to make things work.

In this post, I am going to summarise the steps required to integrate MKStoreKit into your app for non-consumable non-renewable items. I will assume that all the App Store paraphernalia has been already dealt with; but if you are starting right now to use In-App Purchases, maybe you could read the first part of the aforementioned tutorial.

So, going to the meat of the thing, what you need to do to set up and use MKStoreKit in your app is:

  1. 1. in MKStoreConfigs.h define macros for all of your items: e.g.,

    #define kKlimtPictureSetId @"org.freescapes.jigsaw.klimt.pictureset"
    #define kKlimtAltPictureSetId @"org.freescapes.jigsaw.klimt.altpictureset"

  2. 2. create a MKStoreKitConfigs.plist file where you list all of your items; this could look like shown in the picture below.


  3. 3. in your app delegate call:

    [MKStoreManager sharedManager];

    in order to initialize MKStoreKit and give it time to retrieve info from the App Store while the app is initialising;
  4. 4. whenever you want to check if a feature has been bought, call:

    [MKStoreManager isFeaturePurchased:kKlimtPictureSetId]

  5. 5. when the user buys some feature, call:

    [[MKStoreManager sharedManager] buyFeature:kKlimtPictureSetId

  6. 6. to implement the required “restore purchases”, call:

    [[MKStoreManager sharedManager] restorePreviousTransactionsOnComplete:^()
    [self handlePurchaseSuccess:nil];
    onError:^(NSError* error)
    [self handlePurchaseFailure:error];

This is all that there is to it! Really straightforward and nice.


Core Graphics Image Interpolation Performance

Recently, I have done quite a bit of Core Graphics programming and discovered that one key point in ensuring that an app performs well is choosing the right CGContext interpolation level. Interpolation, according to the Wikipedia, “is a method of constructing new data points within the range of a discrete set of known data points.” Interpolating is what is done when you scale up an image, but also when you scale it down. In both cases, interpolation has a tremendous impact on the result of the scaling.

Core Graphics CGContext allows 4 different levels of interpolation: kCGInterpolationLow, kCGInterpolationMedium, kCGInterpolationHigh, and kCGInterpolationNone. Their effect is pretty trivial to describe: kCGInterpolationNone will give you the most jagged result of all; kCGInterpolationHigh will give you the smoothest result of all. What is less clear from the outside is which impact interpolation will have on your app.

So, I put up a benchmark test and here are the results.

The test

In my benchmark test, I draw a set of 6 different 649×649 b&w bitmap images to build a set of 12 different jigsaw puzzle tiles. The bitmap images are scaled down by a factor of 5.0 (i.e., to 129.8×129.8) to make them fit on a 320×480 display (at retina display scale).

The outcomes

Run times were measured using Instruments and only considering the time spent inside of the CGContextDrawImage function, so to remove all effects not related to the interpolation itself.

The device used was a 4th gen iPod touch.

– kCGInterpolationLow: 969 msec

– kCGInterpolationMedium: 1690 msec

– kCGInterpolationHigh: 2694 msec

– kCGInterpolationNone: 545 msec

As you can see, there is a big factor between no interpolation and high interpolation (almost 500% increase). The question arises whether this time is spent for some good sake or not, so let’s take into account the visual result.

Visual Outcome

As to the visual outcome, if you compare High or Medium to Low or None, the difference is staggering. Even noticeable, IMO, the difference between None and Low; while the difference between High and Medium is not particularly noticeable in this particular test.

What is clear is the difference in run times, so that in my particular case, I ended up using `kCGInterpolationMedium`.






App Store: it’s a hard life

It may come as a bit of a shock to many, but the truth was already known to most independent developers for iOS: the App Store is a really hard environment to live in. This has been lately confirmed in some new figures released by mobile analytics firm Adeven, which speak of almost 400.000 (i.e., 80% of all apps) “zombie” apps: apps that are seldom, if ever, downloaded. Here what TUAW says about it:

[only] a few companies with a lot of experience, brand recognition and marketing money are able to catapult their products up into the Top 25, where they’re usually profitable as long as they can sit there.

This is definitely true, so, when I found this list of beautiful iOS app web sites, I could not resist the curiosity of checking with Xyologic stat engine how much a beautiful site can help an app.

Well, I will leave the detailed task for yourself, since it would be pretty hideous to highlight the not-so-good results of some nice and well-done apps, which fellow independent developers put a lot of effort and time into, but it seems pretty obvious that with several of them ranking below the 10k downloads overall, a nice app web site is not the most important thing to have.


Playing a secondary theme in Cocos2D

Cocos2D offers CocosDenshion, an easy to use framework to work with audio.

CocosDenshion allows you to play a background music and then some effects on top of that. This is clearly aimed at games and it usually works really well. The difference between background music and effects is summarized as follows:

  1. background music is thought of as potentially of long duration, so it is handled as a long audio stream;
  2. background music is thought of as continuous, so it is played in exclusive mode;
  3. effects are thought of as smaller in size, so they are loaded into memory entirely;
  4. effects are by their definition itself aimed at being played many times, so they are cached into memory for reuse;
  5. effect can be mixed to background music, so they do not “steal” the audio subsystem.

This works well until your effects are pretty small in size, otherwise your memory requirements will quickly grow. On iOS this is a no-no, since your app has very little memory to run with. Another case when CocosDenshion falls short is when you want your effects to be played continuously in a loop.

Say, for example, that you have a main background theme that is played in a loop, and then a secondary theme that you would like to be also played in a loop when your character enters some given state.

Suche scenario led me to create a small category on CocosDenshion SimpleAudioManager which defines four methods:

-(void) playForegroundMusic:(NSString*)filePath loop:(BOOL)loop;
-(void) stopForegroundMusic;
-(void) pauseForegroundMusic;
-(void) resumeForegroundMusic;

playForegroundMusic will simply play your secondary theme on top of your background music without claiming exclusive access to the audio subsystem. You can find it on my github together with the rest of my Cocos2D snippets.



A bug in the Dropbox app for iOS?

Try this:

  1. launch your Dropbox app on the iPhone/iPad and log in;
  2. go to your computer and browse to the Dropbox settings to change your account password;
  3. go back to your iPhone/iPad and… surprise, you are still allowed to browse through your documents…

So, if you loose your iPhone, no need to rush to change your account password to protect your files from undesired access. This will not do nothing.

The only protection you have is the 4 digits passcode that you can set in the Dropbox app.

Is this enough security for sensitive information?

I suspect this issue is common to many systems using oAuth or other similar long-lived access token mechanisms. But why should it be hard to invalidate all tokens associated to an account when the account password is changed?