How to detect when an iOS app crashes

Great post on StackOverflow about detecting when an app is killed, due to an exception, a signal, etc.

Most crashes can be caught using a 3-fold approach:

  1. appWillTerminate
  2. exception handler
  3. signal handler

This translates into using:

// installs HandleExceptions as the Uncaught Exception Handler
// create the signal action structurestruct sigaction newSignalAction;
// initialize the signal action structure
// set SignalHandler as the handler in the signal action structure
newSignalAction.sa_handler =&SignalHandler;
// set SignalHandler as the handlers for SIGABRT, SIGILL and SIGBUS
sigaction(SIGABRT,&newSignalAction, NULL);
sigaction(SIGILL,&newSignalAction, NULL);
sigaction(SIGBUS,&newSignalAction, NULL);


-(void)applicationWillTerminate:(UIApplication*)application {
void HandleExceptions(NSException*exception){
DebugLog(@"This is where we save the application data during a exception");
// Save application data on crash
void SignalHandler(int sig) {
DebugLog(@"This is where we save the application data during a signal");
// Save application data on crash// Write your code to reset brightness

My own answer to the same question, did not go as far as suggesting to intercept signals, but that is fundamental.

As I said, really great tip!


Apple introduces iOS 8 SDK and Xcode 6

Another post for InfoQ: “At its 2014 Worldwide Developer Conference, Apple announced its new mobile operating system, iOS 8, alongside new SDKs and development tools. New Apple software includes over 4000 new APIs, including all new frameworks such as HealthKit, HomeKit, and CloudKit, and enhancements to the platform gaming capabilities.”

Read it all.


My latest contribution to InfoQ

In the last couple of months, I have been contributing to InfoQ as a News Editor for the Mobile topic. Although this might not look like the most elegant behaviour, I will link here some of my writing for them. here you have my last one:

Android 4.1.1 Vulnerable to Reverse Heartbleed

Google announced last week that Android 4.1.1 is susceptible to the Heartbleed OpenSSL bug. While Android 4.1.1 is, according to Google, the only Android version vulnerable to Heartbleed, it remains in use in millions of smartphones and tablets. Android 4.1.1 devices have been shown to leak significant amount of data…





What’s new in iOS 7 User Interface – 1

The new iOS 7 has radically changed the way user interface is conceived on iOS. Indeed, the change has been so radical that along the past few months many not so positive comments and criticism could be found on the web. Now that iOS 7 has officially shipped this is starting to change and many voices are expressing true their enthusiasm. Even the rush to updating to iOS 7 does not seem to stop, so that it is already clear that iOS 7 will be a huge success in adoption.

In any case, apart from any criticism that one may have regarding how iOS 7 looks and feels, it is clear that it marks the beginning of a new era for mobile interfaces. After several weeks of use and learning through WWDC videos, I am going to summarize here what, as far as I understand, gives iOS 7 its definite character and makes it a radical change.

About being flat

You can hear a lot of talk about “flat UI” surrounding iOS 7, and in a sense iOS 7 is flat; but it is not just that. If the change brought from iOS 7 were just going beyond skeuomorphism, it would have not been so radical a change, as the plethora of flat UI apps already existing for iOS 6 (and android) can prove. As an example, take the new Evernote release for iOS 7:

evernote for ios 7
evernote for ios 7

It is surely a flat UI app, but does this seem an iOS 7 app? Certainly not. Still, reviews about evernote have been raving and all of them (that I have read, that is) have praised evernote for being a great iOS 7 app. Now, it looks more like an Android app to me. Even the most basic tenet of iOS 7 interface design – which is: content before all else – is not really respected, as the Evernote home screen shows. But this post is not just about Evernote, and if you look at this link, you will find several more examples of iOS 7 apps that would feel home on an Android phone (along others that are pretty good).

Basic Principles

So, let’s start with the basics. If you want to design a great iOS 7 UI, you will need to take four principles in consideration:

    • Clarity
  • It is not just about being flat. iOS 7 is about clarity, deference, depth, and detail.


    Clarity is a seemingly simple yet complex word. Clarity can be associated with the following qualities (as they are registered in the New Oxford American Dictionary):

    • coherence
      easy to see or hear
  • All of those qualities are there in the apps made by Apple for iOS 7, and you can clearly see throughout the whole iOS 7 redesign how those qualities are really a strong mark of iOS 7 user experience.

    Coherence: coherence is attained through several means: e.g., the now famous Yve’s icon grid; a new color palette; a completely redesigned icon set; etc. Even the new system icons have been designed in a way to make them look good with the new system font!

    Intelligibility is the quality of something that can be understood. This is evident in the effort to give user content the most important role to play (see also the concept of deference), and even more, to present user content so that the most important information is the most evident. A great example of that is the weather app.

    Intelligibility has also to do with the clear separation between user content and active elements, UI controls. And with the choice of using text for buttons: nothing comes close to text when it comes to intelligibility, of course. To be noted that the main point behind the decision of using purely textual buttons, i.e. of the removal of buttons’ borders, is intelligibility: no borders means more space available for useful information.

    Intelligibility has also got to do with another key concept in iOS 7 redesign: that of context and of always be in context. This is so crucial that I will come back to this later (or in a later post).

    Being easy to see or hear may appear something close to being intelligible, and indeed it is, but it carries a more specific meaning with it. Particularly this is visible in the new iOS 7 feature that allows the user change the system font size. This is attained through the use of text styles in place of the older font face qualifications (e.g., bold face): now we have headers, titles, and so on. And this is made possible by an overall redesign of the typography system in iOS 7.

    Sharpness: A noteworthy feature of iOS 7 new typography is its adaptive rendering of type: at smaller sizes, a type is rendered with a kind of boldening correction; at larger sizes, it is rendered with a lightening correction. This makes for a great, sharp effect which is possibly not so easy to notice. Have a look at this picture, where the central columns represents how a normal, regular typeface is rendered on iOS 7 compared to how the same typeface is rendered in plain and bold styles. This should be a clear example of sharpness.

    As to transparency, this is one of the main tenets of iOS 7. Just look at the new Control Center, or at navigation bar and toolbars and the way they let you see through them a blurred image of your content. This is not just for fun, since it gives a completely different look to the UI, where it somehow “adapts” to user content, and it changes so that is appears more in tune to it. Another great example of coherence by the way. But the main tenet behind the use of transparency is context, as the Control Center case makes clear.

    Now, I will not make any attempt at defining or describing “purity”, the last quality I have listed for clarity. But is it not true that iOS 7 home screen looks really pure with all those whites and pastel colors?

    Now this post has become long enough so that I will move on to the rest of principles behind iOS 7 redesign in a new post.

    Enjoy for now the clarity brought by iOS 7.


    Core Graphics Image Interpolation Performance

    Recently, I have done quite a bit of Core Graphics programming and discovered that one key point in ensuring that an app performs well is choosing the right CGContext interpolation level. Interpolation, according to the Wikipedia, “is a method of constructing new data points within the range of a discrete set of known data points.” Interpolating is what is done when you scale up an image, but also when you scale it down. In both cases, interpolation has a tremendous impact on the result of the scaling.

    Core Graphics CGContext allows 4 different levels of interpolation: kCGInterpolationLow, kCGInterpolationMedium, kCGInterpolationHigh, and kCGInterpolationNone. Their effect is pretty trivial to describe: kCGInterpolationNone will give you the most jagged result of all; kCGInterpolationHigh will give you the smoothest result of all. What is less clear from the outside is which impact interpolation will have on your app.

    So, I put up a benchmark test and here are the results.

    The test

    In my benchmark test, I draw a set of 6 different 649×649 b&w bitmap images to build a set of 12 different jigsaw puzzle tiles. The bitmap images are scaled down by a factor of 5.0 (i.e., to 129.8×129.8) to make them fit on a 320×480 display (at retina display scale).

    The outcomes

    Run times were measured using Instruments and only considering the time spent inside of the CGContextDrawImage function, so to remove all effects not related to the interpolation itself.

    The device used was a 4th gen iPod touch.

    – kCGInterpolationLow: 969 msec

    – kCGInterpolationMedium: 1690 msec

    – kCGInterpolationHigh: 2694 msec

    – kCGInterpolationNone: 545 msec

    As you can see, there is a big factor between no interpolation and high interpolation (almost 500% increase). The question arises whether this time is spent for some good sake or not, so let’s take into account the visual result.

    Visual Outcome

    As to the visual outcome, if you compare High or Medium to Low or None, the difference is staggering. Even noticeable, IMO, the difference between None and Low; while the difference between High and Medium is not particularly noticeable in this particular test.

    What is clear is the difference in run times, so that in my particular case, I ended up using `kCGInterpolationMedium`.






    Using iOS SDK 5 with Xcode 3

    I have been lately using a quite older MacBook Pro (Intel Core Duo 2 2.4 GHz, 4GB RAM) for development and found that it is pretty unpractical installing Xcode 4 for regular use due to its slow performance and huge memory footprint. So I looked for a way to run good old Xcode 3, which is fine under most aspects and works reliably. Turned out it was not so difficult in the end, so I list here the steps I took to make it work: you simply need to copy a bunch of files from a newer Xcode 4.3 distribution to the proper Xcode 3 location.

    • ensure you are using Xcode 3.2.6 and make sure it is not running before executing the steps below;
    • copy from a newer Xcode distribution (e.g., Xcode 4.3) the OS5 SDK files:
      sudo cp -a /PATH_TO_LATEST_XCODE/
    • copy version.plist file (important: backup the original one before overwriting it!)
      sudo cp /PATH_TO_LATEST_XCODE/ /Developer/Platforms/iPhoneOS.platform/
    • copy device support files for OS5:
      sudo cp -a /PATH_TO_LATEST_XCODE/*/Developer/Platforms/iPhoneOS.platform/DeviceSupport/

    If you want to ever go back to using the old original SDK distributed with Xcode 3, simply switch the version.plist file mentioned above with the backup copy.

    In any case, I would not suggest using this set up for building your app for submission to Apple. For that, I prefer using a later Xcode version with a newer compiler and all the regular stuff.



    Video of The Odyssey for the iPad

    I think I have mentioned a few times a collateral project I worked on in the second half of 2011: The Odyssey for the iPad. (The iPhone was not considered from the start due to its smaller size). The project was born as a serious attempt at making an artistic and fiercely poetic version of the Odyssey, based on the original drawing and literary research by Joma, a Barcelona based artist, and on the original music by Xavier Maristany, a Barcelona based musician.

    In the end, we decided to cancel the project due to lack of funding (and the Odyssey is ha huge work, we would have needed funding for at least 2 years work), but I think that the outcome, although only a prototype (hence of prototype-level quality and refinement) is worth describing here.

    The basic idea is illustrating the Odyssey through the selection of its main stories, events, characters. To give an idea of the size of this work, suffices it to say that we were planning on having about 350 drawings. Each drawing was thought as a mixture of images and the greek text associated to the image, and the really clever part was the way in which the text and the drawing fit together.

    Here you can see an image from the prototype and understand what I mean.

    Each scene was meant to have some kind of animation, graphical effect, interaction, and music. By rotating the device, you could get a more traditional book-like view of the Odyssey, based on its text.

    So, with all this in mind, I built a framework to:

    1. display and browse a set of PDF files (each file was meant to cover one of the 24 Odyssey books) containing the base layout (header, footer, text, default image);
    2. enrich the PDF view by displaying on top of it an animation (conceived as a sequence of PNGs), or an OpenGL scene, or a Cocos2D scene, or a video;
    3. play a background music associated to a PDF file, plus a specific musical comment for each given page.

    Apart from the framework, we worked out a couple of scenes as a proof of concept, and this also gave me the chance to integrate in the prototype some cool Cocos2D animation effects and an Open GL 3D panoramic view.

    Since the app is just a prototype and cannot be made available through the App Store, recently, I made a video of it. It is not professionally-crafted, but it gives you an idea of what our Odyssey for the iPad could look like. If there are any interested investors out there…

    Watch a trailer of The Odyssey for iPad here.


    Digilife Main

    User Experience @ CES

    Among the many reports from CES 2012, this one is really interesting to me because it focuses on User Experience. Indeed, how boring all the hype around device specs when really all that matters is the experience that device grants you! But this is just a reminder of how difficult it is changing mindset, forgetting about details no one really cares about, and focusing on those details that really provide a benefit in terms of user experience and usability.

    The trends that CES discloses for this year are: 3D, touch-able, visual retrieval, and voice control as the frontiers of UI for 2012. This all makes sense and is somewhat confirmed by the recent patent Apple disclosed about a “three-dimensional (“3D”) display environment for mobile device”, let alone the recent Siri introduction, with its lights and shadows.

    Now, those (3D, touch, voice, visual) are just words that appeal to our minds, but they are still far from ensuring a User Experience which is pleasant and effective (as the Siri case itself is there to show). Touch technologies were there well before the iPhone made its appearance, still it was only with the iPhone introduction that they got to the level of refinedness, coherence, and completeness that made them turbo-boom in the market.

    Ideas are ok, what matters most is how they are made into real things.


    Quick iPhone app for offline traveling

    Offline traveling may well be a neologism. The fact is that I am about to set off for a 2 week travel to the South of Italy. Now, I will bring my iPod touch with me, but it will be of limited use, since it has no 3G capability, and anyway 3G roaming is expensive; furthermore it is not always available as soon as you get away from the most beaten track.

    Truly Sad. But, I did not want to do without it and the help it can offer when you are looking for your car route, for an hotel nearby, or for information about the place you are going to visit. An ipod/iphone is a great tool where you are traveling with all the information that is available on the web. So I decided to quickly hack a small app packaging:

    • information from the local Tourist Offices about places, events, and accommodations;
    • the wikipedia wealth of historical, geographical, tourist information;
    • a route map.

    Main requirement was that this all could work offline. It was fun, the hardest part being downloading the wikipedia portals about Basilicata and Puglia, without also downloading the whole wikipedia, and the most exciting part integrating an offline map.

    wget-ting Wikipedia

    Downloading the wikipedia for offline viewing is not an easy task, due to its ramification and to the lack of any kind of URL-based structuring of its information. So, eventually you end up downloading much more information than you really want. There are some tricks that you can do to improve filtering of wikipedia pages, specifically, you can skip all “index.php?” pages and all “File:” pages; I found that the following wget command worked pretty well for me:

    wget -R "*index.php*.*,*File*.*" -D "locahost,," --random-wait -E -r -p -e robots=off -U mozilla -l1 -p --convert-links -H -nd -nc http://...

    it produced 3000 files, including pictures, perfectly browsable offline. You can see that I was interested just in the information available on the main portal pages and on its immediate offsprings. Also including the “File:” pages would have produced almost 20 thousands files.

    Route Map

    I found a very good guide about embedding an offline map into your iPhone app at this site.

    Getting “” to download a map from openstreetmap was a breeze, as well as converting the downloaded tiles by means of map2sqllite.

    What came next, i.e., compiling route-me and linking it to my app, was also a snap, but due to the fact that the above page was a bit outdated, the process felt a bit clumsier. It happens anyway that things are easier than what you can read there, so in the end, I had no problem at all. It seems that the latest “route-me” from github does already include all the changes that you are asked to do on the gisnotes page, except for commenting out the assert in the “setTileSource” method in “RMMapContents.m” file.

    Integrating the “WebMap” view is also very easy if you have a look at the RouteMeSampleMapDBOffline sample.



    The Sundays’ Programmers

    Lazy Saturday, today. I have just had a check with StackOverflow; you know, just for that sort of a conditioned reflex we all know; and I actually found two gems of a question. Look at them in the two pictures below.

    To me, the inability to explain what you tried to do with a minimal of clarity for anyone else to understand anything is the clearest sign that you do not have a clue at what you are doing. But, hey, it’s Saturday, this must be the Sunday Programmer in action!