Make apps for a global audience: a new approach to empowering NSAttributedString

When working on an app that is released worldwide, one of the common topics is how to stylise the text based on the user language.

At the WWDC 2018, Apple Engineers gave an awesome talk about Creating Apps for a Global Audience during which they provided countless interesting suggestions about managing different languages in apps.

So, what’s the problem?

When we want to provide different styles to our text-based controls, we usually use the NSAttributedString. By doing so, we can easily provide different fonts or different text colours to parts of our text. You can find tons of articles on the internet about this.

Instead, what we want to achieve is the dynamic application of different styles to the UI, based on the language itself (to be precise, to the current text Unicode script), like:

As you can see from these images, the descriptions include Arabic, Japanese and Latin characters. When we want to apply different character-based fonts, we have two options:

  1. Buy a font which includes both (or several) languages and so different sets of possible characters (glyphs);
  2. Use Apple’s way to specify a font fallback through Core Text (cascadeList), which is not fully documented. If a glyph is not in your app font, there is always a way to show a glyph from another font;

However, both solutions have the same problem – they don’t work if the selected font includes different character sets — and neither solution is powerful enough: what Apple doesn’t provide is an easy way to apply all the styles we need (not only different fonts) on the UIKit controls and based on the characters populating the UI.

Our approach

To understand the approach we implemented, we need to focus on how Unicode.Scalar works.

From unicode.org:

Q: How many languages are covered by Unicode?

A: It’s hard to say, because Unicode encodes scripts for languages, rather than languages per se. Many scripts (especially the Latin script) are used to write a large number of languages. The easiest answer is that Unicode covers all of the languages that can be written in the following scripts: Latin, Greek, Cyrillic, Armenian, Hebrew, Arabic, Syriac, Thaana, Devanagari, Bengali, Gurmukhi, Oriya, Tamil, Telugu, Kannada, Malayalam, Sinhala, Thai, Lao, Tibetan, Myanmar, Georgian, Hangul, Ethiopic, Cherokee, Canadian Aboriginal Syllabics, Khmer, Mongolian, Han (Japanese, Chinese, Korean ideographs), Hiragana, Katakana, and Yi.

So what the Unicode standard defines is the unique set of characters that we can use to write each language in the world.

Our approach to the text styling is to focus on the unicode scalars instead of the language itself. For instance, it would be nice to have a way to say:

I want to apply font size 10 and blue to Greek characters and font size 15 and green colour to Latin characters

right?

Introducing CascadeKit

We have a solution for that and that has been implemented in our CascadeKit library.

Let’s say we have the following text:

That’s a mix of Latin, Greek and Russian characters and let’s say we want to apply specific styles only to Greek and Russian scripts:

  1. Blue background colour, font Helvetica Neue, size 15 to Greek characters;
  2. Red background colour, white text colour, font Courier, size 18 to Russian characters;

Well this is easily achievable with the CascadeKit NSMutableAttributedString extension:

What this new extension does is:

  1. Request to apply specific styles only to Greek and Russian scripts (there are two scripts for both Greek and Russian);
  2. The addAttributes callback “emits” N identified ranges of characters that are populated with Greek or Russian characters;
  3. Inside the callback we apply our styles;

And that’s it. Regardless of my style in picking colours, the result is

Cool right ? 🙂


Please check out CascadeKit repo here and let us know what you think.

Any feedback is greatly appreciated!🙏🏼


kudos to Daniele Bogo for the massive contribution to this library🙏🏼

This post has been originally published on medium but since it’s not accessible from all the countries in the world, I’ve copied the article here.

Continue Reading

StorageKit: What, How and What’s next

StorageKit?

We tried to explain what StorageKit is here:

StorageKit is a framework that reduces the complexity of managing a persistent layer. You can easily manage your favorite persistent framework (Core Data / Realm at the moment), accessing them through a high-level interface.
Our mission is keeping the persistence layer isolated as much as possible from the client codebase. In this way, you can just focus on developing your app. Moreover, you can migrate to another persistent framework easily, keeping the same interface: StorageKit will do almost everything for you.

The idea is to take off the persistency complexity in order to let you focusing on what you have to develop.

What you can do is to instantiate you favourited storage in this way

and the you can use the storage to perform CRUD operations over the persistency layer through its main context:

That’s it! 🎉

Continue Reading

Back to the fundamentals: Sorting algorithms in Swift (from scratch!)

In the latest period I decided to read again the amazing Cormen and co. — Introduction to Algorithms book and it threw me back to an age ago when I attended the University.

So I wanted to write this post for personal knowledge but also because there are a lot of question around the web like “What’s the fastest sorting algorithm?” or “How to write down the bubble sort” during the iOS interviews. So maybe this post could help you to prepare the next interview 🙂

The topics covered in this article are:

Big O Notation

The first topic to cover before learning about sorting algorithms is the concept behind the Big O Notation, or before this, the concept of asymptotic growth of an algorithm:

[…] That is, we are concerned with how the running time of an algorithm increases with the size of the input in the limit, as the size of the input increases without bound. Usually, an algorithm that is asymptotically more efficient will be the best choice for all but very small inputs.

In this definition taken from the Cormen Book, there are two underlying concepts:

  1. this asymptotic growth is something we use to compare algorithms, so we can answer properly to questions like “What’s the fastest sorting algorithm?” comparing their asymptotic growth;
  2. it’s a measure of the performances of a target algorithm based on the input;

Using this definition we can completely skip any other formal definition, referring to the Big O notation considering these sentences:

  1. This step is O(1) –> this means that, regardless the input size, the considered step is constant;
  2. This cycle is O(N), where N is the input size –> this means that, the performance of this cycle is directly related to the input size; if a cycle is O(N) and another on the same input is O(N²) then the first cycle is faster than the second on the same input;
  3. If an algorithm is O(2*N) or O(N+N) we just say that it’s O(N)! The constants don’t have any value here!

Generally speaking we can refer to asymptotic growth in different scenarios but we usually refer to the worst case in time complexity: this is finally the Big O Notation we refer to! (Hint: The complexity could be expressed in terms of time or space)

Let’s now move to talk about the most famous sorting algorithms 🤓

Continue Reading

Isolating DB layer dependency for a better app architecture: a use case with Realm

If you are an iOS developer and you haven’t been hidden in the middle of nowhere for the last three years, you should know what Realm is: basically it’s an awesome dependency to you app.

We have seen during the latest years that other dependencies that helped a lot of people to build thousand of apps have been shut down. Yes I’m referring to Parse.

So I’m going to use Realm for my latest side project but the main goal for me now is to completely (or as much as possible) isolate the persistence layer from it’s concrete implementation (yes, you know, SOLID is just around the corner).

The first goal of introducing a new layer like this one is that any client would be able to program to an interface without caring at all about the persistence layer implementation. Another reason is that you could easily change database implementation or the persistence layer without affeting the rest of the codebase.

Continue Reading

CoreNFC tutorial

Well, when Apple presented iBeacon during the WWDC 2013, I bet that one year later they would have presented the NFC support. But it’s well known that I’m a terrible punter and Apple has finally introduce the support to NFC this year.

This means that starting from iOS 11 (and on iPhone 7/iPhone7+ and future deveices) our apps can leverage on NFC tags to get contextual info about the physical world around the user (especially for location aware apps, physical marketing campaigns, ..).

As described in the documentation:

Using Core NFC, you can read Near Field Communication (NFC) tags of types 1 through 5 that contain data in the NFC Data Exchange Format (NDEF)

So this means that so far it’s not possible to write NFC tags or card emulation.

NDEF

NDEF (NFC Data Exchange Format) is a protocol provided by the NFC Forum which is used to describe how a NFC tag must expose data. Every NDEF record is made up by two components:

  1. A record type (check out this doc)
  2. Payload data

Once a device (i.e. an iPhone or an Apple Watch) is ready to read data from a NFC tag and it’s placed near the tag, these components are exchanged between the NFC tag and the reader.

CoreNFC is able to interact with NFC tags 1/2/3/4 and 5.

Unfortunately this format is not fully supported because it seems that records like Mime type or URI are not included in the framework (or maybe we have to create custom classes to support them? 🤔)

Continue Reading

UITests: app resume from background thanks to Siri

This is a quick post about a new feature added by Apple in Xcode8.3 that let me test an edge case through UITests.

The problem

Let’s consider to have a business logic only applied when the app is resumed from background. And let’s say that if the app is resumed from background then your app navigation changes due to your business logic.

Until XCode8.2 every time a new UITest is executed, the app process is killed and the app is executed from scratch.

Browsing the Apple doc I’ve noticed that they have added the access to Siri through XCUIDevice and the only method exposed by it is:

XCUIDevice.shared().siriService.activate(voiceRecognitionText: "")

This let me to simulate the flow in which the user puts the app in background (using the Home button) and then resumes the app using Siri.

Continue Reading

Functional Lenses: an exploration in Swift

The context

The last feature I have implemented on the NowTV app is a cache that stores a complex data structure, basically a subset of a JSON response containing multiple and nested models.

This JSON is in translated in something like

Lately I’m studying the theory behind the Functional Programming and I was wondering how to access/update data stored in the cache in a functional way. Thanks to manub I ended up in Lenses and even though I’m not expert at all about FP, the aim of this post is just to share what I learned, from a lens newbie perspective 🙂

Continue Reading