Apple Swift – A step in the right direction (or is it?)

One of the big announcements by Apple at this week’s World Wide Developer Conference was Swift, a developer-focused tool intended to make it a lot easier to create applications (well, iOS and OSX applications, anyway).

By creating Swift, Apple is recognizing what most of us already know – creating mobile applications (or any applications) is painful, slow, complex, and open to a relatively small portion of the mobile community. In this article about the Swift announcement, Frank Bentley, a principal research associate at Yahoo Labs who teaches mobile programming at MIT is quoted:

Historically, iOS “tends to trip up some students,” he says, with some of its more complex features that relate to things like memory management and bracket notation. When he took a first look at Swift, he saw that those barriers were gone.

The technologies behind the languages used for mobile application development have not changed or improved significantly in 20 years. Yes, you can layer visual tools and pretty editors over them, but they are still complex, esoteric beasts, and are being used to develop applications which are in many ways orders of magnitude more complex than software developed when these tools were invented.

I congratulate Apple for recognizing the problem, and taking steps to address it.

But does Swift go far enough? And is it the right direction?

Here are a few of my concerns:

  • It is yet another platform-specific tool. It does nothing to aid in the development of portable solutions (though this is, understandably, not Apple’s business model), and ignores industry trends towards mobile web apps.
  • It targets coders specifically. Yes, it will probably make Apple developers more productive. But it is still code. It is yet another syntax to learn, which is similar to but different from all the other languages out there, and is not significantly more accessible to non-coders than other programming languages.
  • As an educational tool to attract more kids and non-coders into development – it is not really an enabler of anything, just an incremental evolution of existing tools.

I would hope that in an industry like ours, we would have moved past the era tools for platform-specific, non-portable solutions.

I would hope that by this point we could move beyond code- and -syntax focused language, even the ones with pretty visual tools duct-taped over the front.

I would hope we would have tools which allow us to develop portable, freely distributable applications using truly visual tools, without having to resort to manual coding, where non-coders can focus on developing solutions and implementing their idea, rather than on learning syntax.

I guess that is why I am part of a team a fulfilling that exact vision.

PS – if you would like to see how we have made that vision a reality, sign up for our developer beta which is starting very soon.

Guest Article: Fred Yeomans is the VP of Technology at Agora Mobile




  1. […] out my post entitled Apple Swift – A step in the right direction (or is it?) over in the Vizwik […]

  2. Zing says:

    I would say that Swift doesn’t really solve any of the problems with iOS development. If you wanted to use more familiar-looking languages, there were more options possible from the start.

    But anyway, let’s look at overall the problem with most modern programming languages, or rather, the problems in programming which modern languages would ideally be trying to solve, and see how it stacks up:

    * The null problem. Null should never have been invented and every language which imported the concept suffers. Swift introduces “optionals”, but let’s face it, optional is just another word for “a wrapper object which is either null or the value.”

    * Concurrency. As far as I can tell, Swift brings nothing to the table here.

    * Immutability. Related to concurrency, but it has already been shown that languages with immutable types not only run faster for concurrency, but are easier to reason with.

    * Unicode correctness. If you were using Cocoa already, you probably already had this. Nobody has it in the language, though. Absolutely nobody. Not even Swift.

    * Representing everything as string. People are fond of representing every kind of character data as the same “string” class and the result of this is bugs:
    * Accidentally showing non-localised strings to the user.
    * SQL injection vulnerabilities
    * Cross-site scripting vulnerabilities
    All of these are possible because the same type is being abused for multiple purposes. Since Swift inherits the Objective C platform, it can’t solve this either.

    • Fred Yeomans says:

      Hi Zing – I appreciate your points. I really wanted to avoid getting into the details of the language because it is exactly that, just another text-based language.

      There are other, more paradigm-shifting (sorry, cringed a little there just typing “paradigm-shifting”) approaches to solving the code crisis. Like Vizwik, for example.