Why Objective-C is doomed

My recent LinkedIn article ended with the conclusion:

One thing that is clear to observers – Objective-C’s days are numbered. Apple management are not shy about killing off technologies that no longer suit them, and once they feel maintaining Objective-C is preventing further improvement, it will be given an official end of life. This won’t happen overnight, but it will happen, and Objective-C developers should ignore the warning signs at their own peril.

This resulted in an animated comment discussion – many developers who are professionally (and emotionally) tied to Objective-C would be understandably resistant to the concept of its eventual demise. However, I still feel the writing is on the wall for Objective-C. I think this particular aspect deserves a bit more treatment, so I’m presenting the reasoning behind this viewpoint here.

Swift is transitional, not complementary

Apple have done a fantastic job making Swift and Objective-C code interoperate cleanly, and this leads some to believe that the two languages can coexist indefinitely. I get the feeling they regard Swift in the same vein as CoffeeScript – an alternative syntax to exactly the same code, delivering some convenience and terseness to the source without fundamentally changing the underlying programming model.

However, this isn’t an accurate comparison. Evan Swick posted a detailed look at Swift internals – if I can ineptly summarise, Swift, like Objective-C, is implemented on top of the objc runtime, but differs in key areas like method dispatch. It’s more of a peer to Objective-C rather than just a ‘client language’ compiling down to the same executable.

At a higher level, Swift departs from Objective-C in other aspects. Type-safety and the monadic option type will result in much more reliable apps, the various functional programming aspects will enable us to design software in different ways, and the (intentionally) less capable error handling and reflection features will force us to do so.

For application development, Swift will be a better tool for the job by almost any measure. I can’t envision a situation where, on an ongoing basis (and purely on language merit), Objective-C will be used for some parts of a new application, and Swift will be used for others. Swift is interoperable with Objective-C only so that we (and Apple) can leverage our existing code and slowly transition to the new world, not so that we have additional language choices for development.

Swift code will outgrow Objective-C

Swift programs inherit the standard Cocoa design patterns by virtue of using the Cocoa frameworks, but there are places the Swift programming model is likely to go where Objective-C can’t follow. Some of the issues are apparent already:

  • Objective-C APIs return all objects as optionals (and all primitives as non-optional), making optionals much less useful than they are in vanilla Swift code.
  • Objective-C APIs always deal with non-generic collections, resulting in a lot of typecasts and erosion of the benefits of type-safety.

The other areas Swift will diverge is through the new language features. Generics, ADTs, top-level and curried functions will allow new styles of API, but none of these are available from Objective-C. We’ll initially see third party frameworks adopt a ‘Swift-first’ or ‘Swift-only’ API approach, but I expect there’ll be increasing pressure from developers for Apple to follow suit. Note this isn’t necessarily a rational process – see Erica Sudun’s recent anecdote on ‘Cocoaphobia’.

As an aside, I’ve been trying to think of similar scenarios examples where a platform vendor has replaced their core applications language. The plethora of alternative JVM languages are all community-driven. Microsoft successfully replaced VB6 with .NET, although this involved also rewriting the application frameworks, and of course Win32 and MFC refuse to die. F# was born within Microsoft (Research), but they never really pushed this as mainstream C# replacement – instead they’ve rolled F#-inspired enhancements into the C# language, and spun off F# as a community project.

Ultimately, Apple will not be able to continue advancing the platform & the tools while maintaining backward compatibility with Objective-C. We’ll start to see new frameworks released as ‘Swift-only’, and eventually I’d expect to see Foundation and AppKit/UIKit replaced with Swift equivalents.

Objective-C is not indispensable

It’s true that Swift isn’t capable of seamlessly interleaving C & assembly like Objective-C can, but this isn’t a deal-breaker for an application programming language – most other languages in this category (Ruby, C#, Java, Python, etc) can’t do this either.

The argument I’ve seen posed is that because of the C-level compatibility, Apple will always need Objective-C around to perform lower-level tasks (e.g. build the frameworks and tools), therefore they’ll continue making it available for third party devs to use.

There are a few problems with this:

  • Apple are quite okay keeping tools around for internal use only (e.g. YellowBox for Windows, which was used by WebObjects for a while). There’s a lot more involved in fully supporting a technology for third party developers vs using it internally.
  • The bulk of iOS & OS X’s low level code is written in C, C++, and assembly (like every other platform) – e.g. the kernel, the BSD subsystem, clang & LLVM, the objc runtime, CoreFoundation, CoreGraphics, OpenCL, Metal, WebKit, CoreAudio etc. I think most application developers just see the Cocoa layer and tend to overestimate its importance in the scheme of things.
  • The ‘Objective’ part of Objective-C is even less capable than Swift of writing low level ‘unsafe’ code, so if there is a lot of low-level code in the Foundation framework, conceptually you can regard it as being written in C.

The main reason why Objective-C is used to build the application frameworks is because it’s the language used for building applications. Once we transition to using Swift for applications, it will exist there only for legacy reasons.

The end is at handDoomed! Doomed I say!

I’ll finish with a quote from Bill Gates, of all people:

“We always overestimate the change that will occur in the next two years and underestimate the change that will occur in the next ten.”

Anyone today claiming iOS & OS X developers “don’t need to learn Objective-C” is jumping the gun – it’s going to be difficult to be a good developer without knowing the language the bulk of the code is written in. I expect Objective-C will still be around for most, if not all, of that 10 years.

However, anyone assuming that Objective-C will stick around forever is ignoring the warning signs. Swift and the Mac & iOS platforms will outlast Objective-C, hopefully by a significant margin.

 

Advertisements

Why Swift was a better option than Xamarin

swift-hero_2xMatt Baxter-Reynolds over at ZDNet claims that Swift is “a total mishit that’s badly judged the state of the software development industry” and that “What Apple should have done, years ago, was bought Xamarin”. I disagree with this on a number of levels.

I think the crux of the argument hinges on the flawed concept that ‘if only’ Apple supported some sort of standard third party language, it would make switching apps & developers between platforms easy, and everybody wins.  This doesn’t make any sense:

  • Learning a new language only takes a few weeks, however learning frameworks, tools, UI idioms, platform processes and cloud service models takes years. I think there’s an implicit and significant overestimation of the actual benefit to developers of using an existing language.
  • Apple could go further and adopt wholesale a full-stack technology like J2ME or Xamarin, but this would eviscerate their ability to innovate & differentiate their platform. I’m struggling with how this could possibly be attractive to them. There’s a definite sense of entitlement in demanding Apple cripple their business to make it slightly easier for third party developers to add value to competitor platforms.

I’d go further and say the industry as a whole would be negatively impacted by ‘language homogenisation’ — having a single language across all platforms would stifle development of new languages and technologies that work to make the industry better. Would we all genuinely be happier and more productive if Java 6 was the only mobile development language?

A couple of other choice quotes:

When I first heard about Swift I was pleased as I assumed that Apple would look to solve the key problem faced by mobile developers — specifically that there is little overlap between developer toolsets making cross-platform mobile development extremely difficult.

I’m one of those developers and that ‘key problem’ is barely even on my list of problems. If it was, I certainly wouldn’t assume it was Apple’s responsibility to solve it.

No, the problems I battle with most are diagnosing obscure app crashes, dealing with header files, dynamic typing issues, forgetting asterisks, counting square brackets and the like (not to mention f*#!ingblocksyntax). Swift goes a long way towards removing this pain.

It’s been a long time since the software development community accepted commercial, “ivory tower” organisations to dictate engineering approaches. The last time this happened was back in the era that brought us Java and .NET itself. Now the community itself decides.

It should make Matt happy to know that the community itself DOES have the ability to decide — there are plenty of cross-platform mobile dev environments available today, including Xamarin. The fact the community hasn’t decided to replace ‘native’ development with these platforms should prompt him to wonder why (and I don’t believe it’s due to them not being first party).

Oh come on — it doesn’t show the best in modern language thinking at all. Modern language thinking comes from the community, building incrementally within that community in a way that’s genuinely helpful to all of us.

I like to think of programming languages as a window into the general attitudes towards language design at the time they were created. Objective-C is clearly a child of the smalltalk era. .NET was influenced by Java which was influenced by C++. The language elements I see in Swift are a reflection of thoroughly modern  thinking from (yes, community driven) languages like Go, Rust, F#Ruby, Python and Haskell.

It sounds like Matt’s arguing only open source languages have the right to be called ‘modern’, which seems a bit narrow-minded to me.

In essence, Apple had one job — create a new baseline tooling for iOS and show a sympatico approach with how the rest of the industry actually operates — and they blew it.

It looks like he’s called out two jobs there. It’s too early to judge whether they’ve blown the first (and most important) job, but at this stage they certainly appear to have delivered the goods. As for the second, maybe I just don’t understand ‘how the rest of the industry actually operates’, but Platform |> Developer Tools |> Software |> Profit! would be my guess. I get the feeling Matt looked at the language, decided he didn’t like it, and took it as a personal affront Apple didn’t just adopt a language he was already comfortable with.

Lastly, I’ll just add that many commentators questioning the need for Swift seem to be assuming the language selection process was a relatively trivial window-shopping exercise — i.e. “Hey, this one looks nice!” However, that’s oversimplifying the number of factors that needed to be considered, and key amongst those was seamless compatibility with existing C & Objective-C code. None of the existing solutions (like Xamarin) come close to doing this as well as Swift.