After developing calculus, actuarial tables, and the mechanical calculator, and introducing the phrase “best of all possible worlds,” Gottfried Leibniz was still not satisfied with his life’s work. The 17th-century polymath had envisioned since childhood the creation of what he termed a characteristica universalis—a language perfectly designed to represent all scientific truths which would make the process of discovering new insights as simple as composing grammatically correct sentences. This "alphabet of human thought" was intended to eliminate falsehoods and ambiguity, and Leibniz continued to work on it until his death.
In modern times, a version of Leibniz’s ambition persists within programming languages. Although these languages do not encapsulate the entirety of the physical and philosophical universe, they represent the next closest thing—binary code, composed of ones and zeroes (another of Leibniz’s inventions), which forms a computer’s internal state. Computer scientists who dare to create new languages pursue their own characteristica universalis, aspiring to devise systems that enable developers to craft code so expressive that it eliminates any hiding place for bugs and is so clear that comments, documentation, and unit tests become redundant.
This expressiveness, naturally, is influenced as much by personal preferences as it is by information theory. Just as the music album Countdown to Ecstasy inspired a lifelong appreciation for Steely Dan in one individual, their preference in programming languages was most significantly shaped by the first language they learned independently—Objective-C.
Some may compare attempting to describe Objective-C as a metaphysically divine or even a quality language to appreciating Shakespeare in Pig Latin. Objective-C is, at most, divisive. Criticized for its verbosity and unusual square brackets, it remained primarily in use for developing Mac and iPhone apps, and might have vanished in the early 1990s if not for an unexpected twist of fate. Despite this, while working as a software engineer in San Francisco in the early 2010s, one individual frequently found themselves defending the language’s more cumbersome design choices in SoMa dive bars or in the comments sections of HackerNews.
Objective-C entered their life when it was most needed. As a rising college senior who had discovered an interest in computer science too late to major in it, they found themselves in entry-level software engineering classes with more adept, younger peers. With smartphones beginning to gain popularity, yet no mobile development courses available at their college, they identified a niche. That summer, they learned Objective-C from a cowboy-themed book series titled The Big Nerd Ranch. Watching code written on a big screen illuminate the pixels of a small screen in their hand for the first time, they experienced the exhilarating sensation of limitless self-expression and believed they could create anything they imagined. They had encountered what felt like a truly universal language—until this perception changed.
Objective-C emerged during the spirited early days of the object-oriented programming era and, by usual standards, should not have persisted beyond it. By the 1980s, software projects had become too vast for individuals or even single teams to handle alone. Xerox PARC computer scientist Alan Kay introduced object-oriented programming to facilitate collaboration, organizing code into reusable “objects” that communicate by sending “messages” to each other. For example, a programmer could create a Timer object capable of receiving messages like start, stop, and readTime. These objects could then be reused in different software programs. During the 1980s, enthusiasm for object-oriented programming was so strong that new languages were being released every few months, and computer scientists speculated that a “software industrial revolution” was imminent.
In 1983, software engineers Tom Love and Brad Cox from International Telephone & Telegraph merged object-oriented programming with the popular, readable syntax of the C programming language to develop Objective-C. They established a short-lived company to license the language and sell object libraries, and before the company went under, they secured a pivotal client: NeXT, the computer firm founded by Steve Jobs following his departure from Apple. When Jobs returned to Apple in 1997, he brought NeXT’s operating system, along with Objective-C. For the subsequent 17 years, the creation of Cox and Love powered products from the world’s most influential technology company.
Acquaintance with Objective-C came a decade and a half later. The sentence-like structure of objects and messages, marked by square brackets such as [self.timer increaseByNumberOfSeconds:60], was observed. These were not brief, Hemingway-esque sentences but lengthy, elaborate, Proustian ones, syntactically intricate and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.