Gottfried Leibniz, a towering figure of the 17th century, made monumental contributions to various fields, including calculus, actuarial tables, and the mechanical calculator. He is also known for coining the phrase best of all possible worlds. Despite these achievements, Leibniz always felt that his life's work was incomplete. Since his youth, he envisioned a groundbreaking concept known as characteristica universalis, a universal language that could accurately embody all scientific truths, making new discoveries as straightforward as crafting grammatically correct sentences. This so-called alphabet of human thought aimed to eliminate ambiguity and falsehoods, and Leibniz devoted his entire life to this ambitious project.

Today, a semblance of Leibnizs dream persists in the realm of programming languages. While they do not encapsulate the entirety of the physical and philosophical universe, they represent an essential frameworkthe binary code, composed of ones and zeroes, which constitutes a computers internal state. The binary system itself is another of Leibniz's fascinating inventions. Computer scientists, often regarded as either brave or eccentric, embark on the quest to develop new languages in pursuit of their own interpretation of characteristica universalis. They strive to create a coding system so expressive that it leaves no room for elusive bugs and renders the need for extensive comments, documentation, and unit tests unnecessary.

However, the concept of expressiveness in programming is inherently subjective, influenced by personal taste as much as by information theory. My own affinity for programming languages was indelibly shaped by my first self-taught experience with Objective-C.

Making the case that Objective-C is akin to a divinely inspired language is comparable to suggesting that Shakespeare is best enjoyed in Pig Latin. The reality is that Objective-C has often been a polarizing subject among developers. Known for its extensive verbosity and unique square bracket syntax, it is primarily employed for developing applications for Mac and iPhone. One might argue that it would have slipped into obscurity during the early 1990s if not for an unexpected twist of fate. Nevertheless, during my time as a software engineer in San Francisco in the early 2010s, I often found myself defending its intricate design choices in lively discussions at local dive bars or in the comments section of tech forums like Hacker News.

Objective-C entered my life at a crucial juncture. As a college senior, I had only recently discovered my passion for computer science, a realization that came too late for me to major in it. Watching younger peers excel in introductory software engineering courses heightened my sense of urgency. At the time, smartphones were gaining traction, yet my university did not offer mobile development classesa gap I was eager to fill. That summer, I committed myself to learning Objective-C through a cowboy-themed series of books titled The Big Nerd Ranch. The first time I saw my code light up pixels on a small screen was an exhilarating experience that left me enamored with Objective-C. It empowered me to feel a sense of unlimited self-expression and led me to believe that I could manifest any idea I envisioned. I had stumbled upon a language that felt universally right for meuntil the affection began to wane.

The inception of Objective-C coincided with the dynamic early days of the object-oriented programming (OOP) era, a time when it seemingly should have faded into history. By the 1980s, software projects had become too vast for a single individual or even a small team to manage independently. To facilitate collaboration, Alan Kay, a pioneer at Xerox PARC, developed the principles of object-oriented programming, which organized code into reusable objects that communicated by sending each other messages. For example, a programmer could create a Timer object capable of receiving commands such as start, stop, and readTime. The excitement surrounding OOP during the 1980s was palpable; new programming languages emerged almost monthly, and computer scientists were convinced we were on the brink of a software industrial revolution.

In 1983, software engineers Tom Love and Brad Cox merged the principles of object-oriented programming with the straightforward syntax of the C programming language to birth Objective-C. Initially, the duo launched a brief venture to license the language and sell libraries of objects. Their efforts would have likely ended in failure had they not landed a pivotal client: NeXT, the computing company founded by Steve Jobs after his departure from Apple. In a twist of fate, when Jobs returned to Apple in 1997, he introduced NeXT's operating systemand Objective-Cinto the company, ensuring that it would thrive for the next 17 years as the foundation for products from one of the world's most influential tech companies.

Fast forward to my own experience, a decade and a half later, and I found myself captivated by how Objective-Cs objects and messages emulated sentence-like structures, punctuated by its distinctive square brackets. For instance, a line of code such as [self.timer increaseByNumberOfSeconds:60] might not have the succinctness of Hemingways prose but instead resembled the intricate, flowing sentences of Proust. This complexity not only evoked vivid imagery but also illustrated the languages capacity for creativity in coding.