Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Apple has a new ‘Viral’ playlist on Apple Music and Shazam

    May 8, 2025

    Scientists Believe They’ve Witnessed ‘Planetary Suicide’ for the First Time

    May 8, 2025

    Instagram CEO testifies about competing with TikTok: ‘You’re either growing, or you’re slowly dying’

    May 8, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » The Subjective Charms of Objective-C
    Business

    The Subjective Charms of Objective-C

    News RoomBy News RoomApril 15, 20254 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    After inventing calculus, actuarial tables, and the mechanical calculator and coining the phrase “best of all possible worlds,” Gottfried Leibniz still felt his life’s work was incomplete. Since boyhood, the 17th-century polymath had dreamed of creating what he called a characteristica universalis—a language that perfectly represented all scientific truths and would render making new discoveries as easy as writing grammatically correct sentences. This “alphabet of human thought” would leave no room for falsehoods or ambiguity, and Leibniz would work on it until the end of his life.

    A version of Leibniz’s dream lives on today in programming languages. They don’t represent the totality of the physical and philosophical universe, but instead, the next best thing—the ever-flipping ones and zeroes that make up a computer’s internal state (binary, another Leibniz invention). Computer scientists brave or crazy enough to build new languages chase their own characteristica universalis, a system that could allow developers to write code so expressive that it leaves no dark corners for bugs to hide and so self-evident that comments, documentation, and unit tests become unnecessary.

    But expressiveness, of course, is as much about personal taste as it is information theory. For me, just as listening to Countdown to Ecstasy as a teenager cemented a lifelong affinity for Steely Dan, my taste in programming languages was shaped the most by the first one I learned on my own—Objective-C.

    To argue that Objective-C resembles a metaphysically divine language, or even a good language, is like saying Shakespeare is best appreciated in pig latin. Objective-C is, at best, polarizing. Ridiculed for its unrelenting verbosity and peculiar square brackets, it is used only for building Mac and iPhone apps and would have faded into obscurity in the early 1990s had it not been for an unlikely quirk of history. Nevertheless, in my time working as a software engineer in San Francisco in the early 2010s, I repeatedly found myself at dive bars in SoMa or in the comments of HackerNews defending its most cumbersome design choices.

    Objective-C came to me when I needed it most. I was a rising college senior and had discovered an interest in computer science too late to major in it. As an adult old enough to drink, I watched teenagers run circles around me in entry-level software engineering classes. Smartphones were just starting to proliferate, but I realized my school didn’t offer any mobile development classes—I had found a niche. I learned Objective-C that summer from a cowboy-themed book series titled The Big Nerd Ranch. The first time I wrote code on a big screen and saw it light up pixels on the small screen in my hand, I fell hard for Objective-C. It made me feel the intoxicating power of unlimited self-expression and let me believe I could create whatever I might imagine. I had stumbled across a truly universal language and loved everything about it—until I didn’t.

    Twist of Fate

    Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it. By the 1980s, software projects had grown too large for one person, or even one team, to develop alone. To make collaboration easier, Xerox PARC computer scientist Alan Kay had created object-oriented programming—a paradigm that organized code into reusable “objects” that interact by sending each other “messages.” For instance, a programmer could build a Timer object that could receive messages like start, stop, and readTime. These objects could then be reused across different software programs. In the 1980s, excitement about object-oriented programming was so high that a new language was coming out every few months, and computer scientists argued that we were on the precipice of a “software industrial revolution.”

    In 1983, Tom Love and Brad Cox, software engineers at International Telephone & Telegraph, combined object-oriented programming with the popular, readable syntax of C programming language to create Objective-C. The pair started a short-lived company to license the language and sell libraries of objects, and before it went belly up they landed the client that would save their creation from falling into obscurity: NeXT, the computer firm Steve Jobs founded after his ouster from Apple. When Jobs triumphantly returned to Apple in 1997, he brought NeXT’s operating system—and Objective-C—with him. For the next 17 years, Cox and Love’s creation would power the products of the most influential technology company in the world.

    I became acquainted with Objective-C a decade and a half later. I saw how objects and messages take on a sentence-like structure, punctuated by square brackets, like [self.timer increaseByNumberOfSeconds:60]. These were not curt, Hemingwayesque sentences, but long, floral, Proustian ones, syntactically complex and evoking vivid imagery with function names like scrollViewDidEndDragging:willDecelerate.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleHelinox Has Upgraded Its Cult-Classic Camp Chair
    Next Article The new Polaroid Flip instant camera will warn you about exposure and focus issues before you shoot

    Related Posts

    A ‘Trump Card Visa’ Is Already Showing Up in Immigration Forms

    May 8, 2025

    OpenAI and the FDA Are Holding Talks About Using AI In Drug Evaluation

    May 8, 2025

    Amazon Has Made a Robot With a Sense of Touch

    May 7, 2025

    Trump’s Tariffs Are Threatening America’s Apple Juice Supply Chain

    May 7, 2025

    The Future of Manufacturing Might Be in Space

    May 7, 2025

    OpenAI Backs Down on Restructuring Amid Pushback

    May 7, 2025
    Our Picks

    Scientists Believe They’ve Witnessed ‘Planetary Suicide’ for the First Time

    May 8, 2025

    Instagram CEO testifies about competing with TikTok: ‘You’re either growing, or you’re slowly dying’

    May 8, 2025

    The Signal Clone Mike Waltz Was Caught Using Has Direct Access to User Chats

    May 8, 2025

    Celsius founder Alex Mashinsky sentenced to 12 years in prison

    May 8, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Business

    A ‘Trump Card Visa’ Is Already Showing Up in Immigration Forms

    By News RoomMay 8, 2025

    The US government has the capacity to dole out roughly 1.1 million permanent resident cards…

    Razer’s Clio is a $230 surround sound head cushion

    May 8, 2025

    Apple is planning smart glasses with and without AR

    May 8, 2025

    Ecobee’s Smart Doorbell Camera now integrates with Google Home

    May 8, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.