Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Microsoft announces layoffs that will impact at least 6,000 employees

    May 13, 2025

    Square’s New Handheld Payment Scanner Looks Like a Phone

    May 13, 2025

    Apple’s new Accessibility Reader can customize text across apps — and in real life

    May 13, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » The New Hatred of Technology
    Business

    The New Hatred of Technology

    News RoomBy News RoomNovember 19, 20244 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    People have never been better, here in the Year of Our Simulation 2024, at hating the very forces underlying that simulation—at hating, in other words, digital technology itself. And good for them. These everywhere-active tech critics don’t just rely, for their on-trend position-taking, on vague, nostalgist, technophobic feelings anymore. Now they have research papers to back them up. They have bestsellers by the likes of Harari and Haidt. They have—picture their smugness—statistics. The kids, I don’t know if you’ve heard, are killing themselves by the classroomful.

    None of this bothers me. Well, teen suicide obviously does, it’s horrible, but it’s not hard to debunk arguments blaming technology. What is hard to debunk, and what does bother me, is the one exception, in my estimation, to this rule: the anti-tech argument offered by the modern-day philosopher.

    By philosopher, I don’t mean some stats-spouting writer of glorified self-help. I mean a deepest-level, ridiculously learned overanalyzer, someone who breaks down problems into their relevant bits so that, when those bits are put back together, nothing looks quite the same. Descartes didn’t just blurt out “I think, therefore I am” off the top of his head. He had to go as far into his head as he humanly could, stripping away everything else, before he could arrive at his classic one-liner. (Plus God. People always seem to forget that Descartes, inventor of the so-called rational mind, couldn’t strip away God.)

    For someone trying to marshal a case against technology, then, a Descartes-style line of attack might go something like this: When we go as far into the technology as we can, stripping everything else away and breaking the problem down into its constituent bits, where do we end up? Exactly there, of course: at the literal bits, the 1s and 0s of digital computation. And what do bits tell us about the world? I’m simplifying here, but pretty much: everything. Cat or dog. Harris or Trump. Black or white. Everyone thinks in binary terms these days. Because that’s what’s enforced and entrenched by the dominant machinery.

    Or so goes, in brief, the snazziest argument against digital technology: “I binarize,” the computers teach us, “therefore I am.” Certain technoliterates have been venturing versions of this Theory of Everything for a while now; earlier this year, an English professor at Dartmouth, Aden Evens, published what is, as far as I can tell, its first properly philosophical codification, The Digital and Its Discontents. I’ve chatted a bit with Evens. Nice guy. Not a technophobe, he claims, but still: It’s clear he’s world-historically distressed by digital life, and he roots that distress in the fundaments of the technology.

    I might’ve agreed, once. Now, as I say: I’m bothered. I’m unsatisfied. The more I think about the technophilosophy of Evens et al., the less I want to accept it. Two reasons for my dissatisfaction, I think. One: Since when do the base units of anything dictate the entirety of its higher-level expression? Genes, the base units of life, only account for some submajority percentage of how we develop and behave. Quantum-mechanical phenomena, the base units of physics, have no bearing on my physical actions. (Otherwise I’d be walking through walls—when I wasn’t, half the time, being dead.) So why must binary digits define, for all time, the limits of computation, and our experience of it? New behaviors always have a way, when complex systems interact, of mysteriously emerging. Nowhere in the individual bird can you find the flocking algorithm! Turing himself said you can’t look at computer code and know, completely, what’ll happen.

    And two: Blaming technology’s discontents on the 1s and 0s treats the digital as an endpoint, as some sort of logical conclusion to the history of human thought—as if humanity, as Evens suggests, had finally achieved the dreams of an Enlightened rationality. There’s no reason to believe such a thing. Computing was, for most of its history, not digital. And, if predictions about an analog comeback are right, it won’t stay purely digital for much longer. I’m not here to say whether computer scientists should or shouldn’t be evolving chips analogically, only to say that, were it to happen, it’d be silly to claim that all the binarisms of modern existence, so thoroughly inculcated in us by our digitized machinery, would suddenly collapse into nuance and glorious analog complexity. We invent technology. Technology doesn’t invent us.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe AI Machine Gun of the Future Is Already Here
    Next Article HTC’s Vive Focus Vision Offers a Glimpse of Crystal Clear VR—When It Works

    Related Posts

    My X Account Was Hijacked to Sell a Fake WIRED Memecoin. Then Came the Backlash

    May 12, 2025

    Buy Now or Pay More Later? ‘Macroeconomic Uncertainty’ Has Shoppers Anxious

    May 12, 2025

    Donald Trump’s UK Trade Deal Could Secure Jaguar’s Resurrection

    May 9, 2025

    Singapore’s Vision for AI Safety Bridges the US-China Divide

    May 9, 2025

    A ‘Trump Card Visa’ Is Already Showing Up in Immigration Forms

    May 8, 2025

    OpenAI and the FDA Are Holding Talks About Using AI In Drug Evaluation

    May 8, 2025
    Our Picks

    Square’s New Handheld Payment Scanner Looks Like a Phone

    May 13, 2025

    Apple’s new Accessibility Reader can customize text across apps — and in real life

    May 13, 2025

    US Border Agents Are Asking for Help Taking Photos of Everyone Entering the Country by Car

    May 13, 2025

    Square’s $399 Handheld accepts tap-to-pay at your table

    May 13, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Gear

    How to Use Apple Maps on the Web

    By News RoomMay 13, 2025

    The boundaries of Apple’s walled garden aren’t as well defined as they used to be;…

    DJI is skipping the US with its most advanced drone yet

    May 13, 2025

    Microsoft extends Office app support on Windows 10 to 2028

    May 13, 2025

    Microsoft reveals its rejected Start menu redesigns

    May 13, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.