Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Google’s Pixel 10 phones will reportedly launch on August 13th

    June 2, 2025

    Panasonic’s 65-inch OLED TV is a great Father’s Day deal at $997

    June 2, 2025

    Behold, a Four-Burner Grill That’s Also a Griddle and a Pretty Good Pizza Oven

    June 2, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » To Build a Better AI Supercomputer, Let There Be Light
    Business

    To Build a Better AI Supercomputer, Let There Be Light

    News RoomBy News RoomApril 5, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    GlobalFoundries, a company that makes chips for others, including AMD and General Motors, previously announced a partnership with Lightmatter. Harris says his company is “working with the largest semiconductor companies in the world as well as the hyperscalers,” referring to the largest cloud companies like Microsoft, Amazon, and Google.

    If Lightmatter or another company can reinvent the wiring of giant AI projects, a key bottleneck in the development of smarter algorithms might fall away. The use of more computation was fundamental to the advances that led to ChatGPT, and many AI researchers see the further scaling-up of hardware as being crucial to future advances in the field—and to hopes of ever reaching the vaguely-specified goal of artificial general intelligence, or AGI, meaning programs that can match or exceed biological intelligence in every way.

    Linking a million chips together with light might allow for algorithms several generations beyond today’s cutting edge, says Lightmatter’s CEO Nick Harris. “Passage is going to enable AGI algorithms,” he confidently suggests.

    The large data centers that are needed to train giant AI algorithms typically consist of racks filled with tens of thousands of computers running specialized silicon chips and a spaghetti of mostly electrical connections between them. Maintaining training runs for AI across so many systems—all connected by wires and switches—is a huge engineering undertaking. Converting between electronic and optical signals also places fundamental limits on chips’ abilities to run computations as one.

    Lightmatter’s approach is designed to simplify the tricky traffic inside AI data centers. “Normally you have a bunch of GPUs, and then a layer of switches, and a layer of switches, and a layer of switches, and you have to traverse that tree” to communicate between two GPUs, Harris says. In a data center connected by Passage, Harris says, every GPU would have a high-speed connection to every other chip.

    Lightmatter’s work on Passage is an example of how AI’s recent flourishing has inspired companies large and small to try to reinvent key hardware behind advances like OpenAI’s ChatGPT. Nvidia, the leading supplier of GPUs for AI projects, held its annual conference last month, where CEO Jensen Huang unveiled the company’s latest chip for training AI: a GPU called Blackwell. Nvidia will sell the GPU in a “superchip” consisting of two Blackwell GPUs and a conventional CPU processor, all connected using the company’s new high-speed communications technology called NVLink-C2C.

    The chip industry is famous for finding ways to wring more computing power from chips without making them larger, but Nvidia chose to buck that trend. The Blackwell GPUs inside the company’s superchip are twice as powerful as their predecessors but are made by bolting two chips together, meaning they consume much more power. That trade-off, in addition to Nvidia’s efforts to glue its chips together with high-speed links, suggests that upgrades to other key components for AI supercomputers, like that proposed by Lightmatter, could become more important.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleAnker’s folding Qi2 charger and other power banks are up to 30 percent off
    Next Article Apple opens the App Store to retro game emulators

    Related Posts

    A United Arab Emirates Lab Announces Frontier AI Projects—and a New Outpost in Silicon Valley

    May 30, 2025

    Why Anthropic’s New AI Model Sometimes Tries to ‘Snitch’

    May 30, 2025

    Donald Trump’s Media Conglomerate Is Becoming a Bitcoin Reserve

    May 29, 2025

    Businesses Got Squeezed by Trump’s Tariffs. Now Some of Them Want Their Money Back

    May 28, 2025

    There’s a Very Simple Pattern to Elon Musk’s Broken Promises

    May 28, 2025

    Freedom of the Press Foundation Threatens Legal Action if Paramount Settles With Trump Over ’60 Minutes’ Interview

    May 27, 2025
    Our Picks

    Panasonic’s 65-inch OLED TV is a great Father’s Day deal at $997

    June 2, 2025

    Behold, a Four-Burner Grill That’s Also a Griddle and a Pretty Good Pizza Oven

    June 2, 2025

    Nvidia’s Arm-powered gaming laptop could launch later this year with Alienware

    June 2, 2025

    Moving sucks, but decluttering helps

    June 2, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Science

    Want to Claim the Solar Tax Credit? Get Installing Now

    By News RoomJune 2, 2025

    This story originally appeared Grist and is part of the Climate Desk collaboration.For the last…

    The Verge’s favorite tools to help with a move

    June 2, 2025

    Jony Ive’s OpenAI device gets the Powell Jobs nod of approval

    June 2, 2025

    28 Years Later honors digital heritage with a 20-camera iPhone rig

    June 2, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.