Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    33 practical smart home gifts that make everyday life a little easier

    33 practical smart home gifts that make everyday life a little easier

    December 12, 2025
    We found 70 stocking stuffers under 0 that are actually useful

    We found 70 stocking stuffers under $100 that are actually useful

    December 12, 2025
    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    December 12, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Researchers trained an OpenAI rival in half an hour for less than $50
    News

    Researchers trained an OpenAI rival in half an hour for less than $50

    News RoomBy News RoomFebruary 6, 20252 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Researchers trained an OpenAI rival in half an hour for less than

    To do this, researchers at Stanford and the University of Washington used a method known as distillation — which allows smaller models to draw from the answers produced by larger ones — to refine s1 using answers from Google’s AI reasoning model, Gemini 2.0 Flash Thinking Experimental. Google’s terms of service note that you can’t use Gemini’s API to “develop models that compete with” the company’s AI models. The Verge reached out to Google with a request for comment but didn’t immediately hear back.

    The researchers based s1 on Qwen2.5, an open-source model from Alibaba Cloud. They initially started with a pool of 59,000 questions to train the model on, but found that the larger data set didn’t offer “substantial gains” over a whittled-down set of just 1,000. The researchers say they trained the model on just 16 Nvidia H100 GPUs.

    The s1 model also uses a technique called test-time scaling, allowing the model to “think” for a longer amount of time before producing an answer. As noted in the paper, researchers forced the model to continue reasoning by adding “Wait” to the model’s response. “This can lead the model to doublecheck its answer, often fixing incorrect reasoning steps,” the paper says.

    OpenAI’s o1 reasoning model uses a similar approach, something the buzzy AI startup DeepSeek sought to replicate with the launch of its R1 model that it claims was trained at a fraction of the cost. OpenAI has since accused DeepSeek of distilling information from its models to build a competitor, violating its terms of service. As for s1, the researchers claim that s1 “exceeds o1-preview on competition math questions by up to 27%.”

    The rise of smaller and cheaper AI models threatens to upend the entire industry. They could prove that major companies like OpenAI, Microsoft, Meta, and Google don’t need to spend billions of dollars training AI, while building massive data centers filled with thousands of Nvidia GPUs.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleTariff Uncertainty Taxes the Auto Industry
    Next Article What we know about President Elon’s government takeover

    Related Posts

    33 practical smart home gifts that make everyday life a little easier

    33 practical smart home gifts that make everyday life a little easier

    December 12, 2025
    We found 70 stocking stuffers under 0 that are actually useful

    We found 70 stocking stuffers under $100 that are actually useful

    December 12, 2025
    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    December 12, 2025
    Mmm, Qi donuts

    Mmm, Qi donuts

    December 12, 2025
    Google Translate brings real-time speech translations to any headphones

    Google Translate brings real-time speech translations to any headphones

    December 12, 2025
    How to vibe-write a country hit

    How to vibe-write a country hit

    December 12, 2025
    Our Picks
    We found 70 stocking stuffers under 0 that are actually useful

    We found 70 stocking stuffers under $100 that are actually useful

    December 12, 2025
    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    iOS 26.2 is here with Liquid Glass, AirDrop, and Apple Music updates

    December 12, 2025
    Mmm, Qi donuts

    Mmm, Qi donuts

    December 12, 2025
    Google Translate brings real-time speech translations to any headphones

    Google Translate brings real-time speech translations to any headphones

    December 12, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    How to vibe-write a country hit News

    How to vibe-write a country hit

    By News RoomDecember 12, 2025

    You may not even know it, but you’ve almost certainly encountered songs made mostly or…

    The TCL QM9K is excellent, but not much more than the QM8K

    The TCL QM9K is excellent, but not much more than the QM8K

    December 12, 2025
    We’re still talking about the Trump phone

    We’re still talking about the Trump phone

    December 12, 2025
    I quit all my AI fitness plans, and I feel free

    I quit all my AI fitness plans, and I feel free

    December 12, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.