Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    Yarbo says it will remove the intentional backdoor from its robot lawn mower

    Yarbo says it will remove the intentional backdoor from its robot lawn mower

    May 11, 2026
    OpenAI just released its answer to Claude Mythos

    OpenAI just released its answer to Claude Mythos

    May 11, 2026
    Joanna Stern is not a robot, but she lived with them

    Joanna Stern is not a robot, but she lived with them

    May 11, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Sora provides better control over videos featuring your AI self
    News

    Sora provides better control over videos featuring your AI self

    News RoomBy News RoomOctober 6, 20252 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Sora provides better control over videos featuring your AI self

    Sora now lets you rein in your AI doubles, giving you more say on how and where deepfake versions of you make an appearance on the app. The update lands as OpenAI hurries to show it actually cares about its users’ concerns as an all-too-predictable tsunami of AI slop threatens to take over the internet.

    The new controls are part of a broader batch of weekend updates meant to stabilize Sora and manage the chaos brewing in its feed. Sora is essentially “a TikTok for deepfakes,” a place to make 10-second videos of pretty much anything, including AI-generated versions of yourself or others (voice included). OpenAI calls these virtual appearances “cameos.” Critics call them a looming misinformation disaster.

    Bill Peebles, who heads the Sora team at OpenAI, said users can now restrict how AI-generated versions of themselves can be used in the app. For example, you could prevent your AI self from appearing in videos involving politics, stop it from saying certain words, or — if you hate mustard — stop it from showing up anywhere near the hellish condiment.

    OpenAI staffer Thomas Dimson said users can also add preferences for their virtual doubles, such as, for example, making them “wear a “#1 Ketchup Fan” ball cap in every video.”

    The safeguards are welcome, but history of AI-powered bots like ChatGPT and Claude offering up tips on explosives, cybercrime, or bioweapons suggests someone, somewhere will probably figure out a way around them. People already have skirted one of Sora’s other safety features, a feeble watermark. Peebles said the company is also “working on” improving that.

    Peebles said Sora will continue “to hillclimb on making restrictions even more robust,” and “will add new ways for you to stay in control” in the future.

    In the week since the app launched, Sora has been complicit in filling the internet with AI-generated slop. The loose cameo controls — pretty much a yes or no to groups like mutuals, people you approve, or “everyone” — were a particular problem. The unwitting star of the platform, none other than OpenAI CEO Sam Altman, illustrated the danger, appearing in a variety of mocking videos that show him stealing, rapping, or even grilling a dead Pikachu.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleGoogle’s Latest AI Ransomware Defense Only Goes So Far
    Next Article Why Are Car Software Updates Still So Bad?

    Related Posts

    Yarbo says it will remove the intentional backdoor from its robot lawn mower

    Yarbo says it will remove the intentional backdoor from its robot lawn mower

    May 11, 2026
    OpenAI just released its answer to Claude Mythos

    OpenAI just released its answer to Claude Mythos

    May 11, 2026
    Joanna Stern is not a robot, but she lived with them

    Joanna Stern is not a robot, but she lived with them

    May 11, 2026
    A million baby monitors and security cameras were easily viewable by hackers

    A million baby monitors and security cameras were easily viewable by hackers

    May 11, 2026
    Govee’s new portable smart lamp is on sale for the first time 

    Govee’s new portable smart lamp is on sale for the first time 

    May 11, 2026
    Who is the Palantir chore coat for?

    Who is the Palantir chore coat for?

    May 11, 2026
    Our Picks
    OpenAI just released its answer to Claude Mythos

    OpenAI just released its answer to Claude Mythos

    May 11, 2026
    Joanna Stern is not a robot, but she lived with them

    Joanna Stern is not a robot, but she lived with them

    May 11, 2026
    A million baby monitors and security cameras were easily viewable by hackers

    A million baby monitors and security cameras were easily viewable by hackers

    May 11, 2026
    Govee’s new portable smart lamp is on sale for the first time 

    Govee’s new portable smart lamp is on sale for the first time 

    May 11, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Who is the Palantir chore coat for? News

    Who is the Palantir chore coat for?

    By News RoomMay 11, 2026

    In late April, Palantir — the software company that, in recent years, has perhaps become…

    Apple brings encrypted RCS chats to iPhone

    Apple brings encrypted RCS chats to iPhone

    May 11, 2026
    Google stopped a zero-day hack that it says was developed with AI

    Google stopped a zero-day hack that it says was developed with AI

    May 11, 2026
    GM settles California lawsuit claiming it sold driving habit data to insurance companies

    GM settles California lawsuit claiming it sold driving habit data to insurance companies

    May 11, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.