Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Top HP Coupon Codes for May

    May 14, 2025

    Republicans push for a decadelong ban on states regulating AI

    May 14, 2025

    Paul McCartney and Dua Lipa call on the UK to pass AI copyright transparency law

    May 13, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » London Underground Is Testing Real-Time AI Surveillance Tools to Spot Crime
    Business

    London Underground Is Testing Real-Time AI Surveillance Tools to Spot Crime

    News RoomBy News RoomFebruary 9, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    In response to WIRED’s Freedom of Information request, the TfL says it used existing CCTV images, AI algorithms, and “numerous detection models” to detect patterns of behavior. “By providing station staff with insights and notifications on customer movement and behaviour they will hopefully be able to respond to any situations more quickly,” the response says. It also says the trial has provided insight into fare evasion that will “assist us in our future approaches and interventions,” and the data gathered is in line with its data policies.

    In a statement sent after publication of this article, Mandy McGregor, TfL’s head of policy and community safety, says the trial results are continuing to be analyzed and adds, “there was no evidence of bias” in the data collected from the trial. During the trial, McGregor says, there were no signs in place at the station that mentioned the tests of AI surveillance tools.

    “We are currently considering the design and scope of a second phase of the trial. No other decisions have been taken about expanding the use of this technology, either to further stations or adding capability.” McGregor says. “Any wider roll out of the technology beyond a pilot would be dependent on a full consultation with local communities and other relevant stakeholders, including experts in the field.”

    Computer vision systems, such as those used in the test, work by trying to detect objects and people in images and videos. During the London trial, algorithms trained to detect certain behaviors or movements were combined with images from the Underground station’s 20-year-old CCTV cameras—analyzing imagery every tenth of a second. When the system detected one of 11 behaviors or events identified as problematic, it would issue an alert to station staff’s iPads or a computer. TfL staff received 19,000 alerts to potentially act on and a further 25,000 kept for analytics purposes, the documents say.

    The categories the system tried to identify were: crowd movement, unauthorized access, safeguarding, mobility assistance, crime and antisocial behavior, person on the tracks, injured or unwell people, hazards such as litter or wet floors, unattended items, stranded customers, and fare evasion. Each has multiple subcategories.

    Daniel Leufer, a senior policy analyst at digital rights group Access Now, says whenever he sees any system doing this kind of monitoring, the first thing he looks for is whether it is attempting to pick out aggression or crime. “Cameras will do this by identifying the body language and behavior,” he says. “What kind of a data set are you going to have to train something on that?”

    The TfL report on the trial says it “wanted to include acts of aggression” but found it was “unable to successfully detect” them. It adds that there was a lack of training data—other reasons for not including acts of aggression were blacked out. Instead, the system issued an alert when someone raised their arms, described as a “common behaviour linked to acts of aggression” in the documents.

    “The training data is always insufficient because these things are arguably too complex and nuanced to be captured properly in data sets with the necessary nuances,” Leufer says, noting it is positive that TfL acknowledged it did not have enough training data. “I’m extremely skeptical about whether machine-learning systems can be used to reliably detect aggression in a way that isn’t simply replicating existing societal biases about what type of behavior is acceptable in public spaces.” There were a total of 66 alerts for aggressive behavior, including testing data, according to the documents WIRED received.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleDisney’s big bets on sports, streaming, and Fortnite
    Next Article Why can’t I buy a refillable version of my favorite pen in the US?

    Related Posts

    Trump Appointees Blocked From Entering US Copyright Office

    May 13, 2025

    My X Account Was Hijacked to Sell a Fake WIRED Memecoin. Then Came the Backlash

    May 12, 2025

    Buy Now or Pay More Later? ‘Macroeconomic Uncertainty’ Has Shoppers Anxious

    May 12, 2025

    Donald Trump’s UK Trade Deal Could Secure Jaguar’s Resurrection

    May 9, 2025

    Singapore’s Vision for AI Safety Bridges the US-China Divide

    May 9, 2025

    A ‘Trump Card Visa’ Is Already Showing Up in Immigration Forms

    May 8, 2025
    Our Picks

    Republicans push for a decadelong ban on states regulating AI

    May 14, 2025

    Paul McCartney and Dua Lipa call on the UK to pass AI copyright transparency law

    May 13, 2025

    Apple TV’s wireless audio sync now works with Dolby Atmos

    May 13, 2025

    Meta’s beef with the press flares at its antitrust trial

    May 13, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Elon Musk’s apparent power play at the Copyright Office completely backfired

    By News RoomMay 13, 2025

    What initially appeared to be a power play by Elon Musk and the Department of…

    DJI said Mavic 4 Pro wouldn’t launch in US but these stores are selling it anyhow

    May 13, 2025

    Judge slams lawyers for ‘bogus AI-generated research’

    May 13, 2025

    You can now preorder the wireless GameCube controller for Switch 2

    May 13, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.