- Advertisement -

- Advertisement -

OHIO WEATHER

London Underground Is Testing Real-Time AI Surveillance Tools to Spot Crime


Staff at the transportation body ran “extensive simulations” at Willesden Green station during the trial to gather more training data, the documents say. These included members of staff falling on the floor, and some of these tests happened when the station was closed. “You will see the BTP [British Transport Police] officer holding a machete and handgun in different locations within the station,” one caption in the documents state, although the images are redacted. During the trial, the files say, there were no alerts for weapons incidents at the station.

The most alerts were issued for people potentially avoiding paying for their journeys by jumping over or crawling under closed fare gates, pushing gates open, walking through open gates, or tailgating someone who paid. Fare dodging costs up to £130 million per year, TfL says, and there were 26,000 fare evasion alerts during the trial.

During all of the tests, images of people’s faces were blurred and data was kept for a maximum of 14 days. However, six months into the trial, the TfL decided to unblur the images of faces when people were suspected of not paying, and it kept that data for longer. It was originally planned, the documents say, for staff to respond to the fare dodging alerts. “However, due to the large number of daily alerts (in some days over 300) and the high accuracy in detections, we configured the system to auto-acknowledge the alerts,” the documents say.

Birtwistle, from the Ada Lovelace Institute, says that people expect “robust oversight and governance” when technologies like these are put in place. “If these technologies are going to be used, they should only be used with public trust, consent and support,” Birtwistle says.

A large part of the trial was aimed at helping staff understand what was happening at the station and respond to incidents. The 59 wheelchair alerts allowed staff at Willesden Green station, which does not have access facilities for wheelchairs, to “provide the necessary care and assistance,” the files say. Meanwhile, there were almost 2,200 alerts for people going beyond yellow safety lines, 39 for people leaning over the edge of the track, and almost 2,000 alerts for people sitting on a bench for extended periods.

“Throughout the PoC we have seen a huge increase in the number of public announcements made by staff, reminding customers to step away from the yellow line,” the documents say. They also say the system generated alerts for “rough sleepers and beggars” at the station’s entrances and claim this allowed staff to “remotely monitor the situation and provide the necessary care and assistance.” TfL states the system was trialed to try to help it improve the quality of staffing at its stations and make it safer for passengers.

The files do not contain any analysis of how accurate the AI detection system is; however, at various points, the detection had to be adjusted. “Object detection and behavior detection are generally quite fragile and are not foolproof,” Leufer, of Access Nows, says. In one instance, the system created alerts saying people were in an unauthorized area when in reality train drivers were leaving the train. Sunlight shining onto the camera also made them less effective, the documents say.



Read More: London Underground Is Testing Real-Time AI Surveillance Tools to Spot Crime

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Thank you for subscribing.

Something went wrong.