London Underground is testing real-time AI surveillance tools to detect crime | Trending Viral hub


Transport body staff carried out “extensive simulations” at Willesden Green station during the trial to gather more training data, the documents say. These included staff members falling to the ground, and some of these tests were carried out when the station was closed. “You will see the BTP (British Transport Police) officer holding a machete and a gun in different locations within the station,” says a caption in the documents, although the images are redacted. During the trial, the records say, there were no alerts of incidents involving weapons at the police station.

Most of the alerts were issued for people potentially avoiding paying for their rides by jumping or crawling under closed entrance doors, pushing doors open, walking through open doors or following someone who paid. Fare evasion costs up to £130m a year, TfL saysand there were 26,000 fare evasion alerts during the test.

During all tests, images of people’s faces appeared blurred and data was retained for up to 14 days. However, six months after the trial began, TfL decided to blur images of faces when people were suspected of not paying, and kept that data for longer. It was originally planned, the documents say, for staff to respond to fare evasion alerts. “However, due to the large number of daily alerts (on some days, more than 300) and the high accuracy in detections, we configured the system to automatically recognize the alerts,” the documents say.

Birtwistle, of the Ada Lovelace Institute, says people expect “robust oversight and governance” when technologies like these are implemented. “If these technologies are to be used, they should only be used with public trust, consent and support,” says Birtwistle.

Much of the testing was aimed at helping staff understand what was happening at the station and respond to incidents. The 59 wheelchair alerts enabled staff at Willesden Green station, which does not have wheelchair access facilities, “to provide the necessary care and assistance”, the files say. Meanwhile, there were almost 2,200 alerts for people crossing the yellow safety lines, 39 for people leaning over the edge of the runway and almost 2,000 alerts for people sitting on a bench for prolonged periods.

“Throughout the PoC we have seen a huge increase in the number of public announcements made by staff, reminding customers to stay away from the yellow line,” the documents say. They also say the system generated alerts for “rough sleepers and beggars” at the station’s entrances and claim this allowed staff to “remotely monitor the situation and provide the necessary care and assistance.” TfL says the system has been trialled to try to help it improve the quality of staff at its stations and make it safer for passengers.

The files do not contain any analysis on the accuracy of the AI ​​detection system; However, at several points the detection had to be adjusted. “Object detection and behavior detection are generally quite fragile and are not foolproof,” says Access Nows’ Leufer. In one case, the system created alerts saying there were people in an unauthorized area when in fact the train drivers were exiting the train. Sunlight shining on the camera also makes it less effective, the documents say.


Check Also

Like moths to a flame? We may need a new phrase. | Trending Viral hub

[ad_1] It used to be that you could put a black light on the edge …

Scotland made big climate promises. They are now “out of reach.” | Trending Viral hub

[ad_1] Climate promises are difficult to keep. Scotland is the most recent, perhaps most surprising, …

Heavy rains cause rare flooding in Dubai | Trending Viral hub

[ad_1] Heavy rain lashed parts of the Middle East on Tuesday, closing schools in the …

Leave a Reply

Your email address will not be published. Required fields are marked *