AI weapons scanner backtracks on UK testing claims


An AI weapons-scanning company has backtracked on claims its technology has been tested by the UK government.

Evolv Technology makes “intelligent” scanners designed to replace metal detectors by identifying people with concealed guns, knives and bombs.

But the company has come under mounting criticism for overstating what the technology can deliver.

Evolv told BBC News it had altered its claims about UK testing to “better reflect the process taken”.

The Federal Trade Commission (FTC) is investigating its “marketing practices”. The Securities Exchange Commission (SEC) launched an investigation into the company last month.

As well as in many large stadiums and hundreds of schools in the US, Evolv scanners are used in the Manchester Arena.

The company had said that its AI weapons scanner had been tested by the UK Government’s National Protective Security Authority (NPSA)

On 20 February the company put out a press release, including a claim that the NPSA was one of a number of testers who had “concluded that the Evolv Express solution was highly effective at detecting firearms and many other types of weapons”.

But BBC News can reveal the NPSA does not do this type of testing.

When BBC News put this to Evolv, the company said: “After discussion with NPSA, we updated the language used in the February 20 press release to better reflect the process taken.”

Instead, it said: an independent company had “tested and validated” Evolv’s technology, using NPSA standards.

But the UK company that did this testing, Metrix NDT, told BBC News it was “not correct to say we ‘validated’ the system”.

‘Close scrutiny’

Metrix NDT managing director Nick Fox told BBC News that Evolv’s system had indeed been tested against NPSA specifications.

But when asked if Metrix NDT had found it “highly effective at detecting firearms and many other types of weapons”, he said: “It is not within our remit to pass any value judgements on the results.”

Evolv told the BBC that in addition to those results, Evolv makes available to any serious prospective customer full third-party testing reports for detection performance.

Prof Marion Oswald, who was on the government’s Centre of Data Ethics and Innovation advisory board until last year, told BBC News it was worrying the technology was replacing “tried and tested” security options.

“It does highlight the need for really close scrutiny and potential additional regulation of companies making these types of claims,” she told BBC News.

And she worried how customers might be influenced, “especially if claims are being made about how certain government bodies may have been involved”.

Image caption,

An Evolv screen that alerts users to potential threats

Evolv has previously said its technology detects the “signatures” of concealed weapons.

“Metallic composition, shape, fragmentation – we have tens of thousands of these signatures, for all the weapons that are out there,” chief executive Peter George said, in 2021, “all the guns, all the bombs and all the large tactical knives.”

But the company has faced criticism it cannot reliably detect knives or bombs. Evolv now says it can detect “many types of knives and some explosives”.

In 2022, following a Freedom of Information request by the security-analysis company Internet Protocol Video Market (IPVM), BBC News revealed that testing by a US facility had found Evolv’s technology could not consistently detect knives and certain types of bombs.

Evolv should inform potential clients, the testers said.

But during that investigation, in August 2022, Evolv had also told BBC News the NPSA (then called the Centre for the Protection of National Infrastructure CPNI) had tested its system. “We have tested with the UK CPNI,” a representative told BBC News.

A Home Office official said: “We are looking to further understand the capabilities of weapons-detection equipment.”

Image caption,

Hundreds of US schools use Evolv scanners

Evolv also amended another claim in its 20 February press release. Evolv had initially referred to the designation of its technology under the US Department of Homeland Security Safety (DHS) Act as an example of recent “third party testing”.

This was later changed so as to reflect the fact that this designation does not involve the conduct of a new test by the DHS but is an evaluation of other evidence.

In May last year, BBC News revealed further details of a stabbing in a New York school that used Evolv scanners.

Proctor High School’s then superintendent, Brian Nolan, said: “Through investigation, it was determined the Evolv Weapon Detection System… was not designed to detect knives.”

The victim is suing Evolv and the scanners were replaced by 10 metal detectors.

‘Deeply regret’

The company has changed the front of its website many times.

Initially claiming its goal was to create “weapons-free zones”, the website now says its mission is to create “safer experiences”.

Last year, the company said it regretted any confusion around the capabilities of its technology.

“We wholeheartedly believe in our technology and our mission and deeply regret if any of our past statements confused or appeared to generalise our capabilities at the time,” it said.

But questions remain as to what Evolv has previously told customers its technology is capable of and the testing it has gone through.



Source link

One thought on “AI weapons scanner backtracks on UK testing claims

Leave a Reply

Your email address will not be published. Required fields are marked *