NEC insists its face-recog training dataset isn’t biased, but refuses to share details of Neoface system with UK court

Facial-recognition technology used by British police forces does not rely on trawling the internet for random face photos to use as training data, an NEC manager told the courts.

The statement was referred to in the Court of Appeal last week by South Wales Police’s barrister Jason Beer QC and obtained by The Register yesterday.

“Our biometric templates are unique to NEC and are not portable between vendors,” said Paul Roberts, head of global facial recognition at NEC Global subsidiary Northgate Public Services.

NEC also refused to tell even the British courts where it obtained its facial-recognition training dataset or to give details about its makeup.

Roberts’ witness statement was deployed by South Wales Police (SWP) as part of its defence against a legal challenge brought by human rights pressure group Liberty, as reported at length last week.

Responding to two written statements made by an American expert witness on Liberty’s behalf, Roberts denied that Neoface Watch (rebranded by police workers as “AFR Locate”) incorporates machine-learning technology for continual evolution during and after deployments.

Instead, he said: “We train our Neoface algorithm in our laboratories and, on a typically annual basis, release a new version of the algorithm containing improvements ranging from, among other things, additional training.”

Liberty’s expert, Dr Anil Jain, had made the opposite claim. An academic and consultant for NEC, he had told the court: “Typically, all state-of-the-art [automated facial recognition] systems, such as AFR Locate, employ a deep learning network… When given a large number of face images of different individuals under various PIE and occlusion conditions, the network automatically learns patterns in the Features that best represent the person…”

You know nothing, go away

The two engaged in a revealing proxy war of words that helped push some further details of the latest police gadget into the public domain.

NEC’s man told the court that Liberty’s expert only had direct knowledge of “an older version of the Neoface algorithm SDK” that was used “for internal testing”. He added: “In any event the facial recognition product utilised in the United States and in which Dr Jain is involved on a consultancy basis, is not [Neoface Watch].”

Roberts was saying that Jain could only give the court generalised descriptions of how different AFR software deployed in a foreign country worked. If this is accurate, a significant part of Liberty’s evidence against the creepy cameras may have lost its force.

Nonetheless, Jain was not to be cowed. In a follow-up to Roberts he told the court: “I cannot comment on whether AFR Locate has a discriminatory impact as I do not have access to the datasets on which the system is trained and therefore cannot analyse the biases in those datasets. For the same reason, the Defendant [South Wales Police] is not in a position to evaluate the discriminatory impact of AFR Locate.”

We’re telling you nothing, now go away

Roberts also refused to explain to the court what the makeup of NEC’s training datasets was, claiming these were “commercially sensitive”. Jain built on this, saying that SWP’s solicitors had told him “that NEC/Northgate are not prepared to disclose summary statistics (in terms of gender, race/ethnicity, age) or any empirical evaluation of the Neoface Algorithm training dataset”. The force is said to have asked NEC to answer questions about where it got the training dataset from, which, as Jain pointed out, suggested the force didn’t know the answer itself, making it “difficult for SWP to confirm whether the technology is in fact biased”.

In terms of function, Jain asserted that SWP was using a Bosch Mic Starlight 7000 HD camera to feed Neoface Watch, running at 1080x1920p, something likely to be echoed by Metropolitan Police, which is the deploying Neoface Watch in London from not-quite-unmarked vans. Back in January an NEC spokesman with the Met told the nation’s press that the technology had a 70 per cent accuracy rate – an interesting contrast between the police’s courtroom claims of 78 per cent accuracy in Cardiff and police-sponsored tests establishing a 98 per cent inaccuracy rate when used against innocent Londoners. That test was carried out later in the same year as the Cardiff deployment.

Although intended to be a landmark case that sets the law on the technology’s use against the general public, the Court of Appeal judges hearing last week’s case seemed pointedly uninterested in wider legal and societal issues raised by the Cardiff AFR deployment. The Met’s assistant commissioner, Nick Ephgrave, previously praised judges for their “very helpful” rulings in favour of police earlier in this case.

Judgment from the Court of Appeal is expected later this year. ®

Follow me for more information.

Uncategorized

Questions or Comments?

Product categories

Post

July 2020
SMTWTFS
 1234
567891011
12131415161718
19202122232425
262728293031 
X