Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There is a third concern besides scale and linkage -- errors and accountability. Sometimes these services are wrong.

When they are wrong, and people get arrested, go to jail, or pay thousands in legal fees to clear themselves...there is no accountability. Sometimes, the law enforcement who use the systems rely ONLY on these systems, not on the underlyer data (e.g. images) to validate an arrest.

Minimally, the services should have to produce an original image/video when surfacing information that will be used for legal action. Secondly, the services should be held financially responsible for legal costs when mistakes are made.

https://www.nytimes.com/2020/12/29/technology/facial-recogni...



I’ll add a fourth: these services will inevitably be hacked, their data added to multiple darkweb data dumps, and circulated among nation states, hackers, and advertisers for the rest of time.


If an eye-witness makes an erroneous identification, or a police officer misunderstands a piece of evidence, or a witness for the prosecution is coerced into giving false testimony, is that any different? In each case we're talking about information that turned out to wrong. This is just new information in the world. Sometimes it will be wrong or sometimes we will misunderstand it -- just like all other kinds of information.


> Minimally, the services should have to produce an original image/video when surfacing information that will be used for legal action

Even better would be an image that is cryptographically signed by the sensor that captured it.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: