Ein europäischer Fluggast hat gegen das BKA geklagt: Dieses solle seine Fluggastdaten nicht speichern, verarbeiten und übermitteln. Das Verwaltungsgericht Wiesbaden lehnte die
The latest Tweets from Privacy Project (@PrivacyProject). The New York Times Opinion Section’s ongoing examination of privacy
What did we find? The big story is as you’d expect: that everything you do online is logged in obscene detail, that you have no privacy. And yet, even expecting this, I was bowled over by the scale and detail of the tracking; even for short stints on the web, when I logged into Invasive Firefox just to check facts and catch up on the news, the amount of information collected about my endeavors was staggering.
Die Vorratsdatenspeicherung von Kennzeichen in Brandenburg ist illegal und muss beendet werden. Das sagt ein unterdrücktes Gutachten des Innenministeriums, das wir veröffentlichen. Die Polizei sammelt seit zwei Jahren ohne zu löschen – es gäbe noch mehr, aber jemand hat den Computer falsch bedient.
auf seinem Blog
A data leak of biometric data, surprise surprise. What about you change your password?
Technology is spying on us and machines are developing human voices.
A journalist and a data scientist secured data from three million users easily by creating a fake marketing company, and were able to de-anonymise many user
During the course of a week of testing, Fowler ran into 5,400 trackers, mostly found within apps, which Disconnect told him would likely send 1.5 gigabytes of data over the course of a month.
The most secure collaboration platform.
Secure messaging, file sharing, voice calls and video conferences. All protected with end-to-end encryption.
A man was pulled to one side, grilled, and fined by cops after he hid his face from a facial-recognition system being tested on the streets of south east England.
London's Metropolitan Police was at the time running public tests of AI-powered equipment that takes photos of people out and about in the capital, and runs the pics through an image database of Brits on a watch list looking for a match.
Our research, which will be reviewed for publication this summer, indicates that the U.S. government, researchers, and corporations have used images of immigrants, abused children, and dead people to test their facial recognition systems, all without consent. The very group the U.S. government has tasked with developing best practices and standards for the artificial intelligence industry, which includes facial recognition software and tools, is perhaps the worst offender when it comes to using images sourced without the knowledge of the people in the photographs.*
IBM created its own Diversity in Faces dataset, which was made up of a million labelled Creative-Commons-licensed photos downloaded from Flickr.