Apple’s CSAM Scanner has been reverse-engineered.. this is dangerous.

#1

Christian Lowe

Socialism Sucks
Joined
Oct 26, 2011
Messages
10,058
Likes
27,784
#1
https://www.washingtonpost.com/opin...-abuse-encryption-security-privacy-dangerous/

Privacy is dead.

“Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we’re also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn’t read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

After many false starts, we built a working prototype. But we encountered a glaring problem.

Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook, and Twitter for not removing pro-democracy protest materials.

We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.

That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”
 
Last edited:
  • Like
Reactions: tbwhhs
#2
#2
Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we’re also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.

We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn’t read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

After many false starts, we built a working prototype. But we encountered a glaring problem.

Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.

[bold]A foreign government could, for example, compel a service to out people sharing disfavored political speech. That’s no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook, and Twitter for not removing pro-democracy protest materials.[/bold]

We were so disturbed that we took a step we hadn’t seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides. We’d planned to discuss paths forward at an academic conference this month.

That dialogue never happened. The week before our presentation, Apple announced it would deploy its nearly identical system on iCloud Photos, which exists on more than 1.5 billion devices. Apple’s motivation, like ours, was to protect children. And its system was technically more efficient and capable than ours. But we were baffled to see that Apple had few answers for the hard questions we’d surfaced.
 
#5
#5
... the best of intentions.
The road to hell is paved with them.

It’s amazing to me that people are so short sighted that they don’t see where this can go horrifically wrong. Who knew hiding under the guise of saving children could enable the NSA program on steroids to release to little backlash.
 
#6
#6
The road to hell is paved with them.

It’s amazing to me that people are so short sighted that they don’t see where this can go horrifically wrong. Who knew hiding under the guise of saving children could enable the NSA program on steroids to release to little backlash.
The ABC agencies have been doing this for a while. NSA has admitted accidentally keeping hundreds of thousands of terabytes of Americans communications. I’m sure after scanning and keeping what they need they deleted it all. Uh huh.
I’m fascinated when people don’t know everything they do, that has a computer trail, is observed or tracked.
 
#7
#7
The ABC agencies have been doing this for a while. NSA has admitted accidentally keeping hundreds of thousands of terabytes of Americans communications. I’m sure after scanning and keeping what they need they deleted it all. Uh huh.
I’m fascinated when people don’t know everything they do, that has a computer trail, is observed or tracked.
Like crime shows where they track the killers movements like a connect-the-dots because the idiot took his iphone everywhere with him?
 
#8
#8
The road to hell is paved with them.

It’s amazing to me that people are so short sighted that they don’t see where this can go horrifically wrong. Who knew hiding under the guise of saving children could enable the NSA program on steroids to release to little backlash.
You have people that willingly post their personal business on FB and IG. Posting live away from home so that burglars have the opportunity to rob them, photos of their kids so that molesters can identify them... all kinds of nonsense. So it wouldn't be surprising to me to see the general public not get in a fuss about Apple doing this.
 
#9
#9
The road to hell is paved with them.

It’s amazing to me that people are so short sighted that they don’t see where this can go horrifically wrong. Who knew hiding under the guise of saving children could enable the NSA program on steroids to release to little backlash.

Business is full of people who say "make this happen", and they don't like people who object or tell them it won't work.
 
#10
#10
Business is full of people who say "make this happen", and they don't like people who object or tell them it won't work.

Apple especially seems to have an issue keeping people who are prone to saying "That's a bad idea". If it's not CSAM,, it's their logic board video converter design. Or their TriStar charging regulator chip in the iPhones.

Do they even have a risk management team to check this stuff out ine the front end should?
 
#11
#11
Apple especially seems to have an issue keeping people who are prone to saying "That's a bad idea". If it's not CSAM,, it's their logic board video converter design. Or their TriStar charging regulator chip in the iPhones.

Do they even have a risk management team to check this stuff out ine the front end should?

I've been under the impression that companies just don't care. Most stuff they'll just weather with little blowback, and they find ways to mitigate the bigger problems ... some are really innovative.

It's different, but years ago Westinghouse got caught screwing over utilities on nuclear fuel charges relating to a uranium cartel (I've forgotten the exact details). The utilities sued and won; the settlement was a work of art. The utilities would receive discounts on replacement parts and spares sold by Westinghouse. So to recover damages, the utilities had to buy more stuff from Westinghouse.
 

VN Store



Back
Top