Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Generally I agree with everything you said. There is a subtle and important distinction here though. Encryption rarely plays a role in the distribution of CSAM, yes, but a more concerning part in my mind is the role encryption plays in the active abuse of children.

To be very clear, I think there are ways to design applications around encryption to prevent the active abuse on platforms. For example, disabling encryption on minors accounts, limiting messages from people outside of your social circle, allowing opt-in so parents can read messages from their kids, AI models to display a prompt to the child that something may be harmful and they should seek their parents etc.

The biggest disappointment I had with Apple about this CSAM scanning business, is that they were not talking about or taking steps in preventing the active abuse of children. I would love to see regulation put in for the design of apps when a minor uses them. A small part of me is worried about poor regulations, but then we have things like Kik.



You don't understand how CSAM scanning works. This requires well known hashes of pedophiliac content - meaning it will do nothing for a novel exploitation of a child.

You can also just have your kid not use encrypted apps or not allow them to message internet strangers. That is probably the wrong thing to do - blanket bans are a bad way to teach life skills - bit you as a parent can do so. You don't need government help.


Some of what you are suggesting is exactly what Apple has said that they are implementing in iMessage for child accounts. In particular, the AI notifications to children that the message that they are receiving or sending may contain csam and that they should reconsider.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: