Two weeks ago, Apple Inc unveiled some new documents related to its ongoing security threat review of child sexual abuse material, or CSAM, scanning and detection initiative. Essentially, it was about preventing child abuse material from being used on iPhones and iPads. The documents had new details about how the tech supremo had designed its CSAM detection system to combat child pornography.

CSAM scanning aims to identify images of child sexual abuse using a process called hashing, which turns images into numbers. Apple announced its CSAM initiatives at the same time as its parental controls for young children viewing explicit photos or enhanced Messages Safety initiatives.

Have a premium account? Sign in to continue reading.

Unlimited access to all stories from $99.9/year*

The latest reporting and analysis from business and investments to news and views on social issues.

Bonus:

  • Simultaneous logins across all devices
  • Instant access to past digital issues
  • Unlimited access to The Edge Malaysia
  • *For annual subscription plan only. T&Cs apply

Subscribe

Stay updated with Singapore corporate news stories for FREE

Follow our Telegram | Facebook