Two weeks ago, Apple Inc unveiled some new documents related to its ongoing security threat review of child sexual abuse material, or CSAM, scanning and detection initiative. Essentially, it was about preventing child abuse material from being used on iPhones and iPads. The documents had new details about how the tech supremo had designed its CSAM detection system to combat child pornography.

CSAM scanning aims to identify images of child sexual abuse using a process called hashing, which turns images into numbers. Apple announced its CSAM initiatives at the same time as its parental controls for young children viewing explicit photos or enhanced Messages Safety initiatives.

You might think almost every rational human being finds CSAM abhorrent. But in the cut-throat world of Big Tech where five giants worth between US$1 trillion ($1.36 trillion) and US$2.5 trillion are battling for dominance, even keeping CSAM off your smartphone has become an excuse to hammer rivals and bring them down a notch or two.

To continue reading,

Sign in to access this Premium article.

Subscription entitlements:

Less than $9 per month
3 Simultaneous logins across all devices
Unlimited access to latest and premium articles
Bonus unlimited access to online articles and virtual newspaper on The Edge Malaysia (single login)

Related Stories

Stay updated with Singapore corporate news stories for FREE

Follow our Telegram | Facebook