
Speakeasy Security
Speakeasy Security
Apple to begin reporting Child Sexual Abuse Material (CSAM)
Apple recently announced it will begin reporting Child Sexual Abuse Material (CSAM) to law enforcement with the latest iOS 15 update. The new system aims to identify images using a process called hashing, which turns images into numbers. On this episode, we discuss how Apple’s new system will work and how this bold step in combating Child Sexual Abuse is being received by privacy-sensitive users around the world.
Links:
Apple to combat Child Sexual Abuse Material: https://www.cnbc.com/2021/08/05/apple-will-report-child-sexual-abuse-images-on-icloud-to-law.html
National Center for Missing Exploited Children (NCMEC): Home (missingkids.org)
Internet Watch Foundation (IWF): Homepage | Internet Watch Foundation (iwf.org.uk)
This podcast is for informational purposes only and is not intended to replace professional legal, financial or insurance advice. We are not responsible for any losses, damages, or liabilities that may arise from the use of this podcast. The content and views expressed are those of the host and guests.