Apple postponed the child protection features announced last month, including a controversial feature that would have scanned user photos for CSAM, after heavy criticism that the changes could reduce user privacy.
The changes were supposed to be implemented by the end of the year. She said The society in a statement: Last month, we announced plans for features targeted at Help protect children from predators by using communication tools to exploit them and reduce the spread of child pornography material.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to spend more time over the next few months to gather input and make improvements before we roll out these critical child safety features.”
This version describes in detail three major changes in the industry. A change to the search and Siri could point to resources to block CSAM if the user searches for information about them.
The other two changes were subject to further consideration. One of the changes could alert parents when their children receive or send sexually explicit images while blurring these images of children.
Read More About: Business News