Facebook Inc. said the company's moderator over the past three months has removed 8.7 million sexual photos of children with the help of undisclosed software. The device automatically marks photos of child-age users that contain sexual elements
The use of the software was implemented over the past year to identify sexual photos or images. This is in line with Facebook's firmness in enforcing a ban on uploading sexual photos by children.
As reported by Reuters on Thursday (10/25), Facebook's Head of Global Security, Antigone Davis said, the software is very helpful for companies in prioritizing handling content together. He said, he was trying to apply the software to Instagram.
Under pressure from regulators and US lawmakers, Facebook has promised to accelerate the elimination of extremist and forbidden content. Machine learning programs that filter billions of pieces of content sent by users every day are very important.
However, the software is not perfect. Some news agencies and advertisers complained to Facebook that their account was blocked by the device. Davis said the security system for children would indeed make a mistake but the injured users could appeal.
"We prefer to make mistakes for the sake of children," Davis said.
He said, Facebook for years had even banned photos of families and children in sexy clothes even though there was no bad intention. However, even though it is still feared that such photos will be misused.