The AFP wants images of young people to assist training its child abuse-detection algorithm
Monash University has partnered with the Australian Federal Police (AFP) to help tackle child abuse.
The pair will create an “ethically-sourced” database of images that can train artificial intelligence algorithms to detect child exploitation.
The project, an initiative of the AiLECS Laboratory – a collaboration between the Faculty of Information Technology at Monash University and the Australian Federal Police will try to collect at least 100,000 images from the community over the next six months.
AiLECS researchers are asking individuals aged 18 and over to donate images of themselves as children to assist populating the My Pictures Matter crowdsourcing campaign database
AiLECS Lab Co-Director Associate Professor Campbell Wilson said machine learning models that are trained on images of people are often fed images that are scraped off the internet or without documented consent for their use.
Associate Professor Wilson said, “To develop AI that can identify exploitative images, we need a very large number of children’s photographs in everyday ‘safe’ contexts that can train and evaluate the AI models intended to combat child exploitation,”
To preserve the privacy of the contributors, the email addresses used to send the images will be kept separate.
“Sourcing these images from the internet is problematic when there is no way of knowing if the children in those pictures have actually consented for their photographs to be uploaded or used for research,” said Professor Wilson
“By obtaining photographs from adults, through informed consent, we are trying to build technologies that are ethically accountable and transparent.” he said
Comprehensive strategies to store and use the data while preserving the privacy of those depicted in the images has also been been developed by AiLECS researchers
Individuals who contributed photos can get details and updates on each stage of the research. They can also decide to change usage permissions or revoke their research consent and remove photos from the database.
AiLECS Lab research is funded by Monash University, the Australian Federal Police and the Westpac Safer Children Safer Communities Scholarship Program.
In 2020, AFP admitted that it had briefly tested Clearview AI, a controversial facial recognition tool that allows you to search a database of images that have been scraped off the Internet.
It was one of four police agencies in Australia – along with Victoria, Queensland and South Australia – and 2,200 worldwide reported using the platform.
Conducted by the AFP-led Australian Centre to Combat Child Exploitation (ACCCE) the “Limited Pilot” was to see if it could be used in child exploitation investigations.
In 2021, the AFP-run Australian Centre for Combating Child Exploitation received more than 33,000 reports of child exploitation online, and each report may contain large amounts of images and videos of children being sexually assaulted or used for the gratification of be exploited by criminals.
By the end of 2022, researchers are aiming for a large enough database of at least 100,000 ethically sourced images to train the AI algorithm through the My Pictures Matter campaign.