in

AI to be Used for Tackling Child Abuse

This year marks the thirtieth anniversary of the United Nations Convention on the Rights of Children (UNCRC). While much progress has been made to promote our children’s rights in the real world, safeguarding children’s rights online has been seriously neglected.

The exponential rise of the online sexual abuse of children over the last two decades is nothing short of a humanitarian crisis.

Child abuse is a crime and the unfortunate reality is that the capabilities of the people doing harm online are growing rapidly. On the flip side, legislation and court precedents are slow to keep up. This has led to inconsistent responses over the years that have failed to shape consistent policy.

To improve the situation and bring child sexual exploitation online under control, we need to understand which responses have worked and those that have backfired.

Increasingly, law enforcement and industry are using Artificial Intelligence (AI) to address these problems.

The applications are still in their infancy, but experts are convinced that AI has significant potential to address the challenge. This includes not just detecting (e.g. identifying child porn or identifying grooming behavior) and replying to uncertain behavior in a timely and targeted manner (e.g. emailing you when a potential victim is acting in a way that may indicate abuse is happening or could be about to occur), but also enabling the law enforcement to prioritize and evolve its responses to the online abuse of children.

It is the goal of the International Centre for Missing and Exploited Children (ICMEC) to assist in the evolution of AI solutions to manage the online sexual exploitation of children. To that end, ICMEC hosts the multi-stakeholder, digital child abuse response team. In addition to our work with law enforcement and corporate partners, we publish an annual report on the cumulative number of suspected online child sexual abuse that is intercepted by our members’ law enforcement and industry personnel.

In 2015, over 32,000 reports of suspected child sexual abuse were made to our members’ hotlines. Of the reports, 77 percent were made by members of the public, a decline from 81 percent in 2014. In other words, individuals, including children, reported that they had a suspicion that child sexual abuse was happening, as opposed to law enforcement or members of the industry acting on their suspicions.

Alongside the above, we must consider how to better develop solutions to ensure they are as fit for purpose as possible.

The Interpol Innovation Cyber crime Working Group has recently issued a discussion paper on the prevention of child sexual exploitation online. It provides an excellent opportunity to highlight how AI can address the challenge. It iterates that:

The growth of the internet and associated technologies has allowed the rapid and widespread sharing and dissemination of child sexual abuse material

There is currently a new opportunity to exploit children and create child sexual abuse material, as well as an unprecedented amount of it already circulating

The growth and availability of child sexual abuse material is driving consumers and demand

I fully agree with the statement above. Exploiting children in this way is barbaric and has no place in our society.

The open nature of the net makes it difficult for law enforcement to investigate and prosecute these crimes and to protect victims. The conference paper describes the evolution of technology used to present child sexual abuse material (CSAM) online, which has made it more difficult for the average individual to locate and identify specific instances of child sexual abuse.

The technologies used to share and conceal CSAM online include steganography, proxy servers, anonymous networks, and the dark web, Tor, encryption, and cryptocurrencies.

Fortunately, AI has significant potential to address these challenges.

For example, AI tools that are able to automatically detect, remove and archive child sexual abuse material online are in use today. Unfortunately, it is not always easy for the average user to know when such tools are available for public use. Collaborating with developers, companies can make it easier to assist with the removal of CSAM.

One example of effective collaboration between a law enforcement agency and a commercial company is the Child Exploitation Tracking System (CETS). The system was developed through a public-private partnership between the Dutch National Police Agency and Microsoft. CETS uses image recognition technology to scan network files and folders and identify evidence of child sex abuse. The system is available as a free consumer add-on, Microsoft PhotoDNA, which helps identify and remove CSAM from users’ computers and networks and integrates with law enforcement investigations by law enforcement network forensic units.

I would like to take this opportunity to remind everyone that child sexual abuse material is unique in its damaging nature. Few other crimes against children are as damaging in terms of the harm done to children themselves and in terms of the need to protect other children.

In the words of the late Stephen Lewis, former Canadian Ambassador for the United Nations Children’s Fund, “There is no cruelty, no sick twist of the mind or perversion that can equal the sexual abuse of a child.”

It is unfair to suggest that the fight against this appalling crime is going well. We need to do a better job of working together to tackle the issue online. Fortunately, technology can help to bring the perpetrators to justice.

How can AI Help in Stopping Child Abuse in 2021?

Of course, artificial intelligence is also changing the way people think about their future.

Among the smartest people I know are those who study child abuse and the best practices to identify, stop, and prosecute it.

There are two primary ways artificial intelligence can help.

First, AI can help identify child abuse and help protect children.

Second, with better case outcomes, these are future-forward approaches to preventing future crimes.

A British Baby Center survey found that 9 out of 10 mothers agreed that any new technology should be used to help protect children.

In other words, what you think about your future is a direct result of what you learn and believe about your past.

A vast majority of crimes against children are committed by people they know and trust. Children are too young to understand the threat they are facing and simply cannot protect themselves.

From a law-enforcement perspective, identifying criminals who commit child abuse is not always easy; after all, criminals are often very good at hiding their identities.

With AI, criminals are unable to hide their crimes.

“There is no cruelty, no sick twist of the mind or perversion that can equal the sexual abuse of a child.” – Stephen Lewis, former Canadian Ambassador for the United Nations Children’s Fund

With AI, law enforcement can use data analysis to predict the places and ways these crimes are committed and even uncover patterns of behavior. This is vital for protecting children in the future.

Based on my experience, there are at least two ways AI may be able to help protect children in 2021.

First and foremost, AI may be able to find new ways to identify and stop child sexual abuse.

In order to do so, AI has to build on what we know about child abuse, and its perpetrators, now.

Criminals who commit crimes against children are often very good at hiding their identities. With the use of AI, however, these criminals cannot hide their crimes.

AI, therefore, is able to use data analysis to predict the places and ways these crimes are committed and even uncover patterns of behavior. That in turn enables law enforcement to find and stop these criminals before they can commit their offenses.

Second, AI can help build better public awareness of child abuse and allow law enforcement to more quickly identify victims.

The better understanding of child abuse, the better the future outcomes will be.

What Careers Will be in Demand in 2025

Zoom Sees More Growth After ‘Unprecedented’