Microsoft has donated a new technology to the National Center for Missing and Exploited Children (NCMEC) that has the potential to make a drastic difference in the fight against the spread of child pornography online.
The technology, called PhotoDNA, was initially created by Microsoft Research. It was further developed by Hany Farid, a leading digital-imaging expert and professor of computer science at Dartmouth College, to help NCMEC in its efforts to find hidden copies of the worst images of child sexual exploitation known today.
Ernie Allen, president and CEO of NCMEC, says child porn is a problem that had all but disappeared in the late 1980s - the U.S. Supreme Court had ruled that it was not protected speech, but instead constituted child abuse. Law enforcement had cracked down on its distribution and importation.
“Twenty years ago we thought this problem was virtually gone,” Allen says. “As wonderful and powerful as the Internet is, it has created an opportunity for people to network with others of like interest, and to access content in the privacy of their own homes that would have formerly put them at risk to acquire.”
Today, says Allen, the problem is exploding. Since 2003, NCMEC has reviewed and analyzed almost 30 million images and videos of child pornography. These photos of sexual abuse are seized from pedophiles who both trade in the illegal images and form communities that reinforce their shared interest in children.
Allen says that the NCMEC cyber-tip line has handled 750,000 reports of child sexual exploitation and child pornography from the public and Internet service providers. “We’re currently reviewing 250,000 images every week,” Allen says. “So this is a massive problem.”
NCMEC has worked with law enforcement to identify many of the worst images of child sexual abuse and exploitation. As they are passed from pedophile to pedophile, many of these images surface repeatedly during child pornography investigations. “Our goal is to stop that victimization,” Allen says. “Using PhotoDNA, we will be able to match those images, working with online service providers around the country, so we can stop the redistribution of the photos.”
The technology, called PhotoDNA, was initially created by Microsoft Research. It was further developed by Hany Farid, a leading digital-imaging expert and professor of computer science at Dartmouth College, to help NCMEC in its efforts to find hidden copies of the worst images of child sexual exploitation known today.
Ernie Allen, president and CEO of NCMEC, says child porn is a problem that had all but disappeared in the late 1980s - the U.S. Supreme Court had ruled that it was not protected speech, but instead constituted child abuse. Law enforcement had cracked down on its distribution and importation.
“Twenty years ago we thought this problem was virtually gone,” Allen says. “As wonderful and powerful as the Internet is, it has created an opportunity for people to network with others of like interest, and to access content in the privacy of their own homes that would have formerly put them at risk to acquire.”
Today, says Allen, the problem is exploding. Since 2003, NCMEC has reviewed and analyzed almost 30 million images and videos of child pornography. These photos of sexual abuse are seized from pedophiles who both trade in the illegal images and form communities that reinforce their shared interest in children.
Allen says that the NCMEC cyber-tip line has handled 750,000 reports of child sexual exploitation and child pornography from the public and Internet service providers. “We’re currently reviewing 250,000 images every week,” Allen says. “So this is a massive problem.”
NCMEC has worked with law enforcement to identify many of the worst images of child sexual abuse and exploitation. As they are passed from pedophile to pedophile, many of these images surface repeatedly during child pornography investigations. “Our goal is to stop that victimization,” Allen says. “Using PhotoDNA, we will be able to match those images, working with online service providers around the country, so we can stop the redistribution of the photos.”
No comments:
Post a Comment