For
@Flickr
, Safer’s Image Classifier is a critical tool in their effort to detect novel
#CSAM
. The use of the Classifier has empowered their team of less than 10 full-time
#TrustAndSafety
employees to have a huge impact.
Minors report experiencing a variety of risky online interactions with adults, such as cold solicitations, attempts at isolating and attempts to “befriend and manipulate” them.
100,000 images of child sexual abuse removed from the open web in beta, and we're just getting started. Today
@Thorn
launches its first commercial product. Today the internet is Safer.
#getSafer
We’re pleased to announce that Safer’s Reporting Service now includes the option to send reports to the Royal Canadian Mounted Police. Adding
#RCMP
reporting is a major step forward toward reaching our goal of eliminating
#CSAM
from the internet.
Safer, built by
@Thorn
, is now available in AWS Marketplace so companies of all sizes can easily access technology to detect CSAM. Let’s build the internet we deserve, together.
“We believe tech companies are key partners, and
@thorn
is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale.” Read more of what John Starr has to say:
When you use Safer, built by
@Thorn
, it’s not just about the number of images hashed and matches found—it’s about victims ID’d and children removed from harm. Here’s how our Director of Computer Vision approaches this daunting technical challenge:
Did you know you can use your existing
@awscloud
budget to purchase Safer and keep your platform free of child sexual abuse material? Protect your community, protect every child. Learn more about how to
#GetSafer
today:
Safer, built by
@Thorn
, began with an idea - one based in technology. Here's how
@juliecordua
convinced a seasoned product director to help build the first comprehensive solution to identifying, removing, and reporting
#CSAM
.
We’re the only all-in-one solution to detect, review & report CSAM. With our reporting tool, customers contribute to keeping kids safe online & provide intelligence that can help rescue child victims. In 2021, 30,339 reports were sent to
@missingkids
Clear trends emerged in
@thorn
latest research. Find actionable insights for
#trustandsafety
in our new Emerging Online Trends in Child Sexual Abuse 2023 report.
“I applaud
@Flickr
, the Flickr community, and every digital platform willing to join us in naming this atrocity. In doing so we are building a better, safer internet.” - Read
@Thorn
VP of Industry Sector Strategy’s blog post:
Find it here, remove it everywhere. A single image match on your platform using Safer can end the revictimization of children whose abuse has been captured and shared across the internet. Let’s build a better internet — the one we deserve.
#SaferTogether
Our customers detected 150,000+ images and videos of known child sexual abuse material (CSAM) in 2021. By detecting and removing CSAM, our customers are helping to build a better internet for everyone.
In 2022, we hashed more than 42.1 billion images and videos. With the largest database of verified hashes (32+ million) to match against, Safer can cast the widest net to detect known
#CSAM
.
Read our full impact report:
Hello TrustCon! Thank you to
@tspainfo
for hosting this amazing event. Our team is looking forward to learning from and sharing our expertise with this amazing community.
Be sure to stop by and say hello.
#trustandsafety
#trustcon
#trustcon23
By taking a proactive approach to CSAM detection, platforms can take active steps to remove this material from circulation before it can do additional harm.
Here are 4 signs your platform needs proactive
#CSAM
detection.
In 2022, more than 60,000 reports of
#CSAM
were sent to
@MissingKids
via Safer. This provides necessary intelligence that can have a life-saving impact and lead to the rescue of child victims.
Let's demand a better internet. One where every tech platform is identifying, removing and reporting CSAM at scale. It’s possible with Safer. Let’s build the internet we deserve.
#SaferTogether
Over four consecutive years,
@Thorn
monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
Safer was built with humanity as a feature, not a bug. Here’s how wellness plays a central role in
@thorn
's newest product designed to eliminate child sexual abuse material (CSAM) from the internet.
#getSafer
Behind every file of child sexual abuse material (CSAM) identified for removal is a platform committed to building a better, safer, internet. The Safer community helped to identify over 79K CSAM files from the internet in 2020.
Our customers detected more than 520,000 images and videos of known child sexual abuse material (CSAM) in 2022. By detecting and removing
#CSAM
, our customers are helping to build a better internet for everyone.
Over four consecutive years,
@Thorn
monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
One recent image match from our product led to 21 abused children being removed from harm. If that’s the kind of impact you’re looking for in your career, apply to our Senior Software Engineer position today:
Recently, our very own John Starr, Vice President of Industry and GM of Safer, sat down with
@tech_coalition
to discuss child protection, transparency and how tools, such as Safer, are supporting the tech industry in combating
#CSAM
on the open web.
Eliminate child sexual abuse material from the internet. Your platform, your community, and your internet: Safer. Built by
@Thorn
for the internet we deserve.
#getSafer
Wonderful news from
@thorn
will play a huge role in ending the revictimisation of children through online child sexual abuse materials. The "Safer" tool can provide tech companies with an invaluable resource to fight online abuse.
📣 We’re excited to announce the release of Safer Essential, our API-based solution for
#CSAM
detection that requires minimal engineering resources to set up.
#childsafety
#trustandsafety
Hashing and matching is the foundation of
#CSAM
detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
Today,
@thorn
will be in the Nonprofit Impact Lounge on the 3rd Floor of the Venetian. If you're at
#AWSreinvent
, we’re here until 4pm and looking forward to connecting with attendees interested in learning more about our mission.
@AWS_Partners
#APNproud
When we build features for Safer, we work closely with our customers to meet their needs. But we don’t stop there—we talk to researchers, experts and psychologists to ensure wellness is a core component of everything we build.
We’re thrilled to work with the Technology Coalition to ensure Safer gets into the hands of more platforms who need them. Together we will accelerate the identification, removal, and reporting of
#CSAM
at scale. Learn more here:
45 more companies reported
#CSAM
to
@missingkids
last year than the year before. We’re moving the needle, but we won’t end this epidemic until every platform with an upload button is proactively detecting CSAM. Now in AWS Marketplace.
#GetSafer
Minors report experiencing a variety of risky online interactions with adults, such as cold solicitations, attempts at isolating and attempts to “befriend and manipulate” them.
If you are building, or know someone building a user generated content platform please share this to eliminate child sexual abuse material from the internet. Safer: Building the internet we deserve.
In 2021, we hashed more than 11 billion images and videos. We also tripled the number of hashes in our matching service. That brings the total hashes in our database to 18+ million, making it the largest hash set in the world for detecting CSAM.
Tomorrow at
#AWSreInvent
, Dr. Rebecca Portnoff, Head of Data Science
@thorn
will join Dr. Werner Vogels on stage during his keynote for a discussion of how machine learning can be leveraged to combat the online spread of
#CSAM
.
@AWS_Partners
#APNproud
If you’re planning to attend AWS re:Invent in Las Vegas next week, stop by to see us in the AWS Nonprofit Impact Lounge on the 3rd Floor of the Venetian. We’d love to say “hello.”
@AWS_Partners
#APNproud
#AWSreInvent
“No matter how difficult the subject is, it’s even more important to address it.”
@Flickr
’s Manager of Trust and Safety writes for
@Thorn
about how using Safer advances their goal to eliminate CSAM from the internet.
In the fight to eliminate child sexual abuse material from the internet, no one is doing it alone. We applaud the platforms adopting proactive CSAM detection practices to create safe spaces for their communities, their employees and the most vulnerable children.
We’ll win this battle by coming together as a tidal wave of advocates who demand an end to the online exploitation of children. Here’s why
@MissingKids
' most recent update on CSAM reports is a good sign:
Clear trends emerged in
@thorn
latest research. Find actionable insights for
#trustandsafety
in our new Emerging Online Trends in Child Sexual Abuse 2023 report.
Hashing and matching is the foundation of
#CSAM
detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
Safer user
@vsco
believes in proactively protecting the wellbeing of its global community of 200 million creators. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its community. Case study:
#safetybydesign
advocates for safety built into each stage of the development process. Asking the right questions along the way can help prevent future harms. Learn more in our latest report.
If you’re planning to attend
#TrustCon
in San Francisco next week, stop by our table on July 12. We’d love to say “hello.”
Members from the
@thorn
team will also be presenting. Hope to see you there.
In 2022, using our machine learning classifiers, our customers classified 304,466 images and 15,238 videos as potential
#CSAM
. Using our Image and Video Classifiers empowers Safer customers to find previously unknown CSAM.
Safer’s services run on your infrastructure. You select which services you need and control how they integrate with your existing systems and workflows.
Safer can be tailored to your
#CSAM
detection needs and grow with you as you scale.