Safer, Built by Thorn
@GetSaferio
Followers
616
Following
60
Media
100
Statuses
195
With Safer, @Thorn is equipping content-hosting platforms with industry-leading tools for proactive detection of child sexual abuse material #csam.
California
Joined May 2020
For @Flickr, Safer’s Image Classifier is a critical tool in their effort to detect novel #CSAM. The use of the Classifier has empowered their team of less than 10 full-time #TrustAndSafety employees to have a huge impact.
2
6
27
Safer, built by @Thorn, is now available in AWS Marketplace so companies of all sizes can easily access technology to detect CSAM. Let’s build the internet we deserve, together.
0
4
13
“We believe tech companies are key partners, and @thorn is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale.” Read more of what John Starr has to say:
1
4
12
When you use Safer, built by @Thorn, it’s not just about the number of images hashed and matches found—it’s about victims ID’d and children removed from harm. Here’s how our Director of Computer Vision approaches this daunting technical challenge:
0
1
9
Safer, built by @Thorn, began with an idea - one based in technology. Here's how @juliecordua convinced a seasoned product director to help build the first comprehensive solution to identifying, removing, and reporting #CSAM.
0
7
6
We’re the only all-in-one solution to detect, review & report CSAM. With our reporting tool, customers contribute to keeping kids safe online & provide intelligence that can help rescue child victims. In 2021, 30,339 reports were sent to @missingkids
0
6
11
Clear trends emerged in @thorn latest research. Find actionable insights for #trustandsafety in our new Emerging Online Trends in Child Sexual Abuse 2023 report.
14
4
9
Find it here, remove it everywhere. A single image match on your platform using Safer can end the revictimization of children whose abuse has been captured and shared across the internet. Let’s build a better internet — the one we deserve. #SaferTogether
0
4
6
In 2022, we hashed more than 42.1 billion images and videos. With the largest database of verified hashes (32+ million) to match against, Safer can cast the widest net to detect known #CSAM. Read our full impact report:
6
2
8
Hello TrustCon! Thank you to @tspainfo for hosting this amazing event. Our team is looking forward to learning from and sharing our expertise with this amazing community. Be sure to stop by and say hello. #trustandsafety #trustcon #trustcon23
0
2
6
By taking a proactive approach to CSAM detection, platforms can take active steps to remove this material from circulation before it can do additional harm. Here are 4 signs your platform needs proactive #CSAM detection.
0
2
7
In 2022, more than 60,000 reports of #CSAM were sent to @MissingKids via Safer. This provides necessary intelligence that can have a life-saving impact and lead to the rescue of child victims.
0
4
6
Let's demand a better internet. One where every tech platform is identifying, removing and reporting CSAM at scale. It’s possible with Safer. Let’s build the internet we deserve. #SaferTogether
0
3
5
It’s impossible to talk about #CSAM detection without talking about hashing and matching. We demystify this technology in our latest blog.
0
2
5
Hashing helps Safer identify #CSAM while protecting content moderators and survivors. Here's how:
0
6
5
Over four consecutive years, @Thorn monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
0
4
3
Our customers detected more than 520,000 images and videos of known child sexual abuse material (CSAM) in 2022. By detecting and removing #CSAM, our customers are helping to build a better internet for everyone.
0
0
6
Over four consecutive years, @Thorn monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
0
3
5
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling #CSAM. Learn more:
2
1
4
Recently, our very own John Starr, Vice President of Industry and GM of Safer, sat down with @tech_coalition to discuss child protection, transparency and how tools, such as Safer, are supporting the tech industry in combating #CSAM on the open web.
0
1
4
#SaferTogether .🙏.
Wonderful news from @thorn will play a huge role in ending the revictimisation of children through online child sexual abuse materials. The "Safer" tool can provide tech companies with an invaluable resource to fight online abuse.
0
0
3
📣 We’re excited to announce the release of Safer Essential, our API-based solution for #CSAM detection that requires minimal engineering resources to set up. #childsafety #trustandsafety
0
2
1
Hashing and matching is the foundation of #CSAM detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
1
0
3
Today, @thorn will be in the Nonprofit Impact Lounge on the 3rd Floor of the Venetian. If you're at #AWSreinvent, we’re here until 4pm and looking forward to connecting with attendees interested in learning more about our mission. @AWS_Partners #APNproud.
0
0
3
As more people find themselves spending more time online than ever before, this #SaferInternetDay is more important than ever. #SID2021.
We deserve a better internet, one where every child can be #SafeOnline. To date, @GetSaferio has identified well over 100,000 #CSAM files for removal from the open web. #Together, we can build the internet we deserve. #SSID2021 #SaferInternetDay
0
0
1
We’re thrilled to work with the Technology Coalition to ensure Safer gets into the hands of more platforms who need them. Together we will accelerate the identification, removal, and reporting of #CSAM at scale. Learn more here:
1
0
2
45 more companies reported #CSAM to @missingkids last year than the year before. We’re moving the needle, but we won’t end this epidemic until every platform with an upload button is proactively detecting CSAM. Now in AWS Marketplace. #GetSafer
0
0
2
Safer is now available in AWS Marketplace, making it easier than ever before to fight the spread of #CSAM through technology. Learn more: #GetSafer #SaferTogether @awscloud
0
0
2
Ask us how your platform can #getSafer.
If you are building, or know someone building a user generated content platform please share this to eliminate child sexual abuse material from the internet. Safer: Building the internet we deserve.
1
0
2
Tomorrow at #AWSreInvent, Dr. Rebecca Portnoff, Head of Data Science @thorn will join Dr. Werner Vogels on stage during his keynote for a discussion of how machine learning can be leveraged to combat the online spread of #CSAM. @AWS_Partners #APNproud
0
0
1
Every platform for every child: Let’s build the internet we deserve, together. #GetSafer #SaferTogether.
0
0
1
If you’re planning to attend AWS re:Invent in Las Vegas next week, stop by to see us in the AWS Nonprofit Impact Lounge on the 3rd Floor of the Venetian. We’d love to say “hello.” @AWS_Partners #APNproud #AWSreInvent
0
0
1
In the fight to eliminate child sexual abuse material from the internet, no one is doing it alone. We applaud the platforms adopting proactive CSAM detection practices to create safe spaces for their communities, their employees and the most vulnerable children.
We’ll win this battle by coming together as a tidal wave of advocates who demand an end to the online exploitation of children. Here’s why @MissingKids' most recent update on CSAM reports is a good sign:
0
0
1
Clear trends emerged in @thorn latest research. Find actionable insights for #trustandsafety in our new Emerging Online Trends in Child Sexual Abuse 2023 report.
0
0
1
Hashing and matching is the foundation of #CSAM detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
0
0
1
Safer user @vsco believes in proactively protecting the wellbeing of its global community of 200 million creators. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its community. Case study:
0
0
1
#safetybydesign advocates for safety built into each stage of the development process. Asking the right questions along the way can help prevent future harms. Learn more in our latest report.
0
0
1
@patrickokeefe @communitysignal Hi there - please shoot us an email with the details at pr@wearethorn.org and someone will get back to you. Thanks!.
1
0
1
In 2022, using our machine learning classifiers, our customers classified 304,466 images and 15,238 videos as potential #CSAM. Using our Image and Video Classifiers empowers Safer customers to find previously unknown CSAM.
0
0
0
Take the first steps in proactive #CSAM detection with Safer Essential, Thorn‘s new API-based solution. Compare Safer Essential vs. Safer Enterprise:
0
0
1
Safer’s services run on your infrastructure. You select which services you need and control how they integrate with your existing systems and workflows. Safer can be tailored to your #CSAM detection needs and grow with you as you scale.
0
0
1