Safer, Built by Thorn Profile Banner
Safer, Built by Thorn Profile
Safer, Built by Thorn

@GetSaferio

Followers
670
Following
52
Media
91
Statuses
181

With Safer, @Thorn is equipping content-hosting platforms with industry-leading tools for proactive detection of child sexual abuse material #csam .

California
Joined May 2020
Don't wanna be here? Send us removal request.
Explore trending content on Musk Viewer
@GetSaferio
Safer, Built by Thorn
2 years
For @Flickr , Safer’s Image Classifier is a critical tool in their effort to detect novel #CSAM . The use of the Classifier has empowered their team of less than 10 full-time #TrustAndSafety employees to have a huge impact.
Tweet media one
2
6
26
@GetSaferio
Safer, Built by Thorn
1 year
Minors report experiencing a variety of risky online interactions with adults, such as cold solicitations, attempts at isolating and attempts to “befriend and manipulate” them.
3
7
21
@GetSaferio
Safer, Built by Thorn
4 years
100,000 images of child sexual abuse removed from the open web in beta, and we're just getting started. Today @Thorn launches its first commercial product. Today the internet is Safer. #getSafer
Tweet media one
0
10
16
@GetSaferio
Safer, Built by Thorn
2 years
We’re pleased to announce that Safer’s Reporting Service now includes the option to send reports to the Royal Canadian Mounted Police. Adding #RCMP reporting is a major step forward toward reaching our goal of eliminating #CSAM from the internet.
2
8
19
@GetSaferio
Safer, Built by Thorn
3 years
Safer, built by @Thorn , is now available in AWS Marketplace so companies of all sizes can easily access technology to detect CSAM. Let’s build the internet we deserve, together.
Tweet media one
0
4
13
@GetSaferio
Safer, Built by Thorn
1 year
“We believe tech companies are key partners, and @thorn is committed to empowering the tech industry with tools and resources to combat child sexual abuse at scale.” Read more of what John Starr has to say:
Tweet media one
1
4
12
@GetSaferio
Safer, Built by Thorn
4 years
When you use Safer, built by @Thorn , it’s not just about the number of images hashed and matches found—it’s about victims ID’d and children removed from harm. Here’s how our Director of Computer Vision approaches this daunting technical challenge:
0
2
9
@GetSaferio
Safer, Built by Thorn
3 years
Did you know you can use your existing @awscloud budget to purchase Safer and keep your platform free of child sexual abuse material? Protect your community, protect every child. Learn more about how to #GetSafer today:
Tweet media one
0
4
10
@GetSaferio
Safer, Built by Thorn
4 years
Safer, built by @Thorn , began with an idea - one based in technology. Here's how @juliecordua convinced a seasoned product director to help build the first comprehensive solution to identifying, removing, and reporting #CSAM .
0
7
8
@GetSaferio
Safer, Built by Thorn
2 years
We’re the only all-in-one solution to detect, review & report CSAM. With our reporting tool, customers contribute to keeping kids safe online & provide intelligence that can help rescue child victims. In 2021, 30,339 reports were sent to @missingkids
Tweet media one
0
7
11
@GetSaferio
Safer, Built by Thorn
3 years
“I applaud @Flickr , the Flickr community, and every digital platform willing to join us in naming this atrocity. In doing so we are building a better, safer internet.” - Read @Thorn VP of Industry Sector Strategy’s blog post:
Tweet media one
0
3
7
@GetSaferio
Safer, Built by Thorn
4 years
Find it here, remove it everywhere. A single image match on your platform using Safer can end the revictimization of children whose abuse has been captured and shared across the internet. Let’s build a better internet — the one we deserve. #SaferTogether
Tweet media one
0
4
7
@GetSaferio
Safer, Built by Thorn
2 years
Our customers detected 150,000+ images and videos of known child sexual abuse material (CSAM) in 2021. By detecting and removing CSAM, our customers are helping to build a better internet for everyone.
Tweet media one
1
3
7
@GetSaferio
Safer, Built by Thorn
1 year
In 2022, we hashed more than 42.1 billion images and videos. With the largest database of verified hashes (32+ million) to match against, Safer can cast the widest net to detect known #CSAM . Read our full impact report:
Tweet media one
6
2
8
@GetSaferio
Safer, Built by Thorn
1 year
Hello TrustCon! Thank you to @tspainfo for hosting this amazing event. Our team is looking forward to learning from and sharing our expertise with this amazing community. Be sure to stop by and say hello. #trustandsafety #trustcon #trustcon23
Tweet media one
0
2
6
@GetSaferio
Safer, Built by Thorn
3 years
By taking a proactive approach to CSAM detection, platforms can take active steps to remove this material from circulation before it can do additional harm. Here are 4 signs your platform needs proactive #CSAM detection.
0
3
7
@GetSaferio
Safer, Built by Thorn
2 years
In 2022, more than 60,000 reports of #CSAM were sent to @MissingKids via Safer. This provides necessary intelligence that can have a life-saving impact and lead to the rescue of child victims.
Tweet media one
0
4
6
@GetSaferio
Safer, Built by Thorn
4 years
Let's demand a better internet. One where every tech platform is identifying, removing and reporting CSAM at scale. It’s possible with Safer. Let’s build the internet we deserve. #SaferTogether
Tweet media one
0
3
5
@GetSaferio
Safer, Built by Thorn
1 year
It’s impossible to talk about #CSAM detection without talking about hashing and matching. We demystify this technology in our latest blog.
0
2
5
@GetSaferio
Safer, Built by Thorn
4 years
Hashing helps Safer identify #CSAM while protecting content moderators and survivors. Here's how:
0
6
5
@GetSaferio
Safer, Built by Thorn
4 years
Platforms can now purchase Safer directly from the #AWS marketplace. Today we're another step closer to the internet we deserve. @awscloud
Tweet media one
0
1
4
@GetSaferio
Safer, Built by Thorn
1 year
Over four consecutive years, @Thorn monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
0
5
4
@GetSaferio
Safer, Built by Thorn
4 years
Safer was built with humanity as a feature, not a bug. Here’s how wellness plays a central role in @thorn 's newest product designed to eliminate child sexual abuse material (CSAM) from the internet. #getSafer
0
3
4
@GetSaferio
Safer, Built by Thorn
4 years
Behind every file of child sexual abuse material (CSAM) identified for removal is a platform committed to building a better, safer, internet. The Safer community helped to identify over 79K CSAM files from the internet in 2020.
Tweet media one
Tweet media two
Tweet media three
0
3
4
@GetSaferio
Safer, Built by Thorn
1 year
Our customers detected more than 520,000 images and videos of known child sexual abuse material (CSAM) in 2022. By detecting and removing #CSAM , our customers are helping to build a better internet for everyone.
Tweet media one
0
0
6
@GetSaferio
Safer, Built by Thorn
1 year
Over four consecutive years, @Thorn monitored the perspectives of 9- to 17-year-olds. The research identified a sustained increase in SG-CSAM. We share trends and recommend mitigations for SG-CSAM in our latest report.
3
3
5
@GetSaferio
Safer, Built by Thorn
3 years
If one of these signals applies to your platform, it’s worth taking a look at your company’s policies and procedures for handling #CSAM . Learn more:
Tweet media one
2
1
4
@GetSaferio
Safer, Built by Thorn
4 years
One recent image match from our product led to 21 abused children being removed from harm. If that’s the kind of impact you’re looking for in your career, apply to our Senior Software Engineer position today:
Tweet media one
2
1
2
@GetSaferio
Safer, Built by Thorn
4 years
Hashing helps Safer identify CSAM while protecting content moderators and survivors. Here's how:
0
1
2
@GetSaferio
Safer, Built by Thorn
2 years
Recently, our very own John Starr, Vice President of Industry and GM of Safer, sat down with @tech_coalition to discuss child protection, transparency and how tools, such as Safer, are supporting the tech industry in combating #CSAM on the open web.
Tweet media one
0
1
4
@GetSaferio
Safer, Built by Thorn
4 years
Eliminate child sexual abuse material from the internet. Your platform, your community, and your internet: Safer. Built by @Thorn for the internet we deserve. #getSafer
Tweet media one
0
0
4
@GetSaferio
Safer, Built by Thorn
4 years
@joannashields
Joanna Shields
4 years
Wonderful news from @thorn will play a huge role in ending the revictimisation of children through online child sexual abuse materials. The "Safer" tool can provide tech companies with an invaluable resource to fight online abuse.
1
9
24
0
0
3
@GetSaferio
Safer, Built by Thorn
11 months
📣 We’re excited to announce the release of Safer Essential, our API-based solution for #CSAM detection that requires minimal engineering resources to set up. #childsafety #trustandsafety
Tweet media one
0
2
1
@GetSaferio
Safer, Built by Thorn
1 year
Hashing and matching is the foundation of #CSAM detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
1
0
3
@GetSaferio
Safer, Built by Thorn
10 months
Today, @thorn will be in the Nonprofit Impact Lounge on the 3rd Floor of the Venetian. If you're at #AWSreinvent , we’re here until 4pm and looking forward to connecting with attendees interested in learning more about our mission. @AWS_Partners #APNproud
0
0
3
@GetSaferio
Safer, Built by Thorn
4 years
As more people find themselves spending more time online than ever before, this #SaferInternetDay is more important than ever. #SID2021
@thorn
Thorn
4 years
We deserve a better internet, one where every child can be #SafeOnline . To date, @GetSaferio has identified well over 100,000 #CSAM files for removal from the open web. #Together , we can build the internet we deserve. #SSID2021 #SaferInternetDay
0
2
8
0
0
1
@GetSaferio
Safer, Built by Thorn
4 years
When we build features for Safer, we work closely with our customers to meet their needs. But we don’t stop there—we talk to researchers, experts and psychologists to ensure wellness is a core component of everything we build.
0
2
3
@GetSaferio
Safer, Built by Thorn
4 years
We’re thrilled to work with the Technology Coalition to ensure Safer gets into the hands of more platforms who need them. Together we will accelerate the identification, removal, and reporting of #CSAM at scale. Learn more here:
1
0
2
@GetSaferio
Safer, Built by Thorn
3 years
45 more companies reported #CSAM to @missingkids last year than the year before. We’re moving the needle, but we won’t end this epidemic until every platform with an upload button is proactively detecting CSAM. Now in AWS Marketplace. #GetSafer
Tweet media one
0
0
2
@GetSaferio
Safer, Built by Thorn
1 year
Minors report experiencing a variety of risky online interactions with adults, such as cold solicitations, attempts at isolating and attempts to “befriend and manipulate” them.
0
0
2
@GetSaferio
Safer, Built by Thorn
3 years
Safer is now available in AWS Marketplace, making it easier than ever before to fight the spread of #CSAM through technology. Learn more: #GetSafer #SaferTogether @awscloud
Tweet media one
0
0
2
@GetSaferio
Safer, Built by Thorn
3 years
No. 3: Your Trust & Safety team is inundated with work. Learn more about the signs your platform needs proactive #CSAM detection. #GetSafer
0
0
1
@GetSaferio
Safer, Built by Thorn
4 years
Ask us how your platform can #getSafer .
@aplusk
ashton kutcher
4 years
If you are building, or know someone building a user generated content platform please share this to eliminate child sexual abuse material from the internet. Safer: Building the internet we deserve.
41
155
622
1
0
2
@GetSaferio
Safer, Built by Thorn
2 years
In 2021, we hashed more than 11 billion images and videos. We also tripled the number of hashes in our matching service. That brings the total hashes in our database to 18+ million, making it the largest hash set in the world for detecting CSAM.
0
0
1
@GetSaferio
Safer, Built by Thorn
10 months
Tomorrow at #AWSreInvent , Dr. Rebecca Portnoff, Head of Data Science @thorn will join Dr. Werner Vogels on stage during his keynote for a discussion of how machine learning can be leveraged to combat the online spread of #CSAM . @AWS_Partners #APNproud
0
0
1
@GetSaferio
Safer, Built by Thorn
3 years
Every platform for every child: Let’s build the internet we deserve, together. #GetSafer #SaferTogether
0
0
1
@GetSaferio
Safer, Built by Thorn
10 months
If you’re planning to attend AWS re:Invent in Las Vegas next week, stop by to see us in the AWS Nonprofit Impact Lounge on the 3rd Floor of the Venetian. We’d love to say “hello.” @AWS_Partners #APNproud #AWSreInvent
Tweet media one
0
0
1
@GetSaferio
Safer, Built by Thorn
3 years
“No matter how difficult the subject is, it’s even more important to address it.” @Flickr ’s Manager of Trust and Safety writes for @Thorn about how using Safer advances their goal to eliminate CSAM from the internet.
0
0
1
@GetSaferio
Safer, Built by Thorn
4 years
In the fight to eliminate child sexual abuse material from the internet, no one is doing it alone. We applaud the platforms adopting proactive CSAM detection practices to create safe spaces for their communities, their employees and the most vulnerable children.
@thorn
Thorn
4 years
We’ll win this battle by coming together as a tidal wave of advocates who demand an end to the online exploitation of children. Here’s why @MissingKids ' most recent update on CSAM reports is a good sign:
0
9
16
0
0
1
@GetSaferio
Safer, Built by Thorn
1 year
Hashing and matching is the foundation of #CSAM detection. But, that’s only part of the equation when it comes to protecting your platform and your users.
0
0
1
@GetSaferio
Safer, Built by Thorn
11 months
Safer user @vsco believes in proactively protecting the wellbeing of its global community of 200 million creators. Deploying Safer helped VSCO deliver on its promise of being a trusted platform and providing a safe experience for its community. Case study:
Tweet media one
0
0
1
@GetSaferio
Safer, Built by Thorn
1 year
#safetybydesign advocates for safety built into each stage of the development process. Asking the right questions along the way can help prevent future harms. Learn more in our latest report.
Tweet media one
0
0
1
@GetSaferio
Safer, Built by Thorn
1 year
If you’re planning to attend #TrustCon in San Francisco next week, stop by our table on July 12. We’d love to say “hello.” Members from the @thorn team will also be presenting. Hope to see you there.
Tweet media one
0
1
1
@GetSaferio
Safer, Built by Thorn
4 years
@patrickokeefe @communitysignal Hi there - please shoot us an email with the details at pr @wearethorn .org and someone will get back to you. Thanks!
1
0
1
@GetSaferio
Safer, Built by Thorn
1 year
In 2022, using our machine learning classifiers, our customers classified 304,466 images and 15,238 videos as potential #CSAM . Using our Image and Video Classifiers empowers Safer customers to find previously unknown CSAM.
Tweet media one
0
0
0
@GetSaferio
Safer, Built by Thorn
4 years
Hosted on #AWS ? Find Safer in the @awscloud marketplace to protect your platform from #CSAM . Let’s build a better internet.
0
0
1
@GetSaferio
Safer, Built by Thorn
11 months
Take the first steps in proactive #CSAM detection with Safer Essential, Thorn‘s new API-based solution. Compare Safer Essential vs. Safer Enterprise:
Tweet media one
0
0
1
@GetSaferio
Safer, Built by Thorn
2 years
Safer’s services run on your infrastructure. You select which services you need and control how they integrate with your existing systems and workflows. Safer can be tailored to your #CSAM detection needs and grow with you as you scale.
0
0
1