Artwork

Nội dung được cung cấp bởi Stanford Law School and Evelyn douek. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stanford Law School and Evelyn douek hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

An Investigation into Self-Generated Child Sexual Abuse Material Networks on Social Media

39:24
 
Chia sẻ
 

Manage episode 365871238 series 3397905
Nội dung được cung cấp bởi Stanford Law School and Evelyn douek. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stanford Law School and Evelyn douek hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.

Large Networks of Minors Appear to be Selling Illicit Sexual Content Online

The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.

A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.

With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.

Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.

Front-Page Wall Street Journal Coverage

  • A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
  • Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal

Bipartisan Concern and Calls for Social Media Regulation

The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.

  • Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.

In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

  continue reading

80 tập

Artwork
iconChia sẻ
 
Manage episode 365871238 series 3397905
Nội dung được cung cấp bởi Stanford Law School and Evelyn douek. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được Stanford Law School and Evelyn douek hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Stanford’s Evelyn Douek and Alex Stamos are joined by Stanford Internet Observatory (SIO) Research Manager Renée DiResta and Chief Technologist David Thiel to discuss a new report on a months-long investigation into the distribution of illicit sexual content by minors online.

Large Networks of Minors Appear to be Selling Illicit Sexual Content Online

The Stanford Internet Observatory (SIO) published a report last week with findings from a months-long investigation into the distribution of illicit sexual content by minors online. The SIO research team identified a large network of accounts claiming to be minors, likely teenagers, who are producing, marketing and selling their own explicit content on social media.

A tip from The Wall Street Journal informed the investigation with a list of common terms and hashtags indicating the sale of “self-generated child sexual abuse material” (SG-CSAM). SIO identified a network of more than 500 accounts advertising SG-CSAM with tens of thousands of likely buyers.

With only public data, this research uncovered and helped resolve basic safety failings with Instagram’s reporting system for accounts with expected child exploitation, and Twitter’s system for automatically detecting and removing known CSAM.

Most of the work to address CSAM has focused on adult offenders who create the majority of content. These findings highlight the need for new countermeasures developed by industry, law enforcement and policymakers to address sextortion and the sale of illicit content that minors create themselves.

Front-Page Wall Street Journal Coverage

  • A Wall Street Journal article first covered Twitter’s lapse in safety measures to prevent known CSAM from appearing on the site and the importance of researcher access to study public social media data to identify and help address issues. - Alexa Corse/ The Wall Street Journal
  • Instagram was the focus of a larger Wall Street Journal investigation, based in part on SIO’s research findings. The app is currently the most significant platform for these CSAM networks, connecting young sellers with buyers with recommendation features, searching for hashtags, and direct messaging. - Jeff Horwitz, Katherine Blunt/ The Wall Street Journal

Bipartisan Concern and Calls for Social Media Regulation

The investigation sparked outrage across the aisle in the U.S. and grabbed the attention of the European Commission as the European Union prepares to enforce the Digital Services Act for the largest online platforms later this summer.

  • Thierry Breton, the top EU official for trade and industry regulation, announced that he will meet with Meta CEO Mark Zuckerberg later this month at the company’s Menlo Park headquarters to discuss the report and demand the company takes action.

In Congress, House Energy and Commerce Democrats and GOP Senators were most outspoken about taking action to address the concerning findings.

Join the conversation and connect with Evelyn and Alex on Twitter at @evelyndouek and @alexstamos.

Moderated Content is produced in partnership by Stanford Law School and the Cyber Policy Center. Special thanks to John Perrino for research and editorial assistance.

Like what you heard? Don’t forget to subscribe and share the podcast with friends!

  continue reading

80 tập

Tất cả các tập

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh