Artwork

Nội dung được cung cấp bởi ITSPmagazine, Sean Martin, and Marco Ciappelli. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được ITSPmagazine, Sean Martin, and Marco Ciappelli hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.
Player FM - Ứng dụng Podcast
Chuyển sang chế độ ngoại tuyến với ứng dụng Player FM !

Hacking Deepfake Image Detection System with White and Black Box Attacks | A SecTor Cybersecurity Conference Toronto 2024 Conversation with Sagar Bhure | On Location Coverage with Sean Martin and Marco Ciappelli

22:40
 
Chia sẻ
 

Manage episode 442828901 series 2972571
Nội dung được cung cấp bởi ITSPmagazine, Sean Martin, and Marco Ciappelli. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được ITSPmagazine, Sean Martin, and Marco Ciappelli hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Guest: Sagar Bhure, Senior Security Researcher, F5 [@F5]

On LinkedIn | https://www.linkedin.com/in/sagarbhure/

At SecTor | https://www.blackhat.com/sector/2024/briefings/schedule/speakers.html#sagar-bhure-45119

____________________________

Hosts:

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

The authenticity of audio and visual media has become an increasingly significant concern. This episode explores this critical issue, featuring insights from Sean Martin, Marco Ciappelli, and guest Sagar Bhure, a security researcher from F5 Networks.

Sean Martin and Marco Ciappelli engage with Bhure to discuss the challenges and potential solutions related to deepfake technology. Bhure reveals intricate details about the creation and detection of deepfake images and videos. He emphasizes the constant battle between creators of deepfakes and those developing detection tools.

The conversation highlights several alarming instances where deepfakes have been used maliciously. Bhure recounts the case in 2020 where a 17-year-old student successfully fooled Twitter’s verification system with an AI-generated image of a non-existent political candidate. Another incident involved a Hong Kong firm losing $20 million due to a deepfake video impersonating the CFO during a Zoom call. These examples underline the serious implications of deepfake technology for misinformation and financial fraud.

One core discussion point centers on the challenge of distinguishing between real and artificial content. Bhure explains that the advancement in AI and hardware capabilities makes it increasingly difficult for the naked eye to differentiate between genuine and fake images. Despite this, he mentions that algorithms focusing on minute details such as skin textures, mouth movements, and audio sync can still identify deepfakes with varying degrees of success.

Marco Ciappelli raises the pertinent issue of how effective detection mechanisms can be integrated into social media platforms like Twitter, Facebook, and Instagram. Bhure suggests a 'secure by design' approach, advocating for pre-upload verification of media content. He suggests that generative AI should be regulated to prevent misuse while recognizing that artificially generated content also has beneficial applications.

The discussion shifts towards audio deepfakes, highlighting the complexity of their detection. According to Bhure, combining visual and audio detection can improve accuracy. He describes a potential method for audio verification, which involves profiling an individual’s voice over an extended period to identify any anomalies in future interactions.

Businesses are not immune to the threat of deepfakes. Bhure notes that corporate sectors, especially media outlets, financial institutions, and any industry relying on digital communication, must stay vigilant. He warns that deepfake technology can be weaponized to bypass security measures, perpetuate misinformation, and carry out sophisticated phishing attacks.

As technology forges ahead, Bhure calls for continuous improvement in detection techniques and the development of robust systems to mitigate risks associated with deepfakes. He points to his upcoming session at Sector in Toronto, where he will delve deeper into 'Hacking Deepfake Image Detection Systems with White and Black Box Attacks,' offering more comprehensive insights into combating this pressing issue.

____________________________

This Episode’s Sponsors

HITRUST: https://itspm.ag/itsphitweb

____________________________

Follow our SecTor Cybersecurity Conference Toronto 2024 coverage: https://www.itspmagazine.com/sector-cybersecurity-conference-2024-cybersecurity-event-coverage-in-toronto-canada

On YouTube: 📺 https://www.youtube.com/playlist?list=PLnYu0psdcllSCvf6o-K0forAXxj2P190S

Be sure to share and subscribe!

____________________________

Resources

Hacking Deepfake Image Detection System with White and Black Box Attacks: https://www.blackhat.com/sector/2024/briefings/schedule/#hacking-deepfake-image-detection-system-with-white-and-black-box-attacks-40909

Learn more about SecTor Cybersecurity Conference Toronto 2024: https://www.blackhat.com/sector/2024/index.html

____________________________

Catch all of our event coverage: https://www.itspmagazine.com/technology-cybersecurity-society-humanity-conference-and-event-coverage

Are you interested in sponsoring our event coverage with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Want to tell your Brand Story as part of our event coverage?

Learn More 👉 https://itspm.ag/evtcovbrf

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: https://www.itspmagazine.com/redefining-cybersecurity-podcast

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

  continue reading

621 tập

Artwork
iconChia sẻ
 
Manage episode 442828901 series 2972571
Nội dung được cung cấp bởi ITSPmagazine, Sean Martin, and Marco Ciappelli. Tất cả nội dung podcast bao gồm các tập, đồ họa và mô tả podcast đều được ITSPmagazine, Sean Martin, and Marco Ciappelli hoặc đối tác nền tảng podcast của họ tải lên và cung cấp trực tiếp. Nếu bạn cho rằng ai đó đang sử dụng tác phẩm có bản quyền của bạn mà không có sự cho phép của bạn, bạn có thể làm theo quy trình được nêu ở đây https://vi.player.fm/legal.

Guest: Sagar Bhure, Senior Security Researcher, F5 [@F5]

On LinkedIn | https://www.linkedin.com/in/sagarbhure/

At SecTor | https://www.blackhat.com/sector/2024/briefings/schedule/speakers.html#sagar-bhure-45119

____________________________

Hosts:

Sean Martin, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining CyberSecurity Podcast [@RedefiningCyber]

On ITSPmagazine | https://www.itspmagazine.com/sean-martin

Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli

____________________________

Episode Notes

The authenticity of audio and visual media has become an increasingly significant concern. This episode explores this critical issue, featuring insights from Sean Martin, Marco Ciappelli, and guest Sagar Bhure, a security researcher from F5 Networks.

Sean Martin and Marco Ciappelli engage with Bhure to discuss the challenges and potential solutions related to deepfake technology. Bhure reveals intricate details about the creation and detection of deepfake images and videos. He emphasizes the constant battle between creators of deepfakes and those developing detection tools.

The conversation highlights several alarming instances where deepfakes have been used maliciously. Bhure recounts the case in 2020 where a 17-year-old student successfully fooled Twitter’s verification system with an AI-generated image of a non-existent political candidate. Another incident involved a Hong Kong firm losing $20 million due to a deepfake video impersonating the CFO during a Zoom call. These examples underline the serious implications of deepfake technology for misinformation and financial fraud.

One core discussion point centers on the challenge of distinguishing between real and artificial content. Bhure explains that the advancement in AI and hardware capabilities makes it increasingly difficult for the naked eye to differentiate between genuine and fake images. Despite this, he mentions that algorithms focusing on minute details such as skin textures, mouth movements, and audio sync can still identify deepfakes with varying degrees of success.

Marco Ciappelli raises the pertinent issue of how effective detection mechanisms can be integrated into social media platforms like Twitter, Facebook, and Instagram. Bhure suggests a 'secure by design' approach, advocating for pre-upload verification of media content. He suggests that generative AI should be regulated to prevent misuse while recognizing that artificially generated content also has beneficial applications.

The discussion shifts towards audio deepfakes, highlighting the complexity of their detection. According to Bhure, combining visual and audio detection can improve accuracy. He describes a potential method for audio verification, which involves profiling an individual’s voice over an extended period to identify any anomalies in future interactions.

Businesses are not immune to the threat of deepfakes. Bhure notes that corporate sectors, especially media outlets, financial institutions, and any industry relying on digital communication, must stay vigilant. He warns that deepfake technology can be weaponized to bypass security measures, perpetuate misinformation, and carry out sophisticated phishing attacks.

As technology forges ahead, Bhure calls for continuous improvement in detection techniques and the development of robust systems to mitigate risks associated with deepfakes. He points to his upcoming session at Sector in Toronto, where he will delve deeper into 'Hacking Deepfake Image Detection Systems with White and Black Box Attacks,' offering more comprehensive insights into combating this pressing issue.

____________________________

This Episode’s Sponsors

HITRUST: https://itspm.ag/itsphitweb

____________________________

Follow our SecTor Cybersecurity Conference Toronto 2024 coverage: https://www.itspmagazine.com/sector-cybersecurity-conference-2024-cybersecurity-event-coverage-in-toronto-canada

On YouTube: 📺 https://www.youtube.com/playlist?list=PLnYu0psdcllSCvf6o-K0forAXxj2P190S

Be sure to share and subscribe!

____________________________

Resources

Hacking Deepfake Image Detection System with White and Black Box Attacks: https://www.blackhat.com/sector/2024/briefings/schedule/#hacking-deepfake-image-detection-system-with-white-and-black-box-attacks-40909

Learn more about SecTor Cybersecurity Conference Toronto 2024: https://www.blackhat.com/sector/2024/index.html

____________________________

Catch all of our event coverage: https://www.itspmagazine.com/technology-cybersecurity-society-humanity-conference-and-event-coverage

Are you interested in sponsoring our event coverage with an ad placement in the podcast?

Learn More 👉 https://itspm.ag/podadplc

Want to tell your Brand Story as part of our event coverage?

Learn More 👉 https://itspm.ag/evtcovbrf

To see and hear more Redefining CyberSecurity content on ITSPmagazine, visit: https://www.itspmagazine.com/redefining-cybersecurity-podcast

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

  continue reading

621 tập

All episodes

×
 
Loading …

Chào mừng bạn đến với Player FM!

Player FM đang quét trang web để tìm các podcast chất lượng cao cho bạn thưởng thức ngay bây giờ. Đây là ứng dụng podcast tốt nhất và hoạt động trên Android, iPhone và web. Đăng ký để đồng bộ các theo dõi trên tất cả thiết bị.

 

Hướng dẫn sử dụng nhanh