Deepfakes
We've curated 72 cybersecurity statistics about deepfakes to help you understand how this evolving technology is impacting identity theft, misinformation, and online scams in 2025. Stay informed to protect yourself from these emerging threats!
Showing 41-60 of 72 results
99% of organizations say that deepfake defense will be important to their cybersecurity strategies over the next 12-18 months.
Only 37% of organizations said they are already investing in deepfake defense.
Email-based deepfake attacks represent 59% of deepfake threat vectors.
Over 5% of organizations targeted by deepfake-related incidents have lost $1 million dollars or more.
1% of organizations said they had no plans to invest in deepfake defense.
30% of schools have experienced students creating harmful AI content (deepfakes of peers, etc.).
22% of respondents are concerned about targeted deepfakes.
In Singapore, 39% of respondents reported high concerns of deepfake impersonations.
54% of schools have not experienced students creating harmful AI content (deepfakes of peers, etc.).
16% of schools are not sure if they have experienced students creating harmful AI content (deepfakes of peers, etc.).
52% of faculty are concerned about deepfake impersonation of staff/students due to AI.
62% of students are confident in their ability to spot a deepfake.
59% of organizations report an increasing difficulty for employees to discern real from not real.
38% of organizations admit to being underprepared for AI-driven social engineering threats such as automated attacks, deepfake-based videos, and voice scams.
70% of students have seen a deepfake in the last 6 months.
32% of organizations reported being prepared for deepfake and synthetic identity attacks.
Higher audio quality deepfake synthesis services typically cost upwards of $1,000 a month, but many platforms offer decent output starting at just $5.
The first deepfake audio-enabled CEO scam occurred in 2019.
Many audio deepfake services offer one-shot voice generation, requiring just a few seconds of source material.
In January 2025, a threat actor offered on-demand deepfake image creation services for defeating KYC checks, mentioning tools like DeepFaceLive, DeepFaceLab, and AI Voice Changer, having bypassed KYC checks from two different cryptocurrency exchanges.