75% of employees are already using AI tools.
The use of unauthorized AI tools among this group is
While organizations secure the front door, employees are leaving the back door wide open.
Find out why Shadow AI is the new reality.
75% of employees are already using AI tools.
The use of unauthorized AI tools among this group is
Click-through rates for phishing emails are rising due to AI.
There has been an increase in clicks on AI-generated phishing emails since 2003.
The Halo Effect Trap
In 56% of cases, first impressions trump actual safety.
Banning ChatGPT and other AI tools is counterproductive. In our latest trend report, we explore the psychology behind “Shadow AI” and reveal why employees are circumventing the rules en masse.
89% know the rules, but 54% don’t follow them. Find out why knowledge alone isn’t enough.
More than half of your colleagues mistakenly believe that a “nice-looking” website is also a secure website.
Learn how to turn employees’ curiosity into safe innovation through AI literacy.
Comparison between Traditional Shadow IT and Shadow AI
Practical guidance for the CISO and HR
Managing AI often feels like trying to bail out a sinking ship: 75% of employees use AI tools, yet IT has visibility into less than 11% of them. How do you tackle this? In this report, we explain how you can directly translate the insights from the report into a strong business case, more focused policies, and a resilient organizational culture.
Compare your organization to the market: did you know that AI-powered phishing attacks are five times more successful than they were two years ago?
Use our analysis of 33,690 respondents to show senior management that AI awareness is urgently needed in a world where 75% of employees are already using AI tools.
Learn how to effectively address the blind spot of “free” and unauthorized AI tools by focusing on AI literacy.
Employees are experimenting faster than your policy allows. Find out how to stay in control without stifling innovation.
Safe adoption starts with behavior. Learn how to embed critical thinking and digital literacy into your culture.
Frequently Asked Questions
About Ransomware
and Security Awareness Training
Shadow AI refers to the unauthorized use of AI tools for work-related purposes. While traditional Shadow IT is often limited to storage services, AI involves the processing of intellectual property and sensitive business data. The risks are growing exponentially: the click-through rate for AI-related phishing has increased fivefold in just two years.
Traditional training programs often focus on theoretical knowledge, even though 89% of employees already know the rules. However, there is a gap between knowing and doing: 56% of professionals are influenced by the Halo effect, whereby a professional-looking interface is mistakenly perceived as secure. Effective security therefore requires a psychological approach focused on behavioral change.
Blocking AI tools is counterproductive and fosters a shadow culture in which risks become invisible. Furthermore, access to AI influences talent’s choice of employer. The solution lies in facilitating safe use by increasing AI literacy, so that employees learn to assess risks independently.
The report analyzes over 100,000 data points from various sectors, including healthcare, business services, and government. These quantitative insights provide professionals with a foundation for securing budgets and building support for comprehensive behavioral programs. It translates complex security challenges into concrete actions for a resilient digital culture.
The focus must shift from strict control to human resilience and digital agility. This involves bridging the knowledge-action gap by teaching employees how to classify data correctly and independently assess the reliability of tools. Organizations must invest in this agility to turn Shadow AI into a safe and productive force for innovation.
Sociaal Psycholoog
Need help finding what you’re looking for?
Contact our client support experts!