What is Shadow AI?

'Shadow AI' refers to the use of artificial intelligence (such as ChatGPT or other smart tools) by employees without the organisation's knowledge or approval (IBM, 2025). This is not a sign of bad faith; employees simply want to do their work faster and better (Invicti Security, 2025). Shadow AI is no longer a marginal issue, but a dominant reality within almost every organisation. Figures from Microsoft and LinkedIn (2024) show that 75% of employees are already using AI tools. Of this group, no less than 78% admit to using unauthorised AI tools for work-related tasks.

Figuur2-1

This can be explained by the ‘Curiosity Gap’ (Loewenstein, 1994): If an official AI tool is missing, a fake email with the text "Activate your licence here" (Figure 2) perfectly matches the desire of the employee who wants to use AI.

Figuur

In addition, the well-known ‘Halo effect’ (Nisbett & Wilson, 1977) plays a role. Employees are often misled by the appearance of a website. If a site looks trustworthy, more than 56% assume that it is also secure (Figure 3). A single click on such a "trustworthy" but fake AI link can be enough to bring ransomware into your organisation.

 

figuur 5

Knowing is not the same as doing

Data from the Awareways Culture Scan questionnaires and Wave training sessions reveals a striking gap between knowledge and behaviour:

1. The Knowledge: No less than 88% of employees know that they should, in principle, request permission for new software (Figure 4).

2. The Practice: In reality, only 46% of employees consult with the security team before starting work with a new tool (Figure 5).

 

Employees are aware of the official procedure, but practice shows that more than half of the organisation simply bypasses the security department to achieve faster results.

 

trendrapport

From control to resilience

Simply blocking AI tools no longer works and can even lead to a loss of talent: 54% of early-career employees indicate that access to AI influences their choice of employer (Microsoft & LinkedIn, 2024). The solution for CISOs, therefore, lies in increasing the human resilience of their employees when it comes to Shadow AI.

At Awareways, we believe that the employee is not the weakest link, but rather the key to secure AI use. By investing in AI literacy, we teach employees how to assess the reliability of tools and how to handle data securely within the new frameworks. In this way, we transform an invisible risk into a secure force for innovation.

 

Curious about the full figures and in-depth psychological analyses behind this behaviour? We will shortly be publishing our full Awareways Trend Report 2025, in which we dive even deeper into the world of Shadow AI.


Expert support

picture-3-1

Daan Verwaaijen

Client Relations

Expert support

Need help finding what you're looking for?
Contact our client support experts!

Talk to an expert