37 per cent of Dutch people would consider using ChatGPT for work and 38 per cent already do, according to research by Kaspersky among 1,000 Dutch people. Despite 69 per cent of respondents knowing how ChatGPT's information processing works, the majority do not think it is important to keep conversations and chat history private; and if they do, they admit to sharing private data anyway. This attitude can put companies at risk, with potentially damaging consequences if a data breach occurs.
Knowledge level leaves much to be desired
ChatGPT is very popular in the Netherlands with almost 1.5 million users in the first months after its launch. The fact that the tool is used for work is therefore no surprise, but the potential for abuse remains high. While on the one hand there is something to be said for using ChatGPT to improve the way we work, on the other hand it appears that the level of knowledge about its information processing leaves something to be desired. For instance, 31 per cent say they have no idea how ChatGPT processes its information. And while 69 per cent think they are aware, the majority (42 per cent) say they only approximately understand how it works.
In addition, Kaspersky asked for opinions on sharing sensitive information with the AI tool. Almost a quarter (24%) said they do not think it is important to keep conversations and chat history private. A larger proportion (34%) think it is important to keep searches private, but admit to sharing sensitive information (occasionally) anyway. The survey thus exposes a worrying problem with employees sharing potentially sensitive data with ChatGPT.
About the survey
Using an online questionnaire, Kaspersky surveyed 1,000 Dutch people about their views on ChatGPT, its use and their level of knowledge in the workplace.