DMR News

Advancing Digital Conversations

Unauthorized AI Tools at Work Pose Risks to Businesses

ByDayne Lee

Aug 29, 2025

Unauthorized AI Tools at Work Pose Risks to Businesses

Artificial intelligence tools are becoming common in the workplace, often used by employees to clean up presentations, draft emails, or generate visuals. But when these tools are not officially sanctioned by employers, they fall under the category of shadow AI — and can create serious risks for companies.

Shadow AI arises when workers turn to third-party apps for speed and convenience. But cybersecurity experts warn that such shortcuts can unintentionally expose sensitive information, from financial statements to client data.

The Risks of Going Rogue

Robert Falzon, head of engineering at cybersecurity firm Check Point Software, said the real problem is employees don’t realize how much data is stored and reused by chatbots. Proprietary numbers shared to create an infographic could resurface later, provided back to someone outside the company by the same AI tool. Hackers, he added, are using these platforms too.

A July study by IBM and the Ponemon Institute found that 20% of surveyed firms had already suffered breaches linked to shadow AI. In Canada, the average cost of a data breach has climbed to nearly $7 million.

Calls for Guardrails

KPMG partner Kareem Sadek said governance is critical: “It’s not necessarily the tech that fails you; it is the lack of governance.” Suggested measures include forming AI committees, adopting zero-trust frameworks, and training employees about risks.

Some firms are responding by rolling out internal chatbots. While these systems reduce risks, they aren’t bulletproof. Cybersecurity researcher Ali Dehghantanha said it took him less than an hour to break into a Fortune 500 firm’s internal AI system during an audit, exposing sensitive client information.

Experts stress that awareness programs and clear internal policies can help bridge the gap between worker convenience and corporate security. Employees need training to recognize what data can be shared safely and what cannot.

What The Author Thinks

The problem with shadow AI isn’t only about software or firewalls — it’s about trust and culture. When employees feel forced to sneak around to use tools that make their jobs easier, it signals a gap between leadership and staff. If companies want to prevent risky behavior, they need to offer secure AI tools that are just as easy to use as the consumer apps people turn to. Otherwise, shadow AI will keep spreading, no matter how many policies are written.


Featured image credit: Matheus Bertelli via Pexels

For more stories like it, click the +Follow button at the top of this page to follow us.

Dayne Lee

With a foundation in financial day trading, I transitioned to my current role as an editor, where I prioritize accuracy and reader engagement in our content. I excel in collaborating with writers to ensure top-quality news coverage. This shift from finance to journalism has been both challenging and rewarding, driving my commitment to editorial excellence.

Leave a Reply

Your email address will not be published. Required fields are marked *