80% of Canadian Nonprofits Are Using AI.
64% Have No Policy.
Here's What That Actually Means.
I've been working on NFP strategy lately, spending time in the data and in the field. And the picture that's emerging isn't the one most sector leaders are expecting.
The narrative around nonprofits and technology usually goes one of two ways: either the sector is behind and needs to catch up, or the sector is resource-constrained and can't be expected to move at the same pace as business. Both framings miss what's actually happening.
Canadian nonprofits are using AI. Widely. And they're doing it without the governance, skills, or strategy to do it well. That's not a technology problem. It's a leadership problem. And it's fixable.
Here's what I'm seeing, and what I think it means.
Deep Dive:
The Usage Gap
A January 2026 report from Imagine Canada and the Canadian Centre for Nonprofit Digital Resilience surveyed over 900 Canadian nonprofits. 80% reported using AI in some form.
That number is higher than most people in the sector expect. It's also higher than the general Canadian business population, where Statistics Canada reports about 12% of businesses are using AI to produce goods or deliver services.
Nonprofits are outpacing the private sector on adoption. That's not because they have more resources. It's because the tools are free or near-free, they're accessible through platforms organizations already use, and frontline staff are using them to get their jobs done whether the organization has a plan or not.
That last part is the problem.
67% of nonprofits use AI for communications and fundraising. 50% use it for data and information tasks. These are real activities touching real donor data, real client information, and real organizational communications. And 64% of the organizations doing this have no policies guiding how it happens.
No guidance on what data can be entered into an AI tool. No policy on who owns the output. No process for reviewing what's being generated before it goes out the door. The staff aren't acting badly. They're filling a vacuum.
The Governance Vacuum
When I look at what's missing, it's not the willingness to act. It's the framing.
Most NFP leaders I talk to think about AI governance as a technology decision. It isn't. It's a data decision, a risk decision, and increasingly, a legal decision.
Canada's privacy law is being replaced. PIPEDA, written in 2000, is expected to give way to new legislation with penalties up to $25 million or 5% of global revenue. Quebec's Law 25 is already in effect with its own requirements. Organizations operating across provinces, or holding data on vulnerable populations, are sitting in a patchwork of obligations they may not fully understand.
43% of Canadian organizations were targeted in a cyber attack in the past 12 months. 42% experienced a data breach, up from 29% in 2022. Those numbers aren't specific to nonprofits, but the sector's exposure is often higher: high volunteer turnover means access credentials proliferate and rarely get cleaned up. Donor and client data sits in systems that haven't been audited in years. Tool sprawl means nobody has a clear picture of where the data actually lives.
A single breach in an NFP context doesn't just trigger a legal obligation. It can destroy donor trust built over decades, destabilize funder relationships, and consume organizational resources that were never sized to handle a crisis.
Governance isn't a compliance exercise. It's protection for the mission.
The Expertise Gap
Here's the number that surprised me most: only 9% of Canadian nonprofits have engaged an external consultant for AI-related work.
That means 91% of organizations navigating this environment are doing it without the benefit of someone who's seen it across multiple organizations, who knows what good looks like, and who can say "here's what I'd do first."
Compare that to what the sector is up against. Vendors pitching AI tools at every conference. Funders starting to ask questions about digital maturity. Regulatory change in the near-term. Staff using tools the organization doesn't officially know about.
The organizations that have brought in outside support are using AI for more activities, with more confidence, and with less exposure. That's not because consultants are magic. It's because external perspective compresses the learning curve. You get to benefit from someone else's trial and error instead of running your own.
The gap between "we need to figure this out" and "we're actively working on it" is often just one conversation.
Next Steps:
What Good Looks Like
I'm not going to pretend there's a playbook that works for every NFP. Organization size, sector, geographic reach, and data complexity all matter. But there are three things I see consistently in organizations that are getting this right.
They started with a data audit, not a technology decision. Before adding tools, they mapped what data they held, who had access, and what would happen in a breach scenario. That exercise alone surfaces the issues worth fixing and gives the board something concrete to work with.
They separated the AI conversation from the AI governance conversation. One is about tools. The other is about decisions, policies, and accountability. Boards that mix the two tend to make neither well. A short, focused conversation on governance, separate from any vendor discussion, tends to produce a usable outcome in under an hour.
They got an outside perspective before a problem forced the issue. Not necessarily a large engagement. Sometimes a structured conversation with someone who's worked across the sector, who can help the leadership team ask the right questions and see the gaps they're too close to see.
The organizations that wait tend to wait until something goes wrong. The ones that act tend to act because someone reframed the question.
The Window
This is the part I keep coming back to.
Most Canadian NFPs are not in crisis right now. The new privacy legislation hasn't landed yet. A major sector-wide breach hasn't happened. Funders haven't started demanding governance documentation. The window to build this thoughtfully, without pressure, is still open.
That window won't stay open indefinitely.
The organizations that build strong data governance and AI literacy now, before regulation tightens and before something goes wrong, will be in a materially better position two years from now. The organizations that wait will be building it reactively, under pressure, and probably with less capacity to do it well.
This is the conversation I'm having with sector leaders right now. Where are you, what's at risk, and what's a reasonable first step.
If you want to have that conversation, I offer a free 30-minute call for NFP leaders. No pitch. Just a structured conversation about your situation and what a sensible next step looks like.
This piece is drawn from a research briefing built on data from Imagine Canada, the Canadian Centre for Nonprofit Digital Resilience, the Ontario Nonprofit Network, the Canadian Centre for Cyber Security, and CIRA.