Australian Not-For-Profits are collecting more data than ever - but many struggle to turn that data into meaningful insights, measurable impact, and sustainable growth.
​
At our recent breakfast, "Monetising Data & Leveraging Technology for NFP Growth", industry and technology experts led a focused and thoughtful discussion on how NFPs can move beyond data collection and start creating real value from the information they already have.
​
Key Takeaways:
-
NFPs don't have a data problem - they have a fragmented data problem
-
Start small and lead with a business use case, not with the technology
-
Good data governance is essential - especially when working with AI
-
Technology change is a people change, not just a system one
-
Collaboration across the NFP sector accelerates progress - share lessons, partner with a trusted provider, and leverage community programs to move forward faster.
Event Recording
Presentation Deck
Q&A:
Traditionally, when we talked about data - especially wrangling fragmented systems - it was all about creating a single source of truth. If I’m understanding correctly, it sounds like newer technology combined with AI allows us to bypass some of that costly foundational work that often sat in the “too hard” basket. Is that right?​
​
Answer: Yes, that’s right. Platforms like Microsoft Fabric still allow you to create a Data Lake or Data Warehouse if that’s the right approach for your organisation. However, what’s changed is that you no longer have to centralise everything upfront.
​
Instead, you can leave data where it lives and use tools such as generative queries or Power BI to unify, access and visualise data across multiple systems. The technology brings those insights together for you, while still ensuring that only the right people have access to the right information.
That introduces an important shift in thinking. With AI making information more visible and accessible, organisations must be deliberate about data access. Not everyone in the business should see everything. Guardrails are essential — creating clear permissions, roles and “data bubbles” across the organisation.
Another critical takeaway is data ownership. This is often overlooked. While many organisations invest in tools, they haven’t clearly defined who owns which data, systems and processes. That ownership conversation is essential to ensure data is used correctly, responsibly and sustainably.
​
I’ve been in this field for 25 years, and fundraising traditionally takes time — research that once took a week now takes minutes. At 62, embracing this technology has genuinely reduced my workload, and I feel it’s safe and secure. I’d encourage anyone to embrace it. But fear still seems to be the biggest barrier — where does the data go, how do I use it, and how do I get the best results?
​
Answer: Thank you for sharing that perspective. One of the most encouraging outcomes of this technology is that it challenges the idea that only younger generations adapt to change. What we’re seeing instead is that people of all ages, backgrounds and experience levels are on this journey together.
The fear is understandable - especially around safety, security and uncertainty - but with the right education, guardrails and organisational support, AI can genuinely reduce workload and enable people to focus on higher‑value work. When used well, it becomes an enabler rather than a risk, empowering everyone to move forward with confidence.
How can organisations use AI effectively without deskilling critical thinking? There’s a real risk that people rely too heavily on it. How do you balance efficiency while maintaining good judgment and decision‑making?
Answer: This is where an AI strategy and governance framework become essential. While much of the conversation focuses on innovation and security, there’s also a strong practical and people‑focused aspect.
​
AI should be treated as a support tool - not a replacement for thinking. A helpful way to frame it is to think of AI like an intern: useful for accelerating tasks and generating ideas, but not something you outsource entire jobs or final decisions to.
​
Importantly, people still own the outcome. That responsibility sits with individuals, leaders and teams, and it requires open conversations around expectations, accountability and value. Transparency is also key - both internally and externally - about how AI is being used.
Governance plays a critical role here: addressing bias, defining acceptable use, and aligning AI adoption with what truly adds value to your organisation. Ultimately, organisations must decide what “value” means for them and ensure AI is enhancing - not replacing - human judgment, insight and critical thinking.

