Nonprofits and grantmaking organizations are typically not known for being on the forefront of technology. But when you are a lean staff and a tool comes along that can act as an unpaid intern, it’s no surprise that the philanthropic industry has jumped on AI adoption, even more than for-profit businesses.
According to a research report by Twilio, 58% of nonprofits are using AI as part of their communication, compared with 47% of B2C for-profit businesses.
In a webinar with Peter Panapento, co-founder and philanthropic practice leader of Turn Two Communications, he shared best practices when creating an AI policy for your grantmaking organization.
If your organization is using or exploring AI-enabled tools as part of your grantmaking processes, you need an AI policy that represents your mission, incorporates the needs of your staff, and evolves as the tools change.
Here are some of the tips Peter shared for creating an AI policy that your organization will actually use.
Understand the Needs of Your Staff
Whether you create a working committee or find ways to survey your staff, make sure your policy reflects the needs of your staff and isn’t simply a directive from leadership.
Meet your staff where they are. How much do they already know? Tools you are using may have already released AI-enhanced functionality, so some staff members may be comfortable using those features or common tools like ChatGPT. But you are likely to have a few staff members who are uninterested or unsure about the technology.Â
As you are creating your AI policy for your organization, create a working committee from different parts of the organization. Publicize that opportunity and progress and invite people to participate. If you have a small staff and can’t put together a full committee, find ways to gather information such as through a survey, to see how people are using AI tools now. This can help you identify staff members who can help you review the draft of your policy as well.
Be Clear on What AI Can Do for Your Organization—and What It Can’t
Your AI policy should set the foundation for a common understanding of what your AI-enabled tools can do and outline how your staff should use them. Provide common definitions so everyone is grounded in what you are talking about. Make sure they understand how terms like intellectual property and third-party information relate to your AI tools. And be sure to highlight the different ways they can encounter AI tools and how they are different, such as generative AI versus AI-enhanced data analysis tools.
In addition to providing that baseline understanding of the terms, be clear on what you see as the benefits of AI—it is to enhance the daily work of your staff. It is not designed to replace anyone. Generative AI tools are great at recognizing and repeating patterns, so they are good at helping you summarize and simplify language. They are not good at creativity and empathy—key traits you want in your grantmaking staff.
Your AI policy should also be clear on expectations for the use of these AI tools. For example, no one should use proprietary, sensitive, or personal data as defined by your classification policy in the prompts. When using AI tools to help you draft an award letter, for example, have it put together a template that you can go back and add the grantee information.
Encourage your staff members that AI should not be used as an endpoint. Everything that comes from an AI tool should be reviewed for potential bias and make sure it matches your organization’s tone. And everything should be fact checked—these tools are notorious for making up facts and data points that match the pattern.
Encourage Regular Use of Approved AI Tools
Many of the tools your team uses regularly already have some AI functionality, like Zoom, Google Docs, and Canva. Empower your team with use cases and guidelines to incorporate AI-enhanced tools into their daily work so they can find the applications that work best for them.
Show them what a good use case would look like for these tools. Provide a list of approved tools and how they should be used. Your AI policy should encourage staff members to experiment and think creatively about their roles to see if there are other opportunities that still fit within the established guidelines.
Provide Ongoing Training for All Staff
Conducting an organization-wide training is essential to make sure all staff members understand and can effectively use the AI policy.
Begin by articulating the purpose of the policy and explaining why AI is a beneficial tool for your organization. It’s important not to intimidate staff but rather to emphasize how AI can enhance their work efficiency and creativity.
Show how to use these tools ethically, in a way that supports your organization’s mission and values while upholding its integrity. Additionally, include sessions where team members can showcase their use of AI and share insights they’ve gained from their experiences. This approach will foster a collaborative learning environment and encourage the ethical and effective use of AI tools.
Don’t Let Your AI Policy Gather Digital Dust
AI and the tools that incorporate it will continue to evolve and its use cases will be constantly changing. So, the best practices for your organization should also shift over time. Establish a process for updating and adapting your AI policy in alignment with these changes. Review your policy regularly—whether monthly, quarterly, or semi-annually—to ensure that it remains relevant and effective. And make sure to communicate any changes you make to the policy promptly and transparently to keep everyone informed and aligned.
Additionally, provide a clear pathway for staff to recommend new tools for evaluation to foster innovation and continuous improvement.
Applying Your AI Policy to Potential Grantees
Once your organization has an AI policy for internal use and is comfortable with several use cases as part of your daily work, turn your sights on an AI policy for your grantees. Document under what circumstances you expect potential grantees to use AI as part of their application responses. Are there any use cases where you wouldn’t want an applicant to use an AI tool? Bring in a trusted grantee to help you think through this part of your policy. And when it’s ready, be sure to make it easy to access for all applicants.
Want to learn more about creating an AI policy for your grantmaking organization? Check out Peter’s full webinar, AI Policies for Grantmakers: How to Manage Risk and Harness AI for Good.