Tag Archives: AI equity

Nonprofit Radio for March 18, 2024: Artificial Intelligence For Nonprofits, Redux

 

Justin Spelhaug, Amy Sample Ward, & Tristan Penn: Artificial Intelligence For Nonprofits, Redux

A second savvy panel takes on the impact, leadership demands, promises, responsibilities, and future of AI across the nonprofit community. We convened a panel in June last year. But this is an enormous shift in nonprofit workplaces that deserves another look. This panel is Justin Spelhaug, from Technology for Social Impact at Microsoft, and Amy Sample Ward and Tristan Penn from NTEN.

 

Listen to the podcast

Get Nonprofit Radio insider alerts!

I love our sponsors!

Donorbox: Powerful fundraising features made refreshingly easy.

Virtuous: Virtuous gives you the nonprofit CRM, fundraising, volunteer, and marketing tools you need to create more responsive donor experiences and grow giving.

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

And welcome to Tony Martignetti nonprofit radio. Big nonprofit ideas for the other 95%. I’m your aptly named host and the pod father of your favorite abdominal podcast. Oh, I’m glad you’re with us. I’d be forced to endure the pain of chronic inflammatory demyelinating, poly reticular neuropathy. If you attacked me with the idea that you missed this week’s show, that one is so good. It deserves two weeks and plus I spent a week practicing it. So it lives on for one more week. Here’s our associate producer to introduce this week’s show. Hey, Tony, I’m on it. It’s Artificial Intelligence for nonprofits. Redux, a second savvy panel takes on the impact, leadership demands, promises responsibilities and future of A I across the nonprofit community. We convened a panel in June last year, but this is an enormous shift in nonprofit workplaces that deserves another look. This panel is Justin Spell Haug from technology for social impact at Microsoft and Amy Sample Ward and Tristan Penn from N 10 on Tony’s take two. Thank you. We’re sponsored by donor box, outdated donation forms blocking your support of generosity. Donor box fast, flexible and friendly fundraising forms for your nonprofit donor box.org and by virtuous, virtuous gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow. Giving. Virtuous.org. Here is Artificial Intelligence for nonprofits redux. We’re talking this week about artificial intelligence. Again, it’s an important topic. Uh We did this with a panel in June last year today, a different distinguished panel shares their thoughts on this transformative technology. It’s timely, It’s got a lot of promise and a lot of risks. It’s moving fast. Those are the reasons why nonprofit radio is devoting multiple episodes to it. What are the promises and the responsibilities? What’s the role of nonprofit leadership about government? What are the equity concerns? The biases? What about access to this intelligence? What are the preconditions for successful integration at your nonprofit? What’s the future of artificial intelligence? Who to share their thinking? Are Justin Spell Hog recently promoted Justin Spell Haug. He is corporate vice president and global head of technology for Social Impact at Microsoft. You’ll find Justin on linkedin. Justin. Welcome to nonprofit radio. Congratulations on your promotion from vice president to corporate vice president at the uh enormous company Microsoft. It’s great to be here with the pod father. It’s a new name. So I’m proud to, proud to be here and look forward to the conversation. All right. Well, I’m glad it’s the first time you’ve heard the pod father. It’s, there’s on, there can be only one really there, there ought to be only one. So I’m glad it’s the first time. Um And I see, you know, global head. I’m sorry, you’re a little bit limited. You’re not working in the stratosphere, the ionosphere, the troposphere, you’re strictly limited to the globe. I’m sorry, we all have our constraints. We are working on Mars and the moon uh soon, but we gotta get a broader population of nonprofits there. All right. So we, we’re limited to the globe. I’m sorry for you, Amy Stample Ward. We know them. They are nonprofit radio’s technology contributor and the CEO of N 10. They’re at Amy Sample ward.org and at Amy RS Ward, Amy, it’s great to see you. Welcome back. Of course. Thanks. I know there have been a number of different conversations about A I that you’ve had on nonprofit radio. Um I’ve listened to them, I haven’t been in all of them. They’ve been great and, you know, we talked a little bit about a IJ and I, you know, when we started off with some of what’s gonna be big topics in the sector for 2024. So excited to be in a conversation kind of dedicated to that. I’m glad you are and Tristan Penn, welcoming back Tristan, he is equity and accountability director at N 10 as a Black and NAVAJO professional. He’s served on previous organizations, equity teams and been a facilitator for de I rooted in racial equity. Tristan is on linkedin, Tristan. Welcome back. Awesome. So happy to be here. Um Thank you for having me, excited to have this conversation with um Amy, who I work very closely with and um it’s really good to see you too and um also excited to have this conversation with Justin to see um you know what we can unearth. Yes, we’re, we’re representing the big tech perspective. Um Amy, since you are our tech contributor, uh we’re gonna start off, you know, just big picture. What are your, what are your thinking? What is your thinking? What are your concerns? Big picture stuff. Yeah. Well, I’m glad that we’ve scheduled five hours for this interview. I will be taking the first four. Thank you so much. I have many thoughts. Uh many concerns, many, uh you know, I think there’s so there’s just a lot to get into, I think some top level, you know, bites to put at the beginning here are, there’s a lot of hype and as with anything that falls into the hype machine, I think nonprofits do not need to fall, you know, victim to like, oh my gosh, I read this one article so I have to do the thing, right? Um There’s, there’s time A I is not done, the world is now now, not already over and everything’s predetermined, right? So, um you, you’ve seen the article that was like a I will end humanity? Ok. Ok. Here we are let’s calm down and talk about things. So I, I know I’ve talked to nonprofits whose boards are, like, I read that article and A I is good. You know, it’s ending all of us like we can take our time. That’s one piece. Uh, I also think it’s important for organizations to think about where they are already working, what communities they already work with, what data they already have. Like this isn’t start a new project when we’re talking about A I. Um And so I think we’ll get into that more in our, in our conversations here. Um And of course, that A I isn’t new. Well, I mean, artificial intelligence is a phrase is the, is the broadest umbrella term we could use for these types of technologies. And so to, to have these sentences that say like A I is new and it’s here and it’s going so fast. Like what is that? That’s like encompassing so many different components of technology. Uh And so what do, what do we really mean when we’re talking about A I? Are you talking about a model that you set up inside of your organization? You know, to help identify program participants that need extra support? That could, that can be A A I. But that’s very different than saying, oh yeah, we’re just using chat GP T to help, you know, start some of our drafts. OK. Those are so they are wildly different things. And so to talk about them in the same breath as it’s all a I it sets folks up to already have kind of a disconnected conversation even from the start. All right. Thank you and hold our feet to the fire. Uh Especially me because the three of you think about this all the time and I don’t. So, you know, if I, if I lose that context that you just revealed, shared with us, please, uh call me out. All right, Justin big picture, please. What do you go on Amy? You know, the hype cycle of it’s gonna save us, it’s gonna destroy us. And now just kind of how do we make use of it? We’ve been going through this, this process as a, as a community. I, I think one of the things when I zoom out, I, I just see um some tectonic shifts that are impacting the sector from some big demographic shifts in European countries in the United States where we force is getting older, that’s putting tons of pressure on aged care and front line community workers, some big shifts in uh continents like Africa where education, skilling and jobs are all critical and the nonprofits facing off on these issues aren’t getting any additional funding. GDP is stabilized in many countries, but we’ve hit a new set point for inflation that’s impacting pocketbooks. It’s impacting people’s ability to raise money. And so really, you know, the question that we have to ask is how do we use A I in, in missions to help organizations raise more money, help them deliver more effective program, help them rise to these challenges that are continuing to create pressure in the sector. And how do we do all of that in a way that’s responsible in a way that’s safe in a way that’s inclusive. And that’s actually a pretty complex topic that I hope we spend some time on. Indeed. And thank you for the uh global perspective. Tristan, big picture of thoughts, please. I have lots of thoughts similar to, to Amy. And I, I think where I start off with is kind of like in a very, uh, I worked for 20 years and I still am working in, in nonprofit and I see how, um over those years nonprofits and, you know, small organizations have seen something that’s bright and glittery and then like, so amazed by it and been like, yes, we want it, we’re going to take it in and we have no process for building it into our, our operations. We have no forethought for it. We have no contingency to, um, to live by when we’re folding this in this ideal state. We, we’ve already jumped like multiple steps to um us envisioning how we’re going to operate with this bright shiny tool that we have. And that’s never been the case in my years, um, that I’ve, I’ve been a nonprofit and it’s, if anything, it’s always been uh folded in, in a way that doesn’t have a lot of forethought too. So I think the things that come to mind for me that make me curious and also a little bit, um, reticence um about just the blanket, the umbrella term A I is um folding it in where it makes sense and not where you want to add a little, you know, uh icing on your cake where it does where it needs none. And so, um that’s where II I intersect with it. There’s another piece of it um where I, I am a little um critical of it and concerned about it. Um because I think that this can, you know, we, to Amy’s point, we think about A I and a lot of people go in different directions. I think the, the baseline for a lot of people is they go to like a I generated pictures or chat GP T um to do those things and it’s much more than that, but I do think about a time anecdotally where um I was at a conference and I was um passing by a booth and there was like a very lovely, you know, picture of an older um couple and I was like, oh, that reminds me of my grandparents. It was an older black couple and I was like, oh, that’s so cute. It reminds me a lot of my grandparents. It’s like very, you know, and then I I went in closer and this is a, a booth that’s, you know, managed by a bunch of white folks. And, um, and then they were like, oh, did you know that this is an A I generated picture? And that didn’t feel good to me as a black person that didn’t feel good. It felt incredibly like I had been misled in a really scary way. Um I feel like I have a really good detector of like what’s real, what’s not my BS detector is like always up and on and that scared me because I was duped hard and that scares me in a way um less about nonprofits, but just the overall overall globalization and usage of it and implementation that it could go in to hand to the hands of people and create false narratives about marginalized groups um just based on what they, what product they wanna sell. And that is scary. Um And that, that’s something that I think um has just stuck with me for um for a while. Thank you for raising the the risks and, and potential, you know, misuse abuse. We, we need to go to artificial intelligence to create a uh a picture of an elderly black couple that was, it was necessary to do. And also thank you for the valuable parallel, you know, you, you make me think of uh social media adoption when Facebook was new, you know, we, we assigned it to an intern and we put it like the cherry on top where we didn’t need a cherry, but the intern had used it in college. So, you know, she may as well do it for us full time. Uh It very valuable, interesting parallel. Um Amy start us off with just a common I definition, you know, um artificial intelligence, generative, I mean, a generative artificial intelligence. That’s, that’s what we’re largely going to be talking about. Uh if not exclusively. I, I think so, what is, what is, there’s a lot of that? I think we’re, we’ll start with taking one at one at a time, right? Sure. No, I was just gonna say, I think um we already are exposed when we’re thinking about technology in our nonprofit organizations to lots of different terms, lots of different companies putting things out there with the uh not necessarily cloaked, you know, it’s not, it’s not a hidden desire to reinforce that they’re specialists, they know what they’re doing. And like us lowly nonprofits don’t know, we couldn’t understand those fancy terms, right? And so I always, I mean, I teach a course and I always remind folks like you absolutely can know what these words mean, you know. Um And I appreciate that there are so many places even actually, like I, I, I’m never somebody that promotes um these things. So folks know this, but like Microsoft has actually offered, you know, community learning spaces to say these are what these words mean. Um So artificial intelligence is like I said, the biggest umbrella term for all different types, generative A I uh machine learning, all of these components that people might talk about as if they are one different thing. They’re all like within that same A I umbrella. And I just want to say two words because they’ll probably come up in our conversation. I know you want to go one word at a time. But the words I hear from folks the most where they’re not, they feel like they should know what this word means and they don’t and they feel like silly that they don’t understand our algorithm and model those words are used all the time in talking about generative A I, which means the tool is, is set up to generate something back for you. Tristan used an image, uh you know, visual image uh example, but that could be text, that could be video, that could be audio, you know, it’s, it’s asking the the tool to generate something for you. Um But an algorithm we’ve heard this word like, you know, oh Facebook’s algorithm is like choosing what I see, right? The algorithm means the set of rules. So in Facebook’s newsfeed, that set of rules says if something already has a bunch of likes prioritize it, right? If it has uh you know, two friends that you’re connected to already commenting, prioritize, so it’s whatever that set of rules is that says this is how to generate a older black couple image, what whatever those rules were, that’s what algorithm means. And model essentially means like you can think of the same, the the word is used in the same way as uh when you say model about cars like it is the whole set put together, right? It’s got the data, it has the algorithm, the rules that say how, how to do it, it has the input, whatever you’re gonna ask it to do that kind of when people say what’s the model? They’re really saying. OK. What, what’s the package uh of how this tool is working? Thank you for all that. It’s time for a break. Open up new cashless in person donation opportunities with Donor box live kiosk. The smart way to accept cashless donations. Anywhere, anytime picture this a cash free on site giving solution that effortlessly collects donations from credit cards, debit cards and digital wallets. No team member required. Plus your donation data is automatically synced with your donor box account. No manual data entry or errors make giving a breeze and focus on what matters your cause. Try donor box live kiosk and revolutionize the way you collect donations in 2024. Visit Donor box.org to learn more. Now, back to artificial intelligence for nonprofits. Redux, Justin, I see you taking lots of notes. What’s uh what’s going on? What’s going on in your head? What what? No, I think um what just as Amy highlighted. One of the things that’s important to highlight is um we, we’ve been using A I for a really, really long time and there are really important use cases that have nothing to do with, with generative A I, things like machine learning, right? That allows us to do things like predict donation, things like machine language that allows us to translate from one language to another. Things like machine vision that allows us to identify and classify objects. All of those are important um tools as we look to solve different problems. Um In in the sector, generative A I is as Amy was highlighting is a new class of artificial intelligence that allows that’s capable of creating effectively novel content because it’s reasoning across, you know, all of the information in the internet and using as a news highlighting algorithms to identify patterns that allows it to um you know, produce answers in a really uh in, in many times intelligent ways. However, uh as Tristan was highlighting, you know, ensuring that um these models are inclusive, are representative, are safe, are understood, are all things that were continuing to work uh to put frameworks around and tools around uh so that they uh produce positive impact, not negative impact. And Justin how can we ensure that that actually happens? You know, there, there’s a lot of talk about biases, you know, uh the the the large language models are trained on predominantly white uh uh language sources. So you’re gonna, there’s so there’s bias uh the, the so that, you know, there are equity issues. But uh what uh what is the big tech doing to actually uh keep these, keep equity centered in and, and keep lack of biases centered as these models are adopted using the algorithms that, that Amy just defined for us. Yeah, it’s a really multifaceted answer. I’ll only hit two points and we can go much deeper if we want, we release. Uh just in fact, in the last week, this the Microsoft A I access principles trying to get at this very problem which has 11 core components. I’ll speak to two to give you a flavor of the kinds of things that we need to do as we think about the A I economy globally to ensure it’s fair, representative uh and safe. The one of the principles is making sure that A I models and development tools are broadly available to software developers everywhere in the world, everywhere in the world and every culture in the world training on the language and on the history uh and on the societies all around the world uh to create much, much more representation. As you probably know, many of the models have been developed in North America and therefore reflect some of those cultural biases. So, federating these tools that is critical uh in the in the A I economy. Secondly, you know, um companies and organizations that produce A I need to have rules uh for how they um check and balance the A I to ensure that it’s responsible, it’s fair, it’s safe, it respects privacy, it respects uh security, it’s inclusive, it’s transparent and we call those rules that Microsoft are responsible A I framework and it’s not just a set of principles, it’s actually an engineering standard. And when applying that engineering standard, we were looking at uh fairness in speech to text. So taking speech and transforming it into text and we found it was a couple of years ago, we produced this article that our, our speech to text algorithms were not as accurate Black and African American communities in the United States as they were for Caucasian communities. Um And that was largely a function of the training data that was used. And so we had to take a step back using our framework that caught this issue to say, how do we work with the communities more effectively? How do we bring socio linguists in to help us understand how to capture all of the rich diverse city of language to make sure that our speech to text capability is representative of every citizen that we’re, we’re rolling this out to. And that’s an exam and we did that and, and today it performs much better and there’s more work to do. But it’s those kinds of frameworks and guard rails that are really important in helping uh people design this stuff in a way that benefits everyone. Tristan. What’s your reaction? You, you’re thinking about equity all the time. Um What’s my reaction? What isn’t my reaction? And I would say, um I, I love that and I love what Justin was saying about um how, you know, making it a Federated model as opposed to it. I mean, yeah, everything, I only say everything but a good amount of things are being generated created curated in North America and baked into those models and algorithms are like biases that skewed towards white men. And um and that’s not OK. I think that excludes me in particular, but also like, you know, I, I think um having um a plan for that as opposed to being reactionary to being like, well, gosh, we didn’t know what was going on and being um uh a little more, less reactionary and more um forward thinking in that way. Yeah, proactive um is, is always a good place to start. I think a few other things that do come to mind too in terms of um making sure that communities of color marginalized communities are um not um constantly shouldering even outside of A I but constantly shouldering um the mess ups of like the brand new tool that came out on the market and that seems to always be the case and there’s always like a headline months later where it’s like, so and so we found out, this tool wasn’t geared towards her facial recognition wasn’t geared towards like, you know, black folks. Um, and it was like, historically wrong. And so I, I think about those things, but I also think about um, it through a nonprofit lens because we’re on a nonprofit call. Um, and I, um, I bring up the, another anecdotal story of um having, uh, being on a call and having an A I note taker bot um hop into the zoom call too. I think we’ve within like the last half year we’ve been on calls where it’s like, oh, I don’t know about some actual person or a thing or like, you know, it, it’s very ambiguously named sometimes where it’s like Otter, one of them is Otter, right? And this Otter is all of a sudden it’s in our meeting. This Otter is, yeah. And I think, you know, there is a lot of benefit, there’s a lot of benefit in having um you know, uh note taking tools and um also captioning tools that are, are, are for folks in terms of accessibility. There are folks that have completely different learning styles. There are folks that take in information at different levels and different wavelengths of things. And I say that all to say that like, you know, I would like to see a world where um it was scarier um with, to keep with the Otter Box or not Otter Box. Sorry, that’s not Otter Box is not a sponsor of this. Um But the Otter A I um uh gene Note taking tool was that after I got an email randomly from the, the note taker to all the people also to all the people that were in that call with a um a narrative recap of everything that we, we talked over. It wasn’t a transcript, it was a narrative recap, which is fine enough. OK. Um There were, there was a screenshot of just a random person that was on the call that was also there. And also um what’s most scary for me, I think or just very concerning um is um at the bottom, it was like here’s the productivity score of the call, 84% here’s the engagement of the call, 72%. And it’s like where it, where is at least, at the very least, where’s the asterisk at the bottom that says this is how we calculated this whatever. And I, I immediately go, I’m not a pessimist, but in that moment, I was like, this is going to be used by people in higher positions, people in power to wield over folks, middle management and direct service to say, hey man, you didn’t have a um 84% or higher engagement score on our last zoom call, you are now on a personal improvement plan and that is a scary place to be. And so I think less about like these tools are what they are. But I think about the people and the systems and the toxic systems at times that sometimes wield these brand new shiny tools in a way that doesn’t feel good and also is working against their mission and against their employees. Its time for Tonys take two. Thank you, Kate and thank you for supporting nonprofit radio. Uh I like to say thanks every once in a while because I don’t want you to think that we’re taking you for granted. I’m grateful, grateful for your listening. And if you get the insider alerts each week, I’m grateful that you get those letting us into your inbox. Um This week, I’m in Portland, Oregon recording a whole bunch of good savvy smart interviewers for upcoming episodes. Hopefully, that helps like show our gratitude because we’re out here collecting good interviews for you to listen to if you can’t make the nonprofit technology conference yourself. So thank you. I’m grateful that you listen, grateful that you’re with us week after week. That’s Tonys take two Kate. Thank you guys so much for listening to us every week. We appreciate you. Well, we’ve got Buku but loads more time. Let’s return to artificial intelligence for nonprofits redux with Justin Spell Haug Amy Sample Ward and Tristan Penn Ki. I love that you brought that up. Um Don’t love that it happened that you brought it up as an example here for folks because I think it’s uh a easy entryway into a conversation on one of the points Tony mentioned at the start of the call, like, what are some of these preconditions? Um And you were like, oh people are like, oh bright shiny, right? That’s what we do. Oh bright shiny, like I’m going to use this tool that like took the notes in here and a place where we’ve seen for many people, many years in in ten’s research is that nonprofits struggle. This isn’t to say that for profit companies don’t also struggle with this, but nonprofit organizations struggle with consent, they struggle with privacy and security. And so here’s a well meaning well intentioned, right? I’m going to use this tool except it’s emailing you, you didn’t consent to that. It emailed all the participants in the call. There was no opt in, right? Let alone a very clear opt out like why did I even get this? Um That’s not even to say opting into sentiment analysis of whatever is a community zoom call, right? Um And so when we peel that back and say, OK, well, we just wouldn’t use that note taking, right? Sure. But when we’re thinking about preconditions for this effective work as an organization do, what are your data policies in general? The number of organizations that we work with that still don’t have a data policy because they think, well, isn’t there like some law about data? So like we, why would we have our own policy? OK, there is some law related to data, right? Different types of data have different laws, but that’s not the same as an organization saying, what data do we collect? Why do we collect it? How long do we retain it? What if somebody wants us to remove it? How do we do that in our systems? Right. So this level of uh fidelity to your own data, to your own community members, to the policies that you’ve set up to manage those relationships. Um And trust for so many organizations are already not in place or, or like I said, there’s just not a fidelity to them that that makes them trusted. So then to say, oh yeah, we’re ready to, we’re ready to add this note taking app to our community calls or our client calls. It just that that’s the place where I have the most fear is actually not the tools having bias. I know they have bias and that is a place of concern and, and a place we can, can address it. But my mo the most fear I have is people still operating within that without any of the structures or policies or, or training to deal with both maybe bias and a tool they use and their own bias or their own issues, right? And it it accelerates the harm that that can be created in that. I mean, I want to use some of that to, to go to Justin and uh that’s something very closely related. Uh the, the uh the nonprofit leadership role, the responsibility of, of nonprofit leaders. I think it gets to a lot of what Amy was just talking about. But what, what do you, what do you see as the, the responsibility of nonprofit leadership in, in formulating these policies? But also in just, you know, making sure that the preconditions are there so that we, we can be successful in integrating artificial intelligence, whether we’re bringing an exterior, an outside tool or, or or building our own. Even that, that may be a, that may be a big lift for a lot of listeners. But, but generally the, the, the nonprofit leadership’s role. Yeah, I mean, there’s a lot of the nonprofit leadership play today and I think we have to meet uh leaders where, where they’re at and, and I think the very first step and Amy mentioned this in the very beginning of the call is raising the the capacity of their knowledge and of their staff’s knowledge of how these tools work and uh what are the edges of the tools and how to apply them effectively in the flow of work. And um there is training available as, as an example, we have a four hour course on linkedin. You don’t need to do it all at once, but it’s actually pretty good. It’s for, it’s not for developers, it’s not for techies, it’s for front line program, staff, fundraising staff finance staff, the, the, the ed uh to really learn about how to think about these tools with that knowledge. Then you can take the next step, which is starting to engage, I think, simple ways to apply these tools to get on the uh on the ground experience of what they’re good at and what they’re not good at. Um you know, using things like uh from Microsoft. So I’ll mention, you know, BB or, or, or Microsoft Copilot to look at writing donor appeal letters or whatever the process may be, they can just start learning about these fundamental language models and what they’re good at. Um I think it’s important as an organization thinks about getting deeper into A I and really thinking about how do they apply it to their processes, whether that be fundraising, whether that be engagement with beneficiaries that they think really deeply about data uh and data classification and that, that, that gets a little sophisticated, but just ensuring that we’ve, we’ve got a strategy to use A I for the data that we want to use A I for and that we segment data that we do not want A I to reason on away. So start with, start with getting the basic skill skills built out. Um A lot of uh organizations I met I meet with are just at the very beginning stage of that, use the simplest tools to accommodate uh the job to get some experience and then start to think longer range around data, data, classification and more advanced scenarios that can be applied. Tony. Can I just, what’s that four hour course on uh Tristan? Let me just let me drill down on a free resource. I love free resources for our listeners. Tristan answers. It’s a linkedin course uh nonprofit uh A I fundamentals. But let me get that for you here. Ok, Tristan, go ahead. Yeah. Um Can I um I really like how Justin um initial uh said, you know, there’s a lot of nonprofit leaders plates already too in terms of responsibility. And I want to gently push um and answer your invite to, to call you in Tony um in, in the premise of the question which, which was what’s, what is the responsibility of, of nonprofit leaders now? And I would say yes, there, obviously, there’s a responsibility as Justin has illustrated that like we need to be better in terms of strategy um in terms of tech, in terms of A I um in general on how we fold these, these crucial tools in. But I would also say that there’s an equal and almost um larger responsibility on those who fund nonprofits. Um I think a lot of times in the nonprofits that I’ve worked with, interacted with and worked within um their operational and financial model has been very ham handedly built in a very um doctor Susan way, which doesn’t really make sense at times and it’s because a year after year, there are different grants, different fundings that require different things um at different times based on whatever the the hot new term is. Uh 1015 years ago, it was mentoring. So a lot of times everything was geared towards mentors. And I say that because this implies that um a lot of these nonprofits are already built on a structure that is very shaky. And so there’s a lot of other things that need to be done. But I do think um a big responsibility sits with folks who fund um nonprofits foundations. Um and also local governments, federal, the federal government in making sure that when they are pushing a grant or um putting out an RFP for a grant that says you need to fold in tech and you need to fold in A I in this way to get kids to learn or get kids in seats um in the classroom that you’re doing. So in a way that creates um longevity and solid um solid nonprofit organi operational work. Um And just doesn’t like slap an ipad in front of a kid. Um And I think that’s really, I used to work with boys and girls club. So that’s where I always default. Um But III I think that um I’ve based on my experience, it’s always been um a really weird way um of, of having um o going into a financial model um of an organization year after year because it’s like, oh, well, that we started doing that because last year’s grant asked for it and now we just do it into perpetuity. And so again, you have that little weird Dr Seuss style way of thinking. And I think um funders and um grant, um grant folks can do a lot by being very clear and very um forward thinking and how they are offering up these monies. It’s time for a break. Virtuous is a software company committed to helping nonprofits grow generosity. Virtuous believes that generosity has the power to create profound change in the world. And in the heart of the giver, it’s their mission to move the needle on global generosity by helping nonprofits better connect with and inspire their givers, responsive fundraising puts the donor at the center of fundraising and grows giving through personalized donor journeys. The response to the needs of each individual. Virtuous is the only responsive nonprofit CRM designed to help you build deeper relationships with every donor at scale. Virtuous. Gives you the nonprofit CRM fundraising, volunteer marketing and automation tools. You need to create responsive experiences that build trust and grow impact virtuous.org. Now back to artificial intelligence for nonprofits redux, you know, that’s not only the mindset like this, this, it, it feels like they’re being strategic by saying, oh, yeah. Well, we were able to come, we were able to pitch that in a way that we got the fund, but then that’s changing their strategies all the time. It also back to the point before is meaning the data you have to work with inside your organization is OK. Well, two years, we structured it this way for two years, we structured it this way. Do we even have like a unique idea to connect these people and say, oh, they were in both of those programs, like our own data sets are messy and influenced by funders saying, oh, now we need you to collect these demographic markers, you know, and it’s, it’s we we as organizations are often pressured by those funders to do it the way they want because it’s easier for them. Um and tells the story, they want to tell, but that’s really, really messing up the data sets and the program kind of uh processes or, or business processes that we have in place. And I I just wanted to connect that to broader things that intens worked on and advocated for for many years from the equity guide specific to funders. And that is that funding technology projects takes time and it takes a lot more money than like $30,000 for whatever the licenses are for something, right? Like it’s not uncommon that an organization building a model, an internal use model. This isn’t some big flashy commercial thing. This is just for them to, you know, like I said before, identify program participants that maybe, you know, could use intervention it’s not uncommon that would take two dozen tries to get the right model in place right? To really make sure the algorithm is, is fine tuned that the outputs are appropriate. Well, you can’t go through two dozen models in, in three months, right? And then have something there. A nonprofit would need a couple of years. And our, our funders, there’s already plenty of funders saying like, oh, now we have this A I grant, you know, opportunity or is that grant gonna be comprehensive of the work to get their data in a good place to get their program, staff ready and trained to Justin’s Point. Every staff person really trained adequately on, on not just what are these tools but what’s a good prompt? What’s a good use case for this, right? All of those pieces so that they can adequately and materially contribute to, then what is this project we want to do? What is the best fit for us and how do we, how do we build it and, and just to add on and we’ll wrap up to Amy’s Point and Tristan’s Point A I hasn’t changed the fundamental physics of what makes a good technology project. I mean, it’s people, it’s process, it’s tools, it’s capacity building, it’s a long term strategy, all that is the same. Um And if your listeners are wondering, where do I even get started in understanding the language of this stuff? Uh Because you asked the question. It’s called Career Essentials in Generative A I it’s on linkedin, it’s free. Uh And I take it it’s, it’s pretty good. So I think it’s worth worthwhile for your listeners. Thank you, Justin. How about uh in 10 Amy, what resources for folks? I mean, hopefully they’re already going to the nonprofit technology conference where there are gonna be a lot of, uh there are a lot of sessions on artificial intelligence. I know because I’m gonna be interviewing a bunch of those folks. So this is, this is probably the second of, I don’t know, six or seven A I episodes uh in, in, in different uh around different subjects. But N 10, N as N 10 as a resource for learning A, we have lots of them. There’s um you know, work uh not workbook but like a guide. There’s of course, the equity guide, there’s some materials on the website. We have an A I course and other courses that talk about A I, there’s community groups where you can ask questions and of course the conference. But uh thanks to Microsoft and Octa gave us some um supporting funding and 10 along with Institute for the future and project evident are at the tail end of a community design process where we’ve worked with over 40 organizations um in this process to create an A I framework for organizations, whether you’re a nonprofit or not, who are trying to make decisions around A I and our framing for this is the framework for an equitable world. So it isn’t just that you are a 501 C three registered in the US, right? Or that you’re a grassroots organization in whatever country like if you want to live in that equitable world, then this is the framework that we can all share and work in together. Um We’re going to do a little preview at the end TC and have whoever comes to the session is gonna get to road test it with us and then we’ll publish it publicly after the NTC. Um So lots more and obviously, I’ll, I’ll share that with you when it comes out. But um what’s really, I think important from this is that it is a framework that uh is built on the idea that all of us are part of these decisions that all of us have responsibility in these decisions. Um And that all of us are accountable to building, right? This isn’t um you know, the quote unquote, responsible tech or this isn’t like this isn’t just for those projects where you’re, where you’re gonna do something good over here. This is whatever we’re doing, it’s gotta be good. It’s gotta be building us into an equitable world because what else are we doing here? Right. If it’s not for that. Um And so I’m excited for folks to get to use it. It’ll be published for free everywhere anybody use it. Please go, you know. Um, so lots more on that too. Amy. You are perfectly consistent with the framed quote that you have behind you. All of us are in this life together. You’re living your, you’re living your framed art. Uh, uh, I admire it. Uh, Justin, we have, we’ve got maybe 10 minutes left. What, what would you like to talk about? We haven’t, we haven’t touched on yet or go further on something we have. Well, no, maybe, maybe I’ll just um build a little bit on what Amy was the question you asked, what are, what are the resources available? So I think that’s pretty useful to the, to the organization. So, so one is, one is the training that I mentioned too is uh we just recently ran a nonprofit Leaders Summit where we, where we had 5600 people together. Uh uh about 4500 online, about 1000 in a room talking about how do we grapple with A I? How do, what are the use cases that make this make sense? How do we think about data security and privacy? And we’re going to continue to invest in in that? We’re going to be rolling that out more globally as well with uh events in Australia and others. But that convening and that dial and just getting the community and dialogue I think is so important. I I learned a ton from that. We’re also going to continue to push on affordability and making sure that uh we’ve got affordable access to our technology so that every organization can use things like Microsoft Copilot uh for, for free um providing, you know that they, they’ve got access to our nonprofit offers and then finally, innovation. And I, I’m, I’m interested looking at scenarios that span the sector where if we invest, once we can create a multiplier effect. And one of the areas that we’re, we’re partnering on is with Save The Children Oxfam and many other organizations on the humanitarian data exchange, which is a large data set used to help organizations coordinate humanitarian and disaster relief domestically and internationally in a more effective manner. Uh So our mission don’t overlap uh but that data set hasn’t been super useful to date, applying things like language models training on that and creating a tool set that is cross sector for many organizations, you’ll see us um continuing to invest in that way. And I look forward to ideas from our intent partners here on the phone as well as you know, the community at large on on where we can make bets that will really help the sector together. Uh move, move forward Tristan. What would you like to touch on or, or go deeper in? We’ve got uh we got the, it’s 78 minutes or so. Um You know, I, I think I just wanna underscore what, what Amy was talking about and that we’ve, we’ve all been working on. Um, which is the, uh, I’m a little tired of you underscoring Amy, Amy and you, we force each other. You know, I agree with you should have seen us, we work together. It’s getting a little dull. It’s a little dull. Now. You should have seen us when we were in office. Our desks were 20 ft away from each other and there was a constant, there was a worn line in between our desks and nobody wants to be in between in that 20 ft in that 20 ft space. Um I will say um being a part of the community group, what Amy was saying about working with 40 other organizations um to figure out what um a healthy and um robust and equitable processes for any organization to um interact with and um field A I is crucial and I’m, I’m so glad that we are able to be a part of it and we’re, we’re going to be um debut it at NTC. It’s something that I’ve learned a lot from just based on someone who again, like I said before, I came from youth development. My degree is in child psych. Um So, but I’ve learned a lot over the years um working with N 10, working at N 10. Um But I think um one thing that’s, that’s been uh really, really beneficial is learning from all those folks in the group and um a couple of things that did come up in when we were creating that framework, which uh was um that organizations are making all kinds of decisions every day today. Um And I, I will say that it kind of highlights that I, we are talking about A I and how it like will look sound and feel and how it looks. This is all kind of uh we’re not meaning it to be, but it’s all within a vacuum. Um And we can’t think like that. We can’t think of all of us who have now, we are four years out from 2020 our lives were forever changed and every nonprofit will have their own sad story to tell about how the um the pandemic impacted them. And I say that to say is that like none, no one was prepared for that. And so if we um keep on talking about or um playing around with this idea of A I is like, it’s going to solve problems or it’s going to sit in this world um in this vacuum, we’re not doing ourselves justice and we’re being very forgetful about the past that we just went through. And so if we’re able to instead consider how A I will interact with the dynamic world that we all live within, um That’s going to better behoove us um both individually, but also organizationally when we’re planning strategically. Um If that’s year after year for you, if that’s every five years, I don’t know what that is. Um So having that strong tech um baseline for folks. And then I think also the other thing is people in all roles are considering A I and aren’t sure how it applies to them. Um I think uh staff, we’ve read stories um that A I will replace workers but have no idea what to do with, you know, where, where that fear sits with them too. Um It should just add to their work and not replace them. And I think a lot of we’re seeing uh you know, I’m, I um am on tiktok and so, you know, that’s a whole other like bag of algorithms and like, you know, things that we can dissect and pull apart. But I do, there are a lot of stories of, you know, there are folks getting laid off left and right. And um I, I would have to, you know, that begs the question why generally, but also like, what is the role of A I in all of this too? Um I think it’s really interesting when layoffs happen at a time when A I is accelerating um in a lot of our worlds, whether it’s in tech and whether it’s in other sectors across the world. And I think that there is a lot to be done by organizations who don’t fall prey to like the siren song of like A I and are going into a clear minded and not saying, oh, well, we can cut out this department and put it in, put, um, you know, this learning module in or this, you know, I think that’s, that’s really where, um, you’re going to see a lot of organizations and commu, um, organizations and companies thrive as opposed to just, um, laying folks off a lot there. No, we’re, yeah, we’re, we’re taking it in. Yeah. No. And, and the reality is that I admire the, the consistency between you and Amy. Uh, and, and, and, and, and generally, I mean, I made fun of you, but what it shows is you’re all thinking the same way. You know, you’ve all got the, uh, the same concern for the nonprofit Human first, human first. You know, like we’re all humans and we’re all prioritizing um us as humans and if we start prioritizing other things and it’s not going to, um, go well, well, but at end to end, you’re, you’re walking the walking the talk. So, and consistently Amy, you want to check us out with, uh, all of us are in this life together. Yeah. I mean, I think the biggest thing I, I want folks to leave with is that, that future is not predetermined. We, we are not sitting down and saying, well, ok, like I’ll wait for my assigned robot to come tell me what to do, right? It, it is still up for all of us to write that every day. And the people who most need to have their sentence at the start of the article or whatever, you know, at the start of the book are the folks who are being told in a lot of different systemic media type ways that they do not get to have their sentence in the article, you know. And so I, I hope that nonprofits know this is both an opportunity to shape and influence as A I tools are being developed to shape and influence the tools that we build within our sector for ourselves with our communities. But it’s also a responsibility for nonprofits who are the ones often closest to and most trusted by those systemically marginalized communities who are experiencing the most real time harm to be the supporter that brings them into that work. They are not necessarily going to get tapped by uh a company to learn this or do whatever. Even though I hear Justin saying these, these, you know, opportunities are, are free and accessible. You as a nonprofit can say, we think we might build something. Can you be in our design committee? Can you work with us? We’ll make sure that we all learn together, right? As an organization, they’re already in relationship with they, they’ve, you know, maybe benefited from programs or services. You have the responsibility and incredible opportunity to be the conduit for so many communities to enter this, this quote unquote A I world. And that’s a really important I think gift uh you know that we have as a sector to, to be the ones helping make sure so much, so much more of the world is part of developing these tools and designing them to be accountable to us as people, their Amy Sample ward. Our technology contributor here at nonprofit radio and the CEO of N 10. Also Tristan Penn Equity and Accountability director at N 10 and Justin Spell Haug, new corporate vice president and Global head at uh Technology for Social Impact at Microsoft. My thanks to each of you. Thank you very much. Real pleasure. Thanks so much, Tony. Thanks Justin. I’ll see you in 20 ft. Thanks so much, Tony. Next week, the generational divide now, this is interesting uh because uh we’ve been promising this for a couple of weeks now and it hasn’t materialized. It’s very relieving to have someone, an associate producer who I can blame for this show having been promised the generational divide, having been promised for weeks on end and not coming through even though it doesn’t matter that the associate producer, Kate has nothing to do with booking the guests that the host takes care of that himself. That that’s irrelevant. I blame the associate producer and this, this show, the generational divide had better come through next week or there’s gonna be a shake up. I’m the one who just reads the script to either. Oh, yeah. Minimize the uh OK. Your title is not script reader it’s associate producer. Well, if you missed any part of this week’s show, I beseech you look, I was slow on my cue. There I beseech you find it at Tony martignetti.com were sponsored by donor box. Outdated donation forms blocking your support, generosity. Donor box. Fast, flexible and friendly fundraising forms for your nonprofit donor box.org. And by virtuous, virtuous gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow. Giving, virtuous.org. Our creative producer is Claire Meyerhoff. I’m your associate producer, Kate Martinetti. The show’s social media is by Susan Chavez. Mark Silverman is our web guy and this music is by Scott Stein. Thank you for that information, Scotty. You’re with us next week for nonprofit radio. Big nonprofit ideas for the other 95% go out and be great.