Tag Archives: AI impact

Nonprofit Radio for September 22, 2025: The State Of The Sector (Beginning With AI)

 

Gene Takagi & Amy Sample Ward: The State Of The Sector (Beginning With AI)

This year, any conversation about the nonprofit sector finds its way to Artificial Intelligence. So we start there, with our contributors Gene Takagi on legal and Amy Sample Ward on technology. Amy is concerned about our lack of security readiness and shares their Top 5 security must-haves. Gene explains your board’s duties around tech, budgeting and planning. They both see resilience as critical. Plus, a ton more. Gene is principal attorney at NEO Law Group and Amy is the CEO of NTEN.

Gene Takagi

Amy Sample Ward

 

 

 

 

Listen to the podcast

Get Nonprofit Radio insider alerts

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

Hello, and my voice cracked. Welcome to Tony Martignetti Nonprofit Radio, big nonprofit ideas for the other 95%. I’m your aptly named host and the podfather of your favorite hebdominal podcast. Oh, I’m glad you’re with us. I’d suffer the effects of chondrodermatitis, nodularis helicus. If I heard that you missed this week’s show. Here’s our associate producer Kate with what’s on the menu. Hello Tony. I hope it’s so funny. It’s that voice cracks like I’m 14. Hey, Tony, I hope our listeners are hungry. The state of the sector, beginning with AI. This year, any conversation about the nonprofit sector finds its way to artificial intelligence. So we start there with our contributors Gene Takagi on legal and Amy Sample Ward on technology. Amy is concerned about our lack of security readiness and shares their top five security must-haves. Gan explains your board’s duties around tech, budgeting and planning. They both see resilience as critical, plus a ton more. Jean is principal attorney at Neo Law Group, and Amy is the CEO of N10. On Tony’s take two. Tales from the gym. The cure for dry eyes. Here is the state of the sector, beginning with AI. It’s a pleasure to welcome back Gene Takagi and Amy Sample Ward, our contributors to nonprofit radio. Gene is our legal contributor and principal of NEO, the nonprofit and exempt organizations law group in San Francisco. He edits that wildly popular nonprofit law blog.com. The firm is at neolawgroup.com and he’s at GTech. Amy Sample Ward is our technology contributor and CEO of N10. They were awarded a 2023 Bosch Foundation fellowship and their most recent co-authored book is The Tech That Comes Next, about equity and inclusiveness in technology development. You’ll find them on Blue Sky as Amy sampleward, aptly named. Welcome. Good to see you both. Gene, Amy, welcome back. Good to see you both as well. I actually got to see Gene in person this week, which was a real treat. But your faces coming through the internet. Where? Where? In DC in a in a meeting. Oh, cool. Yeah, it was wonderful to see Amy and hear a little bit more about her family and learn, learn about things going on. um, and great to see you too, Tony. Thank you. Last time we were together was the 50th. That’s right. Yes. All right, um. So Amy You have been, uh, you have lots of conversations with funders, intermediaries, nonprofits, uh, I’d like to start with you just. What are folks talking about? Yeah, I think there’s A lot of desire for thoughtful conversation across the sector right now and, and over, you know, the last handful of months and I’m sure the months to come. And that desire for thoughtful conversation is trying to be held in a time where things feel rapidly unraveling, you know, and A few, I think patterns have been coming up at least in the versions of conversations that I’m, I’m in, whether those are, you know, 1 to 1 with other intermediary organizations, capacity building organizations, um, nonprofit service groups or, or even philanthropy serving organizations or with funders themselves, and they’re, of course, different. You know, flavors of the same dish maybe, but I think everyone really wants to hear and help and It feels like there’s not that much help happening. Um, I think when you talk to funders are presume you’re talking about. How does that go? Like you you should be funding technology, you should be funding capacity building, you should be funding. that are advocating for things or yeah, I mean, part of what sees as our kind of theory of change in the way that we make impact is of course and directly supporting nonprofit staff through training but also shifting the conditions in which all of us are doing this work. Right, so asking funders to fund adequately for the technology and data that is needed to, to deliver the programs, their funding right is part of that or, or all kinds of other advocacy, um, big, big a little a, you know, influencing thropy, and they, and I, I have to do, so they take these meetings like they don’t mind being told what they ought to be funding. Oh, it’s easy to take a meeting. It doesn’t mean you’re making you’re implementing what’s what’s the outcome and what’s the action? I realize that. But I’m OK, I’m, I’m, I think that most of the, most of the conversations N10 is entered into with foundations are not necessarily on the premise of like, can you please give us this feedback to fund a certain way, right? We just say that when we have access to. To folks that we, that we could share it with, but mostly, um, I think in these times, just like honestly in 2020 funders and other philanthropy serving organizations are asking for what we see because we are able to see into a lot of different types of organizations across the sector, not even just in the. and see trends that are emerging, see what folks are really asking for help on right in a way where we’re not having to divulge, oh, this organization that’s your grantee, they don’t know how to do this, right? There there’s not that vulnerability we’re able to share trends and unfortunately, the trends aren’t aren’t new, but, but at least they’re asking about them right now and they. are very, um, vulnerable issues. Like we are seeing incredible lack of security readiness in organizations. And as we’ve talked about on this show, and Gin has talked about, you know, there’s a lot to be concerned about when you think of a nonprofit organizations like digital and cybersecurity because It’s your staff, it’s your content, but it’s also all of your constituents, all of those people who’ve received programs and services, and if you feel that your mission and your programs and services are vulnerable, those folks in your community who’ve accessed them are 10 times more vulnerable, right? um, than your organization is, and that’s something that I think for us we just. We care about that kind of more than anything and so it really has felt like a spotlight on security and even just to um illustrate, we we can created a new program just to try to help in this way, um, a 3 month just security focused program. We had a single email that said that it was open. Um, In 4 days, we had 400 applicants from 26 different countries asking to be in the 20 people, you know, cohort, so That was, I think, validation that we were really hearing the trend and hearing what, OK, what are, what’s behind some of these questions that we’re getting? What are people really struggling with and oh my gosh, OK, we’re right, they are really struggling with security. This is um let’s, let’s bring Gene in on uh on security. You’re nodding a lot, Gene. And, and we have talked about, as Amy said, uh, as they said, we, we have talked about it, but, uh, you know, it’s, it bears amplification, because we, we all have talked about cybersecurity, protecting data, but especially as Amy’s saying, the, the, the people you’re doing the work for, if you’re, if you’re involved in a people, uh people oriented work, Gene, remind us. Oh, I’m amplifying everything Amy says, as I’m wise to do, um, but maybe I’ll just add that, you know, when people think, including funders, when they think about technology and, and some of them are just focused on AI right now, but technology is much broader than that, of course. When they’re thinking about technology, they really have to think of it as one of the core assets of an organization, and that’s not all because it’s also a huge risk and liability not only to the organization but all to all its beneficiaries and its communities that they serve and it’s communities that they exist in so it’s all of that it’s it’s even more complicated. To manage if I might venture and say this, then your other main investments which are like in staffing and in facilities like this is stuff that we don’t have a lot of experience with it’s newer things that are coming up. We haven’t learned how to manage it very well. It’s a little bit out of control. as it develops as with AI going on we don’t even know what the laws are related to this um so this is stuff that funders need to fund and organizations need to invest in really badly and when they don’t think about doing this they’re they’re really. Living for the short term at the expense of the intermediate term because it’s not even that far off in the future where these risks will ripen. They will ripen very, very quickly now. um, so that’s my two cents. And add to what she’s saying. I talked to two different, um. Funders who are who are regional funders, not national funders, and said, hey, I know the folks that are your grantees, they’re um predominantly rural organizations. They’re predominantly very small organizations, you know, single digit FTEs. There are folks that we can see in our data, not as individuals or individual organizations, but by kind of organizational demographics, are, are very likely to have really low scores, you know, ineffectiveness in these areas. We have free resources. We’re not even like asking you to fund us necessarily, like, which I should have been asking, but, you know, coming at it from really how do we get these resources available to organizations who we know are vulnerable, and their feedback was, well, security is not an issue that any of our grantees have raised with us. And I just want to pause there because why would a grantee in the vast power imbalance between a very small rural two-person organization and a funder, say we don’t have a security certificate on our website, we don’t have secure, you know, donation portal, we don’t. Have a database protect like why would they surface these would be fun? Of course they had of course no one has brought this up, right? Why would they point you, you need to be thinking beyond what was in that grant application and about really the, the safeguarding of that mission. Not only why would they admit it, but it may very well have nothing to do with, although it’s, well, it is related to what they might be seeking money for, but it, it’s, it’s grant application. Yeah, it’s not, it’s right, it’s not gonna be a question on the grant application is your, you know, do you have a, do you have a secure fundraising portal? Um, Gene, you have some advice around board like this should be at a board level, board level CEO conversation, right? Yeah, I mean it’s where it starts to get started. Yeah, and, and very obviously like technology comes up as a budget item, right, for the board. So when the boards are approving annual budgets, are they leaving any space for technology changes? Well, so many organizations, including public governments, are, are just like putting patches, right? They’re investing in patches and so they’ll patch, patch, patch. Um, but the technology is advancing so much quicker than patches can actually address. And again, The persons and organizations at risk are not only the the charity itself, right? It’s all of the beneficiaries whose data they’ve compiled and potentially like just goes beyond that as well. So it’s really, really important now for the boards to say let’s think about this as one of our core assets and our core risks and figure out how we’re going to properly budget for this item. And talking about sort of risk opportunity, you know, assessments and saying, well, what happens I, I’m a big fan of scenario planning and maybe it’s hard because these things don’t have definitions but over strategic planning for like a a longer term plan. I think scenario planning right now is really important because the the environment is just shifting so quickly, right? It’s like shifting every few months it feels like so scenario planning for different scenarios and and some of that would be well what happens if we don’t change our technology or what happens if we don’t invest? What are the worst things that can happen? What are the likely things that are gonna happen? and do we actually have board members who understand any of this? Do we need to relook at our board composition? Do we have anybody younger than 50 on our board? And for a lot of organizations, too many organizations, the answer is no, which will hurt you in the fundraising sort of pipeline down the road very quickly as well. Um, we’re not incorporating enough, um, Gen Z, millennials into the governance and leadership positions as, as boomers and even, um, Gen X are are are hanging on to positions longer. You know, for, for a reason, for a good reason, but, um, we need to bring more younger people into the pipelines because they have perspectives. They have a lot of what’s at risk, um, here as well. So that’s kind of my thinking in with respect to fiduciary duties, in the budgeting, they’ve got to understand it. In the recruiting for board members, they’ve got to figure out how to develop the pipeline of who to bring in on the board, like in their duty of loyalty, like to the organization’s best interests, they’ve got to be. Thinking not only about the purpose or the mission of the organization they’ve got to be thinking of the values of the organization, including how much they value the community and all of this relates to the organization’s um what what I’ll call it’s. Reputation or it’s just um legitimacy to the public at a time when the government is poking holes at organizations’ legitimacy if you haven’t earned that from your own community fundraising and everything else will will just dry up so you’ve got to invest in legitimacy if you’re not investing in technology at this point and protecting persons that rely on you. To safeguard their data you’re gonna lose legitimacy really quickly and you’re gonna be irrelevant or or, you know, liable for, for what are two quick things to what Gene’s saying on, on the staff side but then also on the board side. Plus a million to everything Gene said about making boards more diverse, um, including age, but I don’t want folks to think that that means because you need to like have a 25 year old on your board that’s now in charge of your technology. The board’s job is not to be in charge of your technology, but having more folks in that board meeting who have perspective or experience a lot of different. Things are possible helps open up strategic conversations to say, hey, have we considered this? Not that I’m now the implementer because I’m the board member, but it really does help and I just want to draw that line that we’re not saying make someone on your board in charge of technology, but having people comfortable with technology strategy conversations is very, very valuable, of course. The other side on the staff side, You know, one thing we see in our research, um, and our, you know, different assessment tools and in our programs, yes, there are still organizations that don’t have all the policies that they could have, right? They don’t have strong data retention policy, they only think, oh well, payroll files or HR files, right? They’re not thinking about all of the data, all of the content, you know, all these different things, right? We can have a big policy book and there’s work to be done there. But the real area of vulnerability that we see is organizations likely have some policies, but they do not have staff fidelity to those policies. So you could like go through a checklist and be like, yep, data consent policy, data collection, you know, but staff don’t know the policies exist and they are not practicing them at all in a consistent way. And so I wanted to go back to the scenario planning note because I think we see some folks um. You know, yes, you could bring in a consultant or you could get some sort of big security like test going, but what you could also do is in a staff meeting just take that time and say right now if we got an email that we had been hacked, what do we all think we would do? And just talk it through together and see oh this person. Thinks we would do this and this person over here says, oh we have an account here. What do we have? What, what is our answer, right? What, what are the questions we don’t know how to answer? Let’s go answer those questions for ourselves and really have more um opportunity I think to surface with staff where people don’t know something, not in a shame way but in a like, gosh, this is what we should focus our training on isn’t just let’s draft another policy. Let’s understand how to do these things as the people doing them every day. Amy, uh, in, in a couple of minutes after Gene and I talk about something that I’m gonna ask him, then I’m gonna ask you something, but you, you, I don’t want to put you on the spot with no, no forewarning. If we have, let’s, let’s take a, let’s take a, our audience is small to mid-size, so let’s go more toward the smaller, let’s take a, let’s take a, a 15 person nonprofit. Uh, it, I’m not sure it matters what the mission is. I, I, I don’t want to constrain you. I want you to think broadly. I, I’m the CEO of a 15-person nonprofit. Uh, we’ve got a $4 million annual budget. Is that 2, maybe 33 to $4 million annual budget for 15 employees, full-time employees. Uh, what I’m gonna ask you in a couple of minutes is what, what are some, what, what basic things can you name for us that, that we ought to have? OK. You, I thought that was you know way, you know, yeah, I know you’re gonna start writing, thank you. Gene, I want to ask you, uh, I, I, let’s let’s talk about the core assets of a nonprofit. Uh, you, you, I love that you’re identifying technology as a core asset. Are there, are there other core assets that, that I’m not thinking of? The staff is typically number one, right? Facilities is typically a pretty big investment, although that’s been changing um with a lot of remote working now and organizations seeking to downsize how they allocate where their investments are, where their assets are. um, staffing is also changing and. Part because of some technology, right? So if technology isn’t in that bucket in there, you may be downsizing staffing, you may be reducing facilities, but why is that happening? Probably somewhat related to your technology. If your funding stays stable. I know that’s a big assumption, but probably technology is playing a part in that. Is your technology? Gonna break down like in a year. That’s something to really think about. If you’re now reducing staffing and reducing facilities, relying on technology that’s gonna break down in a year or give you problems in a year or create harm to your beneficiaries, that’s like the big one that that Amy raised that, that really hits home for me. It’s like. Now you’ve got to really rethink what was the board doing? Did you even think about that? Um, so you know as part of your fiduciary duty of care, and again I love to think of it in terms of both the mission of the organization and the values of the organization which if I bring it down to fundamental human rights, it’s preserving dignity to your beneficiaries, right? And if you’re not safeguarding your private data and if you’re letting health data flow away, and this includes your employees too, right? like. Like your key stakeholders, if they can’t trust you. Then your legitimacy is also gone, right? So you’re really just shooting yourself in the foot unless you’re doing that. So boards have got to now rethink like we maybe weren’t thinking about technology that way so much before, but as we’ve seen how exponentially, you know, um, exponential changes technology creates for our organizations and the environments and what we invest in and what our risks are, boards have got to be in the mix and I agree absolutely with with um. Amy, it shouldn’t be the 30 year old or 25 year old board member who’s like, OK, you’re in charge of the technology. Yeah, no, no, it’s, it’s, but it’s another perspective in there. Yeah, and it’s, it’s, it’s better informed, uh, look, I’m the oldest person on the on the meeting, uh, in our chat. Uh, they’re, they’re better informed, you know, they, they, they have a a fluidity, they think about things that, that 63 year old is not gonna think about or 55 year old is not gonna think about. Um, so I’m just kind of fleshing out, yeah, of course, different perspective, but how so? Because they, uh, depending on their age, they either grew up with, you know, uh, technology is an add-on to my life. And some people have had it since like age 5. You know, I had a rotary phone at age 5. And I always dialed it backwards. So, you know, I was challenged from the beginning. Our colleague, our colleague is looking up from our uh homework assignment, homework from their homework assignment. What, uh, what, what do you, what you, what can you enumerate for us? I have 5 things I wrote down off the top of my head. I don’t know that if I had. You know, 50 minutes instead of 5 minutes that I would write the blog post with these same 5 pieces, but I think all of them, I know you gave me an organization, kind of 15 people, 4 million, but I don’t think any of these. Are unique to that organization. So I just want to say that. The first is cyber insurance. I know everybody thinks like let’s make sure we have our DNO in place. Check the box for some insurance as well, you know, um. Let’s make sure everybody DNO directors and officers insurance in case you’re not familiar with that, that’s, that’s an essential should definitely have that directs and officers, thank you. Yeah. Yeah, the second piece I um put down was data deletion practices. I feel like there’s such a focus on preserving data and content at all human reason, um, but actually, Like, to what end do you have this, especially to to Jean’s point before about the dignity of people, and they’re not in your program, you’re not reporting on them, you know, to a funder, you’re not, why are you saving every bit of this if it means somehow that list is taken, you know, um, and we talk a lot in our kind of closed cohorts when we’re working with organizations. That it isn’t that we don’t think there’s value in being able to look at longitudinal data of your programs and, you know, do that evaluation, but you don’t need to know that Amy Sample Ward was the person in that program, right? There are ways that you could anonymize the data and still preserve the pieces that are helpful for your program like evaluation. Well, removing the, the risk of it still being me or Jean or Tony, you know, associated. So I really think deletion practices and policies that dictate when you delete things, how much of it you delete, what you um anonymize is really important. Third, This is, I think, hopefully more top of mind for folks since so many organizations. Maybe became hybrid or virtual or remote permanently from the pandemic and that’s content and machine backups and and redundancy. I see a lot of organizations who say, oh, but we use the cloud, right? Like we use Microsoft 365 or we use Google Workspace. OK, but in your day to day is every single document that someone’s working on in those systems and if they’re downloading it to work on it offline for any reason. Well, does it have data in it? You have constituent information in it, um, but also like if someone’s working on something and they’re You know, computer is stolen or broken or vulnerable, is all of that backed up somewhere? Do you, you know, there it’s quite simple to set a full machine backup to the cloud every day too, right? But it, it just takes thinking of that, prioritizing it and setting it up, um, including, including with that recognizing. That employees might be using their own devices. They, they probably shouldn’t be, you should be, or you should, you should at least be funding their technology, their, their monthly Wi Fi bill, etc. but beyond just recognizing that they may not even be using exclusively your technology and, and what’s the, what’s, so then what’s the redundancy and backup of on their own devices. Technology policies that say the only tool you could use is the laptop we gave you are intentionally limiting your own understanding of how those workers are working because there’s no way that they are only using that laptop you gave them. So, having a policy that says this is how you safely access our tools, whether you’re using our laptop or not, at least allows you to build the practices, the human side of security into that use instead of pretending it doesn’t happen, you know. Yes, yeah, OK, number 4 and number 5 are somewhat similar, but again this is where we see big breakdowns in practice. Number 4 is that Every system that can have it has two factor enabled and is required. There’s so many ways to do to factor that it isn’t an excuse to say that it’s like burdensome, it doesn’t have to be like, it doesn’t have to be a personal text message. It could be an authenticator app, whatever, but like you need to have to factor on everywhere, um. And need to be using a password manager so that staff are not sharing passwords with each other by saying, hey Gene, the password to, you know, our every.org account is is this like, oh my God, you know, that we can both we can both log in but it’s encrypted we don’t see the password, right? We’re sharing it um in a safe way. And then the last one, number 5, is that, again, a practice, organizations have established processes for admin access for if you get logged out of something that it is not. I email Tony and say, oh, hey, will you send that password to me? Like, most of the security vulnerabilities that we see with organizations isn’t because somebody was in a basement and hacked their way in. It’s they sent one phishing email and a staff person responded and was like, oh yeah, here’s your password, right? Like, it wasn’t hard to get in. So, If you have a policy that says you’ll never email each other to say I got logged out, what is, what is a more secure way? OK, well, I call you on the phone. We have this secure password that we say to each other that only staff know and like. I’m not saying that has to be your plan, right, but it isn’t just randomly, oh, the ED sends an email to the staff person that says, please reset my password. Like, I don’t think that’s gonna be foolproof, you know. OK, so it’s just as simple as like a procedure for what happens when somebody can’t can’t log in. Exactly, because that does happen. So why not create something where everybody on the team knows this is what we do. I know I’m doing it safely, you know, and following the procedure. OK, those are pretty, those are pretty simple. Um, so you might, you might say, well, cyber insurance, that’s not simple. It’s not like I can do it today, but you can talk to brokers, you can talk to insurance brokers for cyber insurance, data deletion policy. I’m gonna venture that N10 has a, uh, sample data deletion policy and its resources. There you go. Backup and redundancy. Do you have, is there advice about that in Yeah, there’s lots of it, but I’ll put it on our list to make sure that there’s some guidance on that on our cybersecurity resource hub, which is all free resources, so I’ll make a note of that. Beautiful. 2 factor and and password manager. All right, that, I think that’s pretty well understood. I mean, uh, I, I have clients that use the, uh, the, the Microsoft authenticator. As soon as, as soon as I hit, as soon as I hit enter on the, on the laptop, I can’t even turn to my phone fast enough. The Microsoft Authenticator app is already open, notified. I’ve already got the not in the, in the second it takes me to turn from one side of my desk to the other. The authenticator is open. Uh, so it’s not, there’s no, it’s not like there’s no delay. Right, um, OK, and a procedure for not being able to log in, uh, uh, I bet you could find that on the intense site too. All right, thank you for that quick, quick homework. Thank you. All right, all right, so this is eminently doable. And then there’s, you know, of course you have to go deeper. There, there are policies that you need to have, but you know, I wanted something kind of quick and dirty, so thank you for that. All right, all right. Um, Should we turn to just like general state of the sector from our cybersecurity conversation? Sure, um, Amy, you wanna, you wanna kick that off? You kick that off. Yeah, I do talk to lots of people and I think, you know, we’re hitting the two-year mark of kind of like unavoidability of people constantly talking about AI which I have my own feelings about, but, you know, If I step out of any one day’s conversations about AI and look at the last two years, we’re in a very different place of those conversations, you know, um, in a way that I think I finally feel good about how the trend is going in those conversations, um, a lot of one on one calls I have with, with really diverse organizations, you know, small advocacy organizations, global HQ or, you know, like all kinds of folks is. How do we not use the tools that are being marketed to us? And how do we build a tool that’s purpose-built, that’s closed model, that’s just the content we want it to have, right? And like actually useful for us. Which I think is really exciting, that folks are kind of seeing that it’s, it’s just technology, just like, yes, it has different capabilities, you do different things, different tools do different things, of course, but I’m really excited that it feels like folks are trending towards. Well, we have some use cases. How do we build for those use cases versus we want to adopt these things? How could we find something to do with these things we want to adopt, which I think was the reverse order of it all. You and you and I have a friend who is devoted to this exact project, uh, George Weiner, CEO Whole whale, they’ve created Cas writer. Yeah Horider.AI, which is intended exclusively for the use of small and mid-size nonprofits, limited, limited learning model, uh, your content safe within it and not being skilled in artificial intelligence, that’s about the most I can say about it. But whole well, they have a, they’ve, and they’re not the only one I’m sure, but they’ve created a product specifically, uh, to take advantage of. The technology of AI, but reduce a small and mid-size nonprofit’s risks around your use of it in terms of what it brings in and how it treats the data that you provided. Yeah, causes writer, change agent, there’s a number of folks in the community. You know, trying to help organizations in this way, which I think is great, um, but a trend, a smaller trend in the last couple months in these AI conversations, bigger trends like I said, but there’s also this piece where I’m hearing from folks saying that. They can tell, for example, a colleague used Chat GPT Gemini, and, you know, a large tool like that to to make this proposal that they sent to them or this email, and when they say, hey, it’s really clear that you used Gen AI tools to write this, could we talk about it and get into like your thoughts more about it? There where they had in the past felt that folks were like, oh yeah, I did, but like here’s what I was thinking. Now there’s just complete denial that the tools were used. They lie. People lie? Yes, that’s right. And so to, they’re like, well, how do we have strategic conversations about the way we use these tools if you’re going to deny that you’re using them. Well, let’s let’s talk about what, when you lie to someone about anything, especially I don’t, I don’t, it seems innocuous to me, but, uh, including AI, well, I’ll, I’ll, I’ll leave my own adjective out of it. I think it’s innocuous. It’s so the the technology is so ubiquitous, but all right, if you lie about anything, you, you lose legitimacy. I, if I were a funder, uh, OK, thank you very much. Goodbye, because you just, you just lied to me about something that I don’t think is such a big deal even. And I’m giving you a chance that I was able to point to it, you know, yeah, and I’m giving you a chance to overcome it. I want to have a chat human to human, and you’re denying that the premise of my question. OK. All right, I’m so I’m shocked, obviously, I really, I’m dismayed that people are lying about their use. That’s completely contrary to what the advice is ubiquitous advice is that you’re supposed to disclose the use. Right. I’ll just throw in there that. Please, Gene, get me off my, push me off my soapbox. Well, back to kind of board composition, if you ask a bunch of board members, I think many of them. Would say AI is just like one thing. They have no idea that like AI is a million things, right? And you’re probably using many, many forms already whether you realize it or not, even on a Google search, like, you know, AI is popping up now you might, that might be a little bit more obvious now, but. Just to, to know that AI if I compared it to a vehicle, for example, it could be an airplane, it could be a bicycle, it could be a tank, right? They they all have very, very different purposes and repercussions and so you have to understand that like, oh we’re gonna like invest more in AI. That doesn’t mean a whole lot. So, um, to figure out what your what your strategy is again, I, I, I think, um. Cybersecurity and when when organizations are gonna venture off into AI a little bit more they’ve got to see it as part of governance and not just information technology it’s not just the uh a management tool it’s part of their governance responsibilities. It’s time for Tony’s Take too. Thank you, Kate. Got another tails from the gym. This time, two folks whose names I don’t know yet, but I do see them. Fairly often, they’re not as regular as Rob. The marine semplify or uh Roy, I’ve talked about Roy in the past, not, not, not as common, but we’ll, we’ll, we’ll find out. Like I did find out the uh name of the sourdough purveyor, you recall that just a couple of weeks ago. Uh, I, I’m gonna hold her name, it’s in suspense now, but, uh, I learned her name, the, the one who gave the sourdough to to, to Rob. So these two folks were one of them, uh, the guy. Suffers dry eyes. And the woman he was talking to had the definitive. cure for dry eyes. You have to try this. And she was on him for like 5 minutes, you gotta try this. Hold, hold on to your, make sure you’re sitting because you know you’re not, you, you’re not gonna wanna, you’re not gonna wanna stumble and fall down when you hear the startling news of the dry ice cure of the uh of the century. Pistachios, pistachios. She was very clear. 1/4 cup. She, she did not say a handful, which to me a handful is a 1/4 cup. She didn’t say a handful. It’s a 1/4 cup of pistachios daily, right? This is a daily regimen you have to follow and you will get results within 3 to 4 hours. She swears it 3 to 4 hours, your eyes are gonna start watering. It’s gonna be like you’re crying and tearing, like you’re at a funeral or a wedding. That’s how much water you’re gonna have. All right, I editorialized that I added the wedding funeral, uh, uh, analogy, but she swears within 3 to 4 hours your eyes are, are gonna be watering. Follow the regimen, pistachios. She was also very precise. These are shelled pistachios. You don’t wanna get the, uh, the unshelled ones too much work, uh, which to me that’s interesting now that’s, that’s contrary to the advice that I’m hearing on, uh, YouTube. There’s that guy on YouTube, the commercial that I always skip, but sometimes I listen, uh, Doctor Gundry, you may have heard Doctor Gundry on the YouTube commercials. He talks about pistachios. He says get the unshelled ones because that way you won’t eat too many of them because you have to go through the task of shelling them yourself so you won’t eat too many because too many pistachios, according to Doctor Gundry now this is too many pistachios is bad, but the right amount of pistachios is, is, is, is beneficial, but he’s not as precise as the gym lady. He does not say Gundry, you can’t pin Gundry down. Of course, I didn’t listen to his 45 minute commercials, so, you know, I listened for like 7 minutes and I got the, the shelling, uh, the tip from, uh, from Gundry. So, He’s not as precise as the uh the dry eyes cure lady. A 1/4 cup of pistachios shelled every day. You’re gonna get immediate results. That’s all, it’s just that simple. cure the dry eyes. Don’t buy, don’t buy the over the counter. Don’t buy the saline in the bottle. Don’t buy the uh red eyes. Well, red eyes is a different condition that, uh, it’s different. She doesn’t claim to have a cure for that. Dry eyes, she, she stays in her lane. She’s in her lane, dry eyes. That is Tony’s take too. Kate. I like the specificity of the uh the shelled unshelled unshelled, no, no, no, get the shell, the ones without the shell, they’re already been shelled. She’s very precise cause that, because the shells are gonna take up more capacity and you know, and then you’re not gonna get the full 1/4 cup uh therapy. The treatment is gonna be lacking because you’re not gonna get a 1/4 cup because the shells are taking up space in your measuring cup. Well, then my next question would be like, salted, unsalted, old bay, no old bay. It’s like, Well, you should have been there with me. Uh, she didn’t, she didn’t specify. I think just straight up. She didn’t say salted or unsalted. That’s a good question. You’re gonna have to go on your own, let’s say if it’s a, if it’s a dry eyes regimen. Then you wanna, you wanna be encouraging fluids. So I would guess, now this is not her. I don’t wanna, I don’t wanna impugn her, her remedy, her treatment, you know, with my, my advice now I’m just stay in my lane. This is not my specialty, dry eye cures like hers. I would say you probably want the unsalted because salt, uh, salt causes, uh. More dryness, right, if too much salt, you know, you become dehydrated, I believe, so. But again, that’s not her. You know, I don’t wanna, I don’t wanna add anything on to her, her strict regimen. Um, oh, and by the way, uh, I heard one of the, uh, commentators I listened to on YouTube said, uh, somebody had Riz. I knew exactly what they meant, yeah, I knew exactly. I didn’t have to go look it up in the, I knew it, charismama. I said, oh, I know that. I don’t, I don’t have to go look it up in the uh in the slang dictionary. Oh, so proud of you. Yes, thank you. That’s just a couple of days later. All right. We’ve got Beu but loads more time. Here’s the rest of the state of the sector, beginning with AI with Jean Takagi and Amy Sample Ward. Now I asked about the state of the sector and we’re back into cybersecurity. It only took about 6 minutes, uh, and we’re like 1 minute and uh and then we just talked about it for 5.5 minutes. So, all right, where there are bigger things going on in the nonprofit sector. You know, our, our, uh, federal government, uh, the regime is, is, uh, has found nonprofits that are complicit in terms of universities. Uh, I don’t think it’s gonna stop there. um, we are, you know, both the left is, is under attack and. In a lot of different ways and that, that impacts a lot of nonprofits that do the type of work that is essential, you know, whether it’s legal rights or human rights, uh, simple advocacy, um, I mean, even feeding certain populations, uh, so obviously immigrant work, um, let’s. Uh, let’s go to the uplifting subject of, uh, the, uh, the state of the sector generally. Like, let’s put AI aside now for, for 15 or 20 minutes and just talk about. What people are, what people are feeling, what people are revealing to you. Gene, I’ll turn to you first for this, you know, what, what, what do you, what are people concerned about? What’s happening? Well, um, what’s on people’s minds is what I what I mean. Yeah, I, I think the sector is still feeling the the impact of the broader public being very polarized, um, and the effect of not only government actors on, um, uh, inflaming the polarization but on media as well, and nonprofit media is not exempt from that, uh, as well. So really is about trying to figure out, well, how do we. Move forward at a time where it is so polarized and where for many organizations the government is acting uh adverse to where our mission and our values are and they are affecting our funding and what’s gonna happen. So one of the trends going on right now I, I, I see is. There’s a greater understanding that we’re not gonna go back to the world. That, that was a year, right? We’re not going back there. We’re in this, what I’ll call is probably a transitionary period. I don’t think this period will last exactly like this either, but what’s gonna be next? What’s forthcoming? Is it gonna be worse? Is it gonna be better? And what can we do now as nonprofits to shape that direction? Like we can fight. Tooth and nail for everything right now, but if we’re not and by we, I’m including myself in the nonprofit sector, so forgive that indulgence, but if we can work towards a brighter future strategically, what are we thinking about instead of just sort of defending against every new executive order or every law and just trying to sort of fight on a piece by piece basis to just maintain scraps of of rights that. That we can preserve what what is our future plan, um, so we’re gonna also see with the diminished fundraising we’re gonna see some um consolidation in the sector, right? There’s, there’s a lot of nonprofits out there and they’re going to be a lot fewer nonprofits in 4 years. So what is gonna happen? So we’re gonna see more collaboration. We’re gonna see more mergers. We’re just gonna see a lot of dissolutions, um, and that’s gonna mean that a lot of communities are no longer gonna be served. So what other organizations are gonna pick that up? And if we have less funding to serve communities, do we need to find ways to do it in different ways, um, and so you know, back to technology, people will rely on technology, but that’s not the panacea for everything. Um, and I think collaboration is going to be a big part of it as well. So yes, there’ll be some consolidation and some mergers, but there’s gotta be other sorts of collaborations because the need is just gonna keep growing. Uh, but also trying to shape what we want in the sector is important and to understand that we’re not the only country that’s going through this, right? And we are more and more in a, you know, and this is one world and everybody impacts each other. And there are other very authoritarian countries that have really harmed their civil society and their nonprofit sectors, right? Yet there are nonprofits that continue to thrive. In those sectors, what are they doing? What can we learn from them? What gives them legitimacy when the government is not giving them legitimacy? There’s a lot to grow from here, evolve and adapt, um, but we are, and admittedly we’re in really, really harsh circumstances, so everybody is just sort of, you know, running all over the place without, without any direction still, but I think there’s more and more. Understanding that we’re gonna have to start to gather together and and and create some plans. I really agree with Jean and I, I’m also thinking about how we first started our conversation and How I said, you know, I’m experiencing folks really wanting to have thoughtful conversations, even though we may not be able to even make a container for those thoughtful conversations because of all the pressures and the anxiety and the unknowns. And I feel similarly here and in the way Gan is framed, framed the the uncertainty ahead because I see so many organizations who have never, through all the ups and downs, even if they’ve existed for 100 years, have never had to say. That their mission was political because no one has ever said that feeding hungry children was political or that housing people that don’t have a house is political or, or, you know, name most of the missions across the sector, right? Um. And now we’re in a place, you know, the last few months of the budget cycle and all of those debates made snap and uh so many programs became something where we we saw staff in the community saying like, oh gosh, well, normally I send a newsletter, normally, you know, this is my job and now I’m having to defend. That our organization exists and why we would exist and and what our programs do, but I also think to Jean’s point, there’s so much to learn and there is so much we already know. We do know how to do our work, right? Our folks who are running all kinds of missions and movements are experts and so even if we are. Um, looking at opportunities to collaborate, not just mergers and, and acquisitions or closing, but, but really collaborate in new and different ways, we don’t need to enter those conversations feeling like we don’t know anything. We know a lot. We’re just looking for maybe new venues or ways to apply that learning and that knowledge and I, I just, I wanna say that part because I, I don’t want folks feeling like they can’t enter those conversations because. They’ve just never done it before and they don’t know what what to even say. No, you know all about housing. You know all about resource mobilization in your community, whatever it might be, right? And so from there, there’s lots to grow from that that there’s already fertile ground. We, we have, yeah, we have experience, we have wisdom. Um, it sounds like, you know, you’re, you’re both talking about resilience. You know, we, we, we need, we’re, I guess in the current moment, we’re sort of treading water to see what’s coming as we’re, as we’re defending our, whatever, whatever our work is or whatever is important to us personally, because we, you know, we know that we, we can’t, we can’t take on everything, but, you know, we’re, we’re standing up for what it means the most to us. As, as individuals and as, as nonprofits. And then we’re waiting to see what, you know, what the future holds, um. I, I, I agree. I, I don’t, I don’t think it’s gonna be this extreme, but I also agree we’re not, we’re not going back to uh the 2016. Yeah, I’m just a really strong believer in, in one thing you said, Tony, about like what we want. There, there’s some things we want, and I think that is true of most of the country. I think for a lot of things, we want the same thing, right? It fundamentally it’s dignity for everybody, um. Uh, and, and dignity for our own communities. So just trying to find that and showing how nonprofits further that goal and making sure. That your representatives know that is really critical. So right now our our representatives just seem to be voting as blocks, right? They just vote along party lines and they’re not doing much more, but that would change if en masse, like the people that vote them into power say these are the things that really are meaningful to us like do something. You know about these fundamental things we wanna be able to feed our children we wanna feel safe on our streets like they’re just fundamental things, um, and then we can talk about how to accomplish that and we might have disagreements on, on that, but make sure the representatives know that they’re gonna be held accountable for helping people get what they really want and what the things that most are are most important to to them. That are meaningful to them, um, because so many things that people are shifting the arguments towards have no real meaning to their personal lives like attacking certain groups, you know, for, for, for allowing them to have rights probably, you know, the people people are attacking them. It probably doesn’t make any difference in their day to day lives or not whether those other people have rights or not when we’re speaking about certain minority groups, but why are they attacking it because that makes them or or they’ve been positioned. I, I think they’ve been. Uh again with, with technology and AI they’ve been brainwashed into thinking this is the fundamental thing that separates us versus them and we have to be better than them and um I, I, I think we’ve really got to get off of that sort of framework of thinking and really having nonprofits connected with their communities and tying them to their representatives is really really important at this time. Yeah, that that zero-sum thinking. That everything somebody else gets detracts and takes away from me, my, mine. Whether it’s an organization or person. It reminded me of a conversation we had on the podcast. I’m trying to remember when it was, it was years ago, years ago, um. And I don’t remember what if it was uh political administration change or it was natural disaster. I don’t remember what maybe the original impetus was when we, when we very first talked about this, but It is reminding me of, you know, we’ve said before the value that every organization has in, in kind of sharing the, the information and the data and the lessons and the truth of your community and your work so that when people are putting into the garbage machine, you know, tell me the tell me the real. You know, stats about hunger in my city or whatever, who, who cares about that? But if they actually came to your website as an organization that addresses hunger and you said this, these are the real numbers, right? This is what it, this is what hunger looks like. It looks like a lot of different things, right? It’s like AI hunger can be all these different things, um. That’s an important role in this time that every organization I think can be contributing, really saying this is what we know, this is what we see. This we are experts on these topics so that There’s a little, even if it’s a small antidote to the spin and the and the media and the wherever those online conversations go, at least you were kind of putting on the record what you do know and see in your work. Exactly right. I, I think I remember we were talking about how to be heard when there’s so much noise out there in the social networks and in media. How, how does, how does a nonprofit get get heard, and part of your advice was you have your own channels. So, and including your own website. Yeah. Thank you. All right. All right. What are you hearing, Tony? You get to talk to people all the time too. You have your own angle. You’re sitting over here grilling Gene and I. You got that’s not fair. I don’t see and hearing. Gene, I hate when they do this to me. Gene, help me out. No, um, alright, I’m gonna put AI aside because there is so much of that. Um, Still, you know, funding, uh, people still reeling from the USAID cuts, you know, it fucking kills me. It’s $1.5 billion which there are, there are several 1000 people in the world who could pull out $11.5 billion from their pocket and replace all the AI, all the USAID funding. See, I said AI when I’m, it’s a ubiqui it’s, it’s, we’re, we’re. We’re like, we’re, we’re conditioned that could replace all the USAID funding with a check or with a crypto transfer, and they wouldn’t actually be cash like that’s bananas, and they wouldn’t miss it. So, you know, people still reeling, um, missions still reeling from the USAIDs. I have a client that’s, but I, I, I hear about it from others as well, um. And it wasn’t just USAID, but State Department cuts that were non-USAID funds. The State Department did a lot, um. Yeah, a little, a little in media, you know, I, I listened to some media folks, um, Voice of America, trashed, trashed under, uh, what’s Carrie Lake, you know, uh, used to, used to, you know, like our, our soft. What’s it called soft diplomacy, right? Like, like bags of rice, bags of flour and sugar through USAID and State Department, news and information that was trusted, unbiased. I know there are a lot of people who would disagree that it was unbiased, but still, the, the effort was to, to be unbiased, spreading news and information around the world, around the world. Uh, and then I guess also, uh, public media cuts here in the United States where grossly, ironically, Red rural communities are most impacted because they’re not gonna get emergency flood warnings like like just failed in help me with the state was it Kentucky, the the river that flowed and the and the camp that lost 20 counselors and children, was it Kentucky, Texas. I’m sorry, it was Texas, right, thank you, um. You know, emergency warning systems, let alone news and information, you know, we’ve, we’ve gutted, uh, corporate media long ago gutted local media, but just so news and information. Lost through the Corporation for Public Broadcasting funding. Corporation for Public Broadcasting, of course, winding down in I think October. September or October, uh, so their funding lost and even just as basic as like I’m saying, you know, emergency warning systems for rural communities, horns that blow. Uh, messages that get sent at 3:30 in the morning. That that overcome your do not disturb. Lost, you know, lost. Stupidly Um, and a, a lot of this, you know, we’re just not, what, what aggravates me personally is we’re just not gonna see the impact of it, some of it for decades, and we haven’t even gotten into healthcare. But we’re, we’re maybe not even decades, but just several years. It’s gonna take several years of Fail failed warnings about things that NOAA and the National Weather Service used to be able to warn us about, you know, 8 months ago, um, and health, health impacts in terms of loss of insurance, lost subsidies around Obamacare, uh, Medicaid cuts, and Medicare cuts likely coming, you know, we’re we’re gonna see. Sicker people. We’re gonna see a sicker population, but it’s gonna take time. It’s not gonna happen in 6 weeks or even 6 months, but it will within 6 years. We’re gonna be, we’re gonna be worse off, and we’re not, and we’re gonna blame the, the current then administration, whatever form it’s in. Nobody’s gonna be wise enough to look back 6 years. And say 6 years ago, we cut Noah and that’s why now today, in 2031, you didn’t get the hurricane notice. And then of course healthcare too. How about in fundraising, Tony? I mean, what I’m, what I’m hearing is, don’t rely on the billionaire philanthropists anymore. Like, yeah, yeah, we’re over, thankfully, we’re over that. I, I, I never, I, I, you know, there’s, there’s so far and few, few and far between and, and 10,000 people, 10,000 nonprofits want to be in, um, Jeff Bezos’ ex-wife, uh, pocket, I can’t remember her name, Mackenzie Mackenzie Scott’s pocket. 10,000, 100,000 nonprofits are pursuing that, you know, the focus on your relationships, build, work on donor acquisition, but not at the billion dollar level. Work on your sustainer giving program. Work on, work on the grassroots. Can you, can you do more in personal relationship building so that, so that people of modest means can give you $1000 or $5000. And, and people who are better off can maybe give you $50,000 but they’re not ultra high net worth. But if you’re building those relationships from the sustainer base up working on your donor acquisition program, how are you doing? Are you doing with the petitions, emails, and then a welcome journey and you’re moving folks along and then you’re bringing them in and then inviting them to things, you know, work at work at the grassroots level. Among the, the, the 99.9. 8% of us that aren’t ultra high net worth. The other 95%, for God’s sake, we’ve been doing this since 2010, 2010. Yeah, 2010, 15 years, right? Yeah, 15 years, 7, yeah. The other 95% were, you know, don’t focus on the wealthy that everybody wants to, you know, the celebrity. I got a client with big celebrity problems on their board. Names you would know, 3 names you would, everybody would know. Um, they’re a headache. They don’t, they don’t make board meetings. They cancel at the last minute. They, uh, last minute, like a couple of hours. After all the work has been done, all the board books have been sent, and a couple of hours’ notice, they can’t make it. And then the and then another one drops out. Well, if she can’t, then, then I can’t also. Uh, as if that’s a reason, and then, and then the board meeting is scrubbed, and now, now we’re, you know, now they’re struggling to meet the requisite board meeting requirement in the bylaws, right? But so, you know, celebrities, you don’t need celebrities, you need dedicated folks on your board who recognize their fiduciary duties as Gene talks about often, to you, loyalty, care. Is there a duty of obedience to? Is that one? Or is that’s, no, that’s, that’s the clergy. That’s the duty of obedience. I know it’s not celibacy. I know that’s not, I know that’s not good. Amy, why did you mute your mic when you’re laughing? Come on, let us hear you laugh. Uh, now I know it’s not celibacy, but uh loyalty and obedience, loyalty and care, sorry, loyalty and care. And what’s the other? There are 3. What’s the other of obedience in the laws and internal policies. Yeah, yeah, obedience to laws and internal policies, right. So but, but care and loyalty. That’s another one, another one of these celebrities. The giving to Giving to a charity that’s identical to the, the one that I’m that I’m working with in the same community, does the exact same work and major giving to that charity. So Yeah, you, you know, focus on the, on the 99.98% of us who aren’t ultra high net worth. The grassroots, work on your work on your donor acquisition and sustainer giving and move folks along from the $5 level to the $50 level. This is how it gets done. Things are hard, and there are things we can do. Yeah, thank you. There are, there always are. Yeah. If we’re, if we’re focused in the right place and, and bring it back to artificial intelligence, you don’t even need to use artificial intelligence if you don’t want to. Amy, you’ve said this to us. You don’t need to, and it, but, you know, but that’s, it’s, that is not all of technology and that is not all of your focus in 2025 and beyond. Especially. When using it is impacting care and loyalty and obedience and data protection and everything else, right? Thank you for putting a quarter in my slot. That really worked. There’s a lot going on and there are things we can do. How about we end with that? Because that’s up, that’s upbeat. There is a lot you can do. There’s a lot you know. Amy, you were saying we have so much you can do. There’s so much you do already know and That doesn’t change because it is so hard. It just reinforces how important it is that you do know all of that, that you do know what you are doing, that you can take some actions, even if they feel small. Making sure 2 factor is enabled everywhere could be the thing that saves your organization from being in the news, you know, like, that’s worth it. And it didn’t feel that big or overwhelming. And also everything is still horrible, but you did that thing and it was important to do. Know what you know. You know, a lot of people we don’t know what we don’t know, but you, you do know what you do know. Know what you do know, and, and take action around what you do know. Whether it’s two-factor authentication or, or uh talking to your board about sound technology, investment, or it’s Focusing on your sustainer giving. And there’s a lot going on, there’s a lot you can do. Thank you. And pat yourself on the back whenever you take those small steps because they’re probably bigger than you think. That was Gene Takagi. Leaving it right there. Our legal contributor principal of NO. With Gene Amy Sample Ward, our technology contributor and CEO of NE. Thank you very much, Amy. Thank you very much, Gene. We’ll see you again soon. Thanks, Tony. Thank you Tony. Next week, better governance and relational leadership. If you missed any part of this week’s show, I beseech you. Find it at Tony Martignetti.com. Our creative producer is Claire Meyerhoff. I’m your associate producer Kate Martignetti. The show’s social media is by Susan Chavez. Mark Silverman is our web guide, and this music is by Scott Stein. Thank you for that affirmation, Scotty. Be with us next week for nonprofit Radio, big nonprofit ideas for the other 95%. Go out and be great.

Nonprofit Radio for March 18, 2024: Artificial Intelligence For Nonprofits, Redux

 

Justin Spelhaug, Amy Sample Ward, & Tristan Penn: Artificial Intelligence For Nonprofits, Redux

A second savvy panel takes on the impact, leadership demands, promises, responsibilities, and future of AI across the nonprofit community. We convened a panel in June last year. But this is an enormous shift in nonprofit workplaces that deserves another look. This panel is Justin Spelhaug, from Technology for Social Impact at Microsoft, and Amy Sample Ward and Tristan Penn from NTEN.

 

Listen to the podcast

Get Nonprofit Radio insider alerts!

I love our sponsors!

Donorbox: Powerful fundraising features made refreshingly easy.

Virtuous: Virtuous gives you the nonprofit CRM, fundraising, volunteer, and marketing tools you need to create more responsive donor experiences and grow giving.

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

And welcome to Tony Martignetti nonprofit radio. Big nonprofit ideas for the other 95%. I’m your aptly named host and the pod father of your favorite abdominal podcast. Oh, I’m glad you’re with us. I’d be forced to endure the pain of chronic inflammatory demyelinating, poly reticular neuropathy. If you attacked me with the idea that you missed this week’s show, that one is so good. It deserves two weeks and plus I spent a week practicing it. So it lives on for one more week. Here’s our associate producer to introduce this week’s show. Hey, Tony, I’m on it. It’s Artificial Intelligence for nonprofits. Redux, a second savvy panel takes on the impact, leadership demands, promises responsibilities and future of A I across the nonprofit community. We convened a panel in June last year, but this is an enormous shift in nonprofit workplaces that deserves another look. This panel is Justin Spell Haug from technology for social impact at Microsoft and Amy Sample Ward and Tristan Penn from N 10 on Tony’s take two. Thank you. We’re sponsored by donor box, outdated donation forms blocking your support of generosity. Donor box fast, flexible and friendly fundraising forms for your nonprofit donor box.org and by virtuous, virtuous gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow. Giving. Virtuous.org. Here is Artificial Intelligence for nonprofits redux. We’re talking this week about artificial intelligence. Again, it’s an important topic. Uh We did this with a panel in June last year today, a different distinguished panel shares their thoughts on this transformative technology. It’s timely, It’s got a lot of promise and a lot of risks. It’s moving fast. Those are the reasons why nonprofit radio is devoting multiple episodes to it. What are the promises and the responsibilities? What’s the role of nonprofit leadership about government? What are the equity concerns? The biases? What about access to this intelligence? What are the preconditions for successful integration at your nonprofit? What’s the future of artificial intelligence? Who to share their thinking? Are Justin Spell Hog recently promoted Justin Spell Haug. He is corporate vice president and global head of technology for Social Impact at Microsoft. You’ll find Justin on linkedin. Justin. Welcome to nonprofit radio. Congratulations on your promotion from vice president to corporate vice president at the uh enormous company Microsoft. It’s great to be here with the pod father. It’s a new name. So I’m proud to, proud to be here and look forward to the conversation. All right. Well, I’m glad it’s the first time you’ve heard the pod father. It’s, there’s on, there can be only one really there, there ought to be only one. So I’m glad it’s the first time. Um And I see, you know, global head. I’m sorry, you’re a little bit limited. You’re not working in the stratosphere, the ionosphere, the troposphere, you’re strictly limited to the globe. I’m sorry, we all have our constraints. We are working on Mars and the moon uh soon, but we gotta get a broader population of nonprofits there. All right. So we, we’re limited to the globe. I’m sorry for you, Amy Stample Ward. We know them. They are nonprofit radio’s technology contributor and the CEO of N 10. They’re at Amy Sample ward.org and at Amy RS Ward, Amy, it’s great to see you. Welcome back. Of course. Thanks. I know there have been a number of different conversations about A I that you’ve had on nonprofit radio. Um I’ve listened to them, I haven’t been in all of them. They’ve been great and, you know, we talked a little bit about a IJ and I, you know, when we started off with some of what’s gonna be big topics in the sector for 2024. So excited to be in a conversation kind of dedicated to that. I’m glad you are and Tristan Penn, welcoming back Tristan, he is equity and accountability director at N 10 as a Black and NAVAJO professional. He’s served on previous organizations, equity teams and been a facilitator for de I rooted in racial equity. Tristan is on linkedin, Tristan. Welcome back. Awesome. So happy to be here. Um Thank you for having me, excited to have this conversation with um Amy, who I work very closely with and um it’s really good to see you too and um also excited to have this conversation with Justin to see um you know what we can unearth. Yes, we’re, we’re representing the big tech perspective. Um Amy, since you are our tech contributor, uh we’re gonna start off, you know, just big picture. What are your, what are your thinking? What is your thinking? What are your concerns? Big picture stuff. Yeah. Well, I’m glad that we’ve scheduled five hours for this interview. I will be taking the first four. Thank you so much. I have many thoughts. Uh many concerns, many, uh you know, I think there’s so there’s just a lot to get into, I think some top level, you know, bites to put at the beginning here are, there’s a lot of hype and as with anything that falls into the hype machine, I think nonprofits do not need to fall, you know, victim to like, oh my gosh, I read this one article so I have to do the thing, right? Um There’s, there’s time A I is not done, the world is now now, not already over and everything’s predetermined, right? So, um you, you’ve seen the article that was like a I will end humanity? Ok. Ok. Here we are let’s calm down and talk about things. So I, I know I’ve talked to nonprofits whose boards are, like, I read that article and A I is good. You know, it’s ending all of us like we can take our time. That’s one piece. Uh, I also think it’s important for organizations to think about where they are already working, what communities they already work with, what data they already have. Like this isn’t start a new project when we’re talking about A I. Um And so I think we’ll get into that more in our, in our conversations here. Um And of course, that A I isn’t new. Well, I mean, artificial intelligence is a phrase is the, is the broadest umbrella term we could use for these types of technologies. And so to, to have these sentences that say like A I is new and it’s here and it’s going so fast. Like what is that? That’s like encompassing so many different components of technology. Uh And so what do, what do we really mean when we’re talking about A I? Are you talking about a model that you set up inside of your organization? You know, to help identify program participants that need extra support? That could, that can be A A I. But that’s very different than saying, oh yeah, we’re just using chat GP T to help, you know, start some of our drafts. OK. Those are so they are wildly different things. And so to talk about them in the same breath as it’s all a I it sets folks up to already have kind of a disconnected conversation even from the start. All right. Thank you and hold our feet to the fire. Uh Especially me because the three of you think about this all the time and I don’t. So, you know, if I, if I lose that context that you just revealed, shared with us, please, uh call me out. All right, Justin big picture, please. What do you go on Amy? You know, the hype cycle of it’s gonna save us, it’s gonna destroy us. And now just kind of how do we make use of it? We’ve been going through this, this process as a, as a community. I, I think one of the things when I zoom out, I, I just see um some tectonic shifts that are impacting the sector from some big demographic shifts in European countries in the United States where we force is getting older, that’s putting tons of pressure on aged care and front line community workers, some big shifts in uh continents like Africa where education, skilling and jobs are all critical and the nonprofits facing off on these issues aren’t getting any additional funding. GDP is stabilized in many countries, but we’ve hit a new set point for inflation that’s impacting pocketbooks. It’s impacting people’s ability to raise money. And so really, you know, the question that we have to ask is how do we use A I in, in missions to help organizations raise more money, help them deliver more effective program, help them rise to these challenges that are continuing to create pressure in the sector. And how do we do all of that in a way that’s responsible in a way that’s safe in a way that’s inclusive. And that’s actually a pretty complex topic that I hope we spend some time on. Indeed. And thank you for the uh global perspective. Tristan, big picture of thoughts, please. I have lots of thoughts similar to, to Amy. And I, I think where I start off with is kind of like in a very, uh, I worked for 20 years and I still am working in, in nonprofit and I see how, um over those years nonprofits and, you know, small organizations have seen something that’s bright and glittery and then like, so amazed by it and been like, yes, we want it, we’re going to take it in and we have no process for building it into our, our operations. We have no forethought for it. We have no contingency to, um, to live by when we’re folding this in this ideal state. We, we’ve already jumped like multiple steps to um us envisioning how we’re going to operate with this bright shiny tool that we have. And that’s never been the case in my years, um, that I’ve, I’ve been a nonprofit and it’s, if anything, it’s always been uh folded in, in a way that doesn’t have a lot of forethought too. So I think the things that come to mind for me that make me curious and also a little bit, um, reticence um about just the blanket, the umbrella term A I is um folding it in where it makes sense and not where you want to add a little, you know, uh icing on your cake where it does where it needs none. And so, um that’s where II I intersect with it. There’s another piece of it um where I, I am a little um critical of it and concerned about it. Um because I think that this can, you know, we, to Amy’s point, we think about A I and a lot of people go in different directions. I think the, the baseline for a lot of people is they go to like a I generated pictures or chat GP T um to do those things and it’s much more than that, but I do think about a time anecdotally where um I was at a conference and I was um passing by a booth and there was like a very lovely, you know, picture of an older um couple and I was like, oh, that reminds me of my grandparents. It was an older black couple and I was like, oh, that’s so cute. It reminds me a lot of my grandparents. It’s like very, you know, and then I I went in closer and this is a, a booth that’s, you know, managed by a bunch of white folks. And, um, and then they were like, oh, did you know that this is an A I generated picture? And that didn’t feel good to me as a black person that didn’t feel good. It felt incredibly like I had been misled in a really scary way. Um I feel like I have a really good detector of like what’s real, what’s not my BS detector is like always up and on and that scared me because I was duped hard and that scares me in a way um less about nonprofits, but just the overall overall globalization and usage of it and implementation that it could go in to hand to the hands of people and create false narratives about marginalized groups um just based on what they, what product they wanna sell. And that is scary. Um And that, that’s something that I think um has just stuck with me for um for a while. Thank you for raising the the risks and, and potential, you know, misuse abuse. We, we need to go to artificial intelligence to create a uh a picture of an elderly black couple that was, it was necessary to do. And also thank you for the valuable parallel, you know, you, you make me think of uh social media adoption when Facebook was new, you know, we, we assigned it to an intern and we put it like the cherry on top where we didn’t need a cherry, but the intern had used it in college. So, you know, she may as well do it for us full time. Uh It very valuable, interesting parallel. Um Amy start us off with just a common I definition, you know, um artificial intelligence, generative, I mean, a generative artificial intelligence. That’s, that’s what we’re largely going to be talking about. Uh if not exclusively. I, I think so, what is, what is, there’s a lot of that? I think we’re, we’ll start with taking one at one at a time, right? Sure. No, I was just gonna say, I think um we already are exposed when we’re thinking about technology in our nonprofit organizations to lots of different terms, lots of different companies putting things out there with the uh not necessarily cloaked, you know, it’s not, it’s not a hidden desire to reinforce that they’re specialists, they know what they’re doing. And like us lowly nonprofits don’t know, we couldn’t understand those fancy terms, right? And so I always, I mean, I teach a course and I always remind folks like you absolutely can know what these words mean, you know. Um And I appreciate that there are so many places even actually, like I, I, I’m never somebody that promotes um these things. So folks know this, but like Microsoft has actually offered, you know, community learning spaces to say these are what these words mean. Um So artificial intelligence is like I said, the biggest umbrella term for all different types, generative A I uh machine learning, all of these components that people might talk about as if they are one different thing. They’re all like within that same A I umbrella. And I just want to say two words because they’ll probably come up in our conversation. I know you want to go one word at a time. But the words I hear from folks the most where they’re not, they feel like they should know what this word means and they don’t and they feel like silly that they don’t understand our algorithm and model those words are used all the time in talking about generative A I, which means the tool is, is set up to generate something back for you. Tristan used an image, uh you know, visual image uh example, but that could be text, that could be video, that could be audio, you know, it’s, it’s asking the the tool to generate something for you. Um But an algorithm we’ve heard this word like, you know, oh Facebook’s algorithm is like choosing what I see, right? The algorithm means the set of rules. So in Facebook’s newsfeed, that set of rules says if something already has a bunch of likes prioritize it, right? If it has uh you know, two friends that you’re connected to already commenting, prioritize, so it’s whatever that set of rules is that says this is how to generate a older black couple image, what whatever those rules were, that’s what algorithm means. And model essentially means like you can think of the same, the the word is used in the same way as uh when you say model about cars like it is the whole set put together, right? It’s got the data, it has the algorithm, the rules that say how, how to do it, it has the input, whatever you’re gonna ask it to do that kind of when people say what’s the model? They’re really saying. OK. What, what’s the package uh of how this tool is working? Thank you for all that. It’s time for a break. Open up new cashless in person donation opportunities with Donor box live kiosk. The smart way to accept cashless donations. Anywhere, anytime picture this a cash free on site giving solution that effortlessly collects donations from credit cards, debit cards and digital wallets. No team member required. Plus your donation data is automatically synced with your donor box account. No manual data entry or errors make giving a breeze and focus on what matters your cause. Try donor box live kiosk and revolutionize the way you collect donations in 2024. Visit Donor box.org to learn more. Now, back to artificial intelligence for nonprofits. Redux, Justin, I see you taking lots of notes. What’s uh what’s going on? What’s going on in your head? What what? No, I think um what just as Amy highlighted. One of the things that’s important to highlight is um we, we’ve been using A I for a really, really long time and there are really important use cases that have nothing to do with, with generative A I, things like machine learning, right? That allows us to do things like predict donation, things like machine language that allows us to translate from one language to another. Things like machine vision that allows us to identify and classify objects. All of those are important um tools as we look to solve different problems. Um In in the sector, generative A I is as Amy was highlighting is a new class of artificial intelligence that allows that’s capable of creating effectively novel content because it’s reasoning across, you know, all of the information in the internet and using as a news highlighting algorithms to identify patterns that allows it to um you know, produce answers in a really uh in, in many times intelligent ways. However, uh as Tristan was highlighting, you know, ensuring that um these models are inclusive, are representative, are safe, are understood, are all things that were continuing to work uh to put frameworks around and tools around uh so that they uh produce positive impact, not negative impact. And Justin how can we ensure that that actually happens? You know, there, there’s a lot of talk about biases, you know, uh the the the large language models are trained on predominantly white uh uh language sources. So you’re gonna, there’s so there’s bias uh the, the so that, you know, there are equity issues. But uh what uh what is the big tech doing to actually uh keep these, keep equity centered in and, and keep lack of biases centered as these models are adopted using the algorithms that, that Amy just defined for us. Yeah, it’s a really multifaceted answer. I’ll only hit two points and we can go much deeper if we want, we release. Uh just in fact, in the last week, this the Microsoft A I access principles trying to get at this very problem which has 11 core components. I’ll speak to two to give you a flavor of the kinds of things that we need to do as we think about the A I economy globally to ensure it’s fair, representative uh and safe. The one of the principles is making sure that A I models and development tools are broadly available to software developers everywhere in the world, everywhere in the world and every culture in the world training on the language and on the history uh and on the societies all around the world uh to create much, much more representation. As you probably know, many of the models have been developed in North America and therefore reflect some of those cultural biases. So, federating these tools that is critical uh in the in the A I economy. Secondly, you know, um companies and organizations that produce A I need to have rules uh for how they um check and balance the A I to ensure that it’s responsible, it’s fair, it’s safe, it respects privacy, it respects uh security, it’s inclusive, it’s transparent and we call those rules that Microsoft are responsible A I framework and it’s not just a set of principles, it’s actually an engineering standard. And when applying that engineering standard, we were looking at uh fairness in speech to text. So taking speech and transforming it into text and we found it was a couple of years ago, we produced this article that our, our speech to text algorithms were not as accurate Black and African American communities in the United States as they were for Caucasian communities. Um And that was largely a function of the training data that was used. And so we had to take a step back using our framework that caught this issue to say, how do we work with the communities more effectively? How do we bring socio linguists in to help us understand how to capture all of the rich diverse city of language to make sure that our speech to text capability is representative of every citizen that we’re, we’re rolling this out to. And that’s an exam and we did that and, and today it performs much better and there’s more work to do. But it’s those kinds of frameworks and guard rails that are really important in helping uh people design this stuff in a way that benefits everyone. Tristan. What’s your reaction? You, you’re thinking about equity all the time. Um What’s my reaction? What isn’t my reaction? And I would say, um I, I love that and I love what Justin was saying about um how, you know, making it a Federated model as opposed to it. I mean, yeah, everything, I only say everything but a good amount of things are being generated created curated in North America and baked into those models and algorithms are like biases that skewed towards white men. And um and that’s not OK. I think that excludes me in particular, but also like, you know, I, I think um having um a plan for that as opposed to being reactionary to being like, well, gosh, we didn’t know what was going on and being um uh a little more, less reactionary and more um forward thinking in that way. Yeah, proactive um is, is always a good place to start. I think a few other things that do come to mind too in terms of um making sure that communities of color marginalized communities are um not um constantly shouldering even outside of A I but constantly shouldering um the mess ups of like the brand new tool that came out on the market and that seems to always be the case and there’s always like a headline months later where it’s like, so and so we found out, this tool wasn’t geared towards her facial recognition wasn’t geared towards like, you know, black folks. Um, and it was like, historically wrong. And so I, I think about those things, but I also think about um, it through a nonprofit lens because we’re on a nonprofit call. Um, and I, um, I bring up the, another anecdotal story of um having, uh, being on a call and having an A I note taker bot um hop into the zoom call too. I think we’ve within like the last half year we’ve been on calls where it’s like, oh, I don’t know about some actual person or a thing or like, you know, it, it’s very ambiguously named sometimes where it’s like Otter, one of them is Otter, right? And this Otter is all of a sudden it’s in our meeting. This Otter is, yeah. And I think, you know, there is a lot of benefit, there’s a lot of benefit in having um you know, uh note taking tools and um also captioning tools that are, are, are for folks in terms of accessibility. There are folks that have completely different learning styles. There are folks that take in information at different levels and different wavelengths of things. And I say that all to say that like, you know, I would like to see a world where um it was scarier um with, to keep with the Otter Box or not Otter Box. Sorry, that’s not Otter Box is not a sponsor of this. Um But the Otter A I um uh gene Note taking tool was that after I got an email randomly from the, the note taker to all the people also to all the people that were in that call with a um a narrative recap of everything that we, we talked over. It wasn’t a transcript, it was a narrative recap, which is fine enough. OK. Um There were, there was a screenshot of just a random person that was on the call that was also there. And also um what’s most scary for me, I think or just very concerning um is um at the bottom, it was like here’s the productivity score of the call, 84% here’s the engagement of the call, 72%. And it’s like where it, where is at least, at the very least, where’s the asterisk at the bottom that says this is how we calculated this whatever. And I, I immediately go, I’m not a pessimist, but in that moment, I was like, this is going to be used by people in higher positions, people in power to wield over folks, middle management and direct service to say, hey man, you didn’t have a um 84% or higher engagement score on our last zoom call, you are now on a personal improvement plan and that is a scary place to be. And so I think less about like these tools are what they are. But I think about the people and the systems and the toxic systems at times that sometimes wield these brand new shiny tools in a way that doesn’t feel good and also is working against their mission and against their employees. Its time for Tonys take two. Thank you, Kate and thank you for supporting nonprofit radio. Uh I like to say thanks every once in a while because I don’t want you to think that we’re taking you for granted. I’m grateful, grateful for your listening. And if you get the insider alerts each week, I’m grateful that you get those letting us into your inbox. Um This week, I’m in Portland, Oregon recording a whole bunch of good savvy smart interviewers for upcoming episodes. Hopefully, that helps like show our gratitude because we’re out here collecting good interviews for you to listen to if you can’t make the nonprofit technology conference yourself. So thank you. I’m grateful that you listen, grateful that you’re with us week after week. That’s Tonys take two Kate. Thank you guys so much for listening to us every week. We appreciate you. Well, we’ve got Buku but loads more time. Let’s return to artificial intelligence for nonprofits redux with Justin Spell Haug Amy Sample Ward and Tristan Penn Ki. I love that you brought that up. Um Don’t love that it happened that you brought it up as an example here for folks because I think it’s uh a easy entryway into a conversation on one of the points Tony mentioned at the start of the call, like, what are some of these preconditions? Um And you were like, oh people are like, oh bright shiny, right? That’s what we do. Oh bright shiny, like I’m going to use this tool that like took the notes in here and a place where we’ve seen for many people, many years in in ten’s research is that nonprofits struggle. This isn’t to say that for profit companies don’t also struggle with this, but nonprofit organizations struggle with consent, they struggle with privacy and security. And so here’s a well meaning well intentioned, right? I’m going to use this tool except it’s emailing you, you didn’t consent to that. It emailed all the participants in the call. There was no opt in, right? Let alone a very clear opt out like why did I even get this? Um That’s not even to say opting into sentiment analysis of whatever is a community zoom call, right? Um And so when we peel that back and say, OK, well, we just wouldn’t use that note taking, right? Sure. But when we’re thinking about preconditions for this effective work as an organization do, what are your data policies in general? The number of organizations that we work with that still don’t have a data policy because they think, well, isn’t there like some law about data? So like we, why would we have our own policy? OK, there is some law related to data, right? Different types of data have different laws, but that’s not the same as an organization saying, what data do we collect? Why do we collect it? How long do we retain it? What if somebody wants us to remove it? How do we do that in our systems? Right. So this level of uh fidelity to your own data, to your own community members, to the policies that you’ve set up to manage those relationships. Um And trust for so many organizations are already not in place or, or like I said, there’s just not a fidelity to them that that makes them trusted. So then to say, oh yeah, we’re ready to, we’re ready to add this note taking app to our community calls or our client calls. It just that that’s the place where I have the most fear is actually not the tools having bias. I know they have bias and that is a place of concern and, and a place we can, can address it. But my mo the most fear I have is people still operating within that without any of the structures or policies or, or training to deal with both maybe bias and a tool they use and their own bias or their own issues, right? And it it accelerates the harm that that can be created in that. I mean, I want to use some of that to, to go to Justin and uh that’s something very closely related. Uh the, the uh the nonprofit leadership role, the responsibility of, of nonprofit leaders. I think it gets to a lot of what Amy was just talking about. But what, what do you, what do you see as the, the responsibility of nonprofit leadership in, in formulating these policies? But also in just, you know, making sure that the preconditions are there so that we, we can be successful in integrating artificial intelligence, whether we’re bringing an exterior, an outside tool or, or or building our own. Even that, that may be a, that may be a big lift for a lot of listeners. But, but generally the, the, the nonprofit leadership’s role. Yeah, I mean, there’s a lot of the nonprofit leadership play today and I think we have to meet uh leaders where, where they’re at and, and I think the very first step and Amy mentioned this in the very beginning of the call is raising the the capacity of their knowledge and of their staff’s knowledge of how these tools work and uh what are the edges of the tools and how to apply them effectively in the flow of work. And um there is training available as, as an example, we have a four hour course on linkedin. You don’t need to do it all at once, but it’s actually pretty good. It’s for, it’s not for developers, it’s not for techies, it’s for front line program, staff, fundraising staff finance staff, the, the, the ed uh to really learn about how to think about these tools with that knowledge. Then you can take the next step, which is starting to engage, I think, simple ways to apply these tools to get on the uh on the ground experience of what they’re good at and what they’re not good at. Um you know, using things like uh from Microsoft. So I’ll mention, you know, BB or, or, or Microsoft Copilot to look at writing donor appeal letters or whatever the process may be, they can just start learning about these fundamental language models and what they’re good at. Um I think it’s important as an organization thinks about getting deeper into A I and really thinking about how do they apply it to their processes, whether that be fundraising, whether that be engagement with beneficiaries that they think really deeply about data uh and data classification and that, that, that gets a little sophisticated, but just ensuring that we’ve, we’ve got a strategy to use A I for the data that we want to use A I for and that we segment data that we do not want A I to reason on away. So start with, start with getting the basic skill skills built out. Um A lot of uh organizations I met I meet with are just at the very beginning stage of that, use the simplest tools to accommodate uh the job to get some experience and then start to think longer range around data, data, classification and more advanced scenarios that can be applied. Tony. Can I just, what’s that four hour course on uh Tristan? Let me just let me drill down on a free resource. I love free resources for our listeners. Tristan answers. It’s a linkedin course uh nonprofit uh A I fundamentals. But let me get that for you here. Ok, Tristan, go ahead. Yeah. Um Can I um I really like how Justin um initial uh said, you know, there’s a lot of nonprofit leaders plates already too in terms of responsibility. And I want to gently push um and answer your invite to, to call you in Tony um in, in the premise of the question which, which was what’s, what is the responsibility of, of nonprofit leaders now? And I would say yes, there, obviously, there’s a responsibility as Justin has illustrated that like we need to be better in terms of strategy um in terms of tech, in terms of A I um in general on how we fold these, these crucial tools in. But I would also say that there’s an equal and almost um larger responsibility on those who fund nonprofits. Um I think a lot of times in the nonprofits that I’ve worked with, interacted with and worked within um their operational and financial model has been very ham handedly built in a very um doctor Susan way, which doesn’t really make sense at times and it’s because a year after year, there are different grants, different fundings that require different things um at different times based on whatever the the hot new term is. Uh 1015 years ago, it was mentoring. So a lot of times everything was geared towards mentors. And I say that because this implies that um a lot of these nonprofits are already built on a structure that is very shaky. And so there’s a lot of other things that need to be done. But I do think um a big responsibility sits with folks who fund um nonprofits foundations. Um and also local governments, federal, the federal government in making sure that when they are pushing a grant or um putting out an RFP for a grant that says you need to fold in tech and you need to fold in A I in this way to get kids to learn or get kids in seats um in the classroom that you’re doing. So in a way that creates um longevity and solid um solid nonprofit organi operational work. Um And just doesn’t like slap an ipad in front of a kid. Um And I think that’s really, I used to work with boys and girls club. So that’s where I always default. Um But III I think that um I’ve based on my experience, it’s always been um a really weird way um of, of having um o going into a financial model um of an organization year after year because it’s like, oh, well, that we started doing that because last year’s grant asked for it and now we just do it into perpetuity. And so again, you have that little weird Dr Seuss style way of thinking. And I think um funders and um grant, um grant folks can do a lot by being very clear and very um forward thinking and how they are offering up these monies. It’s time for a break. Virtuous is a software company committed to helping nonprofits grow generosity. Virtuous believes that generosity has the power to create profound change in the world. And in the heart of the giver, it’s their mission to move the needle on global generosity by helping nonprofits better connect with and inspire their givers, responsive fundraising puts the donor at the center of fundraising and grows giving through personalized donor journeys. The response to the needs of each individual. Virtuous is the only responsive nonprofit CRM designed to help you build deeper relationships with every donor at scale. Virtuous. Gives you the nonprofit CRM fundraising, volunteer marketing and automation tools. You need to create responsive experiences that build trust and grow impact virtuous.org. Now back to artificial intelligence for nonprofits redux, you know, that’s not only the mindset like this, this, it, it feels like they’re being strategic by saying, oh, yeah. Well, we were able to come, we were able to pitch that in a way that we got the fund, but then that’s changing their strategies all the time. It also back to the point before is meaning the data you have to work with inside your organization is OK. Well, two years, we structured it this way for two years, we structured it this way. Do we even have like a unique idea to connect these people and say, oh, they were in both of those programs, like our own data sets are messy and influenced by funders saying, oh, now we need you to collect these demographic markers, you know, and it’s, it’s we we as organizations are often pressured by those funders to do it the way they want because it’s easier for them. Um and tells the story, they want to tell, but that’s really, really messing up the data sets and the program kind of uh processes or, or business processes that we have in place. And I I just wanted to connect that to broader things that intens worked on and advocated for for many years from the equity guide specific to funders. And that is that funding technology projects takes time and it takes a lot more money than like $30,000 for whatever the licenses are for something, right? Like it’s not uncommon that an organization building a model, an internal use model. This isn’t some big flashy commercial thing. This is just for them to, you know, like I said before, identify program participants that maybe, you know, could use intervention it’s not uncommon that would take two dozen tries to get the right model in place right? To really make sure the algorithm is, is fine tuned that the outputs are appropriate. Well, you can’t go through two dozen models in, in three months, right? And then have something there. A nonprofit would need a couple of years. And our, our funders, there’s already plenty of funders saying like, oh, now we have this A I grant, you know, opportunity or is that grant gonna be comprehensive of the work to get their data in a good place to get their program, staff ready and trained to Justin’s Point. Every staff person really trained adequately on, on not just what are these tools but what’s a good prompt? What’s a good use case for this, right? All of those pieces so that they can adequately and materially contribute to, then what is this project we want to do? What is the best fit for us and how do we, how do we build it and, and just to add on and we’ll wrap up to Amy’s Point and Tristan’s Point A I hasn’t changed the fundamental physics of what makes a good technology project. I mean, it’s people, it’s process, it’s tools, it’s capacity building, it’s a long term strategy, all that is the same. Um And if your listeners are wondering, where do I even get started in understanding the language of this stuff? Uh Because you asked the question. It’s called Career Essentials in Generative A I it’s on linkedin, it’s free. Uh And I take it it’s, it’s pretty good. So I think it’s worth worthwhile for your listeners. Thank you, Justin. How about uh in 10 Amy, what resources for folks? I mean, hopefully they’re already going to the nonprofit technology conference where there are gonna be a lot of, uh there are a lot of sessions on artificial intelligence. I know because I’m gonna be interviewing a bunch of those folks. So this is, this is probably the second of, I don’t know, six or seven A I episodes uh in, in, in different uh around different subjects. But N 10, N as N 10 as a resource for learning A, we have lots of them. There’s um you know, work uh not workbook but like a guide. There’s of course, the equity guide, there’s some materials on the website. We have an A I course and other courses that talk about A I, there’s community groups where you can ask questions and of course the conference. But uh thanks to Microsoft and Octa gave us some um supporting funding and 10 along with Institute for the future and project evident are at the tail end of a community design process where we’ve worked with over 40 organizations um in this process to create an A I framework for organizations, whether you’re a nonprofit or not, who are trying to make decisions around A I and our framing for this is the framework for an equitable world. So it isn’t just that you are a 501 C three registered in the US, right? Or that you’re a grassroots organization in whatever country like if you want to live in that equitable world, then this is the framework that we can all share and work in together. Um We’re going to do a little preview at the end TC and have whoever comes to the session is gonna get to road test it with us and then we’ll publish it publicly after the NTC. Um So lots more and obviously, I’ll, I’ll share that with you when it comes out. But um what’s really, I think important from this is that it is a framework that uh is built on the idea that all of us are part of these decisions that all of us have responsibility in these decisions. Um And that all of us are accountable to building, right? This isn’t um you know, the quote unquote, responsible tech or this isn’t like this isn’t just for those projects where you’re, where you’re gonna do something good over here. This is whatever we’re doing, it’s gotta be good. It’s gotta be building us into an equitable world because what else are we doing here? Right. If it’s not for that. Um And so I’m excited for folks to get to use it. It’ll be published for free everywhere anybody use it. Please go, you know. Um, so lots more on that too. Amy. You are perfectly consistent with the framed quote that you have behind you. All of us are in this life together. You’re living your, you’re living your framed art. Uh, uh, I admire it. Uh, Justin, we have, we’ve got maybe 10 minutes left. What, what would you like to talk about? We haven’t, we haven’t touched on yet or go further on something we have. Well, no, maybe, maybe I’ll just um build a little bit on what Amy was the question you asked, what are, what are the resources available? So I think that’s pretty useful to the, to the organization. So, so one is, one is the training that I mentioned too is uh we just recently ran a nonprofit Leaders Summit where we, where we had 5600 people together. Uh uh about 4500 online, about 1000 in a room talking about how do we grapple with A I? How do, what are the use cases that make this make sense? How do we think about data security and privacy? And we’re going to continue to invest in in that? We’re going to be rolling that out more globally as well with uh events in Australia and others. But that convening and that dial and just getting the community and dialogue I think is so important. I I learned a ton from that. We’re also going to continue to push on affordability and making sure that uh we’ve got affordable access to our technology so that every organization can use things like Microsoft Copilot uh for, for free um providing, you know that they, they’ve got access to our nonprofit offers and then finally, innovation. And I, I’m, I’m interested looking at scenarios that span the sector where if we invest, once we can create a multiplier effect. And one of the areas that we’re, we’re partnering on is with Save The Children Oxfam and many other organizations on the humanitarian data exchange, which is a large data set used to help organizations coordinate humanitarian and disaster relief domestically and internationally in a more effective manner. Uh So our mission don’t overlap uh but that data set hasn’t been super useful to date, applying things like language models training on that and creating a tool set that is cross sector for many organizations, you’ll see us um continuing to invest in that way. And I look forward to ideas from our intent partners here on the phone as well as you know, the community at large on on where we can make bets that will really help the sector together. Uh move, move forward Tristan. What would you like to touch on or, or go deeper in? We’ve got uh we got the, it’s 78 minutes or so. Um You know, I, I think I just wanna underscore what, what Amy was talking about and that we’ve, we’ve all been working on. Um, which is the, uh, I’m a little tired of you underscoring Amy, Amy and you, we force each other. You know, I agree with you should have seen us, we work together. It’s getting a little dull. It’s a little dull. Now. You should have seen us when we were in office. Our desks were 20 ft away from each other and there was a constant, there was a worn line in between our desks and nobody wants to be in between in that 20 ft in that 20 ft space. Um I will say um being a part of the community group, what Amy was saying about working with 40 other organizations um to figure out what um a healthy and um robust and equitable processes for any organization to um interact with and um field A I is crucial and I’m, I’m so glad that we are able to be a part of it and we’re, we’re going to be um debut it at NTC. It’s something that I’ve learned a lot from just based on someone who again, like I said before, I came from youth development. My degree is in child psych. Um So, but I’ve learned a lot over the years um working with N 10, working at N 10. Um But I think um one thing that’s, that’s been uh really, really beneficial is learning from all those folks in the group and um a couple of things that did come up in when we were creating that framework, which uh was um that organizations are making all kinds of decisions every day today. Um And I, I will say that it kind of highlights that I, we are talking about A I and how it like will look sound and feel and how it looks. This is all kind of uh we’re not meaning it to be, but it’s all within a vacuum. Um And we can’t think like that. We can’t think of all of us who have now, we are four years out from 2020 our lives were forever changed and every nonprofit will have their own sad story to tell about how the um the pandemic impacted them. And I say that to say is that like none, no one was prepared for that. And so if we um keep on talking about or um playing around with this idea of A I is like, it’s going to solve problems or it’s going to sit in this world um in this vacuum, we’re not doing ourselves justice and we’re being very forgetful about the past that we just went through. And so if we’re able to instead consider how A I will interact with the dynamic world that we all live within, um That’s going to better behoove us um both individually, but also organizationally when we’re planning strategically. Um If that’s year after year for you, if that’s every five years, I don’t know what that is. Um So having that strong tech um baseline for folks. And then I think also the other thing is people in all roles are considering A I and aren’t sure how it applies to them. Um I think uh staff, we’ve read stories um that A I will replace workers but have no idea what to do with, you know, where, where that fear sits with them too. Um It should just add to their work and not replace them. And I think a lot of we’re seeing uh you know, I’m, I um am on tiktok and so, you know, that’s a whole other like bag of algorithms and like, you know, things that we can dissect and pull apart. But I do, there are a lot of stories of, you know, there are folks getting laid off left and right. And um I, I would have to, you know, that begs the question why generally, but also like, what is the role of A I in all of this too? Um I think it’s really interesting when layoffs happen at a time when A I is accelerating um in a lot of our worlds, whether it’s in tech and whether it’s in other sectors across the world. And I think that there is a lot to be done by organizations who don’t fall prey to like the siren song of like A I and are going into a clear minded and not saying, oh, well, we can cut out this department and put it in, put, um, you know, this learning module in or this, you know, I think that’s, that’s really where, um, you’re going to see a lot of organizations and commu, um, organizations and companies thrive as opposed to just, um, laying folks off a lot there. No, we’re, yeah, we’re, we’re taking it in. Yeah. No. And, and the reality is that I admire the, the consistency between you and Amy. Uh, and, and, and, and, and generally, I mean, I made fun of you, but what it shows is you’re all thinking the same way. You know, you’ve all got the, uh, the same concern for the nonprofit Human first, human first. You know, like we’re all humans and we’re all prioritizing um us as humans and if we start prioritizing other things and it’s not going to, um, go well, well, but at end to end, you’re, you’re walking the walking the talk. So, and consistently Amy, you want to check us out with, uh, all of us are in this life together. Yeah. I mean, I think the biggest thing I, I want folks to leave with is that, that future is not predetermined. We, we are not sitting down and saying, well, ok, like I’ll wait for my assigned robot to come tell me what to do, right? It, it is still up for all of us to write that every day. And the people who most need to have their sentence at the start of the article or whatever, you know, at the start of the book are the folks who are being told in a lot of different systemic media type ways that they do not get to have their sentence in the article, you know. And so I, I hope that nonprofits know this is both an opportunity to shape and influence as A I tools are being developed to shape and influence the tools that we build within our sector for ourselves with our communities. But it’s also a responsibility for nonprofits who are the ones often closest to and most trusted by those systemically marginalized communities who are experiencing the most real time harm to be the supporter that brings them into that work. They are not necessarily going to get tapped by uh a company to learn this or do whatever. Even though I hear Justin saying these, these, you know, opportunities are, are free and accessible. You as a nonprofit can say, we think we might build something. Can you be in our design committee? Can you work with us? We’ll make sure that we all learn together, right? As an organization, they’re already in relationship with they, they’ve, you know, maybe benefited from programs or services. You have the responsibility and incredible opportunity to be the conduit for so many communities to enter this, this quote unquote A I world. And that’s a really important I think gift uh you know that we have as a sector to, to be the ones helping make sure so much, so much more of the world is part of developing these tools and designing them to be accountable to us as people, their Amy Sample ward. Our technology contributor here at nonprofit radio and the CEO of N 10. Also Tristan Penn Equity and Accountability director at N 10 and Justin Spell Haug, new corporate vice president and Global head at uh Technology for Social Impact at Microsoft. My thanks to each of you. Thank you very much. Real pleasure. Thanks so much, Tony. Thanks Justin. I’ll see you in 20 ft. Thanks so much, Tony. Next week, the generational divide now, this is interesting uh because uh we’ve been promising this for a couple of weeks now and it hasn’t materialized. It’s very relieving to have someone, an associate producer who I can blame for this show having been promised the generational divide, having been promised for weeks on end and not coming through even though it doesn’t matter that the associate producer, Kate has nothing to do with booking the guests that the host takes care of that himself. That that’s irrelevant. I blame the associate producer and this, this show, the generational divide had better come through next week or there’s gonna be a shake up. I’m the one who just reads the script to either. Oh, yeah. Minimize the uh OK. Your title is not script reader it’s associate producer. Well, if you missed any part of this week’s show, I beseech you look, I was slow on my cue. There I beseech you find it at Tony martignetti.com were sponsored by donor box. Outdated donation forms blocking your support, generosity. Donor box. Fast, flexible and friendly fundraising forms for your nonprofit donor box.org. And by virtuous, virtuous gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow. Giving, virtuous.org. Our creative producer is Claire Meyerhoff. I’m your associate producer, Kate Martinetti. The show’s social media is by Susan Chavez. Mark Silverman is our web guy and this music is by Scott Stein. Thank you for that information, Scotty. You’re with us next week for nonprofit radio. Big nonprofit ideas for the other 95% go out and be great.