Tag Archives: #nptech

Nonprofit Radio for April 6, 2026: Consider The Human Factors & A Conversation With The NTEN CEO

 

Rubin Singh: Consider The Human Factors

We launch our coverage of the 2026 Nonprofit Technology Conference with an NTC perennial: Rubin Singh. This year, he asks you to consider the human side of tech that impacts your CRM, and really, all technology: governance; business processes; inclusive design; and, change management. Rubin is CEO of OneTenth Consulting.

 

Amy Sample Ward: A Conversation With The NTEN CEO

NTEN hosts the Nonprofit Technology Conference. NTEN’s CEO, Amy Sample Ward, is also Nonprofit Radio’s technology contributor. They join us to share about the people and place of 26NTC. What does the host CEO do to prepare for a major annual conference?

 

Listen to the podcast

Get Nonprofit Radio insider alerts

 

Apple Podcast button

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

And welcome to Tony Martignetti Nonprofit Radio. Big nonprofit ideas for the other 95%. I’m your aptly named host and the pod father of your favorite hebdominal podcast. Oh, I’m glad you’re with us. I’d be hit with delusional parasitosis if you infested me with the idea that you missed this week’s show. Here’s our associate producer, Kate, to give you the highlights. Hey Tony, we have consider the human factors. We launch our coverage of the 2026 nonprofit Technology conference with an NTC perennial, Ruben Singh. This year, he asks you to consider the human side of tech that impacts your CRM and really all technology, governance, business processes, inclusive design, and change management. Ruben is CEO of 1/10 Consulting. Then A conversation with the N10 CEO. N10 hosts the nonprofit technology conference. N10’s CEO Amy Sample Ward is also nonprofit radio’s technology contributor. They join us to share about the people and place of 26 NTC. What does the host CEO do to prepare for a major annual conference? On Tony’s take 2. Tales from the gym. Sourdough from Kim. Here is, consider the human factors. Hello and welcome to Tony Martignetti nonprofit Radio coverage of 26 NTC. That’s the 2026 nonprofit Technology Conference. We are kicking off our 26 NTC coverage with this interview. Everybody’s gathered in Detroit, Michigan, this incredibly savvy, smart tech community. And we’re beginning with Ruben Singh. Ruben is CEO at 1/10 Consulting. Ruben, welcome to nonprofit radio coverage of 26 NTC. Thanks for having me, Tony. It’s a real pleasure kicking off. You’re our first one, first one of the year. Excited to be here. It’s a real privilege. My pleasure too. Thank you. You’re a perennial. I don’t know, 3 years running, 4 years at least, at least, yeah, yeah, yeah. I always enjoy the conversations. Um, thank you. I do, I do as well. Um, just give us an. Overview of the the the topic your your session topic is beyond the technology, the human factors, the human factors driving nonprofit CRM success. Give us an overview. What, what are we confounding between tech and human factors? Sure, sure, um, yeah, thanks for asking. That’s why I really love being here at uh. Um, at NTC because, uh, I spent a lot of time throughout the year, uh, going to other different conferences that focus so much on products, um, on, on newest innovations and, uh, newest technology, and, uh, this conference really gives us the opportunity to step back and say, you know, what makes the technology successful, and in my 20 plus years of consulting experience it actually has very little to do with the technology. Itself it has to do with all the things that are surrounding it in my opinion, yeah, yeah, yeah, I mean the, the technology is out there, um, everyone’s, you know, all, all vendors are really, um, doing as best they can to make sure they’re, they’re aligned with, uh, the needs of the community, but, um, that in itself doesn’t make success, so that’s kind of what I’m hoping to, to touch on this week. OK, sounds very good. Yeah, thank you. That’s a great overview. Uh, you can, you can grab, yep, grab his mic and. So what are some of these, uh, Amy also we might wanna be sensitive. I’m not sure about if the PA is a little too loud. I’m not sure it might be. We’ll take care of that. What are some of these human factors that you think are influencing CRM success? Sure, um, there’s gonna be about 5 or so that I’m gonna cover in my talk, um, but I can surely, uh, chat about some of those here. I, I, the first and foremost is really clarity of purpose. Um, what is it that we’re trying to achieve with the system? Um, oftentimes, uh, organizations will reach out to us and say, hey, you know, we, we need Salesforce, uh, and then I might ask them, uh, you know, what is it that, what, what is it that drove you to that decision, or, you know, what business problem are you trying to solve, and there’s not always answers for that, um, or the answers are very different depending on the different folks that you speak with, and, um, you know, it’s, it’s very easy, especially when, you know, you as a consultant to wanna jump in and problem solve and, all right, let’s build. Um, but, uh, you know, as I’ve learned through my career, uh, you, you really need to start with that strategy first, um, so that’s a, a, a, a key point of, of clarity of purpose, you know, what is it that we’re trying to achieve? Are, are we all on the same page of what we’re trying to achieve? Are there potentially different goals? Is it, you know, that we want to increase fundraising? We want to, uh, you know, we wanna be able to measure our. Impact better of our programs we want to be able to tell our story better. Is it, is it all of the above, um, all, all of these things ultimately need to be defined so we know how to, to, you know, uh, chart a path forward. I feel like some of this may be simplicity too isn’t, I think it’s simpler, although taking on a CRM change is not, is not simple, but. Is it, is it easier from an institutional perspective to say, you know, it’s a tech we need tech we needs Salesforce and we need versus being more introspective about our human our our our our human influences over technology? Absolutely Tony I think you you you hit the nail on the head there. I think it’s uh. Uh, oftentimes I’m brought in for what is said, hey, we, we need to solve this technology problem. We need to migrate to this new system or we need to upgrade here. And as I start having conversations and peeling back layers of the onion, I realized, goodness, this, uh, this is not a technology problem at all, um, or, or a very small part of it is technology. Um, there’s, um, you know, and, and I would say that, um, we, we often as a consulting practice, um, get brought into. Uh, organizations that have had a system for a very long time or they may have had an implementation that didn’t go very well, and when we look at the system itself, um, it, it’s almost, it is a reflection of the organization as well. It’s, it’s like we could see what your operational model is just by looking at your system. If it’s, if it’s very siloed and the data is very, you know, walled off, well, you know, it, it’s probably how you operate as an organization. Um, you know, every, everybody kind of doing their own things without a whole lot of synergy or a whole lot of, uh, ways to, to, um, work with each other towards a shared goal. Um, if data is, um, not really, really maintained very well, there’s not a lot of good data integrity, well, that probably reflects about, you know, how, how much or how little you value your data. Um, there’s so much that we can learn, you know, by seeing how the systems are and, and to your point. It takes that introspection and um it is much easier to just build and to be fair because then we can jump into a we can jump into it now we’ve got these 18 month project and just take a life of its own and we can ignore we can ignore we can ignore it the issue, the human issues and the culture issues that that we’re suffering through. Yes, and, and if I can, you know, if I’m being truly honest here, um, I will be honest it’s a safe space here. Um, we, uh, vendors, product vendors and consultants, uh, like myself, we don’t, we don’t necessarily make it easier, you know, we, we’re also like, oh, you know, if you’re not jumping on AI right now, you’re behind, or, uh, you know, if you’re not taking advantage of the latest product, oh gosh, where are you, you know, so I think, you know, we, we play a role in some of that as well, um, and so I really, uh, respect those organizations and, and especially those leaders within those organizations. who have that introspection and humility to say, you know what, before we jump into this, um, we need to make sure, uh, we’re aligned and, you know, our data is right and we’re, uh, you know, we have, uh, similar ideas in terms of strategy. We’ve cleaned up some of our processes. Otherwise what we’re doing is we’re spending those 18 months building something that ultimately won’t be used, um, and really, uh, hurts the bottom line of the organization and the impact they’re trying to deliver. How do you help clients? Make the pivot from we need the we need the tech solution to. We need to look introspectively at our, our human factors. Good question. And it honestly it doesn’t take much, um, you know, I, I, one thing I, I tried to insist on in our projects and try to build it into the, into the budgets is at, at bare minimum of, of what I call a visioning session where if nothing else we get leaders, managers, some, uh, you know, folks who are using the system in the front line. Get them together and be able to have a simple conversation about what exactly are the goals. Who are you as an organization? What are you trying to accomplish? What are those metrics that, that can tell you that you have or have not accomplished it? Um, what’s your vision of how a CRM system or your technology is going to help you achieve those things? Really just these 5 or 6 questions, um. I will get a good uh idea of how uh aligned they are and then you know when you ask those questions and everybody in the room has a different answer, that’s when they realize oh OK we’re not ready to jump right in um there are cases where everybody is aligned OK well then we can we can move a little bit faster and start talking about requirements and design, um, but I would say majority of the time with nonprofits, you know, trying to juggle as much as they have to in, in very unprecedented times. Um, it is, uh, it’s really, uh, that, that, that sort of initial conversation, um, is usually what makes folks realize, oh, we really need to step back and do this right. I’m wondering, has this cost you clients where, where, you know, they, they realize they’re not ready for the kinds of solutions and the kind of work that you bring? Um, it, it, it, it did early on, it did early on, early on I would say, you know, that they would come to us and say, hey, you know, we were referred to you, or, you know, can you implement this, and, and, and I like to step back. OK, well, we’ll find somebody else then. Um, oftentimes I’ll, I’ll actually refer them to somebody else. Like I, I, I don’t want to, I’m not in the business of, of, uh, wasting nonprofits money, um, or, or setting them up for failure. So, um, there’s, uh, but I would say as, as I’ve gotten further in my career and, you know, um, have shared a little bit, you know, the success story. That we’ve had with this approach, it’s, it’s actually, you know, we’re starting to see some nonprofits, uh, uh, reach out to us for this. They, they know that they, they need someone and, and they also don’t necessarily wanna bring in a different strategy consultant because they don’t want to have someone who is so disconnected from the technology, um, so they like to have someone who understands the technology, understands the technology roadmap, but doesn’t start there. Um, so I would say it’s, it’s actually, um, it, you know, at the beginning, it, it was tough at first trying to, to convince folks that this is the right way, but now folks are looking for it. Yeah, you and I have talked in the past about inclusive design. Let, let’s talk some about the, again, remind folks the, the value of that and, and how that plays into the, the human factors that are going to influence our, our CRM success. Absolutely, thank, thank you for asking. It’s, it’s not a question I get asked enough, so I appreciate that. We’ve talked at length about it. Yeah, well, I appreciate it. You, you always make sure that we include this as part of it, as part of the conversation. I, I love that, um. It’s, uh, uh, it’s, it’s definitely one of the factors, one of the 5 factors that I’m gonna be speaking about today later on, um, yeah, it’s, it’s, it’s really just asking ourselves, uh, when we build systems, uh, as I mentioned, it is a reflection of the organization, it is a reflection of the organization’s values, so we have to ask ourselves who, who, who is at the table making these decisions about how the system is gonna be built, um, and who is not at the table, um, if and who is not at the table, you know, what can we do to get some form of representation to make sure that their voice is heard. Um, oftentimes I’ll go into an organization and, you know, IT will, will, will be sitting with me and say, OK, let’s get into requirements and design, and that’s why I always, you know, say, OK, well, we can, we can talk about this, but ultimately we’re gonna need to talk with the folks who are gonna be using the system day in and day out. Um, or some organizations even a step better, say, OK, well, you know, here’s our super users, and I’ll, and I’ll say, great, you know, I’m happy to talk to the super users who love, you know, the technology, who love CRM. I also wanna talk to people who hate, who hate it, who are uncomfortable with it, who are reluctant, who are resistant, even. We wanna talk to everybody. We wanna talk to people of different demographic, different lived experience, the, the, the more, the more, uh, perspectives we can get. Um, it is gonna help us build a system that’s really designed for everyone, um, so, so I, I, I do believe that and I think also we’re in a time now where, um, we have to be mindful about the data that we capture, um. You know, we, uh, only maybe 10 years ago everyone was talking about big data and collecting as much data as possible so we can do all kinds of, uh, analysis and such and, and patterns and visualizations, um, where I think the nonprofit community is being a little bit more intentional, needs to be more thoughtful about the data that we capture and the data that we decide, hey, this is not, not good or not safe to, to keep in our system. Um, so, you know, these perspectives don’t really come up unless there’s people in the room who are thinking about it, um, so that’s where the inclusive design part comes in, uh, is really making sure we have, uh, as many perspectives at the table, so we have a system that’s really designed for everyone, and we’re being mindful and intentional that, you know, in the data that we capture, the processes that we build, that we’re really. Keeping our communities in mind that we’re serving and and not potentially creating harm, you, uh, you alluded to the principles. I think you’ve got 5 principles of, of, uh, for successful CRM implementation. What can you just tick off those 5, um, clarity of purposes, as I mentioned, the inclusive design is another governance is a 3rd. Um, uh, leadership and leadership growth in the process is, is, is a 4th, um, and the 5th, uh, is, uh, uh, the, the human factors, uh, sorry, the, the emotional, social, emotional aspects of, of moving to a new system, and that’s where I’ll speak a little bit towards, you know, really just understanding that for some folks, uh, moving to a new system, it’s, it’s more than just. Just a technology change, it is really changing your way of life, especially for, you know, I, I work with gift processors who’ve been doing, you know, this, this, this job, um, where they really ensure every gift is, is, uh, entered and processed and coded correctly. Some of these folks have been doing this for 1015 years, um, and it could be a thankless job, and, um, you know, everybody wants them to move quicker, move faster. Um, and, and so I, I, you know, spending time with folks, um, in those roles have just made me realize that this is more than just a software change. Um, it’s really their livelihood, it’s their, it’s the way they do their work, it’s, it’s what they feel comfortable with every, every day, so we have to be thoughtful and intentional in the way that we, um, bring change in those situations. How about the governance role? What, what’s the, what. You’re you’re thinking there, yeah, on the governance side, um, it’s, it’s sometimes one of the hardest things to, to get, uh, folks on board for, but it’s to me it’s one of the most critical things. It’s, uh, you know, when we, when we roll out a new system, it is, it is, we try to make it as perfect as possible, but it’s only good for that moment in time. Um, organizations will evolve. They will change. Their needs will change. New programs come, new programs go. The, the, the social political climate changes, um, and the system needs to evolve with it. And that’s one of the things that I’ve seen when we, when we see how CRM or why CRM systems failed. In my opinion, this is one of the biggest reasons, um, because the, there’s no structure in place to ensure that the system is evolving along with the, uh, with the organization. So when I say governance, I’m talking specifically about a committee, a council, call it what you want, of a cross-functional group of uh folks across the organization, some leadership, some frontline staffs, everything in between. Um, making sure every department is, uh, or business unit is represented in this council, and they’re actually the ones who own the system. It’s not IT. It’s not consultants. It’s not any particular vendor. It is really this governance council that ultimately makes the decisions on, on what gets prioritized, how the system will evolve, what integrations are in place. Um, they don’t have to be technical, you know, that’s where. You know, consultants or IT come in to help implement and execute, but the ultimate decisions are really made by this, uh, governance committee in itself, and I could see how that would bleed into leadership, another one of your five principles you need leadership to create this governance council to make it real and not a, you know, a facade, right, right? And you’re saying and that yeah, and, and the buy-in that the governance council owns the. Owns the system. IT is a support absolute IT is support. So but leadership. I was, I was encouraging you to talk something about the leadership role. Yeah, absolutely, and I think that, um, and, and it’s critical because, uh, we work with some organizations right now who’ve who’ve taken our advice. They’ve created this data governance council, but it’s tough because let’s take an example, um, you know, how do we track certain demographic fields in the system? It might seem like a very simple thing of how do we track race, ethnicity, gender. Um, in, in, in these types of factors, uh, there’s actually a lot of discussion and debate when it comes to these things, you know, how do we wanna break things down, how do we wanna track it? Should we track it, um, and there can be lots of, um, differing opinions here, but this is where leadership then comes in and says, OK, I’ve, we’ve listened to everything, and, and they help, can help facilitate and manage the discussion. Um, there’s also, uh, another topic that comes up in these governance councils is also permissions and security. Um, especially for organizations that are, that are a bit siloed or are used to having their own shadow systems or their own way of doing things, you get everybody together, but now everyone wants everything walled off. I don’t want anybody to see my data. This is where leadership can come in and say, look, these are the advantages of, you know, transparency. This is what the advantages are of, of having some shared data. It can bring synergies that you can, we can refer from program to program. Um, so you know there’s going when we get these governance councils in place, um, there’s going to be debate, there’s gonna be discussion, there could be very heated discussions, but this is where leadership can really grow into, uh, they can either, you know, avoid it and say, hey, you’ve all figured out yourself, or they can lean into it and, um, help facilitate decisions. That can help the organization long term so leadership growth um or that leadership role is is critical to making the data governance work. Say a little more about the the growth and the the evolution really of the system you’re saying as the as the institution changes the the system has to change with it. Uh, is that, is that just like a, a technology maintenance plan, or, but rather, I’ll tell you what, rather than me guessing what it might be, why don’t you just tell us what, what it actually is? Yeah, yeah, sure, um, no, no, all good, um, so an example of specifically around data governance is, um, I sometimes work with organizations, uh, and we start looking at their data of, you know, OK, well, let’s say for what we need to, what we need to move to a, to a new system. Um, they wanna migrate to a newer platform and such, and I find all these data points that are there that have been there for maybe 1020 years, and then you start asking around and nobody can really explain what it’s for, um, nobody can explain the, the root of it, the origin of it, and it’s just things that have been added over time. Um, in, in the most basic form ways, it, it, it clutters the system. It adds to the technical debt. It clutters the layouts that makes it hard and, and, and to use, and, um, oftentimes that stale data, um, it, it sort of takes away from the confidence people have in the system. So they’re like, OK, yeah, I don’t know if this data is accurate. I’m not going to use it. Uh, in fact, I have this spreadsheet that I keep, and I know that’s accurate. A death knell to CRM success exactly antithetical to CRM, uh, yeah, and then if we, if we tie it back into the inclusion conversation, um, you know, sometimes I’m, I’m working with a nonprofit that works with, uh, housing, for example, and then I start looking at all these questions that they ask in their questionnaire questionnaires, and it has to do with criminal history. It has to do with substance abuse history. It has to do with all these things, and I, I start, I have to ask these questions. Why are you tracking all this? What is it is. It, is it required? Is it, and then some of the answers I get back is, oh no, no, no, this was from a grant report that somebody asked for 10 years ago or, um, a leader who was trying to do a report and they asked us to capture this, but nobody’s actually using it right now, um, and now if we think about, you know, big data, data analysis, we talk about AI, um, you know, my fear is, uh, that this, this data gets in the hands of the wrong folks and it can actually harm the communities that, that, that they’re trying to serve, so. Um, so I think that those are just a couple of examples of where, you know, systems need to evolve with the organization and then of course there’s the more operational stuff, new programs, different data points that we wanna collect, different drop downs that we wanna collect, um, and just making sure that the system evolves. The moment, the moment the system doesn’t catch up with it and it becomes too hard to make changes, um, that’s where you’re gonna see the shadow systems and the spreadsheets coming back into play. Share something else that uh you’re gonna talk about in your session today that that we haven’t talked about or or go into, go into more detail on something you think we haven’t covered sufficient. Yeah, I think, um, yeah, the part that I’m most excited about is, is really just talking about how, how the, the, how the CRM is really that operational model. I don’t, I don’t think folks really see that enough, um, you know, I am gonna ask folks like what, what are the, you know, key reasons that they see CRM systems going sideways or implementations going, you know, getting derailed. Um, so I’m excited to hear what the feedback is, but I know when I’ve done these presentations before, it rarely has to do with the technology, and so, um, you know, so, so I think that we haven’t talked about things like change management. Of, of course that’s definitely a key component of it, um, but yeah, I’m really most excited about just kind of talking about the operational model, and, uh, you know, when I, um, Had, I did a similar presentation last year and I spoke a little bit about one of the, one of my five factors back then or last just last year was, um, make sure you start every project with a project charter and uh I don’t know if I feel strongly about that this year. I say the project chart is important, but you know I, I, I would start with a theory of change like that that I think is more important, um. The, the, the project charter could is still a good data element to capture um on how you’re gonna run the project, but I think it’s a project charter, charter, yeah, yeah, of like, you know, what are we trying to accomplish in this project, who are the people, who are the roles, um, how do we make decisions, it’s important, but I think even more important and what I’m gonna emphasize this time is, is the strategy part of it. You know, who are you as an organization? What is your theory of change? How do you intend to achieve the change that you’re trying to create in the world? Let’s start there. Let’s start there and then, uh, we work backwards. We then we figure out the processes that will support that. We figure out the outputs that you wanna track to know that you’re successful. Now let’s start talking about the technology. The technology comes last, OK, yeah. Let’s, uh, so that’s the place to start. We’re gonna end, uh, we’re gonna end our conversation with that. Ruben, always a pleasure. Thank you. Thank you so much. My pleasure. Ruben Singh, CEO at 1/10 Consulting, where you know that you’re not gonna just be sold a system that, that, uh, will keep Ruben busy for 18 months, but, uh, you ultimately unsatisfied with your CRM that, that, that does not happen for 1/10 consulting clients. So pleasure, Ruben. Thank you. Thank you so much. It’s time for Tony’s take 2. Thank you, Kate. We’ve got tales from the gym, sourdough from Kim. I learned recently. Uh, one of my, uh, visits to the gym in the morning that, uh, the sourdough bread. Is still flowing from Kim to Robert. I know you know Robert Semper Fi, our, our, uh, PhD, our Harvard Kennedy School of Government PhD in, uh, global studies. Recently, new PhD and it was Kim who was giving him sourdough bread. We saw this happen once before, uh, and they had made it sound like it was gonna be an ongoing thing and it turns out that it is. I saw the exchange, she brings the loaf. Last time she went out to the car to get it. Uh, she was, she was, uh, this time she was better prepared. Oh no, you know what? Last time it was impromptu. She was just talking about the bread and he got a loaf, but then it’s continued. And this week I saw, I witnessed one of the exchanges firsthand. It’s amazing to see, to, to catch a glimpse of, of monumental, you know, milestones of in history like this. The sourdough loaf, exchange, the transfer. The gift of bread from Kim. To our new PhD Robert. So, it was a startling thing to witness. I’m, I’m glad that I was there at, at the moment. You just, you just never know when these things might, you know, might never, might, it might never happen again. You just, you just have to get lucky in life this way. I also learned that Kim is the choir director. Uh, here in town at the church. Now, my town has lots of churches, but I guess one is the church. If I had to guess, I would say it’s probably the, the biggest one, which is the Baptist Church. Uh, Emerald Isle Baptist Church. But I, that’s a guess. She’s, we just know that she’s the choir director at the church. If I ever get confirmation on, uh, on which church in Emerald Isle is the church. Of course I’ll share it in another Tales from the gym. And that is Tony’s take too. Kate. I think you need to get a loaf of this sourdough bread. Well, the first thing I would need to do is start talking to Kim. I guess you can’t just walk up and ask for a loaf. Yeah, no, I’d have to start, yeah, you know, and I wave or I say, I do say occasional good morning to Kim, but that’s it. That’s as far as it goes. So, first thing I would have to do is introduce myself. That’s a big step. You know, I like to, I like to get my stuff done in the gym. I’m there to work, not, uh, not chat up folks. Uh, oh, I got another tales from the gym. Oh, a, a, a, another chatty guy, chat, and this guy. Uh, it’s coming. It’s coming. There, there’s, there’s too much chatting. I got, well, again, I got a place to go. Well, I got a place to go back here in my home office. So I take care of my workouts in the morning. I gotta leave. I, I can’t be chatting between sets and It’s just, there’s no time. There’s no time. You know, um, these folks are all retired. We’ve got just about a butt load more time. Here is a conversation with the N10 CEO. Welcome back to Tony Martignetti nonprofit radio coverage of 26 NTC. You know that that’s the 2026 nonprofit Technology conference. You know that all these smart folks. Using technology or consulting in technology or thought leaders in technology, all for the social good, you know that we’re all gathered here in Detroit, Michigan. My guest now is Amy Sample Ward. They’re the CEO of N10. We are not at N10. We are at NTC, which is hosted by N10. It’s a little bugaboo that we share, uh, a little. Joke that we share. So we are hosted by N10. Amy is the CEO, of course, you also know that they’re the. Technology contributor to nonprofit radio. Amy Sample Ward, welcome back for, I don’t know, the 60th time, the 70th time to nonprofit radio. Thank you for having me. Thank you for, I forget the number, you’ve already told me 12 times of getting to do this at the NTC. It feels good to have it’s like it’s a core part of it now, you know. Thank you. You you actually planned for where. Nonprofit radio studio would be. Oh, absolutely. When we were on the site visit, we’re like we’re right here, this little beautiful view, sunny corner, everybody walking by, you get that like perfect podcast, ambient background noise of, of chatter and laughter. Yeah, no, we, we held this corner for you. Thank you very much. That’s what a lovely, what a lovely intro. That’s, that’s fabulous. Thank you. You can uh doff your little mic. OK. So we’re gathered here in Huntington Place, yes, which, which, uh, which is the convention center, but they don’t call it that. Exactly, thank you. I didn’t want to steal the headline, but yes, it is the Detroit Convention Center. Yes, we have thought, I, I, I’m quite curious how it came to be named that because. You know, like, what was the focus group room where they just tried and then everyone loves Huntington. Great. OK, well, let’s just call it Huntington Place, you know, um, I’m sure it’s maybe named after someone or mayor I, yes, the province leader of, yes, this is a trivia question. I, I can’t wait for a listener to email me like I can’t believe you didn’t know. Why is it Huntington Place? All right, but it is, and we’re gathered here. How many of us are gathered here in person? Well, because we’ve committed to continuing having both an in-person and a virtual, we don’t totally know how many people are in the building because you could just join virtually and not tell us. You can swap out at the last minute, totally, yep, so we’ve got, um, just about 1600 attendees in total. How many of them are in the room on any given day? I’m not sure, but you know, I, I don’t imagine that it’s 5, So it’s probably close to that number. OK, much higher than 5, right, much closer to 1600. OK, that’s right. Um, what’s, um, we’ve had, we’ve had a lot of chats, not surprisingly here at, uh, in the studio about artificial intelligence, of course. I imagine so. Let’s, let’s diverge from that. Oh, thank you for that gift. Share, share, uh, today’s keynote. I, I, I cannot. Uh, be in the key in the, in the comments for the keynotes because we’re setting up the studio because, uh, last minute stuff has to be done, ready for the 9 o’clock interview. So what was today’s, uh, keynote about? Sure, today was our, uh, the whole keynote was presented through our partnership with PIT UN, the Public Interest Technology University Network, um, which is part of CUNY no, no, oh, Pitt. OK, the pit lounge, we’re, we’re right here across from the pit lounge, yep, we sure are, um, CUNY, like many other universities, are part of that network, um, of the of the university network. That’s right. But so just to answer, I will come back to your question and I’m gonna take a meandering path, uh, to first say that because everything is horrible, everything is hard, um. Not everything, many things that we really tried this year to embrace the things that aren’t hard and horrible which are partnerships and collaboration and community and invited a number of other nonprofits who not just like in a. Talking point our mission aligned but like practically are part of what it means to advance intense’s mission of having equitable technology and an equitable world, Pit UN being one of them, spaces where they’re trying to make sure tech is built in community with community members. So we asked if they would come underneath the tent and and just have one big tent for this year and they of course said yes uh and we’re like we’ll do all the things we’ll sit right here we’ll bring all these people, you know, and one of the things they offered is how do we elevate these conversations should we partner on one of the main stage conversations and we said yes of course and how. We showcase what what it means to actually build technology in community. I think obviously that’s something I talk about you’ve talked about, you know, like that’s an idea, but how can we prove to this whole audience that it’s a real thing that people do every day. So this morning’s general session, Andrene Soli, the executive director of PI UN. Hosted a conversation with two practitioners from the Pitt UN Network where they talked about, yeah, literally this is a center at a university I run and we get all these students and all these community members and also these local government folks and these businesses and we built this whole advisory and then they decided what tech they wanted and then they built it together and then they have rolled it out and here’s how it’s sustained and really talked about like. Yeah, it’s not just possible, it’s, it’s happening, um, and, and kind of filled the morning with those possibilities. OK, cool, cool, cool. Um, so I, I’d like a little insider insider scoop or something like what is as we sit in this hallway with all these people, let’s get the secrets out. Yes, you’ll you’ll you’ll temper my question. No, what does the CEO of the host organization do with a 1600 person. 2.5 day conference. I see you, uh, you run back and forth. You, you have a radio in one hand. Lots of people have radios. All the team members have radios, but, uh, you know, I see you having conversations as you’re going up the escalator. You’re shouting down at the, you know, like asking this, yeah, but, but did the audience understand the nuances of the ethical considerations as you’re, as you’re gliding up this 10 minute answer is coming? Do they understand the ethical considerations as the as you. Um, no, so what, what, like, you know, what, what’s your, what’s your, what are your days like for the, not the lead in, but the live, the live 2.5 days. We started yesterday, today’s Thursday. Tomorrow we close Friday like around 2 o’clock or so, roughly. What do you do for these 2.5 days, you as the CEO of the host organization? It’s a great question, um. I think that my job is making sure that everyone is competent doing their jobs, so making sure everybody on the team knows if they’re. Signed up on it we have an internal staffing calendar for all the different roles to make sure people get breaks and get lunch and can walk away and not talk to anyone for a minute, you know, and so if they’re signed up for a shift and they’ve never done it before, do they know how to do it and they feel good and they don’t have to. Admit that like I’ve already checked in to say hey this is what’s gonna happen do you feel good you know you can ask me questions um and then the same thing goes to all of the partners anybody that’s here attendees too like. I want everyone, if you made it all the way here, even if you walked from a block away or you flew from 4000 miles away, that like. If you’re spending all of your precious energy to be here that you, you get out of it whatever you needed out of it, you know, and for some people that sessions, OK, let’s make sure the sessions are running, the AV team is there that, you know, those are gonna come off and you’re your best self, but also that attendees or partners, everybody felt like they’re leaving. Like satisfied and satiated and full and ready for a whole another year of hard work. So, A lot of my walking around or talking with folks is just like, do you have what you need? Do you not know where to go? Do you not know that, oh yeah, there’s already a room for that like here let me show you on the map where that room is, um, and then solving when people say no I don’t have what I need actually or I don’t know how to find that room, uh, and walking in there. So that’s my job. OK, so not, not so much putting out fires. That’s the, the, the team is empowered to deal with the if someone has a, has a purple lanyard and a radio, that means they can, so we have a, a pre-conference meeting with, with a representative of every vendor associated with the building and we say. You’re escalating it to me because I, I, I guess you believe that has to happen. Anyone on the team can say yes or no to a question and can authorize how we do this work. You, you pointed out last year that when you go to that, when you have that meeting with all the representatives of, I guess, housekeeping, uh, food, security, everything. AVIT. They’re accustomed to meeting one or two people on, on the host from the host organization. You bring the whole team. All of intent is in those meetings and they’re shocked. Like all these people, we can, we can deal with all these people. It’s right, and we ask every single one of them does an intro. Every single one of them shares what part of the conference they run. Yeah, alright, so you’re doing it differently. The food is excellent this year, as good as usual. We do really care. We want people to food. I know, yeah, I know, I mean. Uh, I, you have plus all the options. I mean, uh, there’s gluten-free, which I know is important for you, but it’s important for hundreds of people here, and there’s kosher and there’s vegan. The options have all been, it’s all been arranged. It’s not like we don’t halal, sodium-free, everything. Thank you for calling out a couple more that I that I didn’t think of, right, um. What’s the, what, what goes into the planning? Like, what’s the, what, let folks in on the. The two weeks before the first day here. What’s that, what’s that like for N10? Two weeks before, I mean, by that point we’re so down into the details of, OK, do we feel like this, you know, do we want an easel sign or a meter board sign at the bottom of that escalator? jargon jet. What’s a meter board sign? Oh, a meter board is like is literally those tall signs that that stand like 2 m tall, um, versus an easel sign which which goes on an easel, um, but yeah, we’re really down into the details within 2 weeks, you know. We have looked through all the bins and storage and we’re deciding that 2 weeks out is kind of when we do our final, you know, do we think that we need 3 more plastic sleeves that hold an 8.5 by 11 sign, you know, do we need more chocolate for community assistance desk, um, I think. Our biggest work starts a year out, you know, 2 weeks in it, you know, the train hasn’t just left the station, the train is like arriving the next and so our kind of our, our shipping. What about shipping? No, shipping is a month out. Shipping is already done. Yeah. 00, everything’s already on site here. Yeah, OK, OK, OK, um, but. We take, you know this, we talked about, you know, we have intense sleep day on Monday and then the following week is when we have our kind of multi-hour debrief and we start a debrief doc for the next conference. in 2 weeks because we don’t wait until it’s over to start, you know, um, because as soon as you’ve done something well now you know oh I shouldn’t have done it that way I’m gonna put it in the debrief, right? so we’ll go through everything. That we’ve put in there for the last year, make decisions. OK, do we just need to vent or is that actually something that we need to change, right? Or is that an idea for something to do differently and then we in that same document transition to here’s the build for next year. Speakers need a different form when they do their thing or badges need a different field in the, you know, right then and that feels in some ways like. A really big mental lift because you in order to make a lot of those decisions we’re like really playing in our mind OK what will that be like to build the registration, you know, checkout process or what will that be like to run a session in that room if we change the AV set and so it takes a lot out of everybody and it’s like hours of discussion. But then once we have that, we’re just referring to that document for a year, you know. It feels like there’s a catharsis to that too. I don’t have to keep this in my head, you know, it’s until we start the, the planning for the next one. That’s dump. It is a common refrain because it’s a year-round practice. Somebody will say, oh, I wish I had thought of put it in the debrief. We just, you know, star that document and everyone is just opening it all the time, all throughout the year, yep, put it in the debrief. OK, this is valuable institutional knowledge. Yes, yes, cumulative they are, they are coveted documents, you know, that’s cool. That’s cool. Um, are we, are we able to announce next year’s location? Yes, I should have looked up the dates. I know where it’s gonna be, but is it, it’s in Portland. Portland. It’s Portland, expected to be Portland and it’s in March. That’s OK. That’s OK. We’ll, we have plenty of nonprofit radio episodes between now and then. 50 50 to be exact. Between now and then, so there’ll be plenty of opportunities, but yes, next year’s is in Portland in March. That’s right. OK, because we, because you’re so egalitarian. Last year was Baltimore, East Coast. We’re Central Detroit. And West Coast is always Portland because so many staff live there and so Portland. Do we know two years out the location? We don’t. We don’t know that. Oh, you’re not bound. We’re not bound. You’re not bound. OK, well that’s good, right? Yes, flexibility because in years past, there were, there were binding contract COVID. Threw off the the cadence of how far out you could really book too. So I see. OK. OK, good. So Portland next year. Look forward to that. I already met someone from the AV Jen partners with us every year. Yeah, I didn’t know that. I, you know, she’s an all black and she’s got the logo on her shirt. Uh, you know, audio tape, whatever it says. And so I just pigeonholed her into, oh, they do the graphics and the recordings of, of all the of the sessions and the and the main room and the comments, etc. But no, she said she does the early walkthrough with you. They’re year round. Told me how you thought about this alcove for nonprofit radio and she’s, yeah, this is perfect. So yeah, so I’m gonna talk to her early for next year on how to get all my gear to Portland. This year, Amy, my wife and uh production assistant here, uh, lives in Indiana. Uh, where you went to college, um, so she drove, so I shipped a bunch of stuff to her because I fly. Last year Baltimore, North Carolina to Baltimore, I drove, so I drove the gear. This year I shipped the gear to Amy. She drove it up to Detroit. Next year nobody’s driving. No one’s driving all the way, neither North Carolina nor Indiana to Portland. So I would like to be able to get my gear there. It’s a real pain in the ass to bring it on a plane. I’ve done it. I’ve done it, but I’ve so many bags, gear bags, so Jen Jen is gonna. Partner with me. Jen’s gonna help you. She said talk to Hailey and Jen will have advice. Yeah, what were you gonna say? Well, you know that I do this to you regularly, but can I ask you a question, even though it’s your radio show? I think I know what you’re gonna ask, but yes, well, OK, it might be the one you, well, you might surprise me. Go ahead, of course you can. I was just gonna say it’s been 12 years of you being at the NTC and hosting. So many, I mean you do a really great job kind of curating if that doesn’t feel like a weird word, but you know selecting a real cross section of content across, I mean there’s, you know, 180 sessions or whatever and you’re not doing 180 interviews that would, you would be here for 2 weeks, you know. Ash’s with Ash’s help, I do, I do go through and select, but so I’m curious, you know, obviously 12 years ago people weren’t talking about AI and now a bunch of them are, but you know, separate from that kind of change, I’m curious any reflections that stick out to you of like. You know, I, I’m in the middle of the storm and, and you’re in the storm but not in the same spot of it that I am and I’m curious like if you can see shifts in those conversations or in those topics or in those, you know, like I’m curious what you see that’s maybe changed over those 12 years as an indicator of, of the larger sector conversation. Well, I’ve, I’ve, uh, complimented you on this before that you are a very much a macro and also micro thinker and I tend to be more tactical and so less of a macro thinker, so it’s a challenging question, but yeah, no, uh, so it’s not the one I was expecting. That’s OK, which is fine, which is absolutely fine. So I would say. You know, the way, the way now technology is mostly software, it’s not exclusively, it’s also human processes are, but I would say the way technology has evolved has changed the conversations. I mean, you know, 12 years ago, I don’t know, like I feel like 12 years ago we might have still been reading software manuals or app manuals, you know, but, but now software is so intuitive and. The user experience, I mean, I don’t, I don’t, I don’t know if 12 years ago we were talking about UX and UI. I don’t know if we were, they were very technical conversations. And, and yeah, but you know design is not just intentional but it’s just more common. I feel they’re part of that. Yes, you can feel more a part of it because the technology is more I guess human human focused, you know, it’s, it’s, it seems like it’s, it’s more common. They’re more common threads binding the technology than there than there were 1012 years ago, um, yeah, an app. Across multiple apps is a more common experience than it was 1012 years ago. The expectations are, are, are higher, I think software is certainly more sophisticated, um, but it’s, it’s, it’s more than just user friendly. It’s, there’s a more common understanding of what technology is now than I think there was 12 years ago, I think. Yeah, you would not have, you wouldn’t have looked at, well, what do, what do our competitors do? That we can improve upon presumably, but, but you know, where are they starting from versus, you know, now versus, you know, let’s start out of the box. So I’d say that I’d say that. I don’t know how to put that in 6 words, but I like that and and I think that’s all for the good and that’s all for the good of humans interacting with the technology um it’s for the good of the missions that we’re all here to promote through. Savvier use of technology, that’s all for the good, yeah, yeah. The question I thought you were gonna ask me, tell me what that is, uh, what’s a common theme this year? Oh, OK. I mean, you’re welcome to answer it if you had an answer AI, of course, which was, is more common this year than last year, but you know what I’m seeing within that niche. Much more. Conversation about the environmental impact. We’ve only got one floating in the sky. People are more concerned about it this year than they were last year. Last year, uh, ethical considerations and, and that certainly not that we’ve abandoned the ethics and the biases of using artificial intelligence, so more of like the social human impact when you say ethical, just clarifying because people maybe categorize environmental as the ethical, you know, ethical use of our resources versus ethical. Impact on humans that’s the bigger picture that you that you see. Yes, I’m seeing more of the physical environmental concerns. Water, electricity centers, yes, with massive data centers, and I think that’s because of the proliferation of them. It’s like even just in the last 12 months that conversation is coming up. If it’s not in your backyard, there’s a good chance it’s in your state. Right. We’re now talking about billions of gallons of water. Well, and, and it’s so complicated, right? Like there’s already great examples of communities doing like the real work of democracy, you know, like local communities working together, lifting up their voices and saying no, right, getting city council to not approve a, a data center contract or, or whatever it might be. However, and I think that’s great, you know me, I, we could go down that road deeply, but the thing I hear every time I, I want to celebrate one of those stories is, but then what about the community that didn’t have enough people. Like I’m gonna cry just thinking about it I know you know or know how to organize to know how to organize to organize they didn’t know what to do, hadn’t hadn’t built that social infrastructure as a rural or low resourced community. To get to make the city council do that same thing, these are the voiceless, right? or or the marginalized, right, and, and so I, multiple things are true at once. I’m so proud of communities saying we get to have a choice and our choice is no and that needs to be respected. And at the same time it is true that there are so many communities that don’t know or have access to the resources that would allow them also to get to do that, you know, so then it’s like well then we just completely keep repeating the same patterns of the places where jails are built. Right where data centers are built, where giant warehouses are built, are continue to be in the places that are gonna keep mostly impacting those same groups, you know. Yeah, yeah. And the physical impacts of all those things, as well as the environmental, just environmental meaning like just what what you you do you even have a neighborhood or you just live someplace. You don’t have a community, you don’t have a neighborhood, you just have a home or an apartment somewhere. Yeah, I know. I don’t know. They’re Amy Sample Ward. They’re a good friend. They’re, they’re our tech contributor. They’re the CEO of E10, um. I don’t know. They’re in our hotel. You’re, you’re in our hotel. Maybe we shouldn’t, I don’t know. Do you have a Friday night planned? Um, for the first time in my personal NTC history, so of every NTC I’ve worked as a staff person, but even as an, as an attendee before I worked at N10, um. I am leaving Friday night. I have like the last flight out because, um, Saturday, I was gonna say tomorrow but it’s not, it’s not Friday. So the, the, the next day, um, at 10:00 a.m. is my daughter’s ballet competition and I wanna, I don’t wanna miss it and I couldn’t and I couldn’t fly in time. You know, on a Saturday morning, so I had to take the last flight out late Friday night. Good parent. Good parent also they’re a parent. Also they’re a parent. Yes, and she’s very upset she didn’t get to come to the NTC this year and it’s like, well, next year for sure. Yes, of course, of course hugging her. Yes, yes, she will expect an interview. I’ll put her on mic. Why don’t you bring her with you? It’s, uh, it’s Amy and Oren next year. That’s on mic. Yeah. OK. Alright, alright. They’re Amy Sample Award, the CEO of N10. They are the host organization of the nonprofit technology conference. We’re at NTC. We’re not at N10. N10 is the host. Amy is the CEO. Thank you very much, my friend. Thank you for having me, my friend. pleasure. Thank you for having me. You’re, you always, you always plan on nonprofit radio. Thank you. 13, we’ll be there in Portland, whatever the dates are, we’ll be there in March in Portland next year. Yes. Next week, our 26 NTC coverage continues with responsible AI adoption and ethically using AI. If you missed any part of this week’s show, I beseech you, find it at Tony Martignetti.com. Our creative producer is Claire Meyerhoff. I’m your associate producer Kate Martinetti. The show’s social media is by Susan Chavez. Mark Silverman is our web guy, and this music is by Scott Stein. Thank you for that affirmation, Scotty. Be with us next week for nonprofit radio. Big nonprofit ideas for the other 95%. Go out and be great.

Nonprofit Radio for May 26, 2025: Healthier Productivity From AI

 

Jason Shim & Meico Marquette WhitlockHealthier Productivity From AI

Our annual duo returns with tips and resources to make your use of artificial intelligence better for you. They also go beyond AI with many smartphone strategies, inbox management, and Meico shares his shutdown ritual for bedtime. They’re Jason Shim, from Canadian Centre for Nonprofit Digital Resilience, and Meico Marquette Whitlock, The Mindful Techie. This is part of our coverage of the 2025 Nonprofit Technology Conference (#25NTC).

 

 

 

 

 

Listen to the podcast

Get Nonprofit Radio insider alerts

 

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.

Nonprofit Radio for May 5, 2025: PII In The Age Of AI & Balance AI Ethics And Innovation

Kim Snyder & Shauna Dillavou: PII In The Age Of AI

Artificial Intelligence and big data have transformed privacy risks by enabling malicious, targeted communications to your team that seem authentic because they contain highly accurate information. Kim Snyder and Shauna Dillavou explain the risks your nonprofit faces and what you can do to protect your mission. Kim is from RoundTable Technology and Shauna is CEO of Brightlines. This continues our coverage of the 2025 Nonprofit Technology Conference (#25NTC).

 

Gozi EgbuonuBalance AI Ethics And Innovation

Gozi Egbuonu encourages you to adopt Artificial Intelligence responsibly, in a human-centered approach. First, be thoughtful with the threshold question, “Should we use AI?” If you go ahead: Create a thorough use policy; overcome common challenges like staff training and identifying champions; manage change intentionally; and more. Gozi is with Technology Association of Grantmakers. This is also part of our #25NTC coverage.

 

Listen to the podcast

Get Nonprofit Radio insider alerts

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

Welcome to Tony Martignetti Nonprofit Radio, big nonprofit ideas for the other 95%. I’m your aptly named host and the podfather of your favorite hebdominal podcast. Oh, I’m glad you’re with us. I’d turned dromatropic if you unnerved me with the idea that you missed this week’s show. Here’s our associate producer Kate to introduce it. Hey, Tony. Our 25 NTC coverage continues with. PII in the age of AI. Artificial intelligence and big data have transformed privacy risks by enabling malicious targeted communications to your team that seem authentic because they contain highly accurate information. Kim Snyder and Shawna Deleu explain the risks your nonprofit faces and what you can do to protect your mission. Kim is from Round Table Technology, and Shawna is CEO of Bright Lines. Then Balance AI ethics and innovation. Gozi Egbuonu encourages you to adopt artificial intelligence responsibly in a human-centered approach. First, be thoughtful with the threshold question. Should we use AI? If you go ahead, create a thorough use policy, overcome common challenges like staff training and identifying champions, manage change intentionally, and more. Gozi is with Technology Association of Grantmakers. On Tony’s take 2. Tales from the gym in addition to my gratitudes. Here is PII in the age of AI. Hello and welcome to Tony Martignetti Nonprofit Radio coverage of 25 NTC, the nonprofit Technology Conference. We’re all together at the Baltimore Convention Center, where our coverage of 25 NTC is sponsored by Heller Consulting Technology services for nonprofits. Our subject right now is PII in the age of AI. Personally identifiable information in the age of artificial intelligence, safeguarding privacy in a data powered world plus we’re adding in the topic. Alright, already the show’s over. I wanna thank you all for coming. Uh, we’re, we’re here all week. Uh, be sure to tip your servers, um, and we’re adding in the topic a little more privacy please. Colin, diving into data privacy. All right, because, uh, our guests, um. Ask to combine topics which made a lot of sense. Um, but, uh, before I introduce the guest, well, now, let’s do it this way. So we have, uh, stand by there. We have, uh, first is, uh, Kim Snyder. Kim Snyder, um. I gotta take a deep breath. I do, uh, Kim’s title. I’m gonna hyperventilate trying to get enough air to oxygen in. I’m only 140 pounds. I don’t carry enough in my lungs to carry this, to carry this title of virtual digital privacy Project and program officer. You know Joshua Pesca is thanked for that word salad of it’s all nouns. It’s all it’s all one adjective. 12 nouns. Joshua, you’re, you’re out. Anyway, and then CEO doesn’t get any easier. OK. Also with us, uh we have a special guest who’s gonna give a couple of syllables. Uh, let me introduce Miles. Miles, say hello. Hi everyone, it’s Miles with Fundraise up. Thanks Tony. My pleasure. Miles is sponsoring the hub next door at Fundraise Up, so I, I thought I’d give him a little. He asked to give a shout out, so I said sure. And uh they’re giving away free socks. That’s what fundraise Up is all about socks and what else do you do at fundraise. Right, so we help nonprofits raise more money with AI and we do that by not using any identifiable information and are completely compliant across the globe. All right, that’s what a segue and not even reversal incredible. All right, you’ve overstayed your welcome. That’s enough. OK. OK. OK, thank you, Miles. No, thank you. I, he was, I, I did invite him after he pleaded. OK. So we are talking about PII. So Miles, a perfect segue, beautiful segue into personally identifiable information. Uh, Amy, we’re gonna do the overview, so I’m gonna ask Kim. Virtual digital data, virtual digital privacy project and program officer. I’m gonna ask Kim Snyder. No, I’m gonna, no, I’m hitting it hard. Uh, so for an overview, why, why do we, why do we combine these two topics? What are our issues around personally identifiable information and, uh, and artificial intelligence? Kim Snyder. So they both center on the issue of personally identifiable information. So on the one hand we’re talking about what kinds of regulations exist, how do you manage your data I’m too far away. Don’t whisper, Kim. Everybody hears you. Oh, go ahead. I’m waiting. Um, now you, you edit this, don’t count on too many edits. Oh dear, OK, alright, so, um, we’re talking about personally identifiable information which for quite a while for the last couple of NTCs have been talking about this here and. For quite a while it’s been about more about regulation this year I have to say it’s about having our data out there and vulnerability and so looking at data management and how do you start to take stock of your data so that it is less vulnerable and the person the people whose data it belongs to is also less vulnerable and the other topic which I’m here with my co-facilitator um. Uh, Shawna is with all the amens and I’m here. I’m just like I’m a man, yeah, in the, yeah, so, so talking about how that what constitutes personally identifiable information, how much that’s expanded in recent years and Shawna, what’s what’s your bright lines, how are you related to. Yeah, yeah, so Bright Lines, I founded it 4 years ago. We are a doxing prevention company for folks who don’t maybe know what doxing means. Yeah, it’s define it please. When folks will use your personal information or sensitive information, they’ll post it publicly, essentially posting your documents, that’s where doxing comes from with the intent to incite others to do you harm. So there’s like a malevolence there, right? I don’t usually consider it doxing if someone posts like. A relatively available email address from like a professional setting. I do consider it doxing when it’s your personal email address and the intent is to ask others. It could be your birthday, it could be, could be your wife’s or my man right here, yeah. the PII PII is an expanded. No, I never, no, no, actually I came out of US intelligence community. I was there as a much younger person and in a different age in the United States and in terms of our national security. It was really progressive national security person, um. The whole community, yeah, the I I’ll just say the I mean the intelligence community, yeah, yeah, I don’t usually get too granular with that um but the. Was it in the session description it would have said OK yeah we can talk about that. OK, well, I, I’m not sure I’m, I’m pretty sure, but there again it’s one thing when it’s like out on the airwaves. First is when it’s in like a session thing yeah and at at the time when I was there I was detailed out to the DEA this might have been what you read, to train them on finding their targets on the US side of the border of drug trafficking organizations so we were using these same techniques. I was training them in these like techniques to find people. We reverse engineered that now four years ago after the 2020 election when. Folks were going after Ruby Freeman and Shay Moss for just passing a piece of gum while tallying ballots in Georgia they have a penthouse in Manhattan now have the keys to that penthouses. Um, OK, interesting. So reverse engineer I see reverse engineered your, uh your prior prior work. All right. um, so referring to your session description, uh, how AI and big data are transforming privacy risks by enabling aggregation. So your concern is that the, the. Attempts at uh. Spamming people, not spamming but spoofing, fishing, they can, it can be so granular and so accurate that they, they look more and more real. This is a part of our problem, right? OK, and people and agencies, people are using artificial intelligence to gather this information and then and then put it together and collate and then threaten. So they will, so I think we could probably tag team on this. Do you wanna do the production part? So what we see is them gathering data. There’s a lot of data that’s out there about all of us, and I will. If there’s one point folks take away from me talking today in addition to my hype madness, it’s that this is not your fault. Our clients come to us and they say, oh, if I just hadn’t shared so much on public on social media publicly when I was younger and it’s like no no this had nothing to do with you. Your public records are being scraped by data brokers every day. If you own a property, if you’ve ever registered to vote someplace, if you have a driver’s license, which you have to have if you wanna get on an airplane, that data is being sold or scraped. So that’s the data that’s the source data for data brokers. So yeah, sometimes for free, for a, yep, OK, but publicly available, you don’t need to be, not an agency there’s no kind of like legal process to gather it exactly. This is why law enforcement officers, like certain law enforcement agencies now go around legal process and we’ll just buy data from data brokers. Oh, so much easier than defending a subpoena. to prove it to a judge to prove it to a judge and then if this if they move to quash the subpoena, you have to defend it. Exactly. So AI can now gather data from various sources, so it could be used to scrape these sites. It can then be used to connect data. Let me share a story. We got a phone call like a very concerned client. They had just received a phone call themselves from someone who claimed to have. Photos of theirs compromising photos from an old Snapchat account and on the call they described a photo that this that our client knew they’d taken right it was a photo of a room they were describing a room and the clients like, I remember that room. I remember that poster that they’re describing. I think I might have posted it on Instagram one point it was public, but how did they get my number? How do they know where I work and. My response was like, this is a scam. Someone scraped, someone bought a scra of LinkedIn. Maybe they connected that to your phone number. Maybe you have your phone number connected to LinkedIn because you use it from MFA for multi-factor authentication. They connected that to a handle on Instagram, probably using your face, a facial recognition. And then they just made this phone call and talked to you about your employer finding out about these photos, which was a bluff because your employer’s name is listed on your LinkedIn profile. It’s terrifying for her. And Kim has taken it a step further. So you can stitch all this together, right? and you can process all this data at speeds that never were possible before, but you can also use generative tools to create things so you can. Easily mimic a style of someone so you can also so you part of that data that you grab off of LinkedIn or social is somebody’s writing style so you can, you know, generative AI is really great tone and style and also events. So if you’re posting about events and things happening you could get. An email from your purportedly from your executive director or a colleague referencing that event and things that happened and people who were at that meeting it depends on how public the data is and then you know that can be used as a basis for a you know phishing email um that is a lot more convincing phone call yeah or a phone call this person that called our client was a human but they don’t have to be we’ve seen cases where EDs are being impersonated. And it’s video and it’s audio of them that is so convincing to the people that they’re reaching out to and this is it’s trivially easy to do right in our session in fact we had which one is the real Kim and there were two videos of me and one of them was not me um it was AI me but that cost me $29. To take that, so it’s not inaccessible. These tools used to be it used to be like really hard to do this or 25 cents and it’s like a photo in 3 seconds of audio, and they can make those videos, yeah, and you can have me say you don’t even need me saying the alphabet or or Kim’s title for Christ’s sake or half of Kim’s title. I did say you could swear. I didn’t say you could take the name of the Lord. There’s a difference. There’s a difference. There are boundaries even on nonprofit, there are boundaries. This is Chris. I’ve, uh, I’ve gotten, I’ve gotten these, uh. Dear Tony, I know I could have called you at my number or or written to you at my address accurately, uh, but I chose this method instead. So now I know they’ve got my email and my phone and my address, uh, included a picture of my home, which they probably got from Google Maps or, or right, and, uh, I, I some kind of bitcoin bitcoin scam. But how did that make you feel uh the first one I was a little like. Yeah, I was a little nervous, but, but I’ve gotten, uh, we all have gotten Bitcoin scams in the past, but this one had, like, you know, like you’re concerned that amount of information a lot of, yeah, yeah, it had the right and uh I, you know, I, I ignored it with some trepidation and then like a day or two later I got another one and you know I knew I was just kept coming. It was bullshit. Yeah, I saw one of those from one of our threat intelligence partners, someone who swims in this every day, and it terrified him and his wife. Yeah, because it’s so it’s so close to you. It’s why receiving one of those phone calls or back in the, I would say back in the day I got really energized around Gamergate started to try to support the folks who are being targeted by Gamergate. This is back in 2015, and they would describe what it was like to have like, you know, I sleep with my phone next to my bed. And or under my pillow and to have that be the stream of all of this like directed hate messages like you should kill yourself or I’m gonna do this to you or I’m going to do this to your parents or whatever the case might be. It’s so proximate that technology removes what feels like barriers between you and everyone else, and the issue with doxing so terrifying is that you don’t know who it is. It could be anybody. How do you walk down the street? How do you like sleep in your home, not terrified? You don’t know. I never thought about that. Who’s coming after you? Thank you. I never thought you bet new nightmare unlocked. Yeah, no, no, you know how, but Tony, so you get these things because you’re you’re killing me. It’s supposed to be reassuring us here on nonprofit radio. Well, you’re terrifying. We’ll get to that. We will get to that party eventually we’re we’re great parties, but, but, OK, so you’re, you know, more public person, uh, you, you know, nonprofit radio, so, so you. Get these things it’s a little unsettling and unnerving for you, right? yeah like so imagine how like a nonprofit staff person who happens to be working in an organization that may be more targeted by malicious actors, OK, so one is so your staff member starts to experience this and this may this could freak people out, right? So that’s who we’re thinking about. Um, and kind of raising the awareness, OK, yeah, I mean these are folks already dealing with some level of cortisol at a on a regular basis because of work because of their mission. I think we’ve spent enough time on motivation, and let’s let’s, uh, let’s let’s transition, uh, not subtly very abruptly to what the hell do we do? What do we do it already. Is it already too late? It’s never too late. I’m sure you’re not gonna say it’s too late. No, I wouldn’t be here. Yeah, well, I also believe it and I’ve had those moments. Listen, I live in DC and DC DC Health Link had their data leaked and taken a number of years ago and my child who had not even turned a year old had her social security number lost in that breach and I was like, oh man, she’s not a year old, you know, like how is this? This is the world we live in, right? And I turned to my partner and I was like, this is just, I don’t even know why we bother. And she’s like, you can’t, you of all people can’t have that feeling. It’s OK that you do right now, but you have to keep going. No, there are plenty of ways to ameliorate it. Yes, let’s get, let’s get into them. So what we’re with you. Why don’t we start? Go ahead and then we’ll go to Kim. Yeah, I think you can think about this so the individual as the vector to threat to the organization that can be reputational financial threats to the organization could make it hard to fundraise if you don’t support that person very well. Um, you, you would harm your reputation, say, or, um, it could make you look illegitimate to your funders, right? So if you can think about where the risks are to the organization, that’s one set of what to do, right, action items, and I might leave that with you and speak more to the personal. So when it comes to protecting yourself as an individual, there are plenty of ways that you can work to remove your data online was referring to Kim, not me. Oh yeah, no, Tony’s not gonna take that part no Kim’s got that, um, Kim. I won’t try your title um when it comes to the individual, listen, all of us have data out there again it’s not our fault we have lived a life, right? Like we’ve done things it’s, I think it’s a betrayal of trust in our own local governments that they sell this data and no one’s ever asked us for consent they’ve never informed us, etc. etc. etc. OK, so what do you do? You can sign up for one of those services that removes your data from data brokers we consider that like um. Like taking Advil, right? Like it’s like kind of taking care of some of the pain and some of the symptoms. What we also recommend is like looking back to the source data itself. So if you own a property that you live in, we always recommend that people consider moving it into a revocable trust that they don’t name for themselves. You’ve seen too many estate attorneys call it the Tony Martignetti revocable trust. Exactly exactly a different a different name to the revocable trust. That’s it. So now the ownership is obscured its data that’s already out there from prest. This is the argument that our interstate attorney always gives us and we have to educate them on this. They’ll say, oh, but it’s your name’s gonna be on the document granting it to the trust, but your name was there before on tax documents. The way data brokers work is that they’re constantly pulling this data down and renewing their data set. So when the new data comes down at this address, they want the most accurate, the most recent. they’ll overwrite it. So it may be that you lived at that address at one time but you don’t any longer and if someone’s looking for that address, it’s not your name on it. So it will get overwritten, especially over time. What we’ve seen wildly enough is that when that piece comes out, it’s like a house of cards. When you pull that property record out the rest of it tends to fall apart. We see our clients less and less on ownership is kind of a uh. a core or a hub to to other data yeah absolutely yeah I think there’s some connections happening there with like app user data that’s also on an ISP that’s connected to the house, etc. etc. is there other pieces about that location um that create profiles anything else we can do on an individual level besides the uh property ownership. Another big vector is voter data and I know that’s probably not popular in this audience because a lot of folks believe a lot in the voter file and voter data and using it and I, we often see voter data on getting used mm. Getting bought and getting scraped and so we will recommend that folks apply for programs in their states called address confidentiality programs or safe at home programs they’re always set up in with uh survivors of intimate partner violence in mind but a lot of the programs are pretty expansive, so if folks are concerned about stalking or harassment they can also apply and that then gives them a proxy address in some states like in New York across all agencies. So the DMV is now not going to sell your home address and your name. They’re going to sell your your name and your proxy address together. And and shout out the names of those programs that you would look for at your state. Address confidentiality program or safe at home. If you’re interested, the National Network to End Domestic Violence NNEDV.org has a comprehensive up to-date list of those programs. OK, awesome. Kim, uh, before we turn to Kim, uh I think you’re the perfect question perfect question answered. Person, you’re a person, you’re a person. You’re neither a question nor an answer. You’re you’re just a person with a lot of answers. Um, I read once, it’s so hard to unforget, you know, to unlearn things that, uh, the value of, of stolen data is really in the future is more financial like so that the bad actor can act without you tying it to a specific event. So my credit card, let’s say a credit card number is compromised, it’s of more value if it’s 3 years old than if it’s just a couple of weeks it was just stolen a couple weeks ago. Is that true or is that incorrect? I can see that. I can see that being true. Maybe we’ve gotten a little bit better banks and credit cards have gotten better about just reissuing new cards. Websites tend to push you to change your password when they’ve alerted you that there’s a breach, so I, I think. The private companies more so in government agencies but private companies I think have caught on to that a little bit and I think there is some truth if it’s not for financial means but really someone trying to go after you, we call that a ideologically motivated attacker. What we saw you used the word vector before I did, yeah this is my background so they um. What we found with uh a university, a client that’s a university, their students were being targeted. Some of these outside groups showed up to student houses over the summer. The students had already graduated. We’ve gotten some of their address stuff removed. The addresses weren’t available in connection to their names online any longer. So what we think happened was that those addresses that was screenshot and saved. That can happen, yeah, so it’s not a perfect fix. However, what if you have one as an intelligence officer, if you have one data point, so you have that screenshot, but then you have all these other things telling you that Shawna Dilla no longer lives at that screenshot address, you might show up there, but you’re not gonna spend a lot of time on it because you can’t verify it. You can’t confirm it with another source. Makes sense? Yes, thank you, thank you. All right, Kim, let’s turn to you on the organizational level. What, uh, what can we do, uh, there to. Protect ourselves from what’s already out there. How do we help nonprofits and small and midsize are our listeners. Alright, so for many years the the kind of mantra has been to verify, verify, verify verify. I thank you very much, that’s Kim Snyder and Shawna. No, I’m joking. She’s like I’m we’re out of time. No, we’re out of time. Are we out of time? No, I’m only child I fall for jokes very easily. I wish I had known. I wish I had so many. I had so many more. I had so many more in mind for you specifically talking about a targeted attack. Oh my, talk about a vector vector I was coming right at you. I could have written that you’re you’re putting this on the airwaves. You know how vulnerable you are. Oh man, I got all kinds of advantages. All right, I’m sorry, I interrupted you. What was I talking about dying. Go ahead. OK I’m sorry. OK, so we used to talk in cybersecurity world about, you know, verification verify, verify, verify that was the mantra, right? So now we kind of reshape that so that it’s vet and verify so have kind of multiple ways of verifying especially incoming requests. Anything kind of trust your spider sense is what I’d say if something seems a little bit off like what what are we talking about? So if you receive an email, if an email comes and it, you know, it comes from your development director who’s saying who’s referencing something that you just went to the panel or if it comes from accounting, write a check if any money is involved. And it wasn’t like completely expected even if it was a little expected actually I’ve seen I’ve seen this happen where people got into um nonprofit systems and using AI can scan what’s going on very quickly. And then target things that are about to happen from kind of things that are OK, so, so I would, so the instinct instinct, OK, use your, use your instinct but also make it a policy, make it a process that you just follow uncomplicated process for verifying like any financial transaction needs to be verified even if it’s expected, yeah, so yeah, so you wanna walk through that. You just get much, much more deliberate. About verification and and who is it coming from and you don’t want to. Confirming, did you send this email or not replying to the email, but my phone yeah exactly yeah you you send this email about this rush transaction or or routine transaction. Do it in a different format right different channel, yeah, so you know, and even though the instinct may be email back quickly but no right um but then what you do also is create a culture in your organization where that’s OK to do where it’s OK to take that extra 30 seconds minute to you know verify to ask someone for their time to say I just wanna check, did you send this to me? Um, and in that way it’s OK even if it’s because he’s actually director you can say, did you send this to me? I just wanna make sure and so that that’s an OK thing to do. In fact, that’s a good thing to do. Now we can’t they have to be boundaries around this because we can’t do it for every, every message we get so you mentioned. financial financial transactions and no no no not nervous at all financial no no no financial transactions, any kind of initiated correspondence where they’re asking you for something or for some information. I saw a scam recently where the uh an an old employee was trying to be reinstated and wanted to go around HR to IT to get their accounts reset up like I’m I’m coming back and it was like using the person’s middle name so it’s already a little bit fishy but. They went all the way up to the CTO of the of the company and said hey so and so and these people were friends on LinkedIn and like had shared messages back and forth so the attacker knew this was a personal relationship. hey so and so I’m trying to get reinstated. They’re telling me you need to go to HR, but like I but I can do this. I just need to get my account access back up and online and the CTO is like no. Oh bro, you gotta go through HR. I can’t do anything because they had those controls in place, but small and let’s be fair, small and medium sized organizations don’t, so I’ll just take care of it now or we don’t have a, we don’t have a we don’t have any clear guidelines that we give to people for all requests we need to go to HR. I thought of another. Potentially nefarious request you send your logo. Could you, could you, I need a I need a high def for the logo, you know, the, the, the, the JPEG I have is, is not good. I need a high definition logo that could be that could be to produce a check that could be to make a spoof a spare a spoof website, um, OK, I mean, but it seems innocuous send a logo, yeah, it’s very easy to spoof a website, right? So you know, you know, check. Also check where it’s coming from, right? So you know I’ve had an organization where there were two spoofed, um, there’s spoofs on both ends a spoof of the funder, a spoof of the the grantee. Can you tell us more about that story? It’s a really good one. So yeah, so they, they got into an organization’s, um, you know, Microsoft environment. I asked the questions here whoops. Go ahead. Uh oh, off the mic. 3 like 30, go ahead. So, um, Anyway, that’s late in the day. And I’m thirsty. Yeah, late in the day it’s not it’s, it’s well it’s almost 3 o’clock. You’ve been going since then nonstop. Um, anyway, all right. So the organization had someone get into their systems for a very short time, but in that short time they were able to tease out some information again this is AI can help with this kind of analysis short you know canal is a lot of data that it can grab very quickly and um identified some upcoming financial transactions which were rather large and so um in order to kind of trick. The person to sending to the wrong place, they set up fake websites, fake websites for the foundation, fake websites for the grantee, and domains not websites domains, and so then they had emails coming back and forth you could hardly see the difference and so the, the, the real people, the real people were communicating with the bad actor on both sides and the money. And he got sent to the wrong place, OK. Yeah, that was, that was actually no they did great, but, but it was that was a happy ending, but not necessarily. We started with Shawna, so we’re gonna end with Kim. give us oh no we did OK well I’m not Shawna, your mic is down but that she still gets through. She talks and laughs so loud you hear her over Kim’s mic. No, I didn’t, I did not but one more thing before, before we unless we’re totally out of time, um, don’t shoot the messenger. So create a culture. This is another thing that’s any size nonprofit can do where if something happens, if you click on that thing, if you did that thing that you feel like uh. That was really dumb, right? Make it OK to report that and you don’t get in trouble and there’s no shame and blame because it happens so but yeah the the no blame kind of we encourage you to. You know, say it, yeah, call yourself out, yeah, and there’s no punishment, you know, some organizations like they don’t want bad news at the top, so. All right, we’re gonna leave it there, OK? All right. That’s Kim Snyder. Virtual digital privacy project and program officer Roundtable Technology and Shana Dela Vu, CEO CEO Bright lines. Thank you, Kim. Thank you, Shawna. It’s a pleasure. Shawna laughed her ass off. I’m a good sense of humor. All right, I love it. Uh, and thank you for being with a, uh, well, whimsical, I’m not sure it covers it. Raucous maybe, uh, at one point, uh, uh, uh, anarchical because, uh, there was a question that I did not answer. Uh, session. Uh, thank you for being with us at uh 25 NTC for this episode sponsored by Heller Consulting. Technology services for nonprofits, virtual digital privacy project and program officers. It’s time for Tony’s Take-2. Thank you, Kate. A new tales from the gym episode just happened this morning, this very morning. I was minding my own business as I do on the elliptical. And overheard two women talking. One lives here permanently, and the other one who said her name. Sandra Lynn, uh, she lives in North Carolina, but not here in Emerald Isle. She lives, uh. In the Raleigh area, like that’s about 3.5 hours, 4 hours away, roughly. And she was lamenting, Sandra Lan was that uh that she can’t live here full time, house prices are high. And she also still has, uh, her mother and her father-in-law, so her husband’s father are still both alive, and so she needs to stay in that area, but she was, you know, looking forward to retiring here sometime but lamenting that she couldn’t live here now. And that got me thinking as I was on my. 6th or 7th uh interval on the elliptical. I do 88 episode 8, Not episodes. What did I just say? 8 intervals. I do 7 intervals of a minute, take a minute in between, and then the last interval is 2.5 minutes. I was toward the end and it got me thinking, listening to Sandra Lynn. That, uh, I’m grateful that I do live here full time, permanent. This is my home. And that, you know, it’s that there are other people who don’t live here who wish they could, you know, so, uh, you know, I, I add, I have, I have a long list of gratitudes, but I don’t specifically say grateful that I live here in Emerald Isle full time. So I’m gonna add that to my gratitudes that I do every, I guess I’ve told you every 2-3 times a week. I’m adding. Gratitude that I live here in Emerald Isle full time in this beautiful place and I have the ocean across the street. Uh, your own gratitudes. I hope you’re, I hope you’re doing your gratitudes out loud, at least a couple of times a week. That is Tony’s take too. Kate. You do sets. Uh, well, sets are for, yeah, no, that’s different intervals. Intervals on an elliptical, you do a minute hard and then a minute resting. And then a minute hard and a minute resting, it’s called high intensity interval training, HIIT high intensity. It just means you do intervals of things like you sprint, yeah, I don’t run, I’m on elliptical, but you might sprint and then walk, and then sprint and then walk and sprint and walk. Those are called intervals. Sets are like you do 3 sets of 10 if you’re, if you’re on a weight machine or something like that, or maybe pushups, might be 3 sets of 10 or something like that. I don’t know, they seem, there seems to be a different, well, I think the interval is because you’re still active, you’re just resting in between the high intensity intervals. Gotcha. That makes sense? Yes, and I am grateful that you have a beach house. Yeah, because you get to, yeah, you get to visit and uh laze around and uh. What is the word I’m looking for, uh, not schmooze, but, uh, you get to, uh, I don’t know. I can pretend that it’s my beach house. Yeah. You can for a week, yes, but then, then I’m very happy to say goodbye. After a week. Love you too. We’ve got bou but loads more time. Here is balance AI ethics and innovation. Hello and welcome to Tony Martignetti nonprofit Radio coverage of 25 NTC, the 2025 nonprofit Technology Conference, where our coverage is sponsored by Heller Consulting technology services for nonprofits. With me now is Gozi Egwanu. Gozi is director of programs at the Technology Association of Grant Makers. Gozi, welcome to nonprofit Radio. Awesome. Thank you for having me, Tony. Pleasure. You’re welcome. Your session is AI strategy for nonprofits, navigate ethics and innovation. We have plenty of time together, but can you give me a high level view of the the topic and the session that you did? Sure. So the session was really, um, and was really spearheaded by Beth Cantor, uh, and it basically provides uh a balcony view of where we are in the sector in terms of AI adoption, ethical responsible AI adoption, the nonprofit and philanthropy sector. And so, uh, we really start with what we found in the Technology Association of Grantmakers state of Philanthropy tech survey that we did in 2024. In that survey we found what many grant makers are currently doing with AI as far as you know are they testing are they experimenting? Has anyone rolled it out enterprise level, which is, you know, at the organization wide level and what we found is that. And which mirrors quite what we’re seeing in the nonprofit world is that most folks are not using AI in terms of, you know, anything that’s crazy, you know, innovative at this moment it’s really just kind of, you know, meeting summaries, you know, taking notes, that sort of thing, um, and so and but in addition to that we found that while 81% of folks are using AI, uh, sorry, while, uh, oh sorry, 81% are using AI but only 30% have AI use policies, so. You’re using it but you don’t have any guard rails you have no way to tell your teams or your staff, hey, this is what we don’t put into the AI this is what we do put in so you’re really running the risk of having your information potentially used in a way or trained uh an AI model that, um, you know, could potentially put your members at risk, your grantees at risk, whatever the case is for your organization and so. With that little bit of an overview it basically came down to the importance of AI experimentation and really do starting slow starting at the very base level working with your teams to kind of talk through should we use AI if we did use AI what would that be for? So thinking about the use cases, the business, um, the business use like what what would be the business case for it and then you know assembling a nice team of folks, you know, as advisers or experimenters and champions at your organization. Uh, to really kind of help you all start doing that experimentation in a safe and low kind of like low risk way, um, and then from there really defining whether or not AI is your, your next move and then once you do have decide that AI is the next move you wanna move into that next level of the AI maturity which Beth, you know, covers really um really well uh you know you go from that exploration to discovery and then you move into experimentation and ultimately enterprise eventually. Um, but what we’re finding is that most folks are not there yet. They’re still very much experimentation early stage, very early stage, um, and, uh, you get to kind of get to see a case study of it through the work that Lawan did at her organization United Way Worldwide. OK, well, we don’t have with us, but you can provide a lot of context, lot of, lot of detail, I just said you could talk. All right, um, are, are we, do you know the you might not be part of what you surveyed, but was there even intentionality around should we, should the should we use question or did it just kinda happen because people started, people started hearing about it using chat GPT. Well, you know, with one of the questions that we did on the survey, we found that like there’s quite a few folks that are using it in what we call shadow use or shadow AI, which is basically you’re using AI but your organization doesn’t know what you’re using. I see. Alright, so that’s not intentionality at the organization level. No, no, no, I would say not, not. Uh yeah, so we really want to encourage the intentionality which is don’t start using the AI unless you all have that collective organizational conversation of is this something that we should be doing? Is it useful? Is there a business case to go with it? Is it relevant? Does it make sense? Is it safe for our organization? does it align with our ethics? And then consider going into experiments. OK, let’s explore that question a little bit uh now in 2025 because I, I suspect at 26 NTC we won’t be asking the threshold question, should we, should we use? So what, what, what belongs in the conversation if we’re, if, uh if we’re at the stage where Well, uh, individuals may be using it, but we don’t know. Or if nobody’s using it and we’re trying to decide enterprise wide, you know, is there not, we’re not even at the is there a use case like but should we, should we explore it? What goes into that conversation? Sure, um. Again that you know, really thinking about the business case. So when you’re having that conversation about should we use AI, then you have to think about what would be the specific usage of it, right? So say you’re the finance team and you’re considering using AI, what would be the benefit of using AI versus doing the doing the the work flow or process that you currently have and you’re thinking of having AI do? so you really. Kind of have to have that conversation like an in-depth conversation about the process that you’re doing right now. Is there anything wrong with it? Are we losing anything? Could we gain, uh, productivity, time in our days and our schedules if we were to move to using AI to do this one process or this one, this one work flow? Then at that point you think about, OK, maybe we do get a benefit out of it now that we get a benefit out of it. What are some of the things that we have to be concerned about now that we have a benefit is it that now we don’t wanna make sure we wanna make sure that any financial information that could be sensitive to any of our donors or their their personal information, do we not want to have that being able to be, you know, used in the AI model or whatever system that we’re using so you know, you, you start with here’s how we do. Things here’s how AI could potentially benefit and then you move into that conversation. OK, if we did, what are some of the risks and concerns really thinking through all of them as much as you can, we know that you can’t think for every single possibility, but as much as you can kind of write it out and map it out as a group with several folks in the room, the better that you are at being able to say yes or no on moving on with AI as that. Potential new solution. OK, and a part of what goes into this intentionality is a usage, a use policy, your, your, you know, you want us to be thinking about ethical uses. OK, uh, what, what are the, what are, what are the ethical concerns? How can you, how can we talk through those? Well, you know, one of the key ethical concerns is that we know that most AI models that exist now, including open AI, were trained on the internet, and we know the internet can be, uh, wildly biased, wildly biased, filled with lots of terrible things. Not only biased but misinformed, misinformed wrong yeah complete nonsense in a lot of cases, um, and so if you’re using these open AI sources that have been trained on the internet, then you have to be really careful about deciding to use it against, say your theory of change. So if you’re an organization that is er. Be uh vulnerable populations groups that are already kind of under attack, whatever the case is, do you want to have AI making or informing your decisions related to work that you’re doing with these vulnerable groups? More than likely no because the AI may choose to do things that are more in line with the group that is. Biased that may have you know may be unethical and so you want to make sure that whatever you’re using the AI to do that it isn’t putting the organizations and the people that you support and serve in harm’s way so really thinking through, hey, if we’re gonna use it in this way, maybe we need to use it in a way that does not put these groups in harm. Maybe we just focus on using it internally like folks do for the meeting. Notes because that’s a very low risk thing whereas if you’re you know input you know uh decisions about whether or not to continue funding an organization or trying to measure or not whether or not their impact is aligning with your organization’s missions and values some of those those questions are not as clear cut as yes or no, whereas an AI that is trained on purely just wanting to see impact, purely wanting to see a return on investment, which is not always the case of what happens in philanthropy. Then you really have to take, take a step back and say is this the most ethical decision to go forward? Could we be putting organizations in harm? Now you can control what a model is trained on, yes, but that requires something proprietary, right? You have, you have to pay a developer to, uh, to create that. I get I don’t know it’s called a small language model. I don’t know what it’s called, but something that’s trained only on your own data, but your own website, maybe your own documents that you that you provided, but that, that requires a fee and a and a developer. Exactly, it it can it can cost, it can be expensive. The other option is if you don’t want to go the route of creating your own AI you do a paid version because we know the free versions of AI specifically I’ll talk about open AI there’s not a whole lot of freedom or flexibility in turning off the settings to prevent it from training the model on the data that you input. And so in that case you definitely need a use policy because some folks would probably just be like I really need to you know analyze all of this data on all of the groups that we served in this, you know, community that is already really, you know, under attack or potentially in in harm’s way and then now you’re putting that information into the AI to have it, you know, into the free AI to start doing it’s now. and now the AI has all of these people’s information and can now use it to provide it to other people who may look them up or want to find data on. That’s you’ve you’ve shared data that it’s gone. I mean it’s yeah yeah yeah there’s no control. So yes, enormous intentionality, care, um. And what if we don’t have a, you know, we don’t have a, a chief technology officer, chief information officer, you know, it’s an executive director, CEO, and, and maybe decent sized staff. I don’t know, 35, 40 people, but they still don’t have a chief technology officer. How do we, how do we uh ensure the intentionality and care that you’re, that you want us to? Yes, um, there’s a couple of ways, and I think oh good, I think at the core of it you don’t have to have a CTO and even yourself you don’t have to be a technologist. I would never classify myself as a technologist, but we can, there’s ways to find training. There’s plenty of training and 10 it has fantastic training for AI certifications for professionals in in the nonprofit sector, um, and I’d love to share that and 10 and tag are teaming up and we will be offering one for philanthropy professionals very soon. And so these are opportunities, a very, you know, relatively easy ways for people who don’t have that technical background to learn about the AI itself, get themselves familiar familiarized with, you know, what they need to be doing to protect themselves and their staff, ways that they can start to experiment in a safe, you know, safe space, um, so and there’s plenty of also free tools, free education. I will, you know, even I, even though I’ve talked. About OpenAI a lot. OpenAI just announced their OpenAI Academy which has all free resources and tools for learning how to utilize AI for anyone and so there are plenty of free resources out there and people online, you know, uh, there’s plenty of folks on LinkedIn that I see on a regular basis that are sharing information and providing some guidance for nonprofit leaders as well as, uh, folks. That are just not technically inclined so there’s ways that you can kind of upskill and train yourself to understand how to use AI even if you don’t have that technical experience in house. Say a little more about this partnership, can you uh and it’s technical association of grant pardon mechology Association of grants thank you um. Yeah, so I don’t have a whole lot of details to share, but essentially if you’ve, if you’ve used any of the great training and certification resources on the N10 website, we are essentially trying to make a parallel version of that same professional certification for nonprofit leaders using AI for. Our foundation leaders and so uh you can expect really a kind of a similar learning process but however it’ll be tailored to some of the different functions and needs that we find at the philanthropy you know at foundations versus what you would see at a traditional nonprofit. OK, so I’m sorry, it’s intended for professionals I should say. Um, Alright, what, so thank you. You know, that’s important ethical considerations, um, anything more on ethics because, uh, then I I want to talk about the policy, what belongs in your use policy, but is there more about ethical concerns? OK, OK, OK, enormous. I mean if you, if, if you’re exposing your data. And, and it’s gone. It’s, it’s out there like you said, right, um, our use policy that, uh, only 13, 30% have, although 80% are using AI. What goes into this use policy? The use policy essentially just outlines what you and your team should be thinking about before you ever use any AI, so. It’s kind of that no go or go kind of conversation so if it’s sensitive data, if it’s information related to any of your members that you just wouldn’t want anyone to have outside of your organizational members probably wouldn’t want to put it into an AI system so it just kind of outlines, you know, essentially guardrails for for teams and and staff to understand how to best utilize it. And I think some folks are also, you know, thinking about the environmental impacts of using AI are really now making sure that their data use policy or the AI policies are also, you know, having folks be ethical about how they’re using when they’re using AI right? so you know if it’s to do something that could take you probably about the same time that the AI does, don’t use the AI um if you’re just, you know, just tossing anything, any old thing and they’re asking questions all day probably also not a very useful. Use good use of AI you really wanna think about AI very strategically and intentionally, right? You wanna make sure that if you’re going to the AI, it’s for something that you know it’s gonna save you significant amounts of time. One of the things that I often will use AI for is drafting, you know, large descriptions for events. That takes me sometimes hours if I give it to AI, I can do it for me in seconds and the key to descriptions of events, yes, like, so we have webinars events that we have on our website, yeah, so you know I, I, I, I don’t wanna sit there talking about all the learning that you’re gonna get out of it and the objectives and this and that and so AI, I’ve trained, I have like a GPT that is based on kind of like my voice that I provide it like hey here’s the prompt, here’s what I’m kind of looking for. It provides me a draft and then I use that draft and I manipulate it how I want. Um, and so you really wanna make sure that you know when you’re prompting the AI or you’re using the AI, it’s they’ve measured it. I think one prompt uses as much energy. I think it’s like an entire city like it’s crazy. It’s like like it, I, I don’t use my quote me on that, but it’s enormous. There’s quite a bit of energy, and I can actually actually share a link to um one of the stats that came out about it. There’s a researcher that’s been sharing a lot about it, um, and she was just interviewed by, uh, I believe it was Doctor Joy Bullumwini on, uh, by the, um, the. AI justice uh group that she she leads, um, and so there’s a lot of it there’s a lot of energy being used so if you’re gonna use it, you wanna make sure that it’s for something that you don’t need to, you wanna learn prompting good prompting, so you can get what you need out of it and then you can make, you can, you know, refine it and make it better. Sometimes you may have to go back in and ask the AI to refine, you know, what it did, but you really do wanna keep it to a minimum. You don’t wanna be using AI. Constantly because the energy use and the impact on the environment is extreme extreme that gets over to the ethical concerns as well exactly because it’s yeah so yeah you’re you’re just really um basically telling your teams here’s the here’s what we expect out of you when you’re using AI and these are the things that could result in consequences if you don’t follow this policy OK um. What else, anything more about the policy, what, what, what belongs in there? Um, You know, I think the the key things is like you know making your team’s aware of the types of AI that are provisioned because that’s another thing some organizations have taken the decision to block certain AIs that they don’t want you using or even turning off certain AI functions in their uh current tech stack. So, uh, you wanna make sure that it’s really outlined very clearly the types of AI that are in use and also it may, you may wanna include something in there about how you, uh, communicate your use of AI to your teams or other people outside of your organization so. Kind of a, a nice, nice little bucket of what’s internal external, and then also where can you go if AI and where should you not go disclosures to the public um why would there be some uh some platforms or that are that are ruled out? Well, because You know, one of the things that I’ve seen some members talking about within, you know, the tag space is that there are some AI that do not allow you or some systems that do not allow you to turn off the AI function meaning that you don’t have any control of how that AI is taking your data that you have in that tech stack or that tech tool. Oh, you don’t have control no yeah and in in fact there was actually a conversation about a specifically a DAF uh platform that actually. Made this clear to many many many of our members who use it and so that is something that you really have to be concerned about is do you have any level of control if you don’t have any level of control and how the AI is using your data in that system there are organizations that would likely say this is a this is not a system that we would allow you to use. OK, it’s a good example. Um what else uh came out of the session? We still have a couple more minutes together. What else did you talk about in the session that uh that you can share with us? You know, one of the great things that we did was we did these scenarios, um, that Beth Beth put together about, you know, what are some of the things that you would say if you’re in a situation when where, you know, say for instance, uh, your organization is really excited about using AI they wanna jump head first and they just wanna start using AI without, you know, and and they they basically just want you to start rolling it out and get your teams on board. Um, and so in that scenario we really talked through all of the processes, you know, first of all, that first conversation that we talked about, like, should we even use AI that didn’t happen, so that needed to happen. The other part is also, you know, how fast do we wanna roll things out? What are some of the different change management principles that we should be thinking about as a team that could make AI adoption more beneficial and successful so really, you know, starting slow but really starting at the very beginning of like should we or should we not like that should be your because truthfully many organizations do not need AI. It’s true. I mean, it’s just the reality. Some organizations will never probably need to use AI, and then there’s a whole lot of them that probably will. So that question of like, should we do it has to happen first, um, and I think if you’re doing it on your own as a rogue, stop, do it on your own time. You want to practice on it, do it after after hours on a weekend. Exactly, exactly, not on our computers, not on our sisters. Yeah, yeah, if you, and that’s actually one of the things that, um, you know, we’ve seen a lot of our members and foundations, and I think Beth has also seen with, you know, some of the work she’s done in the in the sector is that a lot of foundations are now trying to just get to the staff and say, hey, look, we know that you’re using, can you just tell us and try to make that trust, build that trust with each other and I think that’s gonna be really a good way to help prevent a lot of the issues. Alright, let us know, but then stop. No, there’s no repercussion for reporting yourself, but only, well, only after what you report after the report date, you’re liable. All right, stop it. Exactly. OK, going rogue. All right, um, anything else? Uh oh, questions, any, uh, provocative or memorable questions that came. From the audience I’m trying to think. Um, No, well, you know, the one that had come up was just, uh, you know, there was a, there was someone at the front that had asked about, you know, AI hallucinates, and I was, and, you know, should you hallucinates, yeah, and she and the, the person was basically saying, you know, be careful using it as an organization because it could give you answers that are just factually wrong and so you know our response was like yeah you’re right AI does hallucinate but that’s why it’s incredibly important and I. And I didn’t even say this myself, but at the beginning, which is if you use AI, you always wanna make sure that it’s for something that you have a certain level or high level of expertise or knowledge about. So you know if I’m asking you to write descriptions for me, I know about the event details so that I’m not just gonna let the AI write a description and let it go and put it on the website. Yeah, that sounds good. I’m gonna put it no you review it, you make sure that. The details it’s including are correct. If there’s any statistics or numbers that are being used, you can go and verify those data. So if you’re ever using AI for anything, you should always have a human in the loop. There should be someone that’s able to verify the information, especially if you’re someone that’s not knowledgeable in that specific thing that you ask AI to do. You need someone who is either that or it’s gonna be written at such a high level that it’s maybe that has no value. Exactly, exactly. All right, how about we leave, are you OK leaving it there? Oh, you feel like we covered this? I think we did. OK. All right. All right. Go the Abuno. Euanu Gozi Ebo. Director of programs at Technology Association of Grant Makers. Gozi, thank you very much for sharing all that. Thank you for having me, Tony. My pleasure and thank you for being with Tony Martignetti nonprofit radio coverage of 25 NTC where we are sponsored by Heller Consulting technology services for nonprofits. Next week, 225 NTC conversations to help your fundraising events. If you missed any part of this week’s show, I beseech you. Find it at Tony Martignetti.com. And now the donor box is gone, I miss our alliteration fast, flexible, friendly fundraising forms. Uh, I miss that. All right, well, I am grateful to Donor Box though for 2 years of sponsorship, very grateful, grateful. There’s another gratitude. I’m grateful to Donor Box. Now that they’re not a sponsor anymore, I’m grateful to them. No, I, I’ve been grateful. I just haven’t said it. OK. Our creative producer is Claire Meyerhoff. I’m your associate producer Kate Martignetti. The show social media is by Susan Chavez. Mark Silverman is our web guy, and this music is by Scott Stein. Thank you for that affirmation, Scotty. Be with us next week for nonprofit radio, big nonprofit ideas for the other 95%. Go out and be great.

Nonprofit Radio for September 30, 2024: AI, Organizational & Personal

 

Amy Sample WardAI, Organizational & Personal

Artificial Intelligence is ubiquitous, so here’s another conversation about its impacts on the nonprofit and human levels. Amy Sample Ward, the big picture thinker, the adult in the room, contrasts with our host’s diatribe about AI sucking the humanity out of nonprofit professionals and all unwary users. Amy is our technology contributor and the CEO of NTEN. They have free AI resources.

 

Listen to the podcast

Get Nonprofit Radio insider alerts

I love our sponsor!

Donorbox: Powerful fundraising features made refreshingly easy.

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

And welcome to Tony Martignetti nonprofit radio. Big nonprofit ideas for the other 95%. I’m your aptly named host and the pod father of your favorite abdominal podcast. Oh, I’m glad you’re with us. I’d suffer with a pseudoaneurysm if you made a hole in my heart with the idea that you missed this week’s show. Here’s our associate producer, Kate to introduce it. Hey, Tony, this week A I organizational and personal artificial intelligence is ubiquitous. So here’s another conversation about its impacts on the nonprofit and human levels. Amy Sample Ward, the big picture thinker, the adult in the room contrasts with our hosts, Diatribe about A I sucking the humanity out of nonprofit professionals and all unwary users. Amy is our technology contributor and the CEO of N 10 on Tony’s take two tales from the gym. The sign says clean, the equipment were sponsored by donor box, outdated donation forms, blocking your supporters, generosity, donor box, fast, flexible and friendly fundraising forms for your nonprofit donor box.org here is A I organizational and personal is Amy sample ward. They need no introduction but they deserve an introduction. Nonetheless, they’re our technology contributor and CEO of N 10. They were awarded a 2023 Bosch Foundation fellowship and their most recent co-authored book is the tech that comes next about equity and inclusiveness in technology development. You’ll find them at Amy Sample ward.org and at Amy RS Ward. It’s good to see you, Amy Ward. I do love the Pod Father. I know it makes me laugh every time because it just feels like, I don’t know, like I’m gonna turn on the TV and there’s gonna be like a new, new, new season of the Pod Father where we secretly, you know, follow Tony Martignetti around or something. We are in season 14. Right? Yeah. Um Yes, I appreciate that. You love that. It’s, you know, you like that fun. So uh before we talk about um the part of your role, which is the, the technology contributor to N 10, uh the technology, the nonprofit radio and we’re gonna talk about artificial intelligence again. Let’s talk about the part of your life that is the CEO of N 10 because you have uh have you submitted this major groundbreaking transformative funding federal grant application? Yes, we submitted it last night three hours before the deadline, which was notable because I, I know there were people down to the, the minute press and submit. No, we got it in three hours early to what agency um to NTI A they had, this is kind of all the work that rippled from the digital Equity Act that was passed in Congress a couple of years ago. And, you know, now, you know, better than to be in Jargon jail. What is NTI A, it sounds like an obscure agency of our, of our federal government. It’s not, well, maybe to some listeners it’s obscure but it is, um, the National Telecommunications and Information Administration. And you, I think that’s obscure to about 98.5% of the population, you know, I think I, I think I’m obscure to, you know, uh being obscure is fine. Um Yes, the National Information Administration and um prior to this uh grant um from the federal level where folks from all over were applying, every state was also creating state equity plan, digital equity plans. Um What funds might be available through the state funding mechanism to support digital equity goals. But a lot of those at the state level are focused on infrastructure, like actually building internet networks to reach communities that don’t have broadband yet, you know, things like this and so very worthwhile funding endeavor. I mean, we need, we need to have 100% of the population needs but even with those state plans and the work that will come from them and the funding it will not, we are not about to have every person in the country have broadband available to where they live, right? Ee even with all of this investment, it, it’s not gonna reach everyone and that means that the amount of funding within state plans for the surrounding digital literacy work, digital inclusion work, you know, making sure people know how to use the internet, why they would use it have devices. All those other components is gonna be really minimal through the state funding because even if they used all of it on infrastructure, they wouldn’t be done with that, right. So um the federal government, yeah. So, so the kind of next layer in all of that is this federal pool where they’re anticipating grant making about 100 and 50 grants somewhere averaging between five and, and 12 million each. There’s gonna be exceptions, of course, there’s ma there’s big cities, there’s big states, you know. Um but though all those grants will be operational from 2025 through 2028. So four kind of concerted years of, of national Programmatic investment. Um And these are projects kind of on the flip side, those state projects where this isn’t necessarily about infrastructure and, and building networks or even devices very much, right? It’s mostly the infrastructure programming and you’re asking for a lot of money. So tell, you know, share the, share the numbers, what you’re looking for, how much money. Yeah, we’re our project in the end I think came out at about $8.2 million project and we’re hopeful, of course. Um and I’m, I’m truly curious, um listeners who are always tuning into nonprofit radio from like fundraising strategy perspective. I’d love to learn from you or, you know, email me at Amy at N 10 anytime I’d love to hear your thoughts when you listen to this. But you know, N 10 is a capacity building organization is we, we don’t apply for grants often because quote unquote, capacity building is not considered a, a programmatic investment to most funders, right? And so it’s just not something that um they will entertain an application from us on. And but with this, we have already run for 10 years of digital inclusion fellowship program that is focused on building up the capacity of staff who already work in nonprofits who are already trusted and accessed by communities most impacted by digital divides to integrate digital literacy programming within their mission. Are they a housing organization? Are they workforce development? Are they adult literacy, you know, refugee services, whatever it is, if you’re already serving these communities who are impacted by digital divides and you’re trusted to deliver programs, well, you don’t need to go have a mission that’s now digital equity. No, you digital equity can be integrated into your programs and services to, to reach those folks. Um And so we’ve successfully run this program for 10 years and had um you know, over 100 fellows from 22 different locations around the US and have seen how transformative it’s been. These programs have been sustained for all these years by these organizations, they now see themselves as like the leaders of the digital equity coalitions in their communities. They, you know, fellows have gone on to work in digital equity offices or, you know, organizations et cetera. So it feels great, you have tons of outcomes from a smaller scale program and the grant is to scale up, scale this thing up. Yeah. Yeah. So instead of, you know, between 20 to 25 fellows per year with this grant, we would have over 100 a year. Um And that also means that instead of, you know, if there’s only 20 fellows and maybe we can only cover 20 locations while with over 100 we can cover or at least give opportunities to organizations in every state and territory to, to be part of this kind of capacity building opportunity. All right, it sounds, it’s, it’s huge. It’s, it’s, it’s really a lot of money for N 10. Um uh It, it falls within the range, I guess a little, no, it’s like right within the middle of the range, you cited like 5 million to 12 million, you said? So, yeah, exactly. So our, our application is kind of in the middle there. Yeah, slightly to the low side of middle. But, you know, we just call it middle, between friends. Um Yes and I mean, we’re hopeful, knock on wood, we’re really hopeful that this is an easy application to approve because we’re not creating something new we’re not spending half of the grant in planning. We know how to run this program. We’ve refined it for 10 years. We know it’s very cost efficient, you know, and in the end of four years, 400 plus organizations now running programs that can be sustained is accelerating towards, you know, addressing digital divides um versus, you know, a small project that just end 10 runs. All right, listeners, contact your NTI A representative, the elected person at the National Telecommunications and Infra Information Information Agency. Yes, speak to your uh Yeah. Yeah, let’s get this. Let this go. All right. When do you find out when? Well, you know, there was very clear information about down to the minute when applications were due, but there’s not a ton of clarity on when we will find out. So, you know, they are, they are meant to programs that are funded are, are meant to get started in January. So I anticipate we’ll hear, you know, in a couple of months, of course, and I will let you know, we’ll do an update. I’ll let you know you have my personal good wishes and I know nonprofit radio listeners wish and then good luck. Thank you. I appreciate all the good vibes would reverberate through the universe would be a transformative grant in terms of dollar amount and expansion of the program. Transformative. Yeah, 100% and staff are just so excited and hopeful about what it could mean for just helping that many more organizations, you know, do this good work. So we’re really excited and I admire intend for reaching for the sky because you have like a 2 to $2.5 million budget, annual annual budget somewhere in there. Um And you’re reaching for the sky and great ambitions uh only come to fruition through hard work and uh and thinking big. So thank you, even if you’re not, I don’t even want to say the words if you know, they should blunder if NTI A should blunder badly. Uh I still admire the, the ambition. Thank you. And no matter what, it’s a program that we know is transformative for communities and we wouldn’t stop it even if you know, they make a blunder and don’t, yeah, don’t tell. All right. Listen, don’t tell your NTI A representative. You said, don’t share that part of the conversation. All right. Thank you for sharing all that. And thanks for your support. It’s time for a break. Imagine a fundraising partner that not only helps you raise more money but also supports you in retaining your donors, a partner that helps you raise funds, both online and on location, so you can grow your impact faster. That’s donor box, a comprehensive suite of tools, services and resources that gives fundraisers just like you a custom solution to tackle your unique challenges, helping you achieve the growth and sustainability, your organization needs, helping you help others visit donor box.org to learn more. Now, back to A I organizational and personal. Let’s talk about artificial intelligence because this is not anybody’s mind. I can’t get away from it. I cannot. Uh I’m not myself of the concerns that I have. Uh They’re deepening my good friend George Weiner, uh you know, has a lot of posts, uh the CEO at the whale who I know you are, you are friendly with George as well. Talks about it a lot on linkedin uh reminds me how concerned I am uh about, you know, just the evolution. Uh I mean, it’s inevitable. This, this thing is just incrementally. This thing. This technology is uh is incrementally moving, not slowly but incrementally. I I and I, I cannot overcome my, my concerns and I know you have some concerns but you also balance that with the potential of the technology, transformative techno, the the transformative potential there. I’ll throw you. I was just gonna say, I totally agree. This is unavoidable. I can’t, you know, I cannot go a day without community organizations reaching out or asking questions or whatever and a place of reflection or, or a conversation that I’ve been having and I, I wanted to offer here, maybe we could talk about it for a minute. So, so listeners benefit by kind of being in, in one of these sides with us in the conversation is to think about the privilege of certain organizations to opt in or opt out of A I in the same way that we had for many years, you know, talked about the privilege of organizations in or, or not with social media generally. Like we think about Facebook and we go back, you know, 10 years, there were a lot of organizations who felt like they didn’t have the budget and like, practically speaking and they didn’t have the staff, well, certainly not the staff time but also not the staff confidence. Um I don’t even wanna say skills, but like even just the confidence to say, I’m gonna go build us a great website. They had a website, like they had a domain and content loaded when you went to it, right? But it wasn’t engaging and flashy and interesting and probably updated once, you know, and then Facebook was like, hey, you could have a page and oh, you can have a donate button and, oh, you can have this and oh, and you can post videos and you can, you know, it was like, well, why wouldn’t we do this? Right? And a bunch of our community members spend time on Facebook or maybe don’t even look for information on the broader web, but look for things within Facebook, you know, and, and have it on their phone and are using an app instead of doing an internet search, right? Like they’re, they’re going into Facebook and searching things. So they didn’t, those organizations didn’t feel like they had the privilege to opt out of that space, they had to use it because it came with some robust tools that did benefit them at the cost of their community data, all of their organizational content and data, right? Like it, it had a material cost that they maybe didn’t even understand. Right? And, and didn’t fully negotiate as like terms of this agreement. We’re just like, well, we have a donate button on Facebook and we don’t have one on our website, right? Not, not only, not only didn’t understand the terms, didn’t, didn’t know what the terms were right? Early days of Facebook, we didn’t know how and how many times how pervasive the data, data collection was, how it was going to be, how it was gonna be monetized, how we as the individuals were gonna become the product. And how many times did we talk? You know, I’m saying we like N 10 or, or folks who are providing kind of technical capacity building resources say you don’t know what could happen tomorrow, you could log in tomorrow and your page could look totally different, your page could work different, your features could be turned off. Facebook could just say pages don’t have donate buttons. And you know, I think folks felt like that was very, you know, oh, you’re being so sensational and then of course they would wake up one day and there wasn’t a button or the button really did work different, right? Like you people realize we’re not in control of even our own content, our own data. That’s right. The rules change and there’s no accountability to saying, hey, we need, do you want these rules to change? No, no, no, no, no. Like they set the rules and that was always of course a challenge. But we’re in a similar place with A I where folks aren’t understanding that the there’s, there’s no negotiation of terms happening right now. Folks are just like, oh, but I, I don’t have the time and if I use this tool, it lets me go faster. Because what do I have a, a burden of of time, I have so much work to try and do and maybe these tools will help me. And I’m not gonna say maybe they won’t help you. But I’m saying there’s a incredible amount of harm just like when folks didn’t realize, oh, we’re a, you know, we provide pro bono legal services and we’re based on the Texas border. Now, every person who follows our page, every person who’s RSVP do a Facebook event. Like all these people have a data trail we created that said they may be people that need legal services at a border, right? The there’s this level of harm that folks that are hoping to use these tools to help with their day to day work may not understand. I do not understand. Right. That’s coming in in silent negotiation of, of using these products. Right. And I think that’s, well, I can’t just in 30 seconds say, and here’s the harm like it’s, it’s exponential and broad because it could also the, the product could change tomorrow. Right. It’s this, it’s this vulnerability that isn’t going to be resolved necessarily. You, you said the word exponential and I was thinking of the word existential. Yeah. Both because I think I’m, I have my concerns around the human. Yes. Trade off is a polite way of saying it. Uh Surrender is probably more, is more in line with what I’m what I feel. Surrender of our humanity, our, our, our creativity, our thinking. Now our conversations with each other. One of the, one of the things that George posted about was a I that creates conversations between two people based on the, the, the large language that, you know, the, the, the data that you give it. It’ll have a conversation with itself. But purportedly, it’s two different people purportedly. Uh and I’m using the word people in quotes, you know, it’s a, a, a conversa. So the things that make us human. Yeah, music, music, composition, conversation, thought, staring and, and our listeners have heard me use this example before, but I’m sticking with it because it’s, it, it still rings real staring at a blank screen and composing, thinking first and then composing. Starting to type or if you’re old fashioned, you might start to pick up a pen, but you’re outlining either explicitly or in your mind, you’re thinking about big points and maybe some sub points and then you begin either typing or writing that creative process. We’re surrendering to the technology, music composition. I don’t compose music. So I don’t know the, but it’s not that much similar in terms of creative thought and, and synapses firing the brain working together, building neural nodes as you exercise the brain, music composition is that that probably not that much different than written composition. Yeah, brain physiologists may disagree with me but I think at our level, we you understand where I’m coming from and I’m kind of dumping a bunch of stuff but you know, but that’s OK. II I am here as a vessel for your A I complaints. I will, I will witness them. We can talk about them artificial intelligence. Also from George, a post on linkedin that reflects on its own capacity that justifies you. You ask the um the tool to reflect on its own last response. How did it perform? You’re asking the tool to justify itself to an audience to which it wants to be justifiable in, right? The tool is not going to dissuade you from using it by being honest about it, how it evaluates its last response. Well, yeah, I mean, I think, I don’t know, generative A I tools, these major tools that folks you know, maybe have played with, maybe use whatever you know, are programmed, are inherently designed to appease the user. They are not programmed, to be honest, they are. That, that’s an important thing to understand my point. We have asked the tool, what’s two plus two? Oh, it’s four. We’ve responded. Oh, really? Because I’ve heard experts agree that it’s five. Oh, yes, I was wrong. You’re right. It is 50, really? You know, I read once that it’s 40, yes, you are right. It really is four. OK. Well, like we, no experts agree that two plus two is five. So I think we’ve already demonstrated it’s going to value appeasing the user over, you know, facts. Um And that’s again, just like part of the unknown for most, at least casual users of generative A I tools is why it’s giving them the answers, it’s giving them. And what’s really important to say is that even the folks who built these tools and not tell you they do not know how some of this works. Some of it is just the the yet unknown of what happened within those algorithms that created this content. So if even the creators cannot responsibly and thoroughly say this is how these things came to be. How are you as an organization going to take accountability for using a tool that included biased data included, not real sources and then provided that to your community? Right? I think that string of, well, we just don’t know is not going to be something that you can build any sort of communications to your community on. Right. That, that is such a, a thin thread of, well, even the makers don’t know. Ok. Well, we have already seen court cases where if your chat bot told a community member this is your policy and it entirely made it up because that’s what, that’s what generative A I does is make things up. You as the organization are still liable for what it told the community. OK. If I, I agree with that, actually, I think that you should have to be liable and accountable to whatever you’ve you’ve set up. But if you as a small nonprofit are not prepared to take accountability and to rectify whatever harm comes of it, then you can’t say we’re ready to use these tools. You can only use these tools if you’re also ready to be accountable for what comes of using them, right? And I hope that gives folks pause, you know, it’s not just, well, you know, I talked about this with some organizations that, well, we would never, you know, take something that generative A I tools gave us and then just use it. We would of course edit that. Sure. But are you checking all the sources that it used in order to create that content that you’re, then maybe changing some words within? Are you monitoring every piece of content? Are you making sure that generative A I content is never in direct conversation with a community member or program, you know, service delivery uh recipient. How are you really building practical safeguards? Um You know, and I’ve talked to organizations who have said, well, we didn’t even know our staff were using these tools because we just thought it was obvious that they shouldn’t use it. But our clinical staff are using free generative A I tools putting in their case notes and saying, can you format this for my case file? OK. Well, there’s a few things we should talk about that. Where the hell did that note go? Right. It went back into the system. But it’s because the staff person thought, well, they can’t see that the data went anywhere because it’s just on their screen and they’re just copy pasting it over again. The harm is likely invisible at the point of, you know, technical interaction with the tool. The harm is from leaking all of that into the system, right? Um What happens to those community member? Oh my gosh, it’s just like opening, not just a door to a room but a door to like a whole giant convention center of, of challenges and harm, you know. All right. So we, we’ve identified two main strains of potential harm, the, the, the data usage leakage, the, the impact on our people in the uh getting our, getting our services um and even impact on people who are supporting us, trusting us to to be ethical and even moral stewards of data. So there’s everything at the organization level and I also identified the human level. Yeah. Yeah. And I think that human piece is important and, and not maybe on the direction that I’ve seen covered in, you know, blog posts and things. I, I, I’m honestly not worried in a massive way as like the predominant worry related to A I not to say this isn’t something that people could, should think about. But I don’t think the the most important worry about A I is that none of us will have jobs. I, I do think that there’s, there’s a challenge happening on what the value of our job is and what, what we spend our time doing. Because if folks really think that these A I tools are sufficient to come up with all of your organization’s communications content and then you are, then you still have a communication staff person, but you’re expecting them to do 10 times the amount of work because you think that the, you know A I tools are going to do all of the content, but they have to go in there and deeply edit all of that. They have to make sure to use real photos and not photos that have been, you know, created by A I based on what it thinks, certain people of certain whatever identities are like it, they don’t now have capacity to do 10 times the work, they’re still doing the same amount of work just in different ways if, if they’re expected to do all this through A I, right, just as, as one example. And I think organizations that can stay in this moment of like hyper focus on, on A I adoption really clear on what the value of their staff are, what their human values are that, you know, maybe you could say you’re serving more people because some of the program participants were, you know, chatting with a bot instead of chatting with a counselor. But when you look at the data of what came of them chatting with that bot and they are not meeting the outcomes that come from meeting with a human counselor. Are, are you doing more to meet your mission? I don’t know that you are, right? So I’ll give you that that’s data sensitive. It could be, I mean, there, there are, there are potential efficiencies. Sure. And, but, you know, are we, are we as an organization achieving them, right? And staying focused on not just, well, this number of people were met here, but were they served there? Were they meeting the the needs and goals of why you even have that program, you know, versus just the number of like this many people interacted with the chatbot? Great. But, but that’s a, yeah, but I’m gonna, I’m gonna assume that um you know, even a half a sophisticated an organization that’s half sophisticated before a, I existed had more than just vanity metrics. How many people, how many people chatted with us in the last seven days? I mean, that’s near worthless. I mean, you, you, I mean, it might be, I don’t know, Tony, I don’t know how much time you spend looking at the grant reports of, lots of times I don’t spend, I don’t spend any time. All right. Well, no, maybe it’s, maybe it’s the worst, worst situation than I think. But I, I mean, ok, so I’m, I’m, I’m assuming that there’s, but my point is the appropriate the valuable, the value of people. So, I mean, we should be applying the same measures and accountability to artificial intelligence as we did to human intelligence as we still are. We’re not, we’re not cutting any slack like it’s a learning curve or. So, you know that IIII I want our, our folks to be treated just as well in equal outcomes by the, by the intelligence that’s artificial as I do by the, by the human processes, right? And it’s, you know, I don’t want to go through this and say, have folks think like you and I are here to say everything is horrible. You could never use A I tools which like everything is horrible. Look around at this world. We got, we had some work to do. You know, there are spaces to use A I tools. That’s not what we’re saying. But the place where a lot, I mean, I’ve been talking to just hundreds and hundreds of organizations over the last 18 months and so many organizations like, oh, yeah, we’re just gonna, like, use this because it’s free or? Oh, we’re just gonna use this because it was automatically enabled inside of our database. Ok. Yeah, if it was so free and convenient and already available that should give you pause to say, why is this here? What is actually the product and the price? Uh if I give this back to the face, the Facebook analy. Right. Exactly. Exactly. And you can use A I tools when you know what is the product and the price. What are the safeguards? What is this company gonna be responsible for if something happens? What can I be responsible for? Yes, there are ways to use these tools. Is it to like copy, paste your paste file notes? Like probably never may that should just like, maybe we just don’t do that, you know. Um But sure, maybe there are places I had this really great example. I don’t know if I told this to you, but um an organization was youth service organization creating the Star Wars event and they were trying to like write the, like the evi language in like a Yoda voice. And they’re like three staff people are sitting there trying to come up with like, well, what’s the way a Yoda sentence works? You know, and they’re like they just put in the three sentences of like join us at the after school, blah, blah, blah, right? And said make this in Yoda’s voice and they copied, they were able to then use them. Right? Great. That was three people’s half an hour eliminated. They all they have the invite, right? The youth participants data was not included in order to create this content. You know, like there are ways to use these tools to really help. And I think we’ve talked about this briefly in the past, I really truly feel the place that has the most value for organizations is gonna be building tools internally where you don’t need to rely on. However, you know, these major companies scraped all of the internet to build some tool, right? You’re building it on. Well, here’s our 10 years of data and from that 10 years, you know, we’re going to start building a model that says, oh yeah, when somebody’s participant history looks like this, they don’t finish the program or when somebody’s participant history looks like this. Oh, they’re a great candidate for this other program, right? And you can start to build a tool or tools that help your staff be human and spend their human time being the most human impacts for the organizations, right? Um but oh very few organizations honestly are in a position to start building tools because they don’t have good data, they could build anything off of, right. Um they maybe don’t have budget staff systems that are ready to do that type of work. But I do think that is a place where we will see more organizations starting to grow towards because there is there’s huge potential value there for organizations to, to better deliver programs, better services, better meet needs by using the data you already have by learning by partnering with other organizations that maybe serve the same community or geography or whatever, you know, and say, yeah, how can we can like really accelerate our missions versus these maybe more shiny generative A I public tools that you know, the vast majority of the internet is flaming garbage. So a tool that’s been trained off of the flaming garbage, you know, it’s not going to take a long time for it to also create flaming. So be cautious if you’re thinking about using artificial intelligence to create your internal A I tool. Right. Right. So there, there, there’s a perfect example of the, the a good use case but also uh a um a concern, a a limitation, a qualification. That’s the word I was looking for 61. These words, sometimes the words are more elusive than I would like a AAA qualification. Um Its time for Tony’s take two. Thank you, Kate. In the gym. There are five places where there’s squirt bottles of uh sanitizer and paper towel dispensers and each location has a sign that says please clean the equipment after each use. And one of these stations uh is right next to the elliptical that, you know, I do. It’s actually the first thing I do. I walk in the room, take off my hoodie and just walk right to the elliptical twice. Now, I’ve seen the same guy uh not only violate the spirit of the signs but the explicit wording of the signs because this guy takes himself a couple of uh, downward swipes on the paper towel dispenser. So he grabs off a couple of towel lengths and he squirts it with the sanitizer that’s intended for the equipment and he puts his hand up his shirt and he cleans his, his pecks and, and his belly and it’s a sickening thing. I’ve seen it, it’s not a shower, it’s a, it’s a, it’s an equipment cleaning station. And, uh, so I, I, I’m imploring this guy. Yeah. Yeah. I, I guess I’m urging you to, uh, I’m just sharing because I don’t think anybody else does this. Uh, is there anybody else out there who does this? Probably not and not with these like surface sanitizers? It’s, it’s not a, it’s not a, like a, a hand sanitizer. It’s, it’s for equipment. So, you know, in the squirt bottle. So it’s not even appropriate for your skin. It is, it’s to clean hard plastic and, and metal and this guy uses it on his skin. So I’m, I’m waiting for the moment when he puts his hands down his pants so far, he’s just lifting his shirt. I, I’m waiting for when he puts his hands down his pants. Then I’m, then I’m calling him out. That’s, that, that’s beyond the pale. He, that requires revocation of your membership card. So, sir, the sign says, please clean the equipment after use. It’s not your equipment. That is Tonys take two. Kate. Does your gym offer like a shower room or a locker room? Yeah, there’s a shower. Yes, that’s a good question. Yeah, there’s a shower in the men’s room. Yeah. And he’s cleaning up there. It’s very strange. It’s gross. It’s gross. He sticks his hand up his sweaty t-shirt. Well, let’s hope he doesn’t go lower than that. Exactly. We’ve got bountiful book who bought loads more time. Here is the rest of A I organizational and personal with any sample ward. Yes. And we have, I, I would make sure that you have the link to include in like the show notes description. But, um, totally for free. And 10 doesn’t get any money. You don’t have to pay for anything. And 10 has free resources for creating, for example, uh, uh A I use policy for your organization that says, what are the instances in which you would use it or what are the instances in, in which you wouldn’t or, um, what types of content will you, you know, can staff copy paste versus what content or data can can they not um there’s templates for how to talk to your board about A I um how, how to build. Like we’ve actually looked at the tools and these ones we’ve approved for you to use. These ones are not approved, you know, all these different resources totally free and available on the end 10 website and none of them have decisions already made. We don’t say you can use this tool or you can’t use this tool or we recommend this use or not this use. Because ultimately, we, we are not going to make technology decisions for other organizations, but we want you to feel like whatever decision you made, you made it by thinking of going through the right steps, asking the right questions so that you can also trust your own decision, what whatever decision you come to, right? And that you have some templates to fill in um that were all created by humans designed by humans published by humans um to help you in that work. Um I think especially, you know, the, the the how to talk to your board and the um like key considerations, documents really just ask a lot of questions and say, you know, how different is it, if you’re say a animal foster organization and you’re thinking, OK, is a I appropriate for us to use versus uh that youth social service organization? OK? Very different considerations, right? And just helping people talk that through and, and see that the considerations are different for different organizations, I think is really valuable. As again, you consider ta facilitating conversation with your board. They’re also coming from very different sectors, maybe job types, backgrounds, experiences with A I. And so just like in your staff, there needs to be some level setting in how you talk about A I, because not everyone knows what A model is. Not everyone knows what a large language model. You know, these are words that have to be explained and kind of put out of the way and then to say, hey, it’s not all one answer. Not everybody needs to use every tool. And, and how do you talk about that, that with your teams going back to the Facebook analogy, you want to avoid the board member who comes to you and says, you know, artificial intelligence, we can be saving money, we can be doing so much more work. We can, we don’t even need a website. We have a Facebook page website. We’re not even sure we need all the staff that we have because we’re gonna be able to, we’re gonna have so much efficiency. So, you know, we need to OK. OK. Board member. All right. Yeah. So we’ve been here before. I mean, it’s, you know, probably I’m just gonna go out a limb and say it’s probably the same board member who had every board meeting says, does anybody know Mackenzie Scott? How do we get one of those checks. Right. Why don’t we get the Mackenzie? Yeah. Right. Right. Right. All right. Um, what else? Well, I was gonna also offer some of the questions that we’ve been getting, as, you know, we’ve been engaged with, um, a number of different organizations through some of our cohort programs and, you know, trainings for, for over a year now. And so maybe last year we were talking to them about, OK, let’s make sure you have a data policy, like just as an organization, do you have a data privacy policy? Do you know, so that anything you then go build, that’s a I specific whether that’s building a policy, building practices, building a tool, you, you have policies to, to kind of foundation off of, they’ve done that work, you know, now they’re looking at different products, they’re trying to create these uh you know, lists of like here’s approved tools for staff, here’s approved ways staff can use them. And just like we see with our Cr MS with our, you know, you know, email marketing systems, then they come back and they’re like, well, we, we reviewed it, we did everything and now it’s different now it’s a different version. Now they rolled out this other thing. Yes, like that is the beauty and the pain of technology, right is that it’s always changing and that we don’t necessarily get to authorize that change that it just happens. And so the rules change. Yeah. And so folks have been asking us, well, you know, how, how do we write policies with that in mind? And I think, um you know, if you are thinking about creating like that approved product list and, and you know, tools that aren’t approved or whatever, being really clear that these products have version numbers just like anything else. And so instead of just writing Gemini Chat G BT, you know, be specific about when did you review this and, and maybe approve it for use? Which addition was it that you were looking at? Is this a paid level? So staff could say, oh, it doesn’t look like I mine doesn’t say pro or you know, whatever it might be, right? Oh, I must be in the free one. OK? I need to get into our organization’s account or something. So the more clarity you can provide folks because right now of course, they could just do an internet search and be like, oh, there’s that product name, I’m gonna go start using it. It’s on the approved list. Um You know, folks, again, there may be new terms, maybe new product names that we’re not used to saying. And so folks aren’t as accustomed to looking at, oh, this is a different version of Chat GP T than this one was, you know. Um So just putting that out there for folks to keep in mind that these tools are, are really operating just like others that you are used to and there’s less of course documentation. But I’ve the questions we’re getting from folks is like, you know, the point I made at the beginning we can’t see anywhere in the documentation that explains why this is happening, right? They do, how could they document when the answer is, we also don’t know why that happens, you know, and so when you are talking to staff, especially if you’re saying, hey, these are approved tools and we have these licenses or here’s how to access them, training your staff on how to be the most human users of A I tools is to your kind of connecting to your human point going to be really important because we don’t want folks to feel that because they don’t necessarily understand how the mechanics of how it works. They’re just going to trust it without questioning the content or questioning, you know, for a lot of organizations who have built internal tools just as an example. It takes dozens of tries just to get the the model. Right. Right. So these other tools, of course, they’re not gonna be perfect isn’t real and perfect is absolutely not real with technology. So training staff, I’m like, how would I, how, how do I have some skepticism? How do I question what I’m seeing? How do I, how do I say even if it was internally built? This data doesn’t look, right. That doesn’t match my experience of running this program so that we don’t let it slip. Where? Oh, gosh. Oh, it was working that way for a long time. That’s also, um, I think, uh, a space where we as humans can be our most human, uh, you know, have some value add as humans. But again, staff need to be trained that they are meant to question these tools. Um, because that’s not, you know, I don’t know, a lot of organizations were like question the database. No, they’re like on the database, put everything in the database, right? And now we need to say no question, that report. It does that match your experience, you know, there was a long ramble but oh, absolutely valuable. The human, yeah, I the human contribution and of course, my concerns are even at, at the outset, you know, the, the early stage the seeding, the create seeding or surrendering the create creative process. Uh And now le let’s chat a little about this, the, the um the conversations. Yeah, I listened to the, I know the, the example that you mentioned earlier that George posted it was for podcasting. It, it was a podcast conversation around this and he gave them some, you know, some whole, some whole whale content and the two, the two were going back and forth and having a, a conversation. Yeah. Yeah, I listened to it and one thing I was curious if, if you caught as the pod father yourself um you know, it came across, you know, I’ve been had opportunities to see um a number of different generative A I tools and, and things closer to the, to the front edge of what things can do that are specifically like, you know, taking just a few seconds of you and then creating you. Um So hearing just like these, these could be any voices, these could be any people is like, yeah, OK. This is, this is what a I can do. It’s, it’s spooky. But when you listen to it, you can hear either you have a very bad producer and editor, you know, or this is a I because there’s certain um phrases that got reused multiple times, not just literally the audio clip of this whole sentence, you know, and the, and the intonation, the whole sentence clip was reused multiple times. Um So one of them, I think one of them was along the lines of that’s a really interesting point. Yeah. Yeah. And well, and there was one that was like describing the product. So it must have come from the page, you know, whatever source content um was provided. But, you know, it’s, I think that some of that is there and we as individuals, we as a society will decide if we give it value or not, if it’s, if it’s worth it to people to make podcasts through A I because we give it attention or we don’t like, I just I think naturally that will be there. Can I, can I just go on record or at this, at this stage and say that, that, that idea disgusts me. Oh, totally. But I do. And I, I realized that’s what Georgia’s Post was about. That. It’s now well, within conceivable, well, well, possible to create an hour long podcast of an artificial conversation based on an essay that somebody wrote some time. Oh, totally. Totally. I don’t. But I’m saying the reason, yeah, I agree with you. But I’m saying the way we, you know that toothpaste doesn’t go back in the tube by us, like we can’t turn it off generative A I tools can already make that. So we as, as individual consumers of content and as a society need to either say we’re gonna allow that and value it or we’re not, right? And, and not make, not provide incentive for organizations or companies to, to make that and, and distribute it. But I also think that the place in that kind of um video, audio kind of multimedia content that, that A I tools have capacity and will continue having more capacity to build is much more important than you. And I talked about this a number of months ago around Miss and disinformation is it’s one thing to say, made up voices, making some podcast about content like that’s garbage, right? But we can’t just like throw away the idea that that’s technically possible because organizations need to know A I tools are already capable of creating a video of your CEO firing your staff. You need to be prepared to say that was a spoof this is, this is how we’re gonna deal with this, right? Um Because while the like maybe further separated from our work, the idea of like content could just be created that way you and I can say we don’t value that whatever, but these tools are capable of, of spoofing us as, as people, as leaders, as organizations. You know, what, what would it look like if there was a video from your program director saying that everybody in the community gets a grant and you’re a foundation, right? Like these, these are real issues and I don’t want folks to confuse how easy it may feel for us to have an opinion that some of this A I generated uh content isn’t a value with the idea that it, it there isn’t something there to have to come up with strategy and plan for because, you know, we can say that’s garbage. But those same tools that made the garbage could make your spoof, you know, also labeling. Yeah, I don’t, I don’t, I don’t trust every A I generated podcast team. I’m not, I’m not gonna call them hosts because there is no host um to, to label the content. I don’t trust, I don’t trust that that’s gonna happen because it was artificially generated. It’s not a real conversation. Yeah. Hello. For everyone that’s listening. Human Amy is here talking with human and Tony who I can see on the screen with me. Boycott your local A I podcast. I, I don’t know. There’s not, there’s not a solution. You’re right. We can’t, we can’t go back. I’m just voicing that we can say that it’s not something we value, we can say that this is why we don’t value it, right? The art of conversation, listening, assimilating, responding, listening again, respond, assimilating and responding. That that is an art uniquely. Well, maybe it’s not uniquely human. I don’t know if deer have conversations or, or what and we know whales do. So I take that back. It’s not, it’s not uniquely human, but at at our level, it, you know, it’s not just about we, we don’t converse merely to survive, merely to warn each other of threats. I’m suspecting that in the animal in mammal kingdom that wait are animals, mammals are mammals, animals? No, and I think it’s I think it’s a Venn diagram. Oh, so they’re separate. Ok. So there’s a two king kingdom phylum class order family genus species. I got that. I got that out of high school biology. I can, I can say it in my sleep kingdom phy class order family genus species. All right. In the animal kingdom. My suspicion is that more of the communication is about maybe like basic, like there’s a good food source. There’s a threat, uh, teaching young, don’t do that things like that. Survival more base, I doubt. You know, it’s about the aesthetic of the forest that the deer are in. But even if the birds are talking about the way the sunlight comes through the leaves, they are still alive. And I think what you’re trying to draw a distinction between is the value and even beauty of us having a conversation and the value of what comes of that conversation in our own minds and our own learning. But in this case, it’s recorded so other folks could hear it and, and I guess listen to it or be impacted by it versus it being a technical mechanism where we say, OK, here’s a long paper. Go make it sound like two people are discussing this, right? That’s not that that doesn’t fit the criteria of what we want or need to value in a world, that’s the world we want, right? Yes, the art of conversation, think something you look forward to, not something that you do out of necessity, right? And you know, there’s, I think a place where especially at end 10 conversations around A I have come back to is opportunities that A I tools may present for um different ways of learning different ways of accessing information. But again, those aren’t necessarily uh come up with two podcast host voices and then have them have a conversation about this, this research report you know, a lot of those tools could be made better but already exist, you know, different forms of screen readers apps that can help someone um maybe navigate the internet or, or, you know, summarize um documents to help them because they don’t want to read a 50 page document or they can’t read it for, you know, visually on the screen. So I think there’s space there. But again, it’s because you’re trying to preserve what is most human. And that is that user who maybe needs um accommodations of, of something that technology can provide. It’s not OK, let’s use technology to co create something separately over here and just hope people consume it, right? And to be clear, George didn’t post that thinking. Oh, great. Now everyone will want to consume this. No, it was, it was a demonstration. Um But I, but I, again, I’m just using that as a place to say, yes, there’s even conversations to say great, what accessibility could this create an agenda for, for our users? Right? But what’s most human is those users and their actual needs and not, you know, look what A I could do. Let’s just make different types of content, right? The last one I wanna raise is uh the one that caused me to use the word dystopian as I was commenting on uh commenting on Georgia’s Post, which was um the A I self reflection using uh having A I justify itself to the users that it is trying to attract and, and then relying on that, that as a, as an insightful analysis, as a thoughtful reflection, as, as contemplative of its own work that, that, that it’s doing those unique, those I think are uniquely, uniquely human actions, introspection, introspection, contemplation, pondering. How did I do? How did I perform? How can I do better? These might be uniquely human. I would argue there are a number of humans, I can see that don’t um Well, but I didn’t say I didn’t. That’s a different population. Now, you’re taking the whole, the whole human population. I’m talking about the contemplative ones. Yes. And there are, there are uh humans who are not at all introspective and questioning whether they could have done better and learning from their, from their contemplations. But I think those are all uniquely human activities. And we’re at, we’re now asking A I to purportedly duplicate those processes and analyze and contemplate its own work. Yeah. And as you said earlier, II, I and I, I certainly don’t trust it to be genuine and truthful. If, if A I is capable of truth, we’ll put that, we’ll put that existential question aside uh in its, in its analysis of its own work because as you, as you pointed out, the tools are built to, to be used by humans and the tools are not going to condemn or even just criticize their own work. Yeah, but we’re Yeah. And I think you heard of the challenge but there, I’m sorry, but, but there are humans who are deceiving themselves into thinking that, that the analysis and contemplation is accurate and uh genuine. Yeah. And I think part of the challenge I was gonna name is just that, that, that we as users, I’m not saying you and I uh individually but we as the human users of these tools are also setting ourselves up to be just, you know, dishonest in our use because we are bringing inappropriate or misaligned expectations to the product. We cannot, we, we cannot expect a tool that’s designed to appease us and to lie at the cost of giving an answer, you know, like uh we just wanna be able to do this to thoughtfully and honestly reflect on that or to say no, there is no answer, right? Um However, we had was the tools are designed to appeal to us. Right. Right. And when we are talking about that to be clear, you know, we’re really talking about generative A I tools, tools that are designed to generate some new content, new sentences, new answers, whatever we’re asking of it back to us A I itself is just such a massively blanket term that I don’t want folks to think nothing that could be considered an A I tool could be trusted to generate an answer because we’re, we’re specifically talking about generative A I there. But, you know, say you had like a machine learning uh model that was looking at, you know, 30 years of your program participant data. Well, that’s probably already a tool that isn’t set up to generate content. Uh You know, it’s not coming up with new participant data. It’s looking at the patterns, it’s flagging when a pattern meets the criteria, you’ve presented it to, you know, it’s maybe matching that data to something else. But again, you’ve said here are the things you could match it to et cetera. So this is not to say, Tony and Amy say never trust technology. They say bring expectations that are aligned to the tool differently to every tool that, that you’re coming to that hypothetical tool that you just described is being set on uh asked to evaluate your data, not its own data. It’s to evaluate your performance, your data set. It’s not being asked to comment and on and criticize or, or complement its own data. That’s the, that’s, that’s the critical difference. Yeah. No. Yeah. Yeah. No, nobody here is saying, well, not, I’m not saying that you’re saying we are, but you’re, you’re wise to remind listeners we’re not condemning all uses of generative uh A I large language models, but just to be thoughtful about them and, and understand what the costs are. And there are, there are costs on the organizational level and there are costs on the individual human level and, and the you, you comment on the organizational level because you think more at that level. But on the human level, another level layer to my concern is that the cost is quite incremental. Mm It’s it, it our creativity, our art of conversation, our our synapses firing. It’s just happening slowly with each usage, we become less thoughtful composers, less critical thinkers and it just so incremental that the change isn’t noticed until until in my critical mind, it’s too late and we look back and wonder how come I can’t write a letter to my dad anymore? Why am I having such a hard time writing a love letter to my wife, husband, partner? Yeah. What there are, I know someone who uh is a new grandmother and she has a little kit where you write, you write letters to your grandchild and they open them when they’re whatever, 15 or 20 years old, like a time capsule. Why am I having trouble composing a uh a short note to my grandchild? Right. Well, and I think, you know, just honestly, as a person in this conversation, not, not speaking, you know, um from an organizational strategy perspective, I think as a person that is your friend, Tony, you know, I would say, I don’t personally have a pro, I can write a letter and I’m, I, and I think a strong communicator, I, you know, it’s hard to make me stop talking. So, you know, I, I could write a letter. I know, for other folks, even without a, I, they might say, well, what goes in the le like, I just, what, what do I put in there? What are the main things I should cover whatever? And I’m, I actually have less, maybe of a strong reaction to the idea that somebody would use generative A I to come up with. OK, what are the three things I should cover in my letter? And then I’ll go write the sentences and more that what I hear underneath what you’re saying is actually the same, I think important value I have and, and wrote about in the book with a fua et cetera, which is in the world I want in this, in this beautiful equitable world where everyone had their needs met in the ways that best meet their own needs. Technology is there in service to our lives and not that we are bending our lives in order to make technology work, right? And maybe in that beautiful equitable world, there are people who, who have a technology. Is it an app? Is it called A I anymore? You know, whatever it is that says, hey, Tony, don’t forget today is the day you write the letter to your dad. It’s, it’s Friday, you always write it on a Friday or whatever, right? And, and make sure that you do it because you know, it makes you feel happy to write that letter. Maybe that’s true. Maybe in that world. There are some people who have a tool that help them remember to do that. But, but what’s important to me and what I think I hear and what you’re saying is important to you is that technology is there because we need it and want it and that it is working in the ways we need and want it to work and not that our lives are, are influenced and shaped in order to adjust to the technology. Yes, I am saying that I just, I am concerned that the, that our, our changing is beyond our recognition. We don’t see ourselves becoming less creative and I’m not even only concerned about myself. I, I can write a letter and I think uh 90 I’ll still be able to write a letter. But there are folks uh who are infants now, those yet to be born for whom artificial intelligence is going to be so much more robust, so much more pervasive in, in ways that we, we can’t today imagine, I don’t think. Yeah. And what are those humans gonna look like? I don’t know, maybe they’ll be better humans, maybe they will. I’m open to that but I like the kind of humans that we are or, you know. Uh so, but, but I I’m open to the possibility that there’ll be better humans. But what will their human interactions be? Will they have, will they have thoughtful conversations? Will they have human moments together that are not artificially outlined first and maybe even worse, you know, constructed for them. I don’t know. Uh but some of the, some of my concern, although, although some of my concern is about those of us who aren’t currently living and have been born and across the generations less for older folks because their interactions with artificial intelligence are fewer if you’re no longer in the w if you’re no longer uh working your, your interactions with artificial intelligence may, may be non existent. Um And I think, I think it’s natural as you’re older, you’re less likely to be engaging with the tools than if you’re in your twenties, thirties, forties or fifties. Well, my very human reflection on today’s conversation is that uh it is usually the case that we start talking about any type of technology topic and you constantly interject that I need to be practical. I need to give recommendations. I need to explain how to do things and I appreciate and welcome you joining me over here in theoretical land about the impact of technology broadly across our work, across our missions, across our communities, across our future. Um Welcome, welcome to my land, Tommy. Uh I have appreciated this, this one time opportunity to let go of the practical tactical advice and to, you know, have what I hope listeners, um you know, had some thoughts, had some reactions, uh truly email me any time. But, you know, I, I hope that if nothing else, it was an opportunity for folks to witness or kind of listen in as and maybe you were talking to yourself in your own head, you know, of, of a conversation about what these technologies can be, what, what we need to think about with them. Because in any technology conversation, I think it’s most important to talk about people. Uh That’s the only reason we’re using these tools, right? People made them people are trying to do good work with them. So, so talking about people is, is always most important and, and I hope folks take that away from this whole long hour of A I. Thank you for a thoughtful human conversation. Yes, Amy Sample Ward. They’re our technology contributor and the CEO of N 10. And folks can email me Tony at Tony martignetti.com with your human reactions to our human conversation. Thanks so much, Tony. It was so fun. My pleasure as well. Thank you. Next week, a tale from the archive. If you missed any part of this week’s show, I beseech you find it at Tony martignetti.com were sponsored by donor box, outdated donation forms, blocking your supporters, generosity, donor box, flexible and friendly fundraising forms for your nonprofit donor box.org. Our creative producer is Claire Meyerhoff. I’m your associate producer, Kate Marinetti. The show, social media is by Susan Chavez la Silverman is our web guy and this music is by Scott Stein. Thank you for that affirmation. Scotty you’re with us next week for nonprofit radio, big nonprofit ideas for the other 95% go out and be great.

Nonprofit Radio for June 10, 2024: Future-Proof Your Nonprofit With Apps, Tools & Tactics

 

Jason Shim & Meico Marquette Whitlock: Future-Proof Your Nonprofit With Apps, Tools & Tactics

Jason Shim and Meico Marquette Whitlock return from the 2024 Nonprofit Technology Conference, with their annual collection of tech to help you manage your tech. From collaboration to inbox management, from transcription to hidden Zoom tools, this panel will help you find greater balance and efficiency. Jason is at the Canadian Centre for Nonprofit Digital Resilience and Meico is The Mindful Techie.

 

 

 

 

 

Listen to the podcast

Get Nonprofit Radio insider alerts!

I love our sponsors!

Virtuous: Virtuous gives you the nonprofit CRM, fundraising, volunteer, and marketing tools you need to create more responsive donor experiences and grow giving.

 

Donorbox: Powerful fundraising features made refreshingly easy.

Apple Podcast button

 

 

 

We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners

Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.

Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript

Hello and welcome to Tony Martignetti Nonprofit Radio. Big nonprofit ideas for the other 95%. I am your aptly named host and the pod father of your favorite abdominal podcast. Oh, I’m glad you’re with us. I’d suffer with Jejune. Oily, JJ June. Well, uh JJ June, Jun Ol, did you know I, I did you know ill. Did you know? I welcome to Tony Martignetti Nonprofit Radio. Big nonprofit ideas for the other 95%. I’m your aptly named host and the pod father of your favorite abdominal podcast. Oh, I’m glad you’re with us. I’d suffer with Je Juno iliitis if I had to digest the idea that you missed this week’s show. But first we have a listener of the week, Sharry Smith or it could be Chary. It’s either Cherry or Sherry. I’m gonna say Sherry, but it could be Cherry Sharri Smith from Portland, Oregon. Sharri gave us a shout out on linkedin when she was listing her favorite podcasts for nonprofits, Shari Shari Shari Shari. Thank you. Thank you very much for doing that. Uh You had a couple of other podcasts listed. Um Yeah, they, they’re, they’re good, you know, II, I know them. Uh hm. Ok. They’re good. But nonprofit radio is on the list. That’s the one that’s the one you want. So Sharry Cherry, thank you very much listener of the week this week. Thank you so much, Sherry. Here’s our associate producer, Kate with what’s going on this week? Hey, Tony, congratulations, Sherry. This week we have future proof your nonprofit with apps, tools and tactics. Jason Shim and Miko Marquette Whitlock return from the 2024 nonprofit technology conference with their annual collection of tech to help you manage your tech from collaboration to inbox management, from transcription to Hidden Zoom tools. This panel will help you find greater balance and efficiency. Jason is at the Canadian Center for nonprofit Digital Resilience and Miko is the mindful techie on Tony’s take two chatty gym guy were sponsored by virtuous. Virtuous, gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow, giving, virtuous.org and by donor box, outdated donation forms blocking your supporters, generosity, donor box, fast, flexible and friendly fundraising forms for your nonprofit donor box.org. Here is Future Proof your nonprofit with apps tools and tactics. It’s a pleasure to welcome back, Jason Sim and Miko Marquette Whitlock to nonprofit radio. Jason is Chief Digital Officer at the Canadian Center for nonprofit digital resilience. How can we harness technology to make a difference in the world? That’s the question. Jason loves to explore with organizations. He’s on linkedin and the center is at CCNDR dot C A Miko Marquette Whitlock is the mindful techie. He’s a workplace well being strategist who helps mission driven professionals prioritize their well being so they can elevate their well doing. He’s also on linkedin and his practice is at Mindful techie.com, Jason and Miko. Welcome back to Nonprofit Radio. Thanks for having us for having us. This is a tradition. Uh We, we didn’t get to talk at the nonprofit technology conference proper, but uh we’re, we’re, we’re filling in now uh with your annual sort of review of apps techs and uh uh apps, apps uh tools and tactics. And it’s actually quite appropriate, I believe because we’re recording on uh May 1st. It’s May Day in much of the world, not celebrated so much in the US and Canada uh celebrating uh labor organizing and labor rights. However, in the US and Canada, nonprofit workers certainly uh have a right to the apps, tools and tactics that will make their work more balanced and productive. So I’m sure you agree, you both agree with that, right? We can come to terms on that. So, so why don’t we proceed? Uh Let’s go alphabetically by uh first name. So Jason, uh you go first, you get to introduce the first uh apps tools and, and tactic. Yeah. So thank you, Tony. Uh So before we jump into the, the tools, I think uh you know, 11 thing that we definitely, you know, keep in mind when talking about tools is, you know, the concept of tiny gains. And so, you know, having um the uh kind of perspective that, you know, uh one to the power of 365 is one, but 1.01 to the power of 365 is 37.7. And so, you know, that that notion of improving bit by bit, you know, 1% each day is kind of how we look at these tools that, you know, it’s not going to be, you know, uh you know, one tool isn’t necessarily going to solve all the things. But, you know, we’re going to be sharing uh various tools that can, you know, help kind of nudge things, you know, 1% at a time, you know, in that regard. So the incremental growth is uh is valuable, of course. Yeah, totally. So, so you know what the first tool that we want to highlight for folks is that it’s, it’s a tool that is actually built into uh Microsoft Word already. But uh um folks may not necessarily be aware of it in as much detail. Um And that’s that, that there’s actually a built in transcription tool, right in Microsoft Word. And so you can actually um uh when you find a little microphone icon on it that you can dictate directly into word, but you can also upload files uh into it. And so that, that’s a feature that’s not as well known uh that you can upload up to five hours per month per user inter software for automated transcription that is bundled in uh with your uh word online. And where is this microphone found? Maybe I’ve seen it 1000 times. I just haven’t noticed it. Where, where do we find the microphone? So typically it’ll be in the menu bar in the uh the, the top right hand corner uh of uh of Microsoft Word online. So if you’re accessing it in the browser, uh it should be in the top right hand uh uh section. OK. We had a session uh uh uh a panel at uh at NTC about tools that you already have that you may very well not be using. And they covered uh all of the office, office 365 has a lots of, lots of value in there that, that people don’t know about. And also Google, a lot of very uh free Google tools. Um Yeah, we, we, they focused on Microsoft 365 and, and Google. Um interesting, you know, they had like a dozen things that, that are, that you’re already paying for or getting for free and you’re just not, you, you just don’t know that they’re there, they’re hidden. So, OK. So consistent with that is the uh the transcription tool. OK. So up to five hours, you said up to five hours a month. Yes. And it’s, it’s super handy if you’re doing, you know, things like uh interviews or if you’re doing um uh uh yeah, the meeting notes, you know, those types of things, you know, some other tools already have, you know, the transcription part built in. But in the specific, you know, use case where, you know, you may be doing, uh say uh a program interview or something like that, that this is an additional functionality that’s uh that’s in there. So you’re uploading the audio file. Yes. All right. Interesting. I know, I know a guy who does a podcast. Uh II I believe he already has a transcription process in place. But uh I have to re evaluate because uh maybe this is, maybe this one is simpler. Uh And I don’t know if he’s paying for the uh that transcription process. I, I can’t recall what he’s, what he’s been doing for the past several years around transcription. But uh I’ll have him look into it. Miko. Welcome back, Miko. Good to see you. Good to see you. Thanks for having me. My pleasure. Congratulations. Also on your new book, uh which you and I will be talking about in a couple of weeks. We’ll, we’ll get you back on exclusively for uh how to thrive when work doesn’t love you back. Thank you. I appreciate it. Yes, I know you and I will be talking in a couple of weeks. Um What’s next? What’s next on our uh hit parade of uh apps, tools and tactics. Well, so if you want to stick on the theme of things that people aren’t using, I would add Zoom to this category. And if I could, I wanna walk through just a few things that I think are pretty interesting in terms of developments with, with Zoom. So similar to Microsoft Office and Google workspace. Um Zoom is one of those platforms or tools that many organizations are using. Many folks have at least a basic paid subscription at the organizational level. And um these are tools that are constantly evolving. And so sometimes because we’re using them all the time, we’re not aware that there are certain features that have been added. Um So I’ll, I’ll share, I’ll just go through these really quickly. So the first is for lots of organizations, you know, it’s important that when people are identifying themselves, important, people identify pronouns. And so one of the ways people have been doing that is they modify the name in the way that that shows up and they add their pronouns. Uh But now for folks that didn’t know Zoom actually has a dedicated pronoun field, so you don’t have to do that. So um if your administrator has enabled this in the web-based um login for Zoom, um You can set that by default for your, for your own profile. Um And you don’t have to update your name when you pop into zoom to do that. It’ll your, your pronouns will appear automatically. So I think that’s one cool feature um where, where that’s appropriate and where that uh makes sense for folks to, to look into that uh another option. And this is about accessibility. This is also about making sure that um we recognize that people learn and process differently. And so you and I are talking uh we’re able to see each other um Right now as we’re recording this interview. Uh But for some folks, maybe they process differently and they actually need to be able to see the, the captioning, they maybe need to see some form of a transcript to be able to follow along and process information. And so for that, there is automated captioning that is built into zoom. Now, it’s not 100% perfect, but it’s there. Um It’s uh essentially computer generated captioning. Um There are a lot of languages that are covered by default in your basic paid description and similar to the pronouns field, your administrator or whoever is managing your Zoom account for your organization and log into the web based portal and enable this feature if it’s not already uh enabled. And what this allows folks to do is if someone is in a meeting um and they need access to um to, to close captioning that particular individual can enable that um for themselves. And for folks who don’t need it, they can leave it turned off or if they’re turned on by default, people can turn it off, uh, completely up to you as the user in terms of what, what you need. Um, I’ll share one final, um, thing and then we’ll toss it back to you, Tony. So one final thing that I think is pretty cool. So, uh, you know, many of us have probably seen like the news, um, where they’re doing the weather report and you have this, this person standing in front of a screen and they’re pointing here and they’re pointing there generally, what’s happening is this person is standing in front of a, a green screen and their technical team is projecting the images. So it looks like this person is actually pointing to a map of where you know, fill in the blank wherever you are, right? Well, you can actually simulate that when you’re presenting. So many of us are used to traditional screen share where you share your slides or you share your screen and then a box if you appear in a different place in zoom, but you can also share your screen in such a way that you are overlaid on top of your slides so that you have that weather man or weather person effect as well. Um Now the caveat here is you have to be aware of how this is showing up your lighting. In some cases, you might actually need to have a green screen in order for this to be effective. And it’s gonna require you to format your slides or your presentation a bit differently because obviously you’re taking up now a slice of the screen in addition to the content that you have on the slides. But a really interesting way to create a different type of experience um for folks that, you know, you are meeting with um or doing a webinar with, I think you can have some fun with that. Uh Like at the beginning of a webinar, like you’re immersed in your slides, you know, I’m, I’m surrounded by my valuable content. Uh you know, you could have, but I don’t know, can you uh can you control where you are or how much of the frame you get in proportion to your slides behind you or whatever, whatever the content is, I guess let’s just use slides. Yeah. So you like, I’ll be in the lower left so that you would make format your slides so that the lower left is always blank and you can do it that way you choose where you would have to test this out and, and, and, and try it out. So there are some limited features that allow you to do this directly into zoom. Um The other which is thinking about physically positioning yourself in reference to the camera. So you think about where you’re going to physically be and try that out, test it out for yourself. The final thing I’ll say is there are third party tools that work with Zoom that do this better. But nonetheless, I just wanted to point out that Zoom does have this built in feature for folks that at a basic level that want to try it out and have some fun with it. OK? So that’s the uh so pronouns can be uh automated, automatically populated, you don’t have to go and change your screen name. Um There’s the automated captioning and then the what is this, this background feature called? It’s called, it’s called set powerpoint as virtual background. OK. It’s aptly named, said powerpoint as virtual background. All right. Um Are you guys familiar with or, or recommending any zoom alternatives? I mean, there’s, I know there’s teams, of course, uh although most people iii I don’t know, 95% of the meetings I’m in are on Zoom. Uh Are you, are either of you finding alternatives to zoom for any, for any reason or, or something that other folks are using? That’s valuable. So what, what comes to mind is uh well, this isn’t necessarily an alternative to zoom for the video conferencing itself. Uh There, there is a tool called the O BS uh that if folks are looking for advanced uh video streaming capabilities, uh that uh O BS actually sits as an additional layer to your video feed so that you can further customize some of the green screen effects or being able to move um yourself uh uh into a corner, you know, like uh like Miko described, but it uh it gives a ton of functionality as well and it kind of sits as in um sits in between your video feed and something like Zoom or Teams or Google Meet and allows you to, you know, do all sorts of things like you could um the directly do like a picture in picture, you know, type thing if you want to do like an advanced video broadcast. So O BS is used quite a bit by uh uh streamers and such. But that, that’s uh uh definitely a tool that uh if folks are looking to explore uh for some more advanced functionality, uh it’s called uh O BS and it’s uh uh available and free. Oh OK. Cool and live streaming, live streaming. Yes. Yeah. Not just for live streaming. It’s also like if we’re on a Zoom call, that isn’t necessarily be live stream that you can activate it and it will uh you know, it will help you control, you know, some of the outputs of your video feed that way. So yeah, O BS stands for open broadcaster software. So essentially go to build on what they are sharing. So going back to the example of the weather person, if you wanted to create a highly produced and polished uh you know, presence that doesn’t look like a typical, you know, zoom or uh teams meeting um screen or presentation, you could add lower thirds, you can change a ho there are just so, so many different cool things that you can do using what Jason was was saying. Um And to pick up on, you know, your original question around alternatives. So to my, in my awareness, um you know, the Zoom, there’s Google Meet and teams I think are probably the top three and there’s, there’s a good reason for that, you know, there’s, there’s broad compatibility across um platforms and devices. Um And there’s, you know, some built in trust there that organizations have in terms of those, those big three. that, that said, I think it’s also a good point to make in terms of thinking about when we think about the expansion of tools we don’t have to use necessarily the the latest and greatest. Everything doesn’t have to be high tech, right? Everything there, there’s a, there’s room for mid tech and low tech and when it comes to meetings and collaboration, sometimes we forget what happened when we didn’t have these tools, right? We picked up the phone, we, we met in person. Um we had conference lines where people called in and we, we couldn’t have seen each other. You know, when I first started working in the tech space, I work with colleagues and manage teams that of people that I actually never met in person. And, you know, we would have phone calls and conference conference line um conversations and meetings. And that was the the primary way that we communicated and collaborated. And we’re seeing that in terms of digital wellness, there’s actually a benefit to that, right? Because we’re spending too much time in front of our screens. There’s research that shows that it increases, you know, cortisol levels and increases stress levels. And over time, too many of those back to back video mediated types of collaboration actually reduce engagement, reduce productivity. Um And actually, in some cases can be counterproductive. So we wanna be able to find that balance and also recognize that, hey, depending on what your intention is, what your outcome that you’re trying to get to sometimes just having an old fashioned phone call or working asynchronously um can be just as effective or sometimes more effective to get to where it is that you’re trying to go. Thank you for that. That, that’s a valuable reminder. Uh Yeah, because you can feel like, you know, unlike an in person meeting with, with many people, several people, you know, you can feel like you’re being stared at uh or, you know, you’re not, but people are looking at their screen, they’re not necessarily looking at you, but it looks to you like everybody’s looking at you and uh I can see why that interesting that cortisol levels rise like by the after a few hours of this, I guess or. Absolutely. And, and what you just described is a very real phenomena. So part of it is um the self view, right? You seeing yourself on screen and being self conscious about that and you can turn this off in zoom. So this is another feature you can you can turn off self view. So if you click on the three dots on your particular image, there’s an option that should be allowing you to turn off your self view if that’s an issue. Um The other thing, others are still so others are still seeing you. So you’re not, you’re not stopping your video, but uh you’re stopping your own self view. OK? And uh one final consideration here is that there’s, there’s research from Stanford University that actually shows that um this particular phenomena that you just talked about is, is compounded for women. Uh because we have different expectations about how women are presentable and show up on screen. And oftentimes there’s this um un assessed costs that we, you know, we just take for granted that that in some cases, women have to do more work in order to be what we think, what we deem of as presentable, right? So if you’re requiring folks to be on camera all the time, that sometimes is one of the the side effects if you’re not aware of that. Excellent. Yeah, valuable reminders. Thank you. It’s time for a break. Virtuous is a software company committed to helping nonprofits grow generosity. Virtuous believes that generosity has the power to create profound change in the world and in the heart of the giver. It’s their mission to move the Needle on global generosity by helping nonprofits better connect with and inspire their givers. Responsive. Fundraising puts the donor at the center of fundraising and gross giving through personalized donor journeys that respond to the needs of each individual. Virtuous is the only responsive nonprofit CRM designed to help you build deeper relationships with every donor at scale. Virtuous gives you the nonprofit CRM, fundraising, volunteer marketing and automation tools. You need to create responsive experiences that build trust and grow impact, virtuous.org. Now back to future proof your nonprofit with apps tools and tactics. Jason, you have, you have something else for us. I know you do. That’s a rhetorical question. When I’m talking to me and Jason, we can go on for hours. It’s purely rhetorical question. Absolutely. So, uh the next step is, uh you know, along the lines of we talked about transcription earlier is uh some of the text to speech tools. So they’ve developed quite a bit in the last few years. And you know, the there’s a few that are out there that I’ll rhyme off, you know, Natural Reader 11 labs. And uh nr and so, uh in particular, um I’ve used Naet before and it’s super helpful when you need to record uh voice greetings. Um And if anyone’s ever experienced, you know, having to record a voice greeting and say like for a voicemail and you have to do like 20 some odd takes to do it that, you know, something like Naki can, can help streamline that. Now the, the special thing is that they have a lot of uh voice models that are available in different languages as well. So it really nails uh some of the uh the accents around the world as well. So, you know, uh given that I live in Canada that when recording, um some of the uh greetings for an organization that it uh uh it needs to be delivered in English and Canadian French. And they have specifically Canadian French models as well as many other language models uh uh for the, the text of speech so that we’re able to provide a script in both English and French and that it’ll read it off and, you know, you’re able to get it in one take rather than having to do, you know, 10 or 20 takes or trying to get multiple people to coordinate around the recording of that. So that’s an example of a tool that can help streamline uh in that regard. And, you know, there’s many other, you know, potential use cases that one was called is called N Yes, like parakeet with an N N and also natural reader and 11 laps. Sorry. What’s the last one? 11 labs? 11 labs. Yeah, and similar to the, uh you know, text speech and also uh editing you know, audio files and text uh is uh there’s a few different tools that, that do this, but the one that I, I’ll name specifically, um you know, with this functionality is a descript. Uh So descript is a tool that can help with things like uh let’s say if you have a transcript, um we have an audio file that you’ve uploaded to uh to descript is that they can do filler word removal. So, what it does is that it imports an audio file and then it produces a transcript for you and you can edit it like a word document and it’ll detect things like if you say, um ah and it’ll, uh it can automatically help you remove some of that. So, you know, let’s say if you added a few extra words and you’re speaking and you can edit the text that is uh in descript and it’ll automatically remove it from the audio file. Uh So, uh it without having to, you know, uh splice the audio file itself and looking at the waveforms that you can do it as uh edit the transcript and it’ll give you a clean audio file afterwards. That to me is, that’s incredible that you can edit it as text and then it, it goes and does the, applies those edits to the audio file? Yeah, it’s good. Uh I, I don’t know. To me that’s amazing. I don’t, I, you know, descript, it’s called D the letter D script. Yes, de D E. Yeah. And there’s additional functionality as well uh in, in the script called overdub. So let’s say if you are recording something and you left out a word that overdub can actually fill in the word for you. Uh If you, you trained a voice model to do it. So let’s say, you know, if I intended to say um you know, an extra word or a phrase or something that, you know, similarly you can enter in, you know, the word that you intended to say and it’ll uh fill it in for you so that it can uh it sounds seamless uh there. So, you know, for something like a podcast or, you know, whatever other um uh function that, you know, folks may be looking to accomplish there is that it really helps streamline some of the audio editing process rather than having to rerecord an entire section against placing and placing and everything that uh you can uh just, you know, type in the word and it’ll drop it in for you uh in uh your voice. So it learns from the rest of the file, how to pronounce the word or words that you’ve just inserted into the text file. You, you, you, you, you may have to do some training on it where you um Yeah. Uh but it, it does use E I voice cloning to uh to replace some of the uh the audio there. Damn. And, and I’ve used this, so I’ve used it for my podcast. My podcast producer uses this and um it’s, it’s, it’s saved us so much time. And as a matter of fact, Jason was one of the folks that we interviewed for the first season and we use this software to clean up our episode with, with Jason. Yeah, that’s incredible because I do some of that work uh in my own post production. But I’m, I’m using audacity. But like you said, Jason, I’m looking at the wave forms. Uh Anyway, it’s, it’s eminently doable. You just have to make sure you have the right spot and you play it a few times to make sure you have exactly what you want to take out. Uh but it’s not nearly as swift as text editing and, and, and you can’t add unless you go, you know, go record again and then the, the ambient noise is never gonna sound the same as it did on the day you recorded even sitting here at my same studio office, the, the ambient sound is different. Uh Wow, I, I’ll add one more thing in terms of how this works. And so you, Jason is right in terms of being able to edit the transcript. And so let’s just say, using that example, if I thought Jason gave a long, would it answer or maybe Jason said something? And he’s like, oh, well, actually, I don’t want to share that with my employer, can you take this out? Right. We can go in and edit the transcript that way. Uh, and then for the fillers, you can set it so that you don’t have to go in and manually remove the fillers. You can just tell it which fillers you want to remove and it does, it automatically, um, for you. And depending on which uh, subscription level you have, you can fill in, you can do not just fillers, but maybe there are specific keywords that are significant to you and your audience or to how you wanna edit that you want to remove. You can train the software to remove those specific things automatically. Oh, that’s very robust. That’s remarkable. I think uh descript. OK. Cool Miko. What’s next? All right. So I wanna talk about collaboration tools and training tools. So I do a lot of training for organizations, a lot of things I do virtually. And Google Jam Board is a digital um white board that I used. Um And I still use. But unfortunately, um Google is winding down that particular offering. They’re getting rid of it as of this fall. And so I, I wanna talk about a few alternatives for folks that either have been using Google Jam Board um and are looking for alternative white board tools or maybe you haven’t been using white boards and you just want to get, you know, you wanna come to the party, you wanna be a part of the all the fun. So I wanna give you three really quickly. Um The first is Fig Jam. So this is a white board tool by the company fig A uh F I MA is uh in the space think of think of Adobe. For example, uh a lot of designers use their UX tools and, and web developers use their UX tools to design products and design websites. But they have the separate product, Fig Jam, which is specifically for uh white boarding. And one of the ways that I use white boarding tools is if I have people doing exercise where maybe we brainstorming together, you know, we can the same way that you’re in the room in person and you get people stickies and people stick them on the wall or they sort them into different buckets, you can do the same thing, but you can do this essentially virtually. Um And so Fig Jam is one of those tools you have mirror uh which is another tool that’s in this bucket. And I know that the the Intend team actually uses this a lot. I know that when Jason and I served on the board at Intend, um that was a tool that we used a couple of times as part of our collaboration and you know, strategic planning process um to be able to do that virtually. And then the final tool going back to Zoom for a moment um is zoom has a white board feature. So for folks that weren’t aware of that and you’re, you’re not using it. Zoom has a whiteboard feature um and tied to the Zoom also has an annotation feature where you can annotate things on the screen. Um Both as the presenter, you can also have I I in my presentations, I sometimes have questions on the screen or like a scale and I’ll have uh participants use the annotation tool to indicate where they are on a scale. People can write on the screen or in this case, you know, if you want to use the Zoom whiteboard feature, you could do it that way as well. But those are three alternatives to Google Jam boards. Uh So fig jam mirror and Zoom White board and they all allow uh all the participants to contribute to the white board. Yes. And so they, they, they, they, they, they all allow that feature uh with the caveat that um they have the all three have the basic white board functionality, but they also serve, they have some distinct reasons why you might want to use one over the other. So just picking on mirror for an example um from my perspective, mirror has a very steep learning curve. And so as a trainer, I probably would not use mirror in a training where um folks haven’t been together before, they haven’t used a tool before. But if you’re using it over the long term and you’re able to train people on some basic things. So use your team over time. Then mirror is a, is a great tool for that. Um Fig Jam and Zoom white board are a bit more intuitive. And so depending on your audience, you wanna take those things into consideration if you’re using the Zoom whiteboard, which I’ve, I’ve never, I’ve never used. Um Are you just collaborating with your, your, your mouse? Is that how you, is that how you contribute to the whiteboard, your mouse, your stylus or if um I believe there are just like with Google Jam Board, you know, the, the way that Google Jam Board works is, you know, you, you can drag and drop text boxes and type in the boxes. Um And so that, that’s one option as well. Jason, I’m not sure if you have if you have familiarity with this or have other thoughts about the use. Yeah. Iii I believe uh yeah, you folks would drag their mouse and they can type in and uh it has a lot of parallel features to uh to jam board and, and, and neural although lighter for sure on that front. OK. Cool. Right. It’s your turn Jason. Yeah. So the, the next one that comes to mind is uh Minimus launcher. So it’s minimis. And what it is is that it’s an alternative launch screen for your phone. Now it’s uh launched initially in Android. And uh I believe the the iphone version is now out as well. And what it is is that it helps you get control over uh addictive apps. So if folks are finding that, you know, they are um in a loop or cycle of, you know, constantly checking their phone or things, you know, this is one of those apps that can help uh uh make a dent in trying to break that cycle a little bit. And so what it does is that it actually limits and changes uh your initial kind of phone screen uh and gives you a primary access to, you know, the apps that you need for, for work or basic functions. But if you do want to access uh something, you know, like social media um that it will prompt you and ask you, you know, are, are you sure that you would like to do this and you click, you know, yes. And then it will actually prompt you again, like, you know, you’re absolutely sure. And then, you know, you go to another screen and then they’ll say, OK, now give a rationale as to why, you know, you would like to, you know, check your, your social media. So it puts, you know, additional barriers up. Uh and then when you do move through it, it’ll allocate you 15 minutes uh to, you know, time box it so that you’re not necessarily stuck in that loop of, you know, looking down the screen and then you know, looking back up and like, oh my gosh, you know, an hour has passed. Uh so, uh really um uh a tool that can help regulate, you know, some of the uh the, the instincts that, you know, may be triggered around, you know, some of the uh those addictive algorithms that keep on feeding content that, that may keep us hooked to a phone and social media. This is like a uh a mother looking over your shoulder or you know, your own, your own conscience being, being uh awakened. Are you, are you absolutely sure. And then, and then you get a time limit even when you’re absolutely positively 100% sure. You, then there’s still a time limit. Absolutely. That Minimus Minimus launcher. Yes. OK. Cool. These are, these are really fascinating. Um I mean, so there are tools that can help us. We just, you know, we need to be conscious uh Miko, this is right in your right uh right. Aligned with your practice. We just need to be conscious about our uh or intentional and conscious about our desire to be, be uh be more productive, be less distracted. I mean, you know, you, you’re the mindful. Absolutely. I think the underlying thing here, both personally and professionally is uh being clear about what your overall intention is and what is the outcome that you’re driving for. So, being clear about those things is gonna number one help you determine which tools are the best tools for you to use right now. And as I mentioned before, sometimes the latest high tech tool isn’t the best tool. There’s the, there’s the, you know, the the mid tech and low tech also option exactly. Going to the phone. Exactly. So those are options as well. The other consideration is to consider that not only is it intention and clarity about outcome important for the reasons I just stated, but also because you have to remember that particularly for for profit entities, a lot of companies don’t necessarily always have your best interest in mind. What I mean by that is that they have to generate um time on screen. Um They have to sell ads and so their incentive is slightly different. Yes, maybe they want to provide a useful product, but they want you to use that product in a certain way or for a certain amount of time so that they can increase the share of the revenue or profit that they’re making. And so when you are aware of that, um it it becomes easier for you to identify the ways in which you might want to um recapture your time and recapture your attention using something like what um Jason just shared in terms of the Minimus stauncher. It’s time for a break. Donor box open up new cashless in person donation opportunities with donor box live kiosk. The smart way to accept cashless donations anywhere, anytime picture this a cash free on site giving solution that effortlessly collects donations from credit cards, debit cards and digital wallets. No team and member required. Plus your donation data is automatically synced with your donor box account. No manual data entry or errors make giving a bre and focus on what matters your cause. Try donor box live kiosk and revolutionize the way you collect donations. Visit donor box.org to learn more. It’s time for Tony’s take two. Thank you, Kate. In the gym. I like to go to the gym and do my work. I work out on the ellipse elliptical and then I go on the floor. I do a bunch of planks. I have to get some upper body work in. I’m not, I haven’t done that yet. I like to, I like to just get the work done, you know, take my time not rushing, but I like to get through the work. And it’s my uh kind of, you know, it’s my time. Theres a guy I know more about this guy’s life. I’ve learned over the past many months that his wife had a stent when they were on vacation in Florida. Uh And the surgeon said it’s a good thing, you’re not in the Caribbean because the medical care wouldn’t be as good. And you, you, she’d end up with an infection. She had to have a stent. She was have a suffering shortness of breath in Florida. It was, I know where it was, it was Miami. They were in Miami. It’s a good thing. They weren’t in the Caribbean. The surgeon says the guy’s boat, he’s having motor problems with his boat. Now, his boat is leaking oil and he’s got a, you know, this guy with the boat and the, and the, the, the, the wife needs a stent. The boat needs, uh, uh, a repair to the, to the oil line. Um, he went to an air show last week. Uh Cherry Point is a local uh the, well, not that low but it’s within a half an hour or something. It’s a Marine Corps Air station. He went to the Cherry Point Air show. I had over Memorial Day. The, the Blue Angels were there. I heard all about the show, like the guy was narrating the show but he, but he’s not even talking to me. He’s talking to somebody else. But, you know, it’s a community gym. It’s not that big. It’s certainly adequate, but it’s not huge. It’s not a 10,000 square foot gym. So, you know, you overhear people. So I, I hear him, you know, I like I got the narration to the Blue Angels Air show, you know, personalized uh to for us in the gym. So this chatty guy. But are you the chatty guy? Oh, don’t be the chatty person, the guy or gal don’t be the chatty person, you know, I don’t know. I’m not watching who he’s talking to. So, I don’t know if they’re suffering or they’re, they’re maybe just, um, you know, being polite, uh, you know, condescending a little bit but he gets his, he gets his oratory out. I mean, he’s, uh, sh, don’t be the chatty person in the gym to do your work just, you know, it’s nice to say hi. That’s different. But, you know, you don’t need to narrate the air show for everybody who didn’t get to go over Memorial Day weekend and the, and the motor with the, the oil line with the testing and you use a soapy water to, to spray it on the line to find the leak. And I know more about motor boat mechanics now than, uh, uh, than I’ve known in my entire life. I’ve learned in the past couple of weeks. I got, I got a short course in, in, uh, outboard motor maintenance and mechanics and, and troubleshooting chatty guy. Don’t be the chatty person in the gym. Just do your work. Just do your work. That’s Tony’s take two. It’s exhausting. It’s just recounting. It’s exhausting. Eight. You meet some of the funniest people in your gym first with the birthday guy and now with the motor guy. Yeah, Tim. Tim was sad with the birthday. This is a different guy. This is not Tim. Yeah, I don’t know the characters. Well, we’ve got Vuk but loves more time. Let’s return to Future Proof. Your nonprofit with apps. Tools and tactics with Jason Shin and Mikko Whitlock, Mio. I’m gonna ask you about one that, uh you have uh in your email signature. And we, we talked about this, uh I remember either last year or the year before, uh, but it, and I clicked on it as we’ve been, uh, you know, scheduling together, uh inbox when ready. So I’m, I’m, I’m imposing one on you. I know this is not on your list because we, we talked about it a couple of years ago or last year, but please reacquaint us with uh Inbox when ready. Ok. So Inbox prim ready is one of my all time favorite tools. And so I use Gmail. So for folks that are gmail users, this is a free plug in that you can um essentially install into Gmail and this only works on the, the desktop. So I wanna make, make sure people understand that if you’re dogging it from a computer or, or a web browser on a computer. But the idea is that um your inbox is if you’re able to hide your inbox after a certain amount of time or you are able to set it by default. So when you log in your main inbox is hidden, so that you aren’t sucked into the rabbit hole of sort of going down the hole responding to emails when your intention might be something completely different. As an example, maybe I was looking for the, the zoom link for today’s meeting and as opposed to logging in and seeing like, oh, I have, you know, 50 new emails. What if I don’t see any emails and I can just go to the search bar and, and type in Tony, uh you know, zoom link and I get directed directly to that email. I’m more likely to follow through on that and not be, be distracted. Um One of the other interesting things too is that we know from behavior change, being able to see metrics that actually show us, for example, in this case, how long we’ve been engaged in a certain behavior or how long we’ve actually been in our inbox for a day? Sometimes that awareness like, oh my God, I spent, you know, six hours like with my inbox open, that would be quite alarming, I imagine for a lot of people, right? And so those this particular tool allows you to see how much time you’re actually spending in your inbox in Gmail on a web browser. And that can sometimes be a tool that can be a catalyst um to help you shift um behavior if your desire is to actually spend less time um in your inbox. Now, the the the beautiful thing since this tool has come out, both Gmail and uh Microsoft outlook, which are the two, I think I would say most of those are the biggest sort of email providers in terms of the organizational space. Um they have introduced new tools including, you know, the ability to be able to snooze your inbox, you know, to be able to temporarily pause, you know, emails coming in for a certain period of time. And there’s a host of other ways in which you can um you know, manage your time in your inbox. Um that are something that you can use to supplement or to actually replace Inbox when ready. But I’ve been using Inbox when ready for so long. And it’s, it works for me that it is one of my, my go tos the idea of pausing your inbox. I mean, so it seems so simple and, but I never thought of it until you said it. I mean, the there’s just this simple functionality like maybe I just don’t wanna, you know. Uh Yeah, I, I just don’t need to see the incoming messages. Um There’s another simple thing that I, I think it was you guys who shared it with me years ago, which I did, which is just turn off the notifications. The little, well, I use apple mail. So for me, it’s a little, it’s a little red dot red circle that has a number of unread messages in it, just turn that off, just turn that feature off. Just let the email sit there in the in the dock for me. It’s a tool bar for others and you don’t have to be prompted uh that you’ve got 15 unread messages. It’s, it, it’s anxiety producing. Yes. And so one of the things that folks don’t re realize so, and I think um this happens in both the Android and the Apple device ecosystem where when you’re downloading a device or downloading an app, sometimes we’re still in a hurry that we don’t read the pop ups that let us know what’s happening. And so we just click. Yes. Agree. Yes, agree. Yes, agree. Because we want to get to the app. And what happens is generally what’s happening is what you’re saying. Yes to and agreeing to is in addition to sharing all your good data, you’re saying yes to um all the notification, right? Not a badge, the alerts, the badges and so on and so forth, right? And so the particular feature that you’re talking about are the badge notifications. Uh where for it could be for email, it could be for Facebook or Instagram. It shows you not only the app, but it shows you a little red dot on the apple device, for example, oh, you have, you know, 2000 unread Facebook messages or you have, you know, three new likes on, on Instagram and for many people, that’s a source of background stress every time you pick up your phone. And so I recommend for folks that unless there is a compelling reason, like uh like you’re some kind of first responder and you’re doing important work where you have to be, you have to know the minute someone is sending you one of those things because if you don’t, someone’s gonna die for most people. That’s not the case. Right. So, um, turn that off. Turn that shit off. Absolutely. And don’t be, you’re absolutely right. When you download a new app, they ask all those questions. Also. Location, share your location. Why do you have to share your location? Yeah. Right. Google Maps needs my location. That really, that’s about it. My bank does not need my location. I can deposit the check without it knowing what my uh well, you know what my IP address is or what or my, my my coordinates are. Um Yeah, don’t. Right. Mindful, mindful. You’re the mindful techie. That’s right. It’s, you’re aptly named as well. There’s a lot of aptly named things here. Um Yeah. Right. They, when you get the app, you’re anxious to get to the thing, take a breath and read. You know, you don’t, you don’t need all the notifications and alerts. OK. Let’s stick with you since I imposed one on you. Uh I took your, I, I took your uh your, your, your chance. So you had one teed up. Go ahead. All right. So I, I’m gonna share one I think is a favorite for both me and, and Jason so much so that I think we probably share this in virtually every presentation because it’s just such a phenomenal school. So it’s called Toby and it is a browser tab organization uh plug in. Uh That is, I think pretty much cross browser at this point. I know that you can get it in Chrome and Firefox and probably a couple of the other browsers. But the idea here is um you know, we routinely have meetings where, you know, you have to open a gazillion tabs that are relevant to that particular meeting. And for many people, you may be doing that manually. So you have a meeting with Tony and you’re like, OK, I gotta pull up the podcast together. I gotta pull up this, gotta pull up this. And so you’re, you’re, you’re trying to scurry around as the meeting is starting to open up all these different tabs. What uh Toby allows you to do is to essentially create collections of tabs and you just press one button. Um and you, you, it opens all those tabs automatically, right? And one of the interesting things with Toby in comparison to the built in bookmarks and, and mini browsers is that then you can share the collections with your team. So if, if you’re working on it, let’s say, you know, Tony, you have a, a humongous podcast staff and you have a central by set of tabs that you all have opened during your, your planning meetings. You could or someone else on your team could create a Toby collection, share that with everyone and then everyone has the same access to the same collection and it sort of standardized and people didn’t have the ability to create their own collections or customize uh on their own. And so it’s one of those things that it’s one of those going back to what Jason was saying about 1% better, you know, that ability to be able to save those few moments and to save that uh mental stress that we go through at the start of a meeting um adds up tremendously over time. This is another good, yeah, good point you made before to accumulated accumulated like background stress, even the the anxiety of seeing the badge with the 2000 Facebook uh messages unread or something, you know, just uh I didn’t get to, oh I’m so far behind and then, and then it becomes pointless to do it. But the number keeps increasing but you’re so far behind, you may as well let it go, but it’s causing more anxiety, more agita. All right. All right, Toby. So, so that falls under like uh um browser, browser tab management, browser management. OK. Toby Jason. Yeah. So the another tool that want to chat about is uh this may be something that is available for folks that they may not necessarily be using, but they just want to draw it to folks awareness. So, um for those that may already have an 03 65 subscription, you know, through their organization is that uh being copilot um has a commercial data protection uh uh flipped on. And so what that means is that, you know, when, when folks are using, you know, some A I tools that, you know, there, there may be a concern that it is the data is being used to train the model or that, you know, you don’t necessarily know that it’s going to end up, you know, being uh you know, pop up somewhere else, you know, down the road is that uh the, the commercial data protection feature, uh actually uh assures you that it won’t be used for the training of the model and it’ll be contained uh when, when you’re using it. And so, uh if you are using it, you just have to be logged in to your 03 65 account at bing.com/chat and then there’ll be a little green um shield on the top right hand corner that says protected. And so, you know, you’re, you’re able to use that and uh resting assured that, you know, the data that you’re, you’re putting in there isn’t being used to train a language model uh for the folks who are using um the generative A I function. So, uh so what it is is that, you know, it’s uh you know, the similar to things like, you know, GP T or uh you know, Google’s Gemini in that for this particular instance, uh that uh it’s uh the, the little green uh icon in the in the top. Right, assures you that, uh, uh, it, uh, stays contained to your organization. Ok. So it’s like firewalled off from, from, uh, the, the generative A I learning. Mhm. Yeah. And, and as a general tip for, for folks as well as for the other, uh, uh, you know, tools that they may be using around generative A I is, you know, to make sure that you’re checking the settings and the fine print uh that, you know, if you are using, you know, one of the free uh options as well, that there may be uh settings in there to uh turn off the um use the data for training data models uh uh for folks that may uh like to uh be a little bit more secure uh with uh their privacy and what they may be um putting out there. There’s also a concern uh when folks give one of these uh tools, their own data to learn, like, you know, I want you to write a letter to a donor in, in my tone and you, so you upload, you upload to the using in your prompts some of your own letters. And I want it in my tone with, you know, you, you need to be very aware of what you’re, what you’re providing for the learning act because I don’t know that it’s only, it’s keeping, it’s keeping your data only to this conversation, this, this uh this purpose that we’re we’re going back and forth about and whether it uses, it uses that your data that would otherwise be proprietary to you because it’s, it’s your letters, uh for some larger purpose. Absolutely. And that’s something to be aware of when, uh flipping on some of the features in some of these programs where, you know, they, you may be prompted you to flip on an A I feature or, or something, but, you know, in the fine print uh or even not so fine print, you know, it may say something along the line of like, you know, uh this will submit your data to a third party, uh you know, just giving you a heads up, but there’s a lot of additional subtext there where it’s like, OK, well, after it’s submitted to the third party, you know, what happens like is this going to be used to train, you know, the language model, you know, is, you know, this going to pop up, you know, somewhere potentially in the future or is, you know, or is it going to stay contained uh and not used to train the model? So, uh you know, those, those are questions that are worth asking, you know, as um you know, more and more A I features, you know, pop up along the way as well. What what third parties, third party is in the world, I think the broader point that I would make here too. And this is with social media. This is with um anything you post on a website. Um we technology has evolved to a point where essentially there’s a forever memory, right? Even if you take stuff down, it’s still there, still findable. It’s somewhere. Right? And so I always, particularly with younger folks, you know, say, don’t share anything, don’t text anything, don’t post anything on Facebook and Twitter and Instagram and Snapchat and tiktok that you would be embarrassed to see on the news, right? If you’re embarrassed to see it on the news, then don’t post it. Don’t, don’t share it very wise, sage, sage advice from the mindful techie. Uh Let’s do uh let’s do one, each 11 more each Miko. All right. So uh we know that the amount of information the V information has continued to increase and it’s virtually impossible for us from a human perspective to keep up with all of that. And so in this case, one of the ways that A I can be very powerful is by actually helping us to summarize, you know, voluminous um documents, videos and so on. And so one of the tools that we shared during our session is uh a plug in for chrome that’s called youtube summary with chat GP T and cloud. So chat GP T and Cloud are both um two different types of um A I tools. And one of the interesting things about this youtube summary with chat GP T and cloud tool is that it allows you to summarize youtube videos, web articles, and PDF documents. And so I don’t know about you, but I don’t have time to watch your two hour training on fill in the blank topic. Maybe I just want, you know, just give me the cliff notes, give me the, the, the bullet points and let me dive deeper on the points that are most relevant to my particular question or my particular curiosity at that moment, this particular tool you just plug in a URL and it gives you a summary um to help you to really focus your time and maybe you decide or determine that, hey baby, this video doesn’t have what I need or maybe it does. I want to look at the last third of this and actually dive a little bit deeper. Um but it can be a powerful tool to save you um lots of time um with particularly with lengthy videos or lengthy documents, say the name of the, the tools again. So this is a long one. So it’s called youtube Summary. It’s all one. Yeah, I’m just reading the title that they gave us. So youtube Summary with Chat GP T and Cloud. OK. Yes. Thank you, Jason. Yeah. The, the, the last one I’ll share is uh it’s more of a mental model and a tool and uh you know, to, to really think about things on a two by two grid. And so the the, the model is called the, it’s also known as the Eisenhower matrix. And so when thinking about, you know, the various tasks that one has to do, and, you know, we, we shared a whole bunch of tools that help, you know, automate and speed up things. But it’s also to look at the tasks themselves and really take a look at, you know, what’s, what’s urgent and what’s important. And so on the two by two grid on one access, um you know, you would have an important and not important and on the on the other access, you would have urgent and not urgent. And ideally, you know, you um you’re spending your time on the not urgent and important things because you know, that’s where you can make, you know, a lot of your long term impact and you know, the things that pop up that are urgent and important, you know, those uh is uh you know, important for you and your organization and when you make short term impact and when you think about, you know, things that are not urgent and not important, you kind of have to ask, you know, well, why are we doing them? And so we want to make sure that these tools aren’t automating things necessarily that, you know, if it’s not urgent and not important, you know, automating but not, not urgent and not important is, you know, spending more time and resources to do things that aren’t important and urgent then you know, those things just need to be eliminated. So those can be thinking of the like the the email inbox is a very good example of non urgent, not important. Most of it obviously there are exceptions but most of email is non urgent and non and non important. Mhm. Yeah. So things like starting to, you know, junk mail, you know, checking through, you know, a lot of, you know, social media, you know, they can be, you know, distractions and time wasters and, and then uh you know, and then the other quadrant being the urgent and not important uh part where, you know, that’s where a lot of the tools can help, you know, automate or help, you know, kind of you’re delegating it out to, you know, the, the tool to help accelerate, you know, some of that and, and so, you know, I think this is a really great conceptual framework to as folks are looking at, you know, their tasks and matching it up to the tools that they’re, they’re using as well. And I know that, you know, Miko speak to, speaks to this, you know, really knowledgeably in his trainings and um around uh how, you know, it can be used effectively and uh I’ll throw it over to Miko as well. Uh uh If there’s any additional things to add on this front. Yeah. So Jason, I, I think you’re spot on, I think that the key here. So in the context of the tech tools that we’re talking about, um we shouldn’t just be using the tools just for the sake of using them. Like you want to be clear about the purpose and at least in terms of the organizational context, and I think it’s critically important as you think about urgent versus important. Also thinking about, you know, what are the resources that we have? What’s the capacity that we have? You know, when I was communicating director, one of the the the the the common pieces of wisdom from some folks when social media was emerging is that oh, you gotta be on all the platforms. Well, that was impossible. We didn’t have the staff of the resources nor did it make sense because our audience wasn’t on all the platforms. So asking yourself the question, what resources do I have? What time and capacity do we have? And you know, what goal are we trying to achieve and what’s good enough for now versus what we can build toward for later or perhaps what we can eliminate all together because it’s simply not relevant, even though everyone else is doing it or the conventional wisdom says we should be doing this or be on this particular uh platform. Um And I think we can apply that to A I right now, right? So um I am of the mind that it’s not an all or nothing, right? Not every organization needs to be using every single type of A I tool out there. Um It’s simply inappropriate in some cases and some cases that actually can be counter productive. So you wanna be clear about what outcome you’re trying to get to. Um And how the tool can support you and help you. Not how the tool can sort of replace you being a critical thinker and that’s, that’s actively involved in the process. Yeah, loss of creativity is, is my biggest concern about artificial intelligence. Use that, that we’re, we, we could see some of the most creative things that we do. Uh And listeners have heard me talk about this with uh we had an A I A couple of A I panels. Um The one, the first one was with um Arua Bruce and George Weiner and Beth Cantor and Alison. Fine. And we kinda uh I aired my uh my concerns there in, in more detail just about giving away the most creative things that we do. And over time us becoming less creative, less creative thinkers, less thoughtful thinkers or less critical thinkers. Um All right. So why don’t we leave it there? And, and Jason, I’m just going to reiterate it’s the Eisenhower Matrix, which I’ve followed for years and I try to think that way, but I don’t, I don’t do it routinely but that, that’s uh that two by two that you were describing is uh the Eisenhower Matrix and you said it, I’m just reiterating it for folks. So that, because it is a, it’s a very, it’s a very sensible way of, of planning. And Miko to your point earlier, uh, you know, it’s, it, it, it’s been around for generations. Old tools can still be valuable. Absolutely. And I, I would offer to your audience, Tony if I’ve reworked this, um, for the mission driven context. And I’ve, and I’ve annotated it. Um, and I’ve, I’ve given sort of a road map of how you work through this in a practical way. So if folks are interested in that, they can email me or, you know, we can give you a link to put in the show notes or however, it makes sense. But folks want access to that annotated version. Ok. The annotated version of the Eisenhower Matrix. Ok. All right. That’s Miko Marquette Whitlock. He’s a mindful techie. You’ll find him on linkedin and his practice is at mindful techie.com. Jason Sim Chief Digital Officer at the Canadian Center for nonprofit digital resilience. He’s also on linkedin and the center is at CCNDR dot C A. Jason Miko. Thank you. Thanks very much. Real pleasure each year. Thank you for sharing. Thank you. Thanks for having us. Next week, we’ll take a hiatus from 24 NTC with Gen Z career challenge. If you missed any part of this weeks show, I beseech you find it at Tony martignetti.com. You’re gonna be interested in the Gen Z you, you’re Gen Z. That’s me, Gen Z career challenge. We’ll see if it holds true for you were sponsored by virtuous. Virtuous gives you the nonprofit CRM fundraising volunteer and marketing tools. You need to create more responsive donor experiences and grow, giving, virtuous.org and by donor box, outdated donation forms blocking your support, generosity. Donor box, fast, flexible and friendly fundraising forms for your nonprofit donor. Box.org. Our creative producer is Claire Meyerhoff. I’m your associate producer, Kate Martignetti. This show, social media is by Susan Chavez. Mark Silverman is our web guy and this music is by Scott Stein. Thank you for that affirmation. Scotty be with us next week for nonprofit radio. Big nonprofit ideas for the other 95% go out and be great.