Amy Sample Ward & Afua Bruce: The Tech That Comes Next
Social impact orgs, technology developers, funders, communities and policy makers can all do better at technology development, argue Amy Sample Ward and Afua Bruce in their new book, “The Tech That Comes Next.”
Listen to the podcast
Podcast: Play in new window | Download
Get Nonprofit Radio insider alerts!I love our sponsors!
Turn Two Communications: PR and content for nonprofits. Your story is our mission.
Fourth Dimension Technologies: IT Infra In a Box. The Affordable Tech Solution for Nonprofits.
We’re the #1 Podcast for Nonprofits, With 13,000+ Weekly Listeners
Board relations. Fundraising. Volunteer management. Prospect research. Legal compliance. Accounting. Finance. Investments. Donor relations. Public relations. Marketing. Technology. Social media.
Every nonprofit struggles with these issues. Big nonprofits hire experts. The other 95% listen to Tony Martignetti Nonprofit Radio. Trusted experts and leading thinkers join me each week to tackle the tough issues. If you have big dreams but a small budget, you have a home at Tony Martignetti Nonprofit Radio.
View Full Transcript
Processed on: 2022-09-16T13:48:08.345Z
S3 bucket containing transcription results: transcript.results
Link to bucket: s3.console.aws.amazon.com/s3/buckets/transcript.results
Path to JSON: 2022…09…609_tony_martignetti_nonprofit_radio_20220919.mp3.155826533.json
Path to text: transcripts/2022/09/609_tony_martignetti_nonprofit_radio_20220919.txt
Hello and welcome to Tony-Martignetti non profit radio big non profit ideas for the other 95%. I’m your aptly named host of your favorite abdominal podcast. Oh, I’m glad you’re with me. I’d bear the pain of pseudo ag raffia if I had to write the words you missed this week’s show the tech that comes next social impact orgs, technology developers, funders, communities and policymakers can all do better at technology development for greater equity, argue Amy sample Ward and Bruce in their new book, The tech that comes next tony take two heading to the Holy Land. We’re sponsored by turn to communications pr and content for nonprofits. Your story is their mission turn hyphen two dot c o and by fourth dimension technologies I. T. Infra in a box the affordable tech solution for nonprofits. tony-dot-M.A.-slash-Pursuant D Just like 3D but they go one dimension deeper. It’s my pleasure to welcome Amy sample Ward returning she’s the ceo of N 10 and our technology and social media contributor there at AMY sample ward dot org and at Amy R. S Ward and to welcome Bruce. She is a leading public interest technologist who has spent her career working at the intersection of technology policy and society. She’s held senior science and technology positions at data kind, the White House, the FBI and IBM She’s at a few a underscore Bruce who is a F. U. A. Together they’ve co authored the book the tech that comes next how change makers, philanthropists and technologists can build an equitable world. Their book is at the tech that comes next dot com. Amy welcome to nonprofit radio Thanks [00:02:26.04] spk_1:
for having us. [00:02:27.35] spk_2:
I’m glad to [00:02:28.07] spk_1:
hear what you [00:02:29.25] spk_0:
think both of you for the first time. Very nice to meet you. Glad to have you. [00:02:34.51] spk_2:
I’m so excited to be here. [00:02:53.06] spk_0:
Thank you, excited. That’s terrific. You may be more excited than I am. I don’t know, but I know I’m very excited. I’m very pleased. I already said I was pleased, excited. Is excited is even better than pleased. Thank you. Uh let’s start with you since people know AMY sample ward voice. Um I feel like we should start with a definition of technology the way you to see it. [00:03:45.79] spk_2:
Absolutely technology can mean many things to many different people and even when people just simply hear the word of technology here, the word technology contra and hope of the future and assistive devices that may transform our world, but it can also bring up feelings of in trepidation and confusion and so in the book, when we talk about technology, we define it very broadly as to what our tools that exist to help us really exist in the world. Um and so this can be anything from digital systems and websites and like AI for example, but it’s also more basic things such as you know, pay deeper or other tools that are just used. And so we define it extremely broadly in the book. The focus of the book does focus on digital technologies though and really looking at adoption and use and development of digital technologies especially as it relates to the social impact sector [00:04:07.28] spk_0:
and what what troubles you about our relationship to technology? [00:04:36.38] spk_2:
Um, well I am an engineer, a computer engineer specifically. And so I love technology. I love being in technology. I love doing all sorts of things with technology. I love designing new ways to use technology and figuring out how to design technology to support new ways of interact that we have. I think one of the things that [00:04:41.22] spk_1:
does [00:05:34.13] spk_2:
give me pause though is how some see technology or some try to position technology as the be all and end all the magic solution that we could have to solve all of our problems. And that if we simply find the right technology, if we simply insert technology into any societal problem that we’re facing, that that technology will magically fix whatever we have been facing. And that’s simply not true technology not a natural phenomenon. It is something that we could create. It’s something that we should be intentionally creating to minimize bias to make sure that technology is developed and used in inclusive ways and really does enhance what we want to do as humans, which is hopefully live well together in community. Um and not just be used as some big tool to force uh different, often um different, often disproportionately impacting outcomes [00:05:45.54] spk_0:
and you have a lot to say about development specifically more more equitable development. [00:07:43.75] spk_2:
Yes, absolutely. Um I think equitable development of technology is something that can and should be continuing to grow. I think historically, especially when we look at the past several decades of the rise of digital technologies and technology more broadly the um the power, the money, the education has been concentrated in one group and a lot of other groups, it includes a lot of historically underrepresented or overlooked communities um based on ethnicity, based on gender identity, based on sexuality, based on ability, physical ability, mental ability or more um have really been left out or forgotten about. And so when we talk about a more inclusive design process and more inclusive development process for technology, we’re talking about one being more inclusive to who is actually allowed in the room when we talk about technology design. So who do we see as capable of being technologist um and who have who has those abilities to engage that way, but also recognizing that because technology does not exist alone, but because technology doesn’t exist in a vacuum, because technology can’t magically solve all of our problems on our own. Even if you’re not a technologist, you should be at the table in some of these design conversations because you are part of communities that have needs and those needs should be articulated at the start of the design process. You might understand a particular subject matter. I think in the book we talk about using technology in the education space, in the food space in other spaces as well, you may have some of that knowledge that is critical to making sure that the technology supports the overall goals of those sectors. And so it is important that as we think about being inclusive in developing technology, we make space for not just different types of people who are able to be technologists, but also different types of expertise that we need in that developed process. [00:08:09.00] spk_0:
So you’re not so pleased with the model where rich, privileged white males develop technology develop, identify what’s going to be solved and how best to solve it. I I assume that that model is not working for you. [00:08:27.26] spk_2:
I would say I would go even further than [00:08:30.25] spk_0:
going out and [00:08:30.92] spk_2:
it’s not working for most of us. Um so it is not working for most of us to have the power concentrated in that [00:08:52.64] spk_0:
way. Okay. And in fact, uh someone see, I don’t know who wrote which sentences, but somebody wrote. We can’t continue to perpetuate the belief that those with the most money know best. I don’t know, maybe your editor put that in. You may not even be one of the two of you. I don’t know. Maybe [00:08:57.13] spk_2:
I [00:08:58.76] spk_0:
trust [00:09:36.43] spk_2:
me amy and I spent many, many hours on many, many aspects of writing and editing to make sure that what is in the book. We both stand behind. And so absolutely with that sentence. Something that I think we we both stand behind. Um We can’t let you said we can’t let one small population in this case rich privileged white men be the ones who design all of the technology and decide all of the outcomes for everyone. We really need to. And in the book we talk a lot about how it’s so important, why it is so important to go back to communities and communities who understand their needs to understand their priorities and let communities drive that process. That would then include um policymakers. That then includes funders that that includes um technologists themselves and that includes [00:09:53.21] spk_0:
uh [00:09:54.53] spk_2:
the leaders and employees at social impact organizations. [00:09:58.85] spk_0:
Another aspect of it is that just what’s what problems get solved? What what what gets attention? [00:10:06.18] spk_2:
Absolutely. And um I think we have lots of ideas on this, but I have been talking for so long. Um I would love to pass it. [00:10:29.88] spk_0:
We’ll get a simple word gets amy sample Ward will get their chance. Okay. Um Alright, if you insist for All right. Um Okay, if we have to go to Amy Now. All right. Uh You say somebody wrote this sentence. Uh Exactly related to what I was just saying. We dream of community centered work that builds from community centered values and there’s a lot of emphasis on going back to values. Um Why don’t you uh just sort of introduce us to the some of those values amy [00:14:05.53] spk_1:
sure. Happy to. um I think that you know one thing we say in the book and we’ve we’ve enjoyed getting to talk to a lot of groups about since the book has come out is that everything we do as people is centered on values, but often times we don’t talk about them, we don’t make sure that our values are aligned when we start working on something. And so then those values become a some and I think we’ve all heard many different puns about what happens when you operate on assumption. Right. And so that’s that’s kind of part and parcel of also assuming that the only people that can make technology are people with certain degrees or that have a certain amount of money or that you know look a certain way. Um again that that’s those are values that we’re not talking about and that we need to talk about so that we can be really intentional about what we want to focus on. Um and in the book, you know of who has already been speaking to some of those values that there’s important role and we need to prioritize lots of different lived experience as an important part of any technology project. Um that a lot of different people should be involved in every single stage of that process, not like at the end, once we build something and we like pull the pull the little cover off and are like today we built it, what do you think there should be no pulling the cover off? You know, everyone should have already been part of it and known it was being built this whole time. Um but also values that I think are important to can it name early in the conversation around accessibility, so much of the barriers and the walls around technology projects that are there, you know again, whether people are talking about them explicitly or not that are maintaining this this false reality, that only certain people can be involved are coming from a place of saying oh we speak a certain way we use these acronyms. We we talk about things without slowing down for other people to be involved. So what does accessibility look like? Not just that a tool could be used with a system of you know devices but really that you are not using jargon that you’re making sure things are being held at the time of day when those folks that you want involved can be there. Um that child care is provided that your user group meetings, you know every level that you are operating in ways that really do make things accessible to everyone. Um and I think another value that we like to talk about early in conversations is the book is kind of a big idea like the world is not the world we have right now, like what if it was not this, what if it was equitable and just and wonderful. Um, and I know you want to talk about the illustrations colorful uh, you know, so to get there. It’s not like two steps. It’s not okay. That’ll be on like the 2024 plan, right? It’s a lot of work. And so technology and the relationship and expectations we put on it just like social change are that we can make incremental right now immediate changes and at the same we can be working on really big changes. The shifts that get us to a very different world that we have to do both. We can’t just say, well let’s live with harmful technologies and and harmful realities until we can all of a sudden just change over to the like non harmful one. Um, you know, we need to make changes today as we’re building for bigger change. [00:16:31.93] spk_0:
It’s time for a break. Turn to communications. They have a bi weekly newsletter that I get. It’s called on message and they had something interesting in the, in the last one, it was five ways to find the timely hook and I’ve talked about news hooks with them that can be a great opportunity for you to be heard when there is some kind of a news hook. So how do you find these timely hooks, couple of their ideas track recognition days and months. I just did that in august, it was national make a will month and I did a bunch of content around that and there was, you know, there are days and months for everything like pickled day and a lot. So you can search for, you can search for the the recognition days and months, find something that fits with your work. Another one was just staying current with the news. They said they were gonna send their e newsletter on the day that queen Elizabeth died but they thought better of it because you’re not gonna be able to get people’s attention. People are just gonna be deleting emails more rapidly because they’re consumed with the death of the queen. So they held off a day or two. Um, and tying to a trend is another one that they suggested. Uh and they give the example of when um including salaries in job postings was trending and they used the example of somebody who actually wrote contrary to that idea. But it was timely because it was something that lots of people were talking about. So there’s a couple of ways of identifying the hooks, You can get their newsletter on message. Of course you go to turn hyphen two dot c o. Turn to communications. Your story is their mission now back to the tech that comes next. How is it that technology is not neutral? Amy [00:16:36.71] spk_1:
well, [00:16:37.09] spk_0:
humans, humans, [00:17:26.74] spk_1:
I don’t think humans have the capacity to be neutral. And we are the ones creating technology. I mean even before digital technologies. You know, the number of um, pieces of farm equipment that could be considered technology, you know, humans built those that kill people who are left handed because the tool was built by right handed people to be used with your right hand, right? Like there’s there’s not a lot of evidence that humans can be neutral. And so then you add to that that we’re building it with a often very small group of people not talking about values for something that is meant to be you know, used in a different context with different people. It’s it just doesn’t have the capacity to be neutral. Let’s [00:17:43.78] spk_0:
take something that’s so ubiquitous. It’s an easy example. Let’s take facebook. How is so somebody’s facebook is there, you can use it or not use it. How is that not just a neutral entity sitting there for you to use or not use, [00:19:31.46] spk_1:
I mean you are welcome to use or not use facebook but just because you have the choice to use something or not use, it doesn’t mean it’s neutral. The platform is collecting your data is selling your data is deciding whether and how you can use the tool to connect with other people or to create groups, right? It is not allowing you the control over how your data and and use of the platform goes. So it’s kind of a false choice really. Um and for a lot of people, it is very much a false choice. There. There isn’t the feeling that they cannot use it if it’s the only quote unquote free tool that they could use to find certain resources or to otherwise, you know, talk and stay in communication with certain people, but at what cost, you know, and I think that’s the kind of conversation we’re trying to spark in the book is technology isn’t neutral, we just accept that and then we say and so at what cost at what harm are people having to make these choices around how they navigate technology? And we we have never presupposed in this book or in our lives that facebook or any other platform is going to necessarily make the choices that are best for the community and that’s why policymakers have an entire chapter in the book. You don’t need to be a text specialist or have a who is you know, technical background to be a policymaker that’s making smart protective policies that for users we need to say, hey people should be able to access and protect and restrict their data. Let’s make some policies around that. Right? Because the platforms are not going to make that policy themselves that that restricts them. Um and so I think again, all of these different groups together, get us to the tools that we need and not just the technology developers themselves, [00:19:56.28] spk_0:
a few anything you want to add to that. Uh My my question about why facebook is not a neutral tool. [00:20:50.18] spk_2:
I I think Amy gave a really good overview as to why technology and facebook in this case is not neutral. I think um you know, a lot of people now you’ll hear say they algorithm made me see it, the algorithm didn’t make me see something and that just also goes to the fact that someone has programmed the algorithm, someone has decided what will be given more weight or what will be given less weight, what will be emphasized won’t be emphasized. And so that then drives your interactions and the biases that the programmers have or the stated goals that the owners of the platform have then get seen to encoded into the technology that you use, whether it’s facebook or any other platform that then can affect how you interact, even if you do decide to often to using the technology as Amy mentioned, you always well not always, but you often have a choice as to which technology you want to use, what platform you want to log into, you want to engage with or not, but once you’re there, your choices are often limited in ways you might not realize because of the fact that technology is not neutral. [00:21:20.02] spk_0:
We’re getting into the idea of oppressive systems which which the book talks about for you wanna explain. So, facebook may very well be an example, but what what what what’s oppressive systems generally, [00:23:01.71] spk_2:
you know, I think one of the underlying themes of our book is that technology can really be used to enhance goals and to sort of enhance missions, and we argue in the book that we want to, you know, social impact organizations, especially communities to find ways for technology to enhance their mission, to help them accomplish their goals more often. But the reality is that technology again sits on top of people because it’s created by people and so to the egg extent in which extent to which um there are oppressive systems and society, whether that’s around how people get jobs or access education or access other resources, um that is then I can just be translated into the technology systems that then help facilitate our lives. It’s the same principles for different sort of outdated policies that have been rooted in unequal access. For example, if you just take those policies and write code then um that directly relates to policies, the new system, this technical system you’ve created has those same oppressive oppressive aspects in that system. And so again, when we talk about designing technology does need to come back to what communities are we designing for? Are we talking to them? Are we letting communities really drive that work? And through the development process are we really keeping in mind some of the historical context, some of the social context, some of the knowledge about biases and how that appears in different technology and what ties doesn’t have to how organizations function and how policymakers do their work, Um what we need to be funding to make sure that we have the time and the money to invest in a more inclusive process. [00:25:40.82] spk_1:
I just want to add as I was talking about that um, and kind of trying to like hear our own conversation while while we’re in it and to share the reminder that while of course like facebook is this giant huge technology platform. Um, we are also talking about technologies that nonprofits make, you know, an organization that decided to have their staff or hire a web designer to help build something on their website that allowed users to complete their profile or to donate on their website. All of these things that organizations are doing with technology is also developing technology, right? It also needs to be inclusive. It should also have a lot of your community members and users part of that process the whole way, right. This isn’t just for for profit giant tech companies to hear this feedback, this is everyone including the way we fund our own technology inside of organizations, the way we prioritize or build or don’t prioritize or you know, don’t build technology and when we, when we think of it that way and you know, it’s just so easy, I think or I think it is easy to to say, oh my gosh, facebook is an oppressive platform, all of these things are horrible. It’s done all of these things. We can, you know, we could search for news articles from a decade of issues, right? But that kind of shifts the attention. Um, and acts like we as organizations don’t have any blame to share in that not that we’re sharing in facebook’s blame, but like we too are part of making not great decisions around technology, you know. Um there’s an organization that I I experienced this as a user on their website and had to give them some feedback that there they collected demographics as you’re creating your profile super common to do right? Um, their race and ethnicity category for like all humans that would answer this category only had four options total of all of the races and ethnicities in the world. There were four. Not one of those options was multiracial, not one of those was other. Let me tell you the thing you didn’t list here, right? You had to pick required question with four radio [00:25:52.25] spk_0:
buttons. [00:26:29.98] spk_1:
That’s that is that is harmful, right? Like you and maybe there was a good reason, not a good reason. Maybe there was a reason that you felt, you know, your funder makes you report in those four categories. I totally understand how hard it is to like manage your work as well as meeting all these funder reporting requirements. That’s something we talk about the book that is an issue. We need to go fix funders reporting requirements, but just because a funder says give us state in these four categories does not mean those are your four categories right? You have an obligation to your car community to be better than that. Um, and so I just want to name that as an example that we’re not just taking the easy route of complaining about facebook, which I would love to do for like five more hours. [00:26:41.44] spk_0:
No facebook is not even facebook is not even, [00:26:43.71] spk_1:
you know what I mean? Also trying to name it as something we’re doing inside our organizations to [00:28:58.50] spk_0:
your example reminds me of the example you cite from jude shimmer who says, you know, she’s filling out a donation, they’re filling out a donation form and there’s no mx option. It was mr mrs Miss, I guess no mx um, by the way, you had several nonprofit radio guests quoted in the book, Jason sham steve hi jude. So I’m glad non profit radio brought these folks to your attention. You know, elevated their voices so that you, you became aware of them because you would not have known them outside. Well that’s elevating voices. That’s exactly exactly right. It’s time for a break. 4th dimension technologies, technology is an investment. Are you seeing this? You’re investing in staff productivity, you’re investing in your organization’s security donor relations because you’re preserving giving and all the actions and all the person’s preferences and their attendance and things. So you’re certainly investing in your donor relationships, uh, in your sustainability. So because technology is gonna help you preserve your mission into the future. So I don’t want to just throw something out and then not explain it. So see technology as an investment, fourth dimension can help you invest wisely. So, uh, make those savvy tech investment decisions. You can check them out on the listener landing page at Just like three D. But you know, they don’t want to mention deeper. Let’s return to the tech that comes next. All right. So let’s bring it. All right. So no, as I said, facebook is not mentioned in the book. I was choosing that as a ubiquitous example, but let’s bring it to something that is non profit created. Who wants to talk about. I kind of like the john jay college case because I used to do planned giving consulting for john jay, who, which of you knows that story better. Nobody [00:30:43.41] spk_2:
looking at other resume. But I will, I will happen and talk about the john jay college example. So just briefly for folks who might not have read the book or gotten to that section of the book yet. Um, john jay college, an institution in new york city that had recognized that they had a lot of services geared towards making sure people finished their freshman year and started their second year, but not as many services geared towards people who, um, not as many services geared towards me, making sure people then ultimately graduate. And so specifically they had noticed that they had a large number of students or a not insignificant number of students who completed three quarters of the credits they needed to graduate but didn’t ultimately complete their degree and graduate. They partnered Data kind, which is an organization that provides data science and ai expertise to other profits and government agencies. Um so they worked with those data scientists to really understand their issue to look at the 20 years of data that the academic institution had collected. The data. Scientists ran about two dozen models, I think it was and ended up coming up with ended up developing a specific model specific tool for john jay college To use that identified students who are at risk of dropping out and potential interventions. The John Jay College staff then made the final determination as to what intervention would be done and how that would be done. And two years after this program was started at John Jay College credits the program with helping additional nine 100 students graduate. Um and so that is, I think, you know, one of the examples that we’re talking about of really the technology coming together with the subject matter experts really being used to enhance the mission and then really again, technology and humans working together to make sure that the outcomes are our best for everyone. [00:31:04.33] spk_0:
There’s some takeaway there too in regard to ethics, the use of the data collection and use of the data. Can you talk about that? Absolutely, [00:31:51.44] spk_2:
Absolutely, absolutely. As we think about data collect data collection data use data analysis, I think in general, especially in the social impact space, you want to make sure that you got consent when you collect the data that you’re collecting it in ways that make sense, that you’re not necessarily over collecting um you’re storing in the right way is protected in the right ways. Um and then as you need to do something with it, you can you can access it, you can use it as a way to foster communication across a different departments. I think one thing that was really exciting and talking to the john jay college staff as they said this program in that development actually force conversations across departments which if you’ve ever done any work at an academic institution, you know, working across departments on campus can be challenging and so sometimes the data can force those conversations and can also help strengthen arguments for the creation or um termination of different programs. [00:32:15.79] spk_0:
Thank you because ethics is one of the one of your core values ethical considerations around around technology development and [00:33:23.63] spk_1:
I think that’s I like that you’re bringing that up tony because I think it reinforces, I mean a fool was saying this, but just to kind of like explain those words when we’re saying that technology is there to help humans, it means that algorithm that was created is not moving forward and sending, you know, a resource or sending an inch invention to a student, it is not there to do the whole process itself, right? It’s there for its portion and then humans are looking at it, they are deciding, you know, who needs, what resources, who needs what intervention. And they then do that outreach right? Versus that idea that I think nonprofits especially think of all the time. Like if we just got the tool then this whole like thing will be solved and it’ll just like somehow run its course, you know, and like the robots will be in charge and that’s not great. We don’t need to do that. We’re not looking for robots to be in charge but also in this really successful example of technology being used, it’s still required people, you know, the technology isn’t here to replace them. It’s to do the part that we don’t have the time to do. Like crunch all those numbers and figure those things out and then the people are doing what people are meant to do, which is the relationship side, The intervention side, the support side, you know. Um and [00:33:43.70] spk_0:
I just want to kind [00:33:44.49] spk_1:
Of separate the two right? [00:33:46.71] spk_0:
The tool was to flag those who are at greatest risk of not graduating after they have I think three quarters of the points or credits. Uh so so that [00:33:58.99] spk_1:
that [00:34:13.73] spk_0:
right, that that’s an ideal day. That’s an ideal uh data mining artificial intelligence task. Just flag the folks who are at greatest risk because we’ve identified the factors like I don’t remember what any of the factors were. G. P. A. I think was one. But whatever the factors are identified them now flag these folks. Now it’s time for a human to intervene and give the support to these to this population so that we can have 900 more folks graduating than than we expect would have without without the use of the tool. [00:35:19.70] spk_2:
Yeah, absolutely. And just to continue to build on what Amy was saying. I think sometimes as nonprofits are considering technology or maybe hearing pitches about why they should use technology or why they should select a particular technology. It can be overwhelming because sometimes the perception is that if you adopt technology it has to then take over your system and and rem move sort of the human aspect of running your nonprofit and that’s simply not the case. You can always push back as to what those limits need to be sort of in general but also very specifically for your organization for your community. What makes sense? What doesn’t make sense? And so really prioritizing as Amy said, the using the technology to take advantage and to do those tasks that or just simply more efficient and computers are more capable of doing that while you use the humans involved for the more human touch and some of those more societal factors I think really um it’s important to emphasize that as leaders of social impact organizations, as leaders of nonprofits, you have that agency to sort of understand and to decide where the technology is used and where it isn’t used. [00:36:57.67] spk_1:
Yeah, we, we were really conscious when we were working on the book to disrupt this pattern that you know, it’s like you learn a new word and then you see it in everything that you read. Um once, once we talk about it here, you’re gonna like go and everything you click on on the internet, you’re going to see it. But technology companies have been trying to sell us for a long time very successfully that their product is a solution and technologies are constantly using that language when you’re looking at their website, when they’re talking to you, you know, this is an all in one crm solution, this whatever, they are not solutions, they are tools and as soon as we, as you know, non profit staff start adopting that, they are the solutions, we then start kind of relinquishing the control, right? And thinking, oh well the solution is that this too, tool has all of this, It is just a tool, you are still the solution right? You are still the human and we, we didn’t want to have that language in the book. So you know, we’re always talking about technology as a tool because with, without humans needing to put it to work, it doesn’t need to exist. We don’t need to have a world that’s trying to make sure we can maintain all of this technology if we don’t need it anymore. Thank you for your service. Like please move along. We don’t, we don’t need that anymore. And that’s okay. We don’t need to feel bad that a tool isn’t needed anymore. It’s not needed. Great. We have different needs now, you know, um and changing that kind of dynamic and relationship inside organizations. [00:37:24.77] spk_0:
A Crm database is a perfect example of that. It’s not gonna, it’s not gonna build relationships with people for you. It’s just gonna keep track of the activities that you have and it’s gonna identify people’s giving histories and event attendance and help them ticket etcetera. But it’s not going to build personal relationships. They’re gonna lead to greater support whether it’s volunteering or being a board member or donating whatever, you know, it’s [00:37:39.88] spk_1:
not the mission, It’s not the food at the gala. Even if it sold the tickets to the gala right? Like it isn’t at all. [00:38:13.47] spk_0:
So I, so I gathered so the Wiley did most of the writing on the book is what I gather because I managed a couple of quotes and nobody like nobody claimed them. So um and also the I I see there’s only two pictures, I like a lot of pictures in books. You only have two pictures and then you repeat the same two pictures from the beginning, You repeat them at the end and and they’re in black and white, they’re not even four color pictures. So there’s a little shortcomings [00:38:15.70] spk_1:
that’s because in the book they could only be black and white, but in the e book they can, the one that’s meant to be in color can be in color. [00:38:25.00] spk_2:
And also we knew that our readers have imaginations of their own and the words that we have on the page would evoke such strong images we didn’t want [00:38:33.81] spk_0:
to overly [00:38:34.58] spk_2:
provide images in the book. [00:40:08.82] spk_0:
Very good, well played. Okay, it’s time for Tony’s take Two. I’m headed to the Holy Land in november. I’m traveling to Israel for two weeks and I’m wondering if you have suggestions of something that I should see? We can crowdsource my my sight seeing a few things that are already on my itinerary, of course the old city in Jerusalem um Haifa and the Baha’I gardens the Dead Sea and uh mitzpe ramon. You may have some other ideas, things that uh you found or places to eat, maybe that would be that would be great little uh terrific places that I should try in either Jerusalem or tel Aviv I’ll be spending a lot of time in, in those two places but also near these other, these other ones that I mentioned to Haifa So if you know a good restaurant eatery, I’d appreciate that too. You could get me at tony at tony-martignetti dot com. I’d be grateful for your Israel travel suggestions and anything else that you may recommend about Israel travel. I haven’t been there, so I’d be grateful to hear from you that is tony steak too. We’ve got boo koo but loads more time for the tech that comes next with Amy sample ward and a few a Bruce. Let’s let’s talk about another story. Talk about, let’s talk about, yeah, you, you all pick one, pick one of your case cases stories to talk about that that you like, [00:44:39.78] spk_1:
I can talk about one since the flu already talked about one, but I was thinking because you already said it earlier, the food sector, so there’s one in there on rescuing leftover cuisine, an organization founded in new york. Um, and I think a pretty classic example of non profit trajectory like someone has personal lived experience they want to address, you know, make sure people don’t have the experience they had and create an organization kind of accidentally like they just start doing the work and they’re like, wait, what am I doing? Wait, we’ve just created a nonprofit, you know, and and kind of want to build because they start to have success actually doing the thing that they set out to do. Um, but like many nonprofits you reach the limit of human scale, like you get to the, this is only the number of people I can personally talk to or physically carry food, you know from one restaurant to to a shelter or whatever. Um and realize, oh we’re gonna need some tools to help us make this thing work. Um and grow beyond just the handful of initial people and also like many nonprofits, that was a very reactive process, right? Like oh gosh, we need a calendar tool, here’s one, oh gosh, we need a, you know, a phone tool, here’s one and not what is the best, you know, what what do we really need? How do we solve these goals? So they found themselves a few years in with very common nonprofit sector, like little patchwork, you know, all different kinds of things. They’ve kind of forced and often the the integration to use the technical term, the integration between tools was humans like answered the phone and then typed it into the tool because the person on the phone doesn’t have access to type it into the schedule er right? Like I they were having to be the tech integrations as humans, which meant humans were not doing human work, right? Humans were doing work that that the robots should be able to do. Um and that’s when they brought in more strategic dedicated technology. Um staff helped to build and again, what they didn’t really realize at first is they were building a product, you know? Um I think this is a bigger conversation of you and I have with organizations is we are we have products, we’ve built products. It’s not bad. And I think especially in the US, we’ve come to think that product is like a for profit word and we will have nothing to do with it. But what it just means is like it’s a package, it is a thing that’s doing what it’s meant to do. And we should think about how we make sure it works and who can access it. And you know, we bring some strategy to it. Um, but their process is really what drew us to including them in the book. They had a really inclusive process where all the different folks from, you know, that were users. So the volunteers who physically like went to the restaurant and picked up that food and and took it to an agency, the people in the agencies, the people in the kitchen of the restaurants, all those different people were able to say, oh, I wish the tool did this. I wish that I could do this every day when I need to pick up food. I wish I could get this kind of message. Everyone was able to give that feedback and then see everybody else’s requests so that as the staff and community and the tech team prioritized, okay, well what works together? What can we build next? What’s in line to be built next? Everyone had transparency. Everyone could see that everyone understood, okay, my thing is last or like I know why my thing is last, right? Like people could really see and give feedback and be part of the process the whole time kind of back to the very beginning of this conversation with us said, even if they were not the technical developers themselves, they had important expertise, Right? It was good to know, oh, these five different restaurants all want the same thing, what’s happening, right? Like what is the thing that’s happening for restaurants trying to offer food? Let’s figure that out. We know who to get feedback from, you know, um, we’re just such a wonderful example of people really having everyone involved in the whole process. Um, and as they have done that and continue to do that, they were able to move people out of, you know, answering the phone to type into the calendar and move people into human jobs. Um, grew the organization, it’s now in eight different cities in different states. Um, and that’s just more of the mission happening, right? Because technology was invested in in the right kind of way. [00:45:02.73] spk_0:
So takeaways are transparency in prioritizing development inclusiveness, including [00:45:10.61] spk_1:
the, including [00:45:11.71] spk_0:
the community, all the, all the different [00:45:14.65] spk_1:
people [00:45:15.63] spk_0:
who are impacted, giving them agency [00:45:18.80] spk_1:
to [00:45:19.70] spk_0:
contribute and not not have it developed. [00:45:24.33] spk_1:
Yeah. And they had, [00:45:25.28] spk_0:
I don’t know how much [00:46:40.74] spk_1:
of this made it into the book, but you know, in talking with them and having conversations, you know, there were a number of times where the thing they were hearing from, all these different users that needed to be prioritized wasn’t something as staff, they maybe would have identified or at least prioritized, but when you’re really listening and having the community drive that development, you know, is that what you’re investing in is actually going to make it better for your community, right? It’s the thing that they’re asking for versus you saying, Gosh, we have, you know, what’s next on our development docket, wonder what we could build, Like let’s think of something you’re not kind of guessing, you know, exactly what needs to be built and that’s kind of reinforcing for your users that you are listening that you are valued that they want this to be as good of an expiry as possible for you, right, Which is really kind of um bringing people in closer and and I think we all know, especially tony as the fundraiser, like keeping people, it’s a lot easier than bringing in new people. So if you can keep those partners in great, you know, you keep those volunteers in instead of having to recruit new ones because you’re burning them out because they don’t like working with you, it’s not a good experience, you know? Um yeah, [00:47:26.71] spk_0:
let’s talk about the funding, but but not from the funders side because most of the very few of our listeners are on the, on the funding side, they’re on the grantee side and so from the, well the book, you talk about social impact organizations, but this is tony-martignetti non profit radio not tony-martignetti social impact organization, radio So so if we could use, please use nonprofits as an example in their funding requests, they’re doing grants, what what can nonprofits do smarter about requesting funds around technology, the development and the use that’s going to be required for the, you know, for the, for the project that they’re trying to get funded. [00:47:32.08] spk_1:
Yeah, [00:47:32.45] spk_2:
absolutely. This is a question that Amy and I have gotten so many times since the book has come out. [00:47:42.97] spk_0:
Okay, well I’ll give you a milk toast bland ubiquitous question that not that [00:49:01.18] spk_2:
it’s a milk toast question, but it is one that is so important to organizations and that even for non profit organizations that have thought about technology before, then the question becomes how are you going to get it funded right? And so, um, it’s an incredibly important question. And so I think that there are a couple of things that non profits can do. One is to seek out funders who are explicitly funding technology, we’ve seen an increase I think over the past several years in different foundations, different companies who are specifically funding technology and so looking for those types of funders. Um, I think it’s really important, I think then another thing to do is to really make the case as we make in the book that um, funding technology is part of funding programs of the organizations and part of funding the running of the organization. Um, it’s not simply an overhead costs. That is a nice to have that. If you get around to it, you can do it, but really you need to have strong technology and data practices in order to design your programs to run your programs. Um people, you know, are used to being out in the world and interacting with technology in certain and so when they come to your nonprofit, they still probably would like to have a website that sees them that recognizes them. That’s useful. They might like to know how to get connected to other people in your community, other staff members and what those communication technologies might look like and more. And so really looking for ways to write technology into program design as non profits are doing that [00:49:25.77] spk_1:
as well. And [00:49:25.97] spk_2:
then I think thirdly, just being connected with other nonprofits through organizations such as N 10 and listening to other great podcasts such as this one um to hear what, what other nonprofits are doing and what’s been successful as well. And applying some of those techniques to your own organization. [00:49:47.95] spk_0:
I feel bad that I gave short Shrift to the, to the foundation listeners. So, I mean there’s there’s lessons in what you just said. Um, are there one or two other things that we can point out for uh for foundation listeners that to raise their consciousness. [00:51:25.89] spk_2:
Absolutely. Um, I think one of, I think, you know, there are many things about technology that can be funded, especially with nonprofit organizations. And I really encourage foundations to think about what it means to really fund that inclusive innovation process and to fund when I say innovation. I mean recognizing that version one is might not be perfect. And so funding version 1.1 and 1.2 and version two point oh, is just as valuable as funding version one. We see this all the time in the private sector that, you know, my phone gets updates on a regular basis and I still have a, and that’s okay. And so really wanting to make sure that funders recognize that we don’t need to just create new technology every time for the sake of creating something new, but really allowing the space for that iteration and really adjusting to the community needs is really important. I think also making sure that we’re funding inclusivity and so that can be things such as uh compensating people, you know, from the community for time, um, as they are involved in this development process, making sure that there’s money in the budget for all staff, not just a member of the tech team to get training on technology, but there’s money for all staff to get training on the different technologies that the organization is using. Um, and also the timelines that are given to nonprofits doing their programs allows for that really critical community listening and community input process into developing any technology and then ultimately developing and executing programs, [00:51:49.02] spk_0:
I’m glad you just used community as an example because I wanted to probe that a little deeper how [00:51:55.99] spk_1:
I [00:52:11.32] spk_0:
guess, I guess I’m asking how you define community because you say that, you know, technologists and social impact or eggs and policymakers and communities can can be should be more involved in uh, technology development. How are you defining communities there? [00:54:23.04] spk_1:
We’re not in a way because technology that N 10 builds for, you know, the community that that we have is very different than um, you know, that would be a bunch of nonprofit staff from mostly U. S. And Canada, but also all over the world, um of all different departments. Right? That that would be the community that intent has, but the community around, um, you know, the equitable giving circle in Portland. Well, that’s Portland’s specific very, you know, geographically different than the N 10 community. Um, it’s folks who can do monthly donations that want to support, uh, you know, black community in Portland, it community is meant to be defined based on what is trying to be built and and for whom it’s meant to be used. Um, and that’s going to be flexible, but I think where it really comes in is what we talked about in the book, in the funding section, but also all of the sections is what does it look like when we expect that transfer to community ownership is the final stage of technology development. Right. And so if that is the final stage, if um the community, you know, owning the technology that was developed by someone, um is the final step well, there needs to be a level of training and an investment that is very different than if you’re planning to keep this privately yourself the whole time, right? If you’re going to turn it over to the community to own it and maintain it, you’re going to be investing in that community in the process in a very different way. You’re going to be including people in a different way. You’re going to be thinking about knowledge transfer, not just technical transfer, right? Um and so that relationship with the community is inherent to the goal at the end. And I think that’s for us, part of what is so important about thinking about that big question of what does it look like for community to really own technology? Like even in the biggest widest sense, because right now, We as users don’t own the Internet, right? Really, there’s there’s 45 million people just in the us that can’t even access broadband. So the idea that the any of these tools, even in the widest biggest, you know, most access sense are are collectively owned isn’t real. And so that goes back to community, but it also goes back to policy, it goes back to how we’re investing in these tools, what values we are even using when we, when we access them? Um, that’s the whole book right there, I guess. [00:55:00.40] spk_0:
Uh, the book is also, uh, a lot of questions. I always hope to get answers. When I read books this, this book, lots of questions questions at the end of every chapter and then they’re compiled at the end. They’re organized differently at the end. Why did you take that tack? [00:56:06.65] spk_2:
Absolutely, yes. Our book does perhaps answer some questions, but it does provide questions. And that’s because what this work looks like varies based on the community you’re in based on your nonprofit organization, based on your role as a policy maker based on your roll thunder perhaps. Um, it varies. And so what your specific solution will look like. There’ll be some of the same building blocks, but the actual techniques you use will need to vary. And so the questions that we have at the end of each chapter at the end of the chapter on social impact organizations. For example, there are, I think 25 questions and five of those are questions that you ask someone as a nonprofit can ask of other nonprofits about technology. You as someone as a nonprofit can ask of your funders to start that conversation with some funders that we were just sort of summarizing now. What are specific questions that you should be asking of your funders were specific questions you should be asking of technologists that come to you and say, have we got a solution for you? Um, what are specific questions that you should be asking? Policymakers? Um, within the realm of what’s allowed for nonprofits to do part of the policy making process. And what are some real questions that you can ask of the communities that you serve and the communities you partner with to really get out, what are their needs and how might that tie to some of the technology needs for your organization? [00:56:43.69] spk_0:
So what have we uh, what haven’t we talked about yet? That, that either of you would like to, uh, you feel like I’ve spent enough time on the well, here, I am asking you and then I’m proposing something. So I’ll cut myself off what, what what would, uh, whatever we talked about yet, either of you. That [00:58:18.09] spk_1:
I mean, I think one thing that we have experienced is that there are some topics like how do we do this or how do we fund this or how do we make change? Um, you know, there’s some topics that recur throughout a lot of conversations, but ultimately, we have never had the same conversation about the book twice because that’s part of writing a whole book. That’s just questions, you know, and isn’t all the answers that isn’t Oh, great. You know, turn to chapter three where we list the 10 things you need to do tomorrow? Like there are no, I mean there’s probably 100 things, right? But um because of that, what we wanted to do when we wrote the book, even if, you know, we said at the beginning, even if no one reads this but ourselves, we want to feel like we are starting a conversation that we are just going to keep starting and keep having and keep getting closer to figuring out what’s next because it’s gonna be a whole long path. Um, and if it if we’re here to write a how to book that, who are we to write that? Right? Who are we to write the how to book on how we completely change the world? But what if we wrote a book that said, y’all, how do we change the world? Like really truly how let’s go, let’s go figure that out that motivates us. And so if it motivates us, it probably motivates others. And these conversations, I mean, I just love them because this yes, we had some of those recurring themes that all of us think about all the time. But this was a completely different conversation than we’ve had before and that, well, you know, different than we’ll have tomorrow. And I think what we’ve talked about the two of us is when we have [00:58:31.93] spk_0:
not only not only different, but better, [00:59:15.21] spk_1:
but when we have opportunities to talk about the book together with folks like you knowing that people are listening, right? Thousands of, of non private radio listeners, we want to, in a way have this be like a practice session for all of them so that when they finish the podcast and they go to their staff meeting, they’re like, hey, a food amy like never had their sentences thought out before they started probably said a million times. The bar isn’t high. I can just start asking questions, right? That’s why we have all the questions at the end. I can just start talking about this. There is no perfect, perfect doesn’t exist. So let’s not worry that I don’t know the exact way to talk about this technology project. Let’s just start talking about it and and get in there and have these conversations that we have almost model that process of just practicing the work of, of changing things. [00:59:33.45] spk_0:
Anything you would like to uh leave us with anything we haven’t talked about that you would like to, [01:00:00.54] spk_2:
you know, the subtitle of the book talks about building a more equitable world and we call out a few specific roles. But really I think it’s just important to recognize that we all have a role to play in building a more equitable world. And so if you see something in this world that you want changed. Hopefully this book does give you some real ideas about how you can go about doing that, some real questions to ask to find other people who can help you along that journey because really building an equitable world is an inclusive process and that includes you. So that’s that’s all I would add. [01:00:43.80] spk_0:
She’s a for Bruce at a few uh underscore Bruce, her co author is Amy sample ward at Amy R S Ward and you’ll find the book the tech that comes next, how change makers, philanthropists and technologists can build an equitable world at the tech that comes next dot com. Amy, thank you very much. Pleasure. [01:00:46.45] spk_1:
Thanks so much Tony. [01:00:48.33] spk_2:
Thank you. [01:01:39.63] spk_0:
You’re welcome. Thank you. Next week. Gene Takagi returns with Trust in nonprofits. If you missed any part of this week’s show, I beseech you find it at tony-martignetti dot com. We’re sponsored by Turn to communications pr and content for nonprofits. Your story is their mission turn hyphen two dot c o. And by fourth dimension technologies I. T. Infra in a box, the affordable tech solution for nonprofits. tony-dot-M.A.-slash-Pursuant D just like three D. But they go one dimension deeper. Our creative producer is claire Meyerhoff shows social media is by Susan Chavez. Marc Silverman is our web guy and his music is by scott stein, Thank you for that information Scotty B with me next week for nonprofit radio big nonprofit ideas for the other 95% go out and be great