What if the key to our future lies not in what we know, but in what we dare to imagine together?
For our 60th episode, Creativity Squared has partnered with the Twin Cities Innovation Alliance (TCIA) for a special three-part data justice series. The intention of these conversations is to invite the audience to reimagine our relationship with the future.
TCIA is a coalition of cross sector stakeholders building and developing problem-solving ecosystems in collaboration with communities.
These interviews feature the distinguished speakers from TCIA’s 2024 conference Data 4 Public Good (D4PG). D4PG taps into the collective power of community-based changemaking through technology, democracy, and justice. The timely and important themes from these interviews include co-powering, digital justice, data privacy, A.I. in education, Afrofuturism, and the power of narrative for social change.
Today’s episode guests include:
How can we reimagine our relationship with the future? Keep reading and listen in to find out!
To participate in the “Data Justice Week of Action” taking place September 16-20, 2024 visit: https://www.tciamn.org/data-justice-futures
Also, mark your calendars for July 15-20, 2025 when the D4PG conference will return to Macalester College in the Twin Cities. Sign up for TCIA’s newsletter so you don’t miss the opportunity to join next year: https://www.tciamn.org/d4pg
Aasim Shabazz
Artificial intelligence is a complicated and polarizing topic, which automatically excludes a lot of people from public discussions about A.I. policies.
To ensure the future of A.I. benefits the public good, though, the conversation has to include as many voices as possible. The best way to do this is at the community level because much of the mainstream discourse about A.I. ignores local issues like surveillance in schools and bias in public health. These community concerns can serve as the entry point for somebody who may not otherwise care to learn or think about A.I. technology. When a community is engaged and educated, the silos that divide them in other aspects of life begin to fall, and members are free to collectively design the future they want. Aasim calls this “co-powering,” imagining the future in terms of “both/and” and “what about” instead of accepting an illusion of limited choice and uncomfortable compromise.
Community efforts shouldn’t come out of a deficit-based narrative, or an attitude of going along with A.I. to get along with service providers. Instead, efforts should reflect an asset-based narrative, which is rooted in the resiliency of human ingenuity.
TCIA supports grassroots and community efforts to pursue data justice with infrastructure for dialog, such as D4PG and local workshops. TCIA also sponsors the No Tech Criminalization in Education Coalition (NOTICE Coalition), which advocates against tech and policies that increase the chances of any student ending up in the criminal justice system.
Aasim Shabazz (he/him) is the architect of the Data 4 Public Good conference as well as a vision-driven technologist, Co-Founder, and president of Twin Cities Innovation Alliance (TCIA). Aasim drives innovative solutions to complex problems faced by communities and organizations. Throughout his career, Aasim has contributed leadership by serving on various boards and commissions, including advancing Minnesota’s equitable light rail development—where he served as a founding co-chair of the Blue Line Coalition.
Marika Pfefferkorn (she/her) is Co-Founder as well as the Solutions and Sustainability Officer of TCIA. An interdisciplinary and cross-sector thought leader and community advocate, Marika is a change agent working to transform systems and scale successes across education, technology, civic leadership and entrepreneurship. She works along the continuum from community to theory to practice, integrating collective cultural wisdom and applying a restorative lens to upend punitive conditions in education, and to reimagine education through a liberatory lens. She has successfully co-led campaigns to end discriminatory suspension practices in Minnesota schools, to remove the presence of police in Minneapolis and St. Paul schools, to increase investment in indigenous restorative practices in education and community settings, and successfully advocated for an Ethnic Studies requirement in Minnesota schools. Marika has cofounded and led the Solutions not Suspensions Coalition, and Education for Liberation MN Network, and participates as a member of the Safety Not surveillance Coalition and Dignity in Schools Campaign.
Dr. Cierra Kaler-Jones
How do we shift the narrative toward technology interventions that support students rather than simply manage them?
One of the most effective ways to inspire change is through storytelling. Dr. Kaler-Jones saw this firsthand when she successfully fought against sharp cuts to her high school’s budget for arts programs.
The power of a compelling story is its ability to stir people out of complacency. Stories build into narratives, which in turn shape beliefs, which ultimately affect how we act.
It’s not enough, though, to dismantle existing narratives about A.I. and other technology. At the same time, we have to write new narratives that affirm peoples’ visions for the future.
The connection between storytelling and community isn’t a coincidence. Although political agendas demonizing critical race theory or erasing LGBTQ+ people often roll down into schools from the mouths of politicians and pundits, it’s the job of the local community to either accept that narrative or to get in front and change it.
Don’t be afraid to channel the emotion and imagery that issues like school surveillance and student safety can evoke, Dr. Kaler-Jones says.
At the same time, “know your audience so well that you know what they eat for breakfast,” she advises. Knowing your audience can help you tailor your message and hold them accountable to the principles they say they support.
Dr. Cierra Kaler-Jones (she/her) is the first-ever Executive Director of Rethinking Schools, the nation’s leading grassroots publisher for racial and social justice education. She is also on the leadership team of the Zinn Education Project and a steering committee member of Black Lives Matter at School. Previously, she was the Director of Storytelling at Communities for Just Schools Fund. As a community-based researcher, Dr. Kaler-Jones supports communities in leveraging participatory, arts-based research methods for storytelling and narrative power-building. She also runs a community-based program with Black girls that uses art and political education to fuel social change in the D.C. Metro Area. Over the past ten years, Cierra has learned alongside preschoolers, K-12 students, college students, and adults as a teaching artist. With her roots in dance and arts education, Cierra has also taught classes on U.S. history, public policy, storytelling, digital media, and social change & leadership. She believes that through storytelling, we can create narrative change that disrupts oppressive structures and systems, and build something more beautiful, loving, and liberatory instead.
Dr. Chelsea Barabas
In school districts across the country, well-intentioned leaders are adopting technology that claims to make schools safer and more efficient. These tech interventions are coming through two main points of concern for school leaders: budget constraints and student safety.
Automating education and in-school counseling might be seen as a way to reduce the burden on taxpayers, even though the long-term effects of reducing human interaction in education could become a bigger burden on us all.
There’s also been a growing trend of surveillance and facial recognition software in school buildings, despite the mounting evidence of how much bias is built into such surveillance tech.
In Minnesota, Alabama, and Texas, devices that detect gunshot sounds are being installed in school bathrooms. Why install them in the bathroom? Because the same device is also designed to detect vaping. In the places where devices like that are being used, police are also a more common sight inside school buildings. In some cases, school districts voluntarily share data about students with law enforcement. Rather than keeping kids safe, these interventions can cause more harm than good by getting students, especially BIPOC students, mixed up in the criminal justice system because of what they do in school.
Dr. Chelsea Barabas
Although technology can seem like an intimidating topic for parents and student guardians to take on, Dr. Barabas encourages stakeholders to start small by attending and participating in school board meetings. For instance, parents might ask school leaders about the software on their student’s school-issued device, what kind of data is being collected, and who that data is being shared with. The school budget is where to go for answers about how much money is budgeted for student support services vs security, hardware, software, and policing. Those who look may be surprised at how much of the budget goes into possibly criminalizing students versus supporting services that help them grow.
Dr. Chelsea Barabas (she/her) serves on the steering committee of the No Tech Criminalization in Education Coalition (NOTICE) and is an incoming research fellow at University of Texas, Austin. She holds a PhD in Media, Arts and Sciences from MIT, where she studied the spread of algorithmic decision-making tools in the US criminal legal system. Her work focuses on unpacking and transforming mainstream narratives around school safety, criminal justice reform, and data-driven decision-making. Visit Dr. Barabas’s personal website.
Cassandra Hendricks
A large part of Alicia’s and Cassandra’s work is helping nonprofits that provide human services manage the mountain of data they have to collect and report so their donors can measure impact.
They both advocate for changes to nonprofit data collection practices to better serve clients. Their data philosophy at MACC is twofold: all data should be useful, and all data have costs.
All data should be useful, not only because collecting it takes time and resources, but also because extensive data collection as a prerequisite for assistance can seem like a barrier to some people who need it.
Useful data is any data that serves a specific purpose to improve services for people in need. Limiting data collection makes the process of seeking assistance less of a burden on the person in need and improves data quality by filtering out the noise.
Alicia Ranney
Limiting data collection also reduces costs and increases efficiency in an organization. This leads to better experiences for both providers and clients by freeing up time and resources for work that directly advances the nonprofit’s mission.
Advancing data justice in the nonprofit sector can take many forms. Nonprofits can establish constituent advisory groups for feedback about improving the intake process for clients. Nonprofit leaders can and should also start dialogues with their contacts at donor institutions about how particular data is used.
Alicia and Cassandra also support broader systemic changes in how nonprofits are held accountable by their donors. Trust-based philanthropy is a growing trend where institutions grant money to human service organizations with much fewer conditions.
Standardizing impact evaluations is another avenue to advance data justice in the nonprofit space. Instead of collecting unique data points for each different donor institution, a standardized evaluation format could help agencies access more funding by jumping through fewer hoops.
Alicia Ranney (she/her) is the Vice President of Data and Evaluation at the Metropolitan Alliance of Connected Communities (MACC). She has over a decade of experience supporting nonprofit social service providers in strategic data decision-making, database development, and uplifting data justice in all aspects of evaluation.
Cassandra Hendricks (they/them) is a Data Consultant at MACC. They are deeply invested in the public sector and have spent the last decade working to build youth voices in public libraries and the labor movement. Lessons from these roles inform their current work integrating data justice into human services work. In their free time, they grow food, cook food, and share food in the community.
Dr. Chelsea Barabas
Although technology can seem like an intimidating topic for parents and student guardians to take on, Dr. Barabas encourages stakeholders to start small by attending and participating in school board meetings. For instance, parents might ask school leaders about the software on their student’s school-issued device, what kind of data is being collected, and who that data is being shared with. The school budget is where to go for answers about how much money is budgeted for student support services vs security, hardware, software, and policing. Those who look may be surprised at how much of the budget goes into possibly criminalizing students versus supporting services that help them grow.
Dr. Chelsea Barabas (she/her) serves on the steering committee of the No Tech Criminalization in Education Coalition (NOTICE) and is an incoming research fellow at University of Texas, Austin. She holds a PhD in Media, Arts and Sciences from MIT, where she studied the spread of algorithmic decision-making tools in the US criminal legal system. Her work focuses on unpacking and transforming mainstream narratives around school safety, criminal justice reform, and data-driven decision-making. Visit Dr. Barabas’s personal website.
Cassandra Hendricks
A large part of Alicia’s and Cassandra’s work is helping nonprofits that provide human services manage the mountain of data they have to collect and report so their donors can measure impact.
They both advocate for changes to nonprofit data collection practices to better serve clients. Their data philosophy at MACC is twofold: all data should be useful, and all data have costs.
All data should be useful, not only because collecting it takes time and resources, but also because extensive data collection as a prerequisite for assistance can seem like a barrier to some people who need it.
Useful data is any data that serves a specific purpose to improve services for people in need. Limiting data collection makes the process of seeking assistance less of a burden on the person in need and improves data quality by filtering out the noise.
Alicia Ranney
Limiting data collection also reduces costs and increases efficiency in an organization. This leads to better experiences for both providers and clients by freeing up time and resources for work that directly advances the nonprofit’s mission.
Advancing data justice in the nonprofit sector can take many forms. Nonprofits can establish constituent advisory groups for feedback about improving the intake process for clients. Nonprofit leaders can and should also start dialogues with their contacts at donor institutions about how particular data is used.
Alicia and Cassandra also support broader systemic changes in how nonprofits are held accountable by their donors. Trust-based philanthropy is a growing trend where institutions grant money to human service organizations with much fewer conditions.
Standardizing impact evaluations is another avenue to advance data justice in the nonprofit space. Instead of collecting unique data points for each different donor institution, a standardized evaluation format could help agencies access more funding by jumping through fewer hoops.
Alicia Ranney (she/her) is the Vice President of Data and Evaluation at the Metropolitan Alliance of Connected Communities (MACC). She has over a decade of experience supporting nonprofit social service providers in strategic data decision-making, database development, and uplifting data justice in all aspects of evaluation.
Cassandra Hendricks (they/them) is a Data Consultant at MACC. They are deeply invested in the public sector and have spent the last decade working to build youth voices in public libraries and the labor movement. Lessons from these roles inform their current work integrating data justice into human services work. In their free time, they grow food, cook food, and share food in the community.
If you enjoyed these conversations, subscribe to Creativity Squared for two more episodes coming out with more highlights from D4PG.
Part two of the D4PG series features:
Thank you to all of D4PG’s distinguished speakers for joining us on this special episode of Creativity Squared.
This show is produced and made possible by the team at PLAY Audio Agency: https://playaudioagency.com.
Creativity Squared is brought to you by Sociality Squared, a social media agency who understands the magic of bringing people together around what they value and love: http://socialitysquared.com.
Because it’s important to support artists, 10% of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized non-profit that supports over 150 arts organizations, projects, and independent artists.
Join Creativity Squared’s free weekly newsletter and become a premium supporter here.
TRANSCRIPT
[00:00:00] Aasim: We’re at a time, at an inflection point where the value of the human intellect and its experiences is seen as competing with AI. It’s seen as competing with polar opposite agendas. And the purpose, one of the purposes of D4PG, Data For Public Good, is that the public good is a shared discourse. It has shared meaning, and it’s for the good of all.
[00:00:31] Helen: Welcome to Creativity Squared. Discover how creatives are collaborating with artificial intelligence in your inbox on YouTube and on your preferred podcast platform. Hi, I’m Helen Todd, your host, and I’m so excited to have you join the weekly conversations I’m having with amazing pioneers in this space.
[00:00:50] Helen: The intention of these conversations is to ignite our collective imagination at the intersection of AI and creativity, to envision a world where artists thrive.
[00:01:07] Helen: For the 60th episode, Creativity Squared has partnered with TCIA, the Twin Cities Innovation Alliance, for a special three part data justice series. The intention of these conversations is to invite the audience to reimagine our relationship with the future. TCIA is a coalition of cross sector stakeholders building and developing problem solving ecosystems
[00:01:35] Helen: in collaboration with communities. These interviews feature the distinguished speakers from TCIA’s 2024 conference, D4PG, Data For Public Good. D4PG taps into the collective power of community based change making through technology, democracy, and justice. The timely and important themes from these interviews include co-powering, digital justice, data privacy, AI in education, Afrofuturism, and the power of narrative for social change.
[00:02:13] Helen: Today’s episode guests in part one include:
Marika: My name is Marika Pfefferkorn.
[00:02:18] Aasim: My name is Aasim Shabazz.
[00:02:20] Cierra: My name is Cierra Kayler-Jones.
[00:02:22] Cassandra: My name is Cassandra Hendricks.
[00:02:24] Alicia: My name is Alicia Ranney.
Chelsea: My name is Chelsea Barabas.
Helen: For more information on these speakers and the topics they discuss and to support their organizations, visit CreativitySquared.com
[00:02:37] Helen: for the blog post accompanying this episode. Also, mark your calendars for July 15th through 20th, 2025, when the D4PG conference will return to Macalester College in the Twin Cities. With that, the mic is in the hands of our guests and their thought provoking interviews. Enjoy.
[00:03:01] Marika: My name is Marika Pfefferkorn. I am the co-founder and solutions and sustainability officer of the Twin Cities Innovation Alliance and the executive director of the Midwest Center for School Transformation. I have to also throw in there that I’m the co-founder of the NOTICE coalition, which stands for No Tech Criminalization in Education.
[00:03:21] Aasim: Hi, my name is Aasim Shabazz and I’m the co-founder of Twin Cities Innovation Alliance and also the founder of the, what we are designing as the Futures Lab. So today we’re in St. Paul on the Macalester College campus, for the Data For Public Good, which is supported and sponsored by Twin Cities Innovation Alliance.
[00:03:39] Aasim: It is designed as a container for conversation and convening that activates community through imagination, bringing together a cross sector of teachers, administrators, entrepreneurs, futurists to solve our world’s problems. We work with community organizers and it creates a rich conversation. So that’s why we’re here today.
[00:03:58] Aasim: The topics can range from themes that center on independence focus, at the local level, minimizing the harmful impacts of poor policies and implementation of technology that harms the emerging, communities and indigenous black and brown communities and rural communities. So we’re thinking about carlson technologies that are being deployed within our schools without our parents permission to have a negative impact.
[00:04:22] Aasim: We also are looking at the futures that we want to design. We’re also looking at public health. So those are emergent themes within the space that we have. And through the algorithmic improv, we work with community members and teachers and those who are not comfortable with the topic to learn to play with it and experiment.
[00:04:39] Marika: I would love to give a shout out to our sponsors and partners. So we actually, really excited that the Borealis Fund and the Communities Transforming Policing and Macalester College, Communities for Just Schools Fund. We also have CS Fund as well as Aspirations. So these are all folks that saw the vision for Data For Public Good and made sure that it was realized here this year and going forward.
[00:05:06] Marika: And we also want to thank our Dignity in Schools campaign partners and our NOTICE coalition partners that are all here in support to kind of help us build out this community container so that we can address the future that we want for our communities and our families.
[00:05:21] Aasim: We’re at a time at an inflection point where the value of the human intellect and its experiences is seen as competing with AI.
[00:05:31] Aasim: It’s seen as competing with polar opposite agendas. And the purpose, one of the purposes of D4PG, Data for Public Good, is that the public good is a shared discourse. It has shared meaning and it’s for the good of all. And right now what we’re experiencing that there are a lot of voices in defining what that good is are not at the table.
[00:05:50] Aasim: And to open that pathway back up to community technologists who are using technology every day with their students, with their families to advance their jobs is to reframe that idea and center it around co powering. And that co powering is in the space of willingness to shape and be shaped. in the direction that we want to go in.
[00:06:08] Aasim: And so how do we define that? So this is that moment for us. when you say data for public good, if you just move that D as if it was on a rotating reel and put climate for public good, education for public good, communities for public good, then you start to expand what that means and you start to have removed the silos and realize that we’re all connected.
[00:06:26] Aasim: And this is that space for us to help amplify that and model that across the U.S. and the world.
[00:06:31] Marika: I think the important takeaways that come out of data for public good is that we are really interrupting a narrative about AI, that it is this, force that does not actually, is not actually grounded in human intelligence.
[00:06:48] Marika: And it’s really, AI is built on data. And so we bring community members together as Aasim said, technologists and folks kind of disrupt the conversation that is currently, the dominant narrative and make sure people understand what is behind this. AI is not something new. It’s something that’s really exciting as something that’s being branded really well right now.
[00:07:12] Marika: But we really need to unpack what that means for our schools, for our communities, for our nonprofits, our social service agencies, and really break this down so that everyone sees themselves as a part of this process and understand where their point of entry is to actually contribute to the conversation and the narrative that we’re trying to build so that we actually are in control of the messaging.
[00:07:34] Aasim: Co-powering, what is it? It is about the shape of a conversation of being. The willingness to be shaped and to shape. How we stay connected together. Today we find ourselves with a lot of information that pushes us in a certain camp and that cuts us off from the whole. Co powering is about willingness to have a “both and,” and “what about.”
[00:07:54] Aasim: Knowledge is concrete, but the imagination? Well, it’s more important to where we want to go. And so in that conversation and community about imagining what the futures are that we hold, there’s no single dominant narrative. And in order to get to where we all want to go, it has to be a both and, and what about.
[00:08:10] Aasim: And we have to flex and develop that muscle, build institutions, design frameworks that allows us to experience that and move it forward. And at D4PG and at TCIA, that’s what we focus on. On building the infrastructure, the narrative, the conversation, the toolkits, to enable communities to get what they want as a collective.
[00:08:29] Aasim: So shared vision, shared mission, shared responsibility for the outcomes. And that’s how we drive to a better future.
[00:08:35] Marika: I really think we’re at a critical juncture right now, as we’re thinking about this next iteration of generative AI, again, recognizing that this is nothing new, but the packaging is new for folks.
[00:08:49] Marika: And so we’re seeing on headlines where AI is being positioned as the newest solution to all of our problems. And then also as being juxtaposed to this is the worst thing that’s happened to us and it’s, harming everybody. And the, conversation is so binary around it that it leaves so many people out of the conversation.
[00:09:10] Marika: And so decisions are being made, the pace is happening really quickly. And so what we are really trying to do is catalyze our communities by educating them on what’s happening, equipping them with the resources to actually respond in real time, and then activating them to know how to address it in their local communities, because a lot of people are feeling like this is inevitable. There’s nothing that we can do about it, but that is absolutely not the case, especially when we think about what is happening at the local level. Folks are looking to the federal government for guidance and oversight and recognizing that as a really limited purview and where the power is and where we can really hold it is in our local communities and making sure again, that we’re educating, equipping and activating so they are not just on the periphery, but are at the table making decisions and that they are not being engineered into this process, but they are the designers and have the vision for how it should be used.
[00:10:07] Aasim: A better future is what we imagine when we dare to dream and a better future is when we are all consistently
[00:10:14] Aasim: And pattern dreaming of what is possible and then acting on it, not othering the same. They’re never going to let us do that. But when we feel that we are empowered in our community, that we have the paths that we want in spite of some of the friction, there will be setbacks, but that the resiliency of the human ingenuity is centered first and that the language reflects that; that it’s not a deficit based narrative that we’re hearing about why we need to move in a certain direction, but it’s an asset based narrative.
[00:10:40] Aasim: That each one of our communities and individuals that we have, you know, we’re doing this because we want to see this in our communities, we’re doing this because we want to see this in our life, and not from a deficit of what I can’t do, what I’m not able to do because they, you know, reimagine it. And so that’s what the purpose of this theme of this conversation, is reimagining our relationship with the future, and slowly shift that narrative internally and externally in the world, and that’s what this moment is about.
[00:11:04] Marika: So as I think about the Twin Cities Innovation Alliance and our framework of educate, equip, and activate, really focusing on the activate, there’s so many different ways that folks can get involved. One I want to share is that the NOTICE Coalition is convening communities across the country with the No Data About Us Without Us Fellowship.
[00:11:24] Marika: And that means that we are building community infrastructure and supporting organizations and people that are addressing the needs around AI data technologies in their community. So check in with us to see how you can get involved and bring a No Data About Us Without Us Fellowship to your community.
[00:11:42] Marika: The other piece that I would share with everybody is we have the next Data For Public Good coming up in 2025. So, this is your early invitation to get involved. Think about whether you want to be a presenter, whether you want to be an attendee, or what is it that you can bring to this community that we’re building to have these conversations that are taking shape in a really new way.
[00:12:06] Marika: And the other invitation that I will extend is, as we’re talking about this emergence of AI, think about what is your space? What is your role? And what is the silo that you’re stuck in? Because we need to break down those silos. It’s not just education. It’s not housing. It’s not just transportation.
[00:12:25] Marika: This is threaded across all of these different sectors, and when we’re having these isolated conversations, we’re limiting the capacity to address a holistic solution and actually focus and invest our energy in the future. This is not about just taking down AI. This is about envisioning what we want for our future, and we need to hear what your solutions are.
[00:12:47] Marika: And so I also just briefly want to share about one of the things that we have been investing to help support communities think about solutions and we have the ideathon. And so we recently had the Digital Justice Ideathon where we bring communities together. We take them through the design process and the outcome is their imagination is activated to think about solutions where they can plug in, whether it’s at the ground level or in a collective way.
[00:13:14] Marika: There’s so many different ways for people to plug in. We just need to help you find your point of entry so you can do so.
[00:13:24] Cierra: My name is Cierra Kaler-Jones, and I am the Executive Director of Rethinking Schools, which is the nation’s leading grassroots publisher for racial and social justice and education. I live in Virginia, but Rethinking Schools is actually located in Milwaukee, but I’m so excited to be here for the Data for Public Good conference.
[00:13:41] Cierra: At the Data for Public Good conference, I’m speaking about narrative power building. How do we shift and shape some of the harmful narratives around AI and technology? And how do we build something more beautiful, more loving, more liberatory? And a lot of that is really centered in community. How do we make sure that community has the tools to be able to strategize around narratives so that we can do a “both and” strategy of [dismantling] the harmful narratives and also to build.
[00:14:08] Cierra: This topic is important to me because I get so jazzed about all things storytelling and narrative power building because I’ve personally experienced the power and sharing my own story to advocate for change at a school board meeting when I was a high school student. And so ever since then, we won our campaign, and ever since then I realized that storytelling could move people. And then I started to learn more about how stories build to narratives and those narratives shape our beliefs and values, which ultimately impacts our actions and whether or not we uphold or dismantle the status quo. I want the attendees and viewers of the Data For Public Good conference to know that we have power.
[00:14:50] Cierra: A lot of the most prevalent narratives in AI and technology right now are evoking feelings of fear because fear is actually one of the easiest stories to sell and believe. And so I want people to know that we are not powerless as the opposition might try to make us believe with these fear mongering narratives, but we actually can create narratives that are affirmative and provide affirmative visions for people.
[00:15:17] Cierra: Let people know what you want them to do. Let people know where we’re going. Let people know the vision and start with the vision first and let’s leave the fear narratives behind. In this moment in time, it is so critical that we engage in narrative power building because we have seen the anti critical race theory legislation.
[00:15:38] Cierra: We’ve seen the anti LGBTQ plus legislation. We’ve seen the anti organizing legislation all amid climate crisis and upcoming presidential election. And so as these narratives circulate, we have to get in front of them, especially now. And getting in front of them means having a strategy. It means collaborating.
[00:16:00] Cierra: It means being in community because that’s the only way we’re going to be able to build the people power, because with narrative power building, in order to build a narrative, we need people to do it. And as people, we have created narratives, but also as people, we can change the narrative too. One action that I would encourage people to take is to learn, about AI, to learn about technology, to demystify some of these narratives that are circulating it, I think one of the challenges of the moment is that because it seems so scary and oftentimes it can be perceived as dangerous, it can make people feel as though we don’t have the power to change it. So the first step is to learn to engage and then to be not only a part of the conversation, but for us to get in front of the conversation.
[00:16:53] Cierra: I draw inspiration from my students. So I come to education work by way of being a teaching artist. I’ve taught babies as young as two up through adults and in every conversation that I have with them, I see how they are making critical connections, how they are talking about dismantling structures and systems, how they are also talking about what could be possible.
[00:17:16] Cierra: Recently, when I was talking to students about AI and technology, but also about Afrofuturism, I encouraged them to just, to draw what is your Afrofuturistic world look like? And so one of my students, Naima, created this beautiful animation of a home that was powered by water. Powered by a waterfall and what she said to me was, you know, Miss Sierra, I wanted to make connections between climate change and how we could use technology to solve some of these pressing issues and even students that say we can actually use AI for better or for good.
[00:17:51] Cierra: What if we actually flip the script and had it so that AI would call out teachers if they were to say something racist or call out something problematic, call out oppression. What if we actually created these counter technologies that centered our voices, our experiences, and that’s how I know that the future will be better.
[00:18:10] Cierra: My vision for a better future, I think encompasses in so many things. I think it’s the dismantling, like abolishing, the prison industrial complex, abolishing policing and all of the ways and forms that it shows up, abolishing all of the ways that oppression shows up on a daily basis in every single part of society from healthcare to law to every single possible place.
[00:18:36] Cierra: But it’s also about building something that is joyful. And so if you were to ask me, I think I would think about dance and having a dance break in the middle of the day. I think about community centers. I would think about more green spaces, more garden spaces, community gardening. I would think about education, systems in schools where students could engage in creativity and criticality, to dialogue about some of the world’s most pressing issues and come up with solutions together.
[00:19:09] Cierra: So I think that the vision is so broad and I would probably need many, many days to talk about all of the different pieces of what it could look like. So when I was a junior in high school, that’s the first time that I ever shared my story. Believe it or not, I was very shy, so shy that I hated raising my hand in class.
[00:19:30] Cierra: But, during my junior year of high school, the board of education decided that it was going to cut all of the arts programs because of statewide budget cuts. So we immediately started organizing. I started gathering peers, teachers, community members. We were writing op eds to the local newspaper. We started a social media campaign and we were showing up at the board of education meeting.
[00:19:54] Cierra: And so if I were to share a tip with folks, I would say, share your personal story. That’s what I did at the board of education meeting. And I wanted them to feel the story. I was giving them stats and facts about arts education and all of the ways that it is so crucial to democracy, to engaging in this society and being creative.
[00:20:15] Cierra: But what I really honed in was on the feelings and the emotions of what the arts did for me. And we were actually successful in that campaign. We won with a unanimous vote from the board of education. And so I try to center that story in a lot of the work that I do because I have to model for other people.
[00:20:32] Cierra: If I’m asking them to share their stories, I need to do that as well. So that’s one tip that I have is really hone in on some of those emotions and storytelling. We talk a lot about imagery and how to make people feel as though they’re there with you because the body engages in a physiological, exchange movement when we feel a story.
[00:20:55] Cierra: And that’s ultimately what compels people to take action is when they feel. I am a dancer. I’ve been dancing for over 25 years now, but I’m also a dance teacher. So a lot of the storytelling work that I do is not only in verbal expression and written expression, but also thinking about how do we reclaim our bodies and reclaim our movement.
[00:21:18] Cierra: I think particularly in an education system where our bodies are policed often, especially for black and brown young people, how do we break out of that? By knowing ourselves intuitively and being able to better understand our body. So that’s something that I like to bring into every space that I enter.
[00:21:39] Cierra: At our breakout workshop, one of the areas of narrative power building that we dove into was all about audience message value. Audience, and knowing your audience is one of the most strategic ways to create narrative change. And my students roll their eyes when I say this, but you need to know your audience so well that you know what they eat for breakfast.
[00:22:02] Cierra: And that’s where the strategy lies is knowing, are they a part of your base of supporters? Are they in the middle? Are they somebody that is not ever going to be a part of your narrative or not ever going to be a part of your organizing? And so when you know your audience, you then know what messages you need to tailor and the messages need to be tailored to the audience in order for them to actually understand it.
[00:22:26] Cierra: And then there’s values. Values are crucial to understanding about your audience, because you want to hold them accountable to their values. If they say they value safety, well, if we’re actually talking about public safety and community safety, police don’t keep us safe. If we’re actually talking about love and justice and democracy, well, let’s talk about how we’ve never really ever lived in a democracy outside of reconstruction, which is the closest thing and they burn that all down, right?
[00:22:51] Cierra: So thinking about some of these values to be able to have those conversations with the audience, because as you build and build and engage in dialogue is where you make that change. Right now we are living in a moment where almost every state has either introduced or passed legislation that is banning critical conversations about racism and oppression in school.
[00:23:15] Cierra: One of the harmful narratives that is really circulated is that the opposition has tried to brand critical race theory, which is an important scholarly legal theory. Branded as something that is negative. And so on our side, we are working to flip that narrative and part of what the opposition is doing is they’re using fear tactics to make people afraid of learning about the history of grappling with things that are hard so that we can be able to dream new futures.
[00:23:44] Cierra: So in thinking about flipping that, one of the things that a lot of narrative strategists are talking about is actually love, because we love people, we love a world that doesn’t exist yet, and we want it to be better. But in order to do that, we have to learn the history, we have to be aware of it, because that’s the only way that we’re going to make change.
[00:24:01] Cierra: And so that’s a narrative strategy that is still in process, but I think it’s one that we have to lean into because it is so connected to AI. It’s so connected to technology, especially because so much of that is affecting and impacting black and brown students and in schools, they wouldn’t even be able to teach about it or to learn about it, as we’ve seen many educators being terminated or being doxed for teaching the truth about the systems that impact them and impact their students the most.
[00:24:30] Cierra: I think one of the ways to continue on is to find community and to find connection. One thing that I talk to folks about often is that in this capitalist society that we live in, it’s actually not structured for us to seize places of joy and of freedom and of, you know, collaboration and creativity and of rest.
[00:24:52] Cierra: And so we have to rest. We have to be well because opposition doesn’t want us to be well. That’s part of the system. And so when we are in moments where there’s lots of struggle, we just have to be reminded that there are other people that are in the work with us. I think sometimes what makes it feel so daunting is that we might feel isolated or siloed and we feel like we have to carry the weight of the world on our shoulders.
[00:25:18] Cierra: But it’s all about community and knowing that there are other people in your community that can pick up where you need to leave off and you take breaks and you rest. And we have to seize that rest for ourselves in order to be well and in order to continue on. One of the major challenges, that we’re fighting right now with tech and AI is just all of the ways that it’s been co opted and weaponized against black and brown students in particular, as I talk about schools, we see tech, like coming up as a way to, replace teachers, as a way to replace school counselors, as a way to replace human interactions.
[00:25:58] Cierra: We also see facial recognition technology that we know includes a lot of bias. And so further criminalizing and further punishing black and brown students, LGBTQ plus students. So we have to keep fighting some of the ways that AI is just another way that oppression morphs
[00:26:21] Cierra: in different structures and systems and the ways that it continues to get perpetuated through these technologies.
[00:26:31] Chelsea: My name is Chelsea Barabas. I am based out of Austin, Texas, and I work at a place called the Edgelands Institute. I’ll also be a research fellow at the University of Texas in the fall, and I’m on the steering committee of the NOTICE Coalition, which stands for No Tech Criminalization In Education. So yeah, I’m here at Data for Public Good to participate on a panel called Challenging Carceral Technologies in Schools.
[00:26:56] Chelsea: So I’ll be on this panel with many people that I work with on this coalition, the National Coalition for the No Tech Criminalization and Education Coalition. And we’ll be just sharing some of the work that we’ve been doing across the country, helping to raise awareness and build power amongst young people and people who care about them, to resist, and reimagine the role of technology in the classroom.
[00:27:20] Chelsea: I think this is a really important topic. Because first off, technology and surveillance is something that’s growing a lot in our public school system and our public schools, I think are an incredibly important public institution that we have not only for preparing young people for the future, but also for our democracy in terms of just raising people, the future citizens of this country who can think and engage civically, be grounded in history and things like that.
[00:27:46] Chelsea: And the role of technology and surveillance in schools is undermining that or potentially undermining public schools as a space for kind of preparing young people for the future, as well as pushing young people, particularly young people from marginalized backgrounds, black and brown kids, neurodivergent kids, queer kids, out of school and into the criminal legal system.
[00:28:10] Chelsea: So it’s, this is just a really important, topic to raise because of those issues, as well as seeing this is a potential opportunity to really think through the ways we could use technology differently, to serve the interests and needs of those kids. I think a lot of people really are not aware of what’s happening right now because a lot of the decisions that are being made around adopting a given technology are being made behind the scenes or in situations where there’s just not a lot of community or public awareness.
[00:28:39] Chelsea: So it’s not surprising to me when I talk to people and they say, wow, I’ve never heard of this issue before. And I think I guess one of the big reasons for this, that there are many different like social factors that are shaping the adoption of this one is narratives around safety and security that are tied to tragic events like mass shootings in schools, concerns around teen suicide, bullying online. So that’s one entry point for this technology. Another one is around thinking through widespread budget deficits, resource crises within schools and trying to figure out how do we supplement the limited resources we have in the classroom, to meet minimum standards. So this comes in the form of remote teaching, online tutoring, AI assisted education that is really trying to fill a gap that’s too big to fill with just things that are not actual real humans and people who care about kids in the classroom.
[00:29:38] Chelsea: So that’s another, way this stuff is coming in. Yeah, I’d say those are the two of the big ways that, that schools are thinking about adopting this stuff. One example would be there is a growth in the use of vape detectors, so, or like sensors that are marketed to detect a variety of things that are placed in bathrooms or gym locker rooms, and they’ll often be marketed both on the one hand as ways of, say for example, capturing the sound of potential gunshots or aggressive noises, but then they’ll also have functionalities around, detecting when somebody might be using a vape in the bathroom.
[00:30:14] Chelsea: And I like to use this example because first off, usually, yeah, parents are not aware that these things are being used, but I think also it shows the ways that something might be marketed as a solution to gun violence, but then gets used to, at the end of the day to day, routines to punish kids for something that ultimately is a public health crisis and is being increasingly used as being here in Minnesota where I’m from, in Texas.
[00:30:41] Chelsea: We know quite a bit of this is being used in Alabama and not only that is being connected to the criminal legal system, youth courts, drug courts that are being set up. So that’s a big issue. Another big issue though, in the form of kind of instruction and learning is thinking through software that’s installed on student devices, often, referred to as student activity monitoring software, which monitors keywords and different types of behaviors that students might be doing on their devices.
[00:31:10] Chelsea: So, where this comes up as an issue often is, for example, if a, an LGBTQ kid is looking up support resources online, they might be flagged for using words like “queer” or something like that, which get categorized as like sexually explicit content, and then can render them more vulnerable for punishment down the line.
[00:31:31] Chelsea: So yeah, so those are two examples. One of the most important messages for people who are starting to think about these issues is that I think especially parents and students and people who’ve been thinking about educational justice for a long time have an essential role in these conversations. Sometimes I think when words like AI or cutting edge technology get thrown around, people can kind of take a step back and think maybe I don’t have the expertise to have input in this conversation, and I would say that’s absolutely not true.
[00:32:04] Chelsea: And actually, the people who whose voices need to be heard the most are the people who, have direct experience with these things, as well as people who can help us contextualize these technologies within much, longer histories of, issues we’ve seen in the public school system, such as the school to prison pipeline.
[00:32:23] Chelsea: So this issue is important, especially right now, I think, because I think there is growing awareness across the country, that these technologies are being used and there are connections being made, both cross generationally between parents and students and teachers, but also across different school districts and different cities across the country.
[00:32:46] Chelsea: So through the notice coalition, we’re really trying to build a network of people, not only to raise awareness, but also to support ongoing grassroots efforts to raise important conversations around this topic. So, I think this is a moment of opportunity, more than anything else to not accept this stuff as inevitable, but really enter a conversation to change the trajectory of how things are going.
[00:33:11] Chelsea: There are, a few really great first steps that people can take to get involved with these issues. I think, at the local level, attending a school board meeting and starting to ask questions is a great starting point. School boards are often the place where these things do happen. And, they’re a great place to just start to learn what’s, what technologies your school has adopted, how much they’re spending, things like that.
[00:33:37] Chelsea: If people are interested in getting involved in a conversation with a broader community, they can reach out to the No Tech Criminalization and Education Coalition, the coalition that I’m a part of. We’re online. You can find us there. And we will be doing programming, workshops, just to bring people in and support them however we can.
[00:33:58] Chelsea: So for people who do attend a school board meeting, I think a great starting point for this is to ask some questions you can ask about specific technologies. So for example, if your student has a school issued device, we could ask questions around what types of software are installed on my student’s device to monitor their activities on their devices.
[00:34:20] Chelsea: What types of software are installed on their devices to, prevent them from accessing certain kinds of content? another important question is asking what data is being collected about my student and with whom is it being shared? In many places, we found that school districts are also sharing information with law enforcement agencies, and that’s been a big starting point for mobilizing people around these issues.
[00:34:46] Chelsea: So asking, does my school district have a data sharing agreement with the local police department is another great question. And then a last question can be around just the budget. How much is our school spending on student support services like social work, counselors, things like that versus security, hardware and software, policing and things like that.
[00:35:08] Chelsea: I’ve never seen a school district that spends more money on social work than they do on surveillance and policing. So I think that ratio is always important to look at and to be working towards reversing that trend. No technology is inherently bad or harmful. And I think that’s actually what’s exciting about all of this is that technology is a terrain for us to negotiate, how we want to relate to each other, how we want our institutions and organizations to treat the people that are a part of it more than anything else.
[00:35:42] Chelsea: And so I see AI and technology in general as a terrain for negotiation and a place for us to think about these bigger issues of how do we want to treat one another? How do we want to break cycles of discrimination or inequality and think about forging a different path? So, that’s what excites me about technology is I think it’s a space for opportunity more than anything else that often starts with naming the risks and the potential for harm.
[00:36:11] Chelsea: But I think that can easily transfer, translate over to a conversation around how we can do things differently. I think ultimately we’re working towards a world that will equip the people who have historically been the targets of technology to be the users of technology and the builders of technology that serves them and their interests.
[00:36:34] Chelsea: So the work of the people at this conference, I think, is really all about that. Not just bringing people to the table, but convening different kinds of conversations, and equipping people. Not only with the hard technical skills, but also the capacity to imagine a world that looks very different from the world we’re in now so that we can combine those together to, to build and use technologies that, center their experiences.
[00:37:00] Chelsea: If you want to learn more about me, you can find me on my website, which is chelspar.com, and if you want to connect with the coalition, you can find us by Googling the no tech criminalization and education coalition. Our website is housed on the Twin Cities Innovation Alliance web page, so you can go through there to find out more.
[00:37:25] Cassandra: My name is Cassandra Hendricks, I use they them pronouns. I work at MAC, Metropolitan Alliance for Connected Communities, and I’m a data consultant on the data services team. Grew up in Wisconsin, but have been in Minnesota for well over a decade now.
[00:37:44] Alicia: And my name is Alicia Ranney. I use she, her pronouns.
[00:37:47] Alicia: I’m the vice president of data and evaluation at the Metropolitan Alliance of Connected Communities, MAC. I am originally from Florida, but I’ve lived here in Minnesota and Minneapolis, for about 17 years. So today, we are here at Macalester college at the Data For Public Good conference, and we are here to speak about data justice, introducing our lens at Mac as sort of adding to the ongoing conversation of what is data justice?
[00:38:20] Alicia: What does that mean? and our organization focuses on what that means for human service nonprofits.
[00:38:28] Cassandra: Data justice to us is both looking at all of the data that is collected by human services, nonprofits. Typically stipulated by their funding requirements and stuff. So, you know, if you go in and you need a bag of food, you might be asked your gender identity, your name, the race of all of your kids, birthday.
[00:38:53] Cassandra: It’s like our lens is questioning that and pushing back on the requirements that end up falling most heavily on poor people and we see that as like an extra tax on them. So we want people to both be like thinking critically. And advocating where we can to push back on those systems.
[00:39:15] Alicia: I would add that Data Justice is really focused on the use of data and asking ourselves, well, what is this used for?
[00:39:25] Alicia: How can this be of service to the communities, to the work that we do, to further understand the challenges and problems that we face as a society, we take two main assumptions, which [are] the key talking points from our presentation today is that, like, all data should be useful and all data have costs and that it’s really important to weigh those two key aspects in all of our work so that way we are not, I think, like what you were sharing Cassandra, is like, we’re not, you know, wasting time on energy and resources and causing harm, for something that’s going to get shelved for something that’s just there to check off a box, for something there that’s nobody really actually ever thought about, well, why are we using collecting this information?
[00:40:15] Alicia: We don’t know. But really trying to emphasize that if we are collecting information, it should be useful. It should serve a goal of helping an organization, an institution, a people learn about the challenges that they’re facing or improve programming so that they’re getting better services.
[00:40:37] Alicia: and like just asking those critical features and questions.
[00:40:42] Cassandra: When we say like tax on poor people, like human service organizations, we’re talking about domestic violence organizations. Organizations that help unhoused youth food shelves and community centers, et cetera, a lot of which are predominantly used by folks who do not have access to those resources privately, right?
[00:41:04] Cassandra: So that is predominantly poor people. And when all of these data requirements are expected of, if you want housing, give me your full chemical dependency history, In detail. Give me, have you worked the last 52 weeks of this year? What were those jobs like? How much were you making? Some like more basic stuff that feels like doing your taxes to can be very private.
[00:41:30] Cassandra: And like Alicia was saying to end up in a database that’s then going to get sent to a funder, you know, these unknown locations necessarily, how is that going to come back and benefit those actual participants and bring in a trauma informed lens to that, right? Of if you are giving your data, you should have both access to it and a clear understanding of how this is going to benefit you.
[00:41:56] Alicia: There’s something that we hear a lot within our world of working with so many human service organizations. It’s just like, I think the folks, the staff that are frontline staff and working directly with folks are really trying their hardest to minimize that the number of times someone has to tell their story and storytelling is powerful.
[00:42:19] Alicia: It’s important. It’s, a really important piece to help learn about challenges or support someone in kind of coming into their own. But when you’re doing it in exchange for a basic need, and you’re doing it over and over again, and you know that, you get, you start to get the sense that nobody really is listening, right?
[00:42:40] Alicia: Nobody really cares, but I will walk through these hoops cause I really need to get, you know, this rental assistance today, or I would really like to just figure out some options to get out of the situation I’m in for the next couple of weeks, what are my options? So there’s an exchange that happens.
[00:42:59] Alicia: And I think the important piece is that exchange is necessary. In some instances, you want to show compliance that you’re actually working with a human being and that you’re not just making up your information, right? But there should be something that goes beyond that. And then if it is just for compliance, then what is the bare minimum that should be requested of an individual to be able to show a compliance that you did X, Y, and Z, and then what are the other parts that would be supportive to help them really gain autonomy and independence and choice.
[00:43:30] Alicia: And then lastly, most importantly, and you, alluded to this, Cassandra, is this like the importance of consent and consent is so important and we don’t, in the United States, I think we’re still at our earliest timeframe of understanding consent. Everybody likes to talk about HIPAA. HIPAA is fine. How many HIPAA, you know, consent forms have I filled out in my doctor office and going to here, yes, I’ll sign that and fill that out.
[00:43:59] Alicia: What does it mean when we’re providing our data to just anyone? I mean, and this reaches out into, and this is where that sort of overlap of the rest of the sort of data justice ecosystem sits with this is that like, you know, whether it’s an algorithm or AI or social media, you know, we are, have gotten into this practice of giving our information away and we don’t know where it goes, but we definitely know it does often and sometimes does not end up in the places we want it to be.
[00:44:30] Alicia: And that’s, you know, that’s something that we’re investigating at a micro level. I would say micro-macro level, depending on how small you want to put human service nonprofits. But, it’s something that in this sector over here, we’re looking at investigating and saying like, you know, we should do, what are the ultimate best practices that we can push forward as a sector and really elevate that, and then bring that forward from a human service nonprofit sector.
[00:44:58] Alicia: Not necessarily someone swooping in and saying, Hey, you know, maybe you should do it this way. And we’re like, okay. But we’re really trying to think about, well, you know, all these folks have been working in the field, doing their work for decades, some of them, and they know what’s best. There’s a lot to learn.
[00:45:16] Alicia: And I think that they know when there’s this feeling of trauma informed, practices. And then collecting an inventory of information before you can even have a conversation about what they’re, they’ve showed up for is not a trauma informed practice. And so we’re just like, sort of thinking about that and ways in that we can bring that up and help build more partners and practitioners and bring along the larger institutions who are often asking for the data you know, into this conversation, and that’s, I think, ultimately where we’re getting to today, but we’re just doing an introduction to data justice today and building sort of an understanding of what that looks like and how it shows up into nonprofit human service organizations today.
[00:46:03] Cassandra: Yeah, one thing I would love to add is that we have seen good models of there’s an organization, for example, that has a survivor’s council made up completely of folks who have done, or experienced sexual exploitation. They’ve done some type of sex work, as youth and inform the, like what the intake looks like for that organization. And some of, I mean, working with funding requirements and stuff, but like looking at the language that’s used, how is that interview conducted, et cetera, informing that process.
[00:46:39] Cassandra: And that feels like a peek into what’s possible. but it’s certainly not standard across organizations at this point.
[00:46:47] Alicia: You know, we’ve talked a bit about what is data justice and that’s like, hopefully they’re walking away with an understanding of what that looks like and how that shows up in our ecosystem.
[00:46:56] Alicia: And maybe also coming away with ideas and maybe, just a thread of inquiry into the avenues in which they’re coming across similar practices, whether it’s in a nonprofit human service organization or at their university or school, or maybe they go and they find themselves in philanthropy or in government, all of, you can take this same line of inquiry into all of these different sectors and ask the same thing.
[00:47:24] Alicia: Well, what is this data used for? What, you know, what purpose does it serve? What are the participants experiences as they are giving their information away? How is this being used? What decisions get made, as a result? What actions are taking and if there are, if they get to the decisions, actions, and, or like, we don’t know.
[00:47:45] Alicia: And the answer is, I don’t know, or we’ve just always done this way. We hope they maybe come away with some sense of empowerment that says, well, maybe we shouldn’t collect it
[00:47:57] Alicia: if there’s no purpose, if it doesn’t have a direct action or decision related to it, maybe we should leave it aside and reduce that amount.
[00:48:06] Alicia: It improves your data that you want to collect, because you’re then getting higher quality with less of a long stream of information. You’re able to get more for the less data that you get and asking more data and then also throwing away half of it because you realize you don’t really need it after all.
[00:48:28] Alicia: I would say the other piece are just coming away from like with some ideas of inspiration. Cassandra mentioned the like, constituent groups as like one method of bringing in community to help interrogate and ask themselves what would be useful for us? What do we want to learn? What do we, what would be helpful for us to know and help those construct the inventories, the surveys, the intakes, the questionnaires that get collected, and then also we will hopefully come away with some ideas of like, well, how do I get started? Like, where do we begin? And because like, you don’t have to go start with like a really hard conversation with your, you know, your grant partner, your government contact person about like, this, what is data justice?
[00:49:18] Alicia: You can just start with taking a looking at your intake and reviewing some of these questions and asking yourself, well, do we still use this? Is this required? Could this be worded in a way that would be more supportive of an individual’s identity and well being? I mean, I think it’s always important to ask, well, you know, if you’re a consumer or a constituent or the recipient of like, “can you please fill this out.”
[00:49:43] Alicia: Ask them, well, what is this used for? And you can kind of tell at that point if the staff person has had any other conversations about data justice and how connected they are even to that inventory or survey or intake, because if they don’t know, that’s unfortunately just a common experience. But the more of us that can start asking ourselves, well, what is this used for?
[00:50:04] Alicia: What decisions are going to be made with this information? Can I get a report back on this information? When could I expect it? Like these simple questions will start to say, like, Oh, we should maybe think about these things and, as a community, we should come up with some decisions so we can be ready to answer these questions because I think we’ve collectively taken it for granted.
[00:50:29] Alicia: I certainly have in my history, like I haven’t, I wasn’t born understanding data justice. I had a trajectory and a journey myself. And I think, these are questions though, that would have been helpful for me to hear years and years ago when I was, you know, a program manager and working with people, so.
[00:50:46] Cassandra: Yeah, and I think we’ve also seen collective action already taken that we can similarly look to as examples.
[00:50:55] Cassandra: Every time the census is collected, there is huge debate around the questions on the census. Most recently the citizenship question and you know, even interrogating like, what’s on my driver’s license? Like, do they actually need to know X, Y, and z? On this thing that I carry with me, in order to drive a car regarding gender identity et cetera.
[00:51:17] Cassandra: So both like looking to groups that have already been doing that work. And, you know, what we’re doing too, is we’re not just talking about data justice. Like we are working with data administrators at a lot of the human services nonprofits we work with. We’ve formed a cohort and are looking to have meetings with funders and like ask those questions because a lot of times part of what our nonprofits experience is they receive the grant and then they see all the stipulations of like, oh yeah, and by the way, to get that 50, 000, you’re going to have to tell us all of these things and figure out how to collect that.
[00:51:53] Cassandra: So there are ways to push back like individually, like you were saying, when you experience it as a consumer and a doctor’s office, et cetera. But also, you know, there are ways collectively of what is your school asking if you’re bringing your child and that they need to know about your kid. Like is all of that information actually useful?
[00:52:13] Cassandra: How often is it being updated? Where is it being stored? Et cetera.
[00:52:17] Alicia: Yeah. And when you see industry leaders that are actually doing good work, I think that movement around trust based philanthropy movement around offering more gen ops funding. I would love to see more folks in philanthropy just elevating like general organizational evaluation practices instead of a structured or, you know, specific report back.
[00:52:42] Alicia: Like, well, if a portion of your funding goes back to, the organization for evaluation, let that organization tell you what they want to learn about their work. And then, but let that all of the funds collectively help them do that work. Not just do 1500 different ways of learning about one specific thing that one funder is interested in, but use like sort of a gen ops model for, evaluation, learning, and storytelling.
[00:53:10] Alicia: So that way human service nonprofits can sort of focus their efforts and we’re seeing some of that. And so as constituents as folks who are watching this. If you see that too, give that kudos, tell them they’re going to, they should keep doing that. And, and share those, leaders out with other folks.
[00:53:30] Alicia: So that way we can know who to look to as examples and mentors in this movement. I think we’re in a moment right now, which makes all of this conversation super timely where. The rules haven’t yet been made for better for worse and I think that we have an opportunity here to think about what we’re doing before we do it and in sort of context of AI and context of the large amounts of information that are given away and sometimes stolen and, used just on the internet. Like there’s other pieces and avenues and groups in which data is also being collected. And it’s not, I don’t think as egregious or scary as say, like having, you know, all of my banking data stolen and then finding it on the dark web.
[00:54:22] Alicia: Right. But it’s part of that conversation. It’s about, sort of almost aligning yourself as an individual with your data and say, This is a part of me. This is not something that’s distinct from myself, my body, my identity. This is actually mine. Because it is, it’s your data. And so I think that we’re, we have an opportunity at this moment to ask ourselves, well, what do we want to see?
[00:54:52] Alicia: What world do we want to build with all of these technologies that we now have the opportunity to support us with? What ways do we want to ensure that we don’t induce or, cause harm or further harm? I think that all of these pieces are really important, I think, for nonprofits too, I guess, one last point is that I think we’re going to be, we’re ushered, we’re slowly being ushered into an age of compliance based reporting, and I think that there’s like this fear that, that human services are out to game the system, which is entirely untrue.
[00:55:30] Alicia: There’s been a couple of really notorious news stories of as of late, but it is by far not even close to representative of the hard work and the community building efforts and the expertise and the people who are out there on the ground, like really holding our communities together in the challenging times with housing, healthcare, child care costs, uncertain general uncertainty. You know, there are we sort of MAC is representative of a network of groups of organizations that are doing a really good job and working so hard. And it isn’t that they don’t want to collect data. It’s that they just want to do it so that and ensure that they’re actually learning something from that information that it isn’t just being taken for granted and given away or doesn’t have a specific use and I think that’s the opportunity that we’re at right now.
[00:56:28] Cassandra: Yeah, I would just add to that, that it’s a moment also for solidarity. It feels like a lot of the concern around data, et cetera, right now is coming from people who haven’t historically had all of their data collected. Right. So we’re talking about class in that.
[00:56:49] Cassandra: And if you have never used human services, because a lot of these data practices and human services, regardless of whether or not they’ve been in a database, the collection of it has been happening for a long time. And these are folks who also have cell phones, are using apps, are using banks, et cetera.
[00:57:07] Cassandra: And how do we like widen the net, and our lens of not just like resort to individual panic of like, oh no, my banking info, my, you know, use of a store’s website, et cetera, that got leaked. I went to the University of Minnesota and like got notified that my information got leaked, but that is obviously like being able to attend higher education is a massive privilege.
[00:57:40] Cassandra: And I didn’t experience, it’s not like my data was collect, like some people have had their data collected by human services since they were a child. And so thinking about the like scope and breadth and that this is widening that net. Like we want to push like wider to see who is all actually impacted at all the different levels, not just by, I think some of these like newer technologies.
[00:58:04] Alicia: Where’s the intersections of privacy and data and what we need to get by. I mean, I think that there is a commodification of data. It’s evident. I mean, we share that, like, I mean, in essence. It’s especially for compliance based reporting and especially kind of getting ushered into this sort of next icky era of being accountable.
[00:58:31] Alicia: It’s just like there’s an exchange. So it’s almost like data is a form of currency in exchange for basic things. I mean, we go to the grocery store right now and we can exchange currency, but like if I don’t have enough money right now to pay my rent. I’m making some choices. I might go to the food shelf and I exchange my data for a bag of food.
[00:58:51] Alicia: And so that organization exchanged data for actual currency so they can continue their work. and I’m not saying that there’s anything wrong with compliance and accountability. I think that is important. I think we just need to be specific and thoughtful of how much is necessarily required.
[00:59:11] Alicia: And I think going back to your question though of like privacy, I think that, yeah, we don’t… That, that needs to shift, and it isn’t something that gets talked about as much here in the US as it has in, Europe and, other places around the world, like I’m, that’s where, I mean, I first found the concept of data justice was not in the United States, but it was in Europe, and it was like off of data justice website focusing on AI and social media and our data there. But I think that there is like an importance of like understanding data as an extension of who you are and that you have a right to that information. Someone just can’t extract that and cut that away from you and say like, thanks. I’ll take this from here.
[00:59:56] Alicia: It does not become their property. It is still, it’s about you. I mean, what is it? No data about us without us, right? I mean, and so if it’s about you, if it’s you, then you should have an inherent right to that data. And I don’t think that we’re all there. I think that we have a long way to go and there’s certain corporations and companies and folks in this world who think that nah, I can, once you give it to me, it is mine and I can do with it what I want. And I just, I think that’s a fundamental framework that we need to push back against. You know, you want to say the motivation is well intended. I think that there is an idea around being able to learn about a situation, understand a group of human beings, understand what challenges and barriers they’re facing.
[01:00:46] Alicia: So that way you can change the world and change that experience. But I don’t think that’s the purpose of why human services exist. They exist as a important social safety net that is able to catch human beings in the face of systemic oppression and face of society and face of all of the factors that are weighing against human beings and communities in the world.
[01:01:11] Alicia: And they are that safety net that we depend on. And I think the folks that provide important donations and funding opportunities are gearing towards, you know, they don’t want to see youth homelessness. They don’t want to see people going hungry. They don’t want to see folks who have to make hard decisions about, you know, whether I should stay with this, you know, person who is unsafe for me and my family because it’s a house or whether I could go somewhere else, but I have nowhere else to go to, you know, the folks who, the intentions of human services are noble and important, and they fill an important void that we, as a society, like, just have, and I think data and information and learning, well, is in a, you know, has just sort of organically grown of this, like, well, we should collect some data on this.
[01:02:04] Alicia: I think it has come from a sense of distrust. I think that goes back to the eighties sort of Reagan era of like, you know, of people thinking that folks are there just to take advantage of the system and like they’ve got nothing else to do, but take advantage of the system. And like, so you have to have data and compliance to make sure there’s nobody taking advantage of the system.
[01:02:26] Alicia: But I can tell you, like, that’s a lot of work to do that just to take advantage of a system and that people are not arriving nine times out of ten because they want to get, you know, save money or get a free ride that doesn’t exist. You know, they’re there because this is an avenue that’ll support them and find finding a better solutions in their lives that they don’t have otherwise any other options to get.
[01:02:54] Alicia: So it has been, I think, steeped in accountability. Proving that the people you’re working with have enough problems to justify the dollars that are getting spent. And I think the movement of trust-based philanthropy is seeking to flip that and that data justice is part of that conversation there of like, we actually trust our human service organizations with the resources to make the decisions about what programming they think is most necessary to have.
[01:03:25] Alicia: And I think we can also trust that if they had the resources to funnel all of that information into real evaluation and learn about what ways they could improve their programming, like there was a question, I think, from the last Data 4 Public Good conference that has stuck in. It wasn’t a question.
[01:03:44] Alicia: It was a statement. But the group that was on stage had shared that instead of asking the participants what barriers they face in life, they asked their participants, well, what barriers do you see in our programming that help you that are keeping you from succeeding? Cause we all have barriers and challenges, right?
[01:04:06] Alicia: But like, what are those barriers and programming and services that are keeping you from succeeding? If we could invest all of that time, effort, and resources into that, we would have some of the most amazing programming out there. I mean, it’s still amazing, but it’d be even better. I think, in terms of being able to have that understood, and then how can we shift that back to the community, you know, so that they can have a piece of that as well.
[01:04:30] Alicia: Just because you can collect it doesn’t mean that it is useful. If you could collect more data doesn’t mean it would be helpful. Just because it’s a new shiny AI that can look at your data in new ways doesn’t mean it will actually improve your programming. And so, like, I think, like, as we’ve had more databases, as that’s become more of a practice, especially in human service organizations around the world, right, you know, the idea is like, well, we should collect more, you know, we see a virtual buffet of data collection points that we could possibly ever ask.
[01:05:03] Alicia: And if the answer is like, well, it could be useful maybe in the future, that’s not good enough. That is not good enough for right now.
[01:05:11] Cassandra: And I would just add that the government entities that fund. So much of this, right, because when we talk about funders, we’re like talking about the government, whether it be like county government, state government, as like a huge piece of it, those offices also have to constantly defend the amount of funding they have human services are incredibly politicized in the society we currently live in. And there is the idea that like. The more data we have, the more storytelling we can do, then like, that’ll like keep us securing the funding. And we don’t do that with a lot of other sectors. And it is like a very big question to like push back on all of us with of why do we, like, why do we make people who are in crisis and in need, like defend that need?
[01:06:00] Alicia: My dream. And I think I’ve alluded to this a little bit is that nonprofit human service organizations get funding to invest in their data collection practices to ensure that it’s useful and being used and that they have potentially funding and resources to be able to bring together their constituencies and their communities that they’re working with to be informed, to educate themselves about the challenges and issues that they’re up against and bring forward ideas
[01:06:36] Alicia: that are, that could be supportive to that programming and that potentially, you know, the future human service nonprofit is getting paid for all of the different avenues of work that help meet their mission. And then there’s a source of funding that ensures that they can do an organizational evaluation.
[01:06:55] Alicia: That’s not being split, sliced and diced apart into 500 different funding streams, which is the current case, and then retranslated and reinterpreted to meet this funder need and this funder need and this funder need and that funder need, but and then they don’t, that’s where all of the time, energy and resources going right now. In the future,
[01:07:17] Alicia: I think it’s where they can be able to have, like, this is our organizational report. All of my 50 funders here, this is the report I did. I’m not changing what we’re did just for your eyes, but thank you for your funding. So you know, you made our programming work this year and this is what we did with it.
[01:07:37] Alicia: This is what we learned. This is what didn’t work. This is what we’re going to change in the future. As a result, that type of learning is, you know, such a privilege to be able to do and doesn’t happen enough right now, but in the future. I think that’s a place of where data justice is.
[01:07:53] Cassandra: Yeah.
[01:07:53] Cassandra: Thank you. I would just build on that. And when we talk about human service, like direct human services, we’re talking about care, right? We’re talking about your ??? situation and like you need to receive care. and in like facing that crisis. And that care is child care. That care is like having one on ones with someone who can like talk with you, therapy, et cetera, all of those things.
[01:08:19] Cassandra: And we’re talking about all of these different spaces for people to receive care around their basic needs. And what I would love to see is that we get back to that vision. And fund it fully and that it’s trust based like you’re alluding to, like, what if we trusted our case managers that they had the conversations they said they did, that they, and that the participants are who they say they are, that they are experiencing what they say they’re experiencing.
[01:08:51] Cassandra: And that, you know, what we see currently is so many organizations, like the data is like a curmudgeony, like, and then I have to like go do all my case notes and I have to like put all of this data in about these people, like no one goes to school for that. No one is like excited about doing any of their social work, like that piece of it, but it could be it, like, it doesn’t have to be that, like, what if it was actually really exciting to see and there was autonomy in that.
[01:09:20] Cassandra: I think that the, like curiosity and like getting back to the vision of, well, what is all of this for in the first place? And like refocusing all of us because like what if all of that sprawl that you’re talking about of like the compliance reports have to go to someone who has to look at it and that’s an office.
[01:09:40] Cassandra: Like, those resources could be going elsewhere, right? Like those resources could actually just go to the work, go to participants, go towards housing stabilization.
[01:09:50] Alicia: So if you want to learn more about this important work, I think folks can check out our website or at mac-mn.org. You can Google data justice.
[01:10:01] Alicia: and MACC and you can get brought to our data justice page and then we are going to be having a week of action, this September, data justice week of action, with TCIA and MAC are co sponsoring this through our data future, data justice futures initiative and we’ll be having website information go live, I think in the next week.
[01:10:27] Alicia: I don’t have a web page for you just yet though.
[01:10:30] Cassandra: Yes. And on that will be our data justice report that we’re releasing as well.
[01:10:39] Helen: Thank you for spending some time with us today. We’re just getting started and would love your support. Subscribe to Creativity Squared on your preferred platform. podcast platform and leave a review. It really helps. And I’d love to hear your feedback. What topics are you thinking about and want to dive into more?
[01:10:55] Helen: I invite you to visit CreativitySquared.com to let me know. And while you’re there, be sure to sign up for our free weekly newsletter, so you can easily stay on top of all the latest news at the intersection of AI and creativity, because it’s so important to support artists, 10 percent of all revenue creativity squared generates will go to ArtsWave, a nationally recognized nonprofit that supports over 100 arts organizations. Become a premium newsletter subscriber or leave a tip on the website to support this project and ArtsWave. And premium newsletter subscribers will receive NFTs of episode cover art and more extras to say thank you for helping bring my dream to life.
[01:11:35] Helen: And a big, big thank you to everyone who’s offered their time, energy, and encouragement and support so far. I really appreciate it from the bottom of my heart. This show is produced and made possible by the team at Play Audio Agency. Until next week, keep creating.