Young or old, rich or poor, technology is changing our world and our interactions faster than we can adapt. How do we look out for ourselves and each other at the different stages of life: in school, parenthood, work, and our hopes for our future selves?
For our 62nd episode, Creativity Squared has partnered with the Twin Cities Innovation Alliance (TCIA) for the third and final installment of a special three-data justice series. The intention of these conversations is to invite the audience to reimagine our relationship with the future.
TCIA is a coalition of cross-sector stakeholders building and developing problem-solving ecosystems in collaboration with communities.
These interviews feature the distinguished speakers from TCIA’s 2024 conference Data 4 Public Good (D4PG). D4PG taps into the collective power of community-based changemaking through technology, democracy, and justice. The timely and important themes from these interviews include co-powering, digital justice, data privacy, A.I. in education, Afrofuturism, and the power of narrative for social change.
Today’s episode guests include:
Whether you’re a parent, community member, creator, designer, or just somebody who values their work-life balance, the conversations below hold powerful insights for navigating the uncertainties unfolding before us — enjoy! And if you missed the first two parts of the series, be sure to catch up on part one and part two!
To participate in the “Data Justice Week of Action” taking place September 16-20, 2024 visit: https://www.tciamn.org/data-justice-futures
Also, mark your calendars for July 15-20, 2025 when the D4PG conference will return to Macalester College in the Twin Cities. Sign up for TCIA’s newsletter so you don’t miss the opportunity to join next year: https://www.tciamn.org/d4pg
Dr. Catherine Squires
Dr. Catherine Squires, a writer, editor, and yoga teacher from St. Paul, Minnesota, presented findings on the impact of COVID-19 on technology use and surveillance in schools. Her work, in collaboration with the Midwest Center for School Transformation and Dignity in Schools campaign, analyzed nationwide survey results and focus group data from students and parents.
Squires reported that 77% of respondents across 29 states said their schools were using surveillance technology to monitor student behavior and activity online. This adoption has been driven by concerns about school safety and student mental health, particularly following school shootings and the pandemic.
She noted that there’s little evidence these surveillance technologies prevent the issues they claim to address. Squires pointed out that even some companies selling these technologies admit they can’t prevent violence, as most school shooters are from within the community.
Squires outlined several key points about school surveillance technologies:
She emphasized the need for transparency, suggesting parents and community members ask about the contracts schools sign with third-party companies and the scope of surveillance conducted. Liz Sullivan-Yuknis and Ruth Idakula of Partners for Dignity called for similar action during their remarks in Part 2 of our D4PG recap.
Squires mentioned that students with special needs, as well as Black and brown students, often face greater scrutiny from these systems.
She recommended several actions:
Squires advocated for redesigning school environments based on student needs and input. She stressed the importance of building supportive, enriching spaces that prioritize relationship-building.
Regarding decision-making, Squires noted that surveillance technology choices are usually made at the district level or department of education level. She pointed out that practices can vary significantly even between neighboring school districts.
The full report, including recommendations, will be available on the Dignity in Schools campaign website.
She concluded by emphasizing the importance of public education and the need for those most impacted to be central in designing future educational systems.
Dr. Catherine Squires (Writer, Editor, & Yoga Practitioner) – Dr. Catherine R. Squires has engaged in multiple community partnerships in the Twin Cities to uplift and share local Black histories, support BIPOC writers, share accessible yoga practices, curate panels, host conferences and facilitate intergenerational story sharing. Catherine is the author of multiple books and articles on media, race, gender, and politics, including Dispatches from the Color Line (2007) and The Post-Racial Mystique (2014), and the edited collection Dangerous Discourses: Feminism, Gun Violence & Civic Life (2016). After a two-plus decade career in academia, she retired from her position as Associate Dean of the Humphrey School of Public Affairs and Professor of Communication Studies at the University of Minnesota, where she is now professor emerita. Dr. Squires is currently working to support the work of local organizations, editing, writing, and facilitating healing spaces.
Dr. Michael Dando
Dr. Michael Dando, an associate professor at St. Cloud State University, collaborates with young people to help them tell their own stories and share them publicly. His work centers on popular culture, particularly comic books and hip-hop, as tools for civic literacy. He explores what these media teach about society rather than just using them as educational props.
Dando emphasized the fundamental importance of stories in human society, arguing that storytelling is even more crucial than the invention of fire because knowledge spreads through stories. He sees storytelling as a powerful way for youth to engage with societal challenges and make their voices heard.
In his view, comic books and hip-hop are widely consumed forms of popular culture that serve as effective vehicles for storytelling and public engagement. Dando works with local artists and youth to develop storytelling skills, aiming to create a collaborative space where young people can feel they’re making a difference in their community.
Dando spoke about the importance of collective action in addressing societal issues. He views events like the Data for Public Good conference as vital for building a group of people committed to serving the public interest.
When discussing the future, Dando described a vision where people can be their authentic selves without fear, benefiting others without causing harm. He framed this as collective flourishing rather than just individual freedom.
Dando encouraged wide-ranging reading, from academic papers to science fiction, as a way to understand how people have approached current challenges. He also stressed the importance of creating and leaving one’s mark for future generations.
Acknowledging the unprecedented nature of building a diverse, democratic society, Dando characterized our current efforts as an attempt to do something entirely new in human history.
Dr. Michael Dando (Author, Artist, Educator, & Scholar, St. Cloud State University) – An award-winning author, artist, educator, and scholar with two decades of classroom experience, his research and writing explore ways teachers and schools collaborate with communities to build collective, civically engaged, democratic opportunities and systems for social justice education. Particularly, his research examines ways youth employ various cultural forms, including hip-hop and comics, to construct social, cultural, and political identities and literacies that generate educational opportunities for sustained, critical, democratic engagement for social justice.
Dr. Michael B. Dando earned his PhD in Curriculum and Instruction with a focus on multicultural education from the University of Wisconsin-Madison and is currently an Assistant Professor of Communication Arts and Literature at St. Cloud State University in Minnesota.
Shreya Sampath
Shreya Sampath, a rising sophomore at George Washington University and the director of US chapters at Encode Justice, brought a unique perspective to D4PG as a public school student who’s now advocating for human-centered artificial intelligence and algorithmic justice.
Despite the prevalence of surveillance tech in schools, students, parents, and even teachers often aren’t aware that they’re being monitored. School-sponsored computers frequently come with pre-installed monitoring software that tracks student activity, including emails and browsing history. This surveillance extends to school-given email accounts, potentially monitoring students even on personal devices.
Sampath highlighted the negative consequences of this pervasive surveillance:
She said that while these technologies are often implemented with good intentions, such as student safety or mental health support, there’s little evidence of their effectiveness. Instead, they risk making students feel less part of the community and more scrutinized.
Sampath called for greater transparency from schools about their use of surveillance technologies. She encouraged students and parents to ask questions about what data is being collected, how it’s being used, and who has access to it.
As a call to action, Sampath urged listeners to:
Sampath’s vision for a better future includes schools where students feel safe and supported without constant surveillance. To get there, she says that school officials need to include student voices in discussions about technology implementation.
Shreya Sampath (Sophomore International Affairs And Economics Major, The George Washington University) – Shreya Sampath is a rising sophomore at The George Washington University studying international affairs and economics. Passionate about representing Gen Z in the equitable technology conversation, Shreya is an executive member of Encode Justice, a youth-led coalition fighting for human rights in the digital age. As Director of Chapter Projects, she advises more than 20 U.S. chapters advocating in the privacy, surveillance, and A.I. governance space. She also co-led an investigation into school surveillance, partnering with ACLU-NJ. Recognized by the Princeton Prize in Race Relations for her work, she is excited to mobilize local communities to fight for algorithmic justice.
Sophie Wang
Sophie Wang, an artist, zine maker, and educator from Minneapolis, outlined a powerful framework of “algorithmic ecology” for understanding and challenging harmful algorithmic systems.
Algorithmic ecology is a concept developed by Stop LAPD Spying Coalition and Free Radicals. This four-part framework helps map the fight against problematic algorithms or programs. It consists of:
To illustrate this framework, Wang used the example of PredPol, a crime-prediction model that Los Angeles used for 10 years before public pressure killed the program in 2021. In that time, critics say that PredPol worsened racial profiling while failing to accurately predict future crimes. Operationally, LAPD used it to create “hotspots” for increased policing. Institutionally, the program was supported by academics, City Hall members, and nonprofits. Ideologically, it connects to broader issues like gentrification.
The algorithmic ecology framework makes it clear that community campaigns to challenge public tech interventions like PredPol need to be abolitionist. Removing one program doesn’t eliminate harm, as the entire ecosystem must be addressed. This approach creates more targets for campaigns and ways for people to get involved.
A key point in Wang’s talk was challenging the notion of “neutral” technology. She argued that we must look beyond “dirty data in, dirty data out” and question the fundamental purpose of these algorithms. Wang stressed the importance of asking why technologies exist, who benefits, and who is harmed.
Wang encouraged listeners to take action by:
Wang’s vision for a better future includes redirecting resources from surveillance to community support and improving working conditions for educators and healthcare workers.
Wang’s presentation offered valuable insights for critically examining technology’s role in society and working towards community-centered solutions. Her framework provides a structured approach to understanding the complex ways these systems affect our communities, encouraging a more holistic view of technology’s societal impact.
Sophie Wang (Agent for Algorithmic Justice, Zine maker, and Artist) – Sophie Wang is a researcher, educator, artist, and zine maker currently based in Minneapolis and with roots in Los Angeles. She makes zines/comics/art that bring a critical power lens to science, technology, epistemology, and forms of knowledge-making. Her work draws from experience organizing with radical scientists/knowledge workers and working on collaborative campaigns as part of Free Radicals with Stop LAPD Spying Coalition against police surveillance technology and the use of predictive policing by the LAPD.
Heather Willems
Heather Willems, a visual strategist and founder of Two Line Studios in Minneapolis, Minnesota, spoke toi attendees about visual strategies to beat burnout.
Willems has seen the the importance of addressing burnout first-hand, drawing from her personal experience of not recognizing its impact on her leadership and team interactions. She introduced the concept of visual strategies as a tool for storytelling and problem-solving, noting its effectiveness in corporate settings.
Willems uses visual strategies as a powerful tool for storytelling and problem-solving, especially in corporate settings. Echoing Dr. Eric Solomon’s remarks from Part 2 of our D4PG recap, Willems outlined several areas to address burnout and build resiliency. These included managing workload by creating awareness and identifying tasks to remove or reschedule, finding areas of control in daily life to exercise autonomy, and incorporating moments of celebration to recognize successes, both big and small.
The importance of addressing burnout, according to Willems, is heightened by global uncertainties in politics, economics, and the environment, as well as rapid technological changes, including the integration of A.I. tools. These factors contribute to increased pressure and expectations at work.
Willems draws inspiration from the people she interacts with, valuing new perspectives and ongoing learning. Her vision for a better future involves integrating compassion into leadership across various sectors, including universities, nonprofits, and corporations. She believes that training future leaders in compassionate leadership will have a significant impact on organizational cultures.
Visual strategies and compassionate leadership can both helip create positive changes in work environments and organizational cultures. She suggested that small, regular actions can lead to significant changes in how people show up in their work and personal lives.
Willems’ book, “Draw Your Big Idea,” offers visual strategies applicable to burnout and general business strategy. She offers additional resources through her LinkedIn and Twoline Studios.
Heather Willems (CEO & Visual Strategist, Twoline Studios) – Heather Willems is the founder of TwoLine Studios and bestselling co-author of “Draw Your Big Idea,” revolutionizes creative problem-solving in business through art and storytelling. A Graphic Facilitation pioneer, Heather’s global influence is marked by her transformation of complex business strategies into visual narratives. With a Stanford University certification in compassionate leadership, she is dedicated to fostering empathetic leadership and mitigating burnout with creative practices, promoting a workplace culture of compassion and understanding.
Thank you to all of D4PG’s distinguished speakers for joining us on this special episode of Creativity Squared.
This show is produced and made possible by the team at PLAY Audio Agency: https://playaudioagency.com.
Creativity Squared is brought to you by Sociality Squared, a social media agency who understands the magic of bringing people together around what they value and love: http://socialitysquared.com.
Because it’s important to support artists, 10% of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized non-profit that supports over 150 arts organizations, projects, and independent artists.
Join Creativity Squared’s free weekly newsletter and become a premium supporter here.
TRANSCRIPT
[00:00:00] Sophie: It’s always important to look beyond just the program or just the software or just the algorithm or just the technology. It’s always important to ask, why does this technology exist? Who makes it possible for this technology to exist? Who benefits from it and who is harmed by it? Because just starting with, how can I fix this technology to make it less racist or sexist or less biased?
[00:00:27] Sophie: The real answer and the root answer to that is never going to be within the actual technology itself. It’s always going to be somewhere in the larger ecosystem of that technology.
[00:00:43] Helen: Welcome to Creativity Squared. Discover how creatives are collaborating with artificial intelligence in your inbox, on YouTube, and on your preferred podcast platform. Hi, I’m Helen Todd, your host, and I’m so excited to have you join the weekly conversations I’m having with amazing pioneers in the space.
[00:01:02] Helen: The intention of these conversations is to ignite our collective imagination at the intersection of AI and creativity to envision a world where artists thrive.
[00:01:19] Helen: For the 62nd episode, Creativity Squared has partnered with TCIA, the Twin Cities Innovation Alliance, for a special three part data justice series. The intention of these conversations is to invite the audience to reimagine our relationship with the future. TCIA is a coalition of cross sector stakeholders building and developing problem solving ecosystems in collaboration with communities.
[00:01:50] Helen: These interviews feature the distinguished speakers from TCIA’s 2024 conference, D4PG. Data 4 Public Good. D4PG taps into the collective power of community based change making through technology, democracy, and justice. The timely and important themes from these interviews include co-powering, digital justice, data privacy, AI in education, Afrofuturism, and the power of narrative for social change.
[00:02:24] Helen: Today’s episode guests in part three include:
Catherine: I’m Catherine Squires.
[00:02:29] Mike: My name is Mike Dandle.
[00:02:31] Shreya: My name is Shreya Sampath.
Sophie: Hi, my name is Sophia Wong.
Heather: Hi, I’m Heather Willems.
Helen: For more information on these speakers and the topics they discuss and to support their organizations, visit CreativitySquared.com for the blog post accompanying this episode.
[00:02:49] Helen: Also, mark your calendars for July 15th through 20th, 2025, when the D4PG conference will return to Macalester College in the Twin Cities. With that, the mic is in the hands of our guests and their thought provoking interviews. Enjoy.
[00:03:11] Heather: Hi, I’m Heather Willems. I am a visual strategist and founder of Two Line Studios. I’m living here in Minneapolis, Minnesota, and today I get to talk at Data 4 Public Good about visual strategies to beat burnout. The topic of beating burnout is really important to me. One, because from my own personal experience, I didn’t recognize I was going through burnout.
[00:03:38] Heather: And it just really impacted the way that I showed up for my team. It impacted my leadership and how I was showing up in my life. So I want to share [these] compassionate leadership techniques that you can do to restore your energy and just really show up as your best self in any situation. I come from an arts background.
[00:03:58] Heather: And so thinking visually and telling visual stories is second nature to me. It was only when I started working with business executives and corporate leaders in the fortune 100 companies that I recognized it was an incredible tool. So visual strategies, I mentioned, it’s a visual storytelling technique.
[00:04:17] Heather: It could be in the form of graphic recording, which Ywo Line studios provides, but we also create a lot of visual templates and just drawing as a way to widen your perspective, shift the way that you look at things. And it’s injecting a little bit of fun and creativity into something that’s really hard.
[00:04:38] Heather: If you can’t have fun with a problem, you’re really never going to solve it. So visual strategies is a way to connect with some of these really difficult topics around burnout or maybe not feeling your best when you’re at work. And it’s a way to start shifting the way that you think about things and map it out so you can start to take action to improve.
[00:05:02] Heather: The key messages that I’m interested for people to really recognize coming out of the session today is that one, burnout is real. It is something that is not something that you can fix on your own. It’s not something that you need to be personally stronger or have your own personal, like resilience to overcome this.
[00:05:25] Heather: This is a cultural and environmental situation. So one thing’s burnout is real. Second thing, you don’t have to do it alone. And the third thing is that there are, we have six different ways that you can start to take tiny actions on a regular basis to build up the way that you show up every day and to start [to] have a ripple effect that can change the culture of your environment.
[00:05:54] Heather: So there are six areas that we address when we’re talking about burnout and building that resiliency that you need in your everyday life. So one is workload. The first thing, always the first thing is to create awareness of what your situation is. So maybe it’s looking at your calendar and finding one thing that you can take off your calendar for that day or move to the next day.
[00:06:20] Heather: Another area is control. We want autonomy in order to feel safe in our environments. So the question is like, look at your life and where’s one area that you can identify where you have control over the situation. Like maybe it’s your workload that you need to deliver during that day, or maybe it’s the environment that you sit in.
[00:06:41] Heather: Like maybe you want to take and sit outside for a couple hours during the day versus in the office. So where’s one area that we, you can have control in your daily life. And then the third area is celebration. Celebration and rewards are huge. I was just in a strategy session yesterday, and we talked about if you’re not celebrating your successes, your strategy is never going to work.
[00:07:06] Heather: So where can you incorporate celebration into your daily life? Maybe it’s having coffee with friends, or maybe it’s just writing yourself a like, “Way to go! You did a great job today!” And putting that post it note on your computer. Like how can you find a moment of celebration in your everyday life?
[00:07:26] Heather: This talk and like addressing burnout is so important right now. We are in a world of uncertainty where political uncertainty, economic, environmental, and then, you know, the introduction of technology and AI into our everyday lives so quickly, there’s a lot of emotional and mental chaos that’s happening right now.
[00:07:49] Heather: Where as service providers or just working in everyday life, we’re expected to do more with less. Including like bridging that gap between, you know, where we were yesterday and where we are today with technology. So, these are just some tools that everybody can use to help get them centered, to help find some stability and then find that control where you’re able to actually calm down, think clearly, and make some decisions that are right for you in this chaotic environment.
[00:08:25] Heather: I draw my inspiration from the incredible people that I surround myself with. It’s like everywhere I go, I get to talk to amazing humans who are doing just incredible research, reading incredible books, or maybe just have a new perspective that I’ve never considered before. And so that’s where I draw my everyday inspiration; is from just people I surround myself with, or even people that I come across in my daily life, going for a walk or, you know, being in a new business meeting, there’s always something to learn from other folks. I am very excited for a better future. And what that looks like to me, is integrating compassion into leadership.
[00:09:07] Heather: It is so important to bring that compassion and empathy and just an understanding for common humanity into our leadership, whether we’re at the university, whether we’re working in nonprofits, whether we’re working in the corporate environment, we have a great opportunity to train the leaders who are coming up with us and behind us, to bring more of that understanding and compassion into how they lead their teams. And from there, you know, if our next level of leaders have this greater compassion and the people that they’re leading to have this greater compassion, I really think that’s going to have a huge impact on our culture.
[00:09:50] Heather: And really shape the future for a better good.
[00:09:59] Mike: My name is Mike Dando. I am an associate professor of English and education at St. Cloud State University and the director of the Institute for Speculative Design and Education. And I am going to be speaking today or speaking at the conference Data 4 Public Good about Afrofuturism, speculative design, and civic engagement in literacies.
[00:10:22] Mike: I do a lot of work with youth in the community and telling stories they want to tell on their terms, in their ways, and then getting those stories that they’re telling out into the public as a way of engaging how they want to solve problems and address challenges that they see in the world in ways they think they’re going to be meaningful.
[00:10:43] Mike: So giving holding space for their voices to be heard is a lot of what I do. I spent a lot of my time working and collaborating with students around popular culture and civic literacy. So what those things teach us, not how those things can be used as a teaching tool, right? So I spend a lot of my time talking and working with comic books and also hip hop culture broadly.
[00:11:09] Mike: And so it’s not just rap. It’s also beat making, street art, break dancing, DJing, those types of things as a way of creative self expression. So stories matter. There’s power in narrative. And students, young people, all people have stories to tell. And so these are two ways that people have told stories that are widely seen, widely consumed and are in the public square.
[00:11:35] Mike: So any and everywhere you go, whether it’s on the city street or at a bookstore, you’re going to run into these types of stories that are being told. And so that’s one of the things that; so I collaborate with local artists, area artists, and youth to learn how to tell those stories, right? So it’s one thing to know that you want to tell the story.
[00:11:56] Mike: It’s another thing to have these skills to do that. So there are these tools for self expression. Young people develop, they have them but to work with someone who is a little, maybe further along in the process, is sort of a generative collaborative social space that they can feel like they’re making a difference in their community just by having their voice out there.
[00:12:19] Mike: This topic is important to me because stories are the most important thing we as humans have learned to do, even more so than fire. And here’s why I say that, because we had to share the story of how to do that with somebody else a billion years ago, however long years ago, you had to tell that story.
[00:12:41] Mike: Here’s how you do it. Here’s how I did it. That’s a story. Technology is a tool to tell stories. Podcasting, for example. I have a podcast called Comic School. It’s a way of sharing stories. It’s that the technology has changed, but the power that stories hold, and the potential that they hold, I think is fundamental to a flourishing society and democracy.
[00:13:04] Mike: Being able to share that story. This is who I am. This is what matters to me. And then to hear that story in return, do I know who you are? Do you know who I am? Do we see each other clearly? And that’s what stories matter. So yes, that happens in rap, in music, in art. In comics. And so that’s why the stories matter.
[00:13:23] Mike: And those are the two most commonly consumed and produced forms of popular culture. Just from an economic standpoint, everybody knows Marvel movies, right? The most recent rap beef, ever that broke through, everybody was talking about it, right? People that hadn’t even heard a Kendrick album were talking about it because it was a story.
[00:13:46] Mike: It was a way to engage and interact with each other. And that matters. The message that I would like for people hearing this or attending the conference to walk away with is you matter and it is possible. So I guess that’s two. I guess I cheated. But it, right. So when we see challenges they may seem insurmountable and it may seem as though, well, what can I do as one person?
[00:14:18] Mike: Well, it’s not just one person. It’s many of us that are all pulling in the same direction. Now there’s going to be many different ways to do that. There are going to be people who feel different ways about AI or different ways about any of the other technological tools. But those folks that are gathered here understand fundamentally that data, information must be used for the public good, because it’s certainly being used for private interests, right?
[00:14:49] Mike: So, yes you can. Yes, we can. Your stories matter, your experiences matter, and this is a place where we can engage in community, communication and solidarity, right? And collaboration. It is important for us to build a critical mass of people who are dedicated to the public good, especially as that public good, I think, is being systematically dismantled.
[00:15:20] Mike: And so that’s what I want people to walk away with is yes, times are challenging. They always are. They always have been. It’s not new. It’s our turn. Not just buck up, stiff upper lip, but no we got each other’s backs here. Groups like this, opportunities like this, organizations like this, gatherings like this are important because I live here.
[00:15:45] Mike: But we live here. This is where I keep my stuff. We got one earth. Okay. We got one shot at this. That’s my goofy answer but the serious answer, gatherings like this are important in this moment in time, because they have always mattered in moments like this, like these, we have an ongoing commitment, the folks that I that I’ve been with here, I’ve talked with, I’ve collaborated with to a just, an equitable and flourishing society for everyone.
[00:16:17] Mike: So it very much takes, fundamentally takes the proposition. I think that’s yet to be fulfilled that all people are created equal. Seriously, right? And so it is, it matters because there has always been, especially in trying or challenging times, gatherings like these have always mattered. Because there’s always pushback.
[00:16:42] Mike: There’s always pushback when somebody says, well I would like to use, I would like to have a more equitable society, less hierarchical society in terms of power structures, et cetera. Somebody’s always going to push back against that. And it might be a person, it might be an organization, might be a group of people, but there have always been people that have pushed back against that in particular ways on the macro level, and the micro level, the local and the global. And so that’s why gatherings like this matter because justice always matters. And this is an instance where we are gathered together to quote Minneapolis native son, Prince. We are gathered together to get through this thing called life. That’s what we do.
[00:17:31] Mike: That’s why these things matter because we’re all we got, right? And we have to lean on each other, especially when the going gets tough, the tough get going, but they get going together, big link arms and head in the same direction. And that’s what this is. Action item: read, read, read, read, read, read.
[00:17:49] Mike: And when you’ve done your reading, read some more. That’s how you plant your feet by the people that have come before you. What are people saying? And when I say do your own research, I don’t just mean go to social media. I mean, look at what have people said and thought that came before you and how can you mobilize similar strategies in your approach to addressing the challenges and the problems that you’re facing today, right?
[00:18:17] Mike: So when we’re looking at, oh, surveillance. Surveillance isn’t new. 1984; it was written well before 1984, right? So that was a discussion then, right? Fahrenheit 451. And I’m not even saying just read boring research papers. I love reading boring research papers, cause I’m an, I’m a scholar. I’m an academic, read Octavia Butler, right? Read Parable of the Sower. Read sci-fi. Read comics. Read what people have thought about what the current moment requires, because you are living in their future. And you are also living in somebody else’s past. So read, read, read, and when you’re done, and I know a lot of some say, just do one thing, but when you’re done reading write, write, write or create, create, create.
[00:19:05] Mike: If you don’t want to write, then make something, leave your mark here now for people that come later. A better future looks like a place where folks can be their true, full human selves without fear and in such a way that it benefits others. It doesn’t come at the expense of someone else. So it’s not just individual freedom, it’s collective flourishing.
[00:19:37] Mike: That’s what a just future is. It’s collective flourishing. It’s us together, all of us living in peace and harmony. I know it sounds so corny, but just ‘cause it’s corny doesn’t mean it’s not true, right? That what we’ve got to do is, we’ve got to, we’ve never tried; nobody in the face of the planet has ever tried to do what we’re doing! The live together, a bunch of people from across the globe in one space. The United States is impossibly large. But nobody’s tried to live in a multi-racial, multi-ethnic, democratic society before. We’re doing new things that, you know, how long have people been here?
[00:20:15] Mike: We’re trying something new. So, a just, equitable, verdant, like, society. It’s not all paved, like there’s flowers and stuff. That matters. That’s what a, that’s what a, that’s what a better future looks like.
[00:20:29] Shreya: My name is Shreya Sampath. I go by she, her pronouns, and I am a rising sophomore at the George Washington University and also the director of U.S. chapters at Encode Justice. Encode Justice is a youth led coalition fighting for human centered artificial intelligence and algorithmic justice. And of course, it’s brought together a lot of high school students and college students in this movement. Today at the Data 4 Public Good conference, I’m here to speak about school surveillance technologies and coming from a student perspective, discussing how students have organized around this and, you know, what it feels like to be a student in a public school where school surveillance is very prominent.
[00:21:10] Shreya: This topic is important to me because I’m coming into this space as a student who went to public school and now is in university. I’m also from a background, a minority background, and oftentimes people with my skin tone are misrecognized or misidentified by facial recognition, which is one of my motivations for starting in this space.
[00:21:30] Shreya: Another reason why I joined is because I was encouraged by my peers, including Marissa Syed, who was my chapter lead; Co-lead in New Jersey and of course the entire Encode Justice community. So, it was a combination of, you know, personal motivation to advocate for students in the space that may not feel as comfortable about speaking on these topics and, you know, reasons for my background as well as just being in a community that was supportive.
[00:21:57] Shreya: I have, I think, three buckets, ideas that I want to express during this conference and just in general about this topic. I think the number one, you know, message that I want to put out there is that, you know, student voices are important in this conversation and they should be included. And then the second bucket is that, you know, advocacy is important, but legislative work is also equally important.
[00:22:21] Shreya: And in this space where technology is advancing so quickly, government really needs to step up and start regulating or providing guidance to other, you know, public institutions like schools on how to most effectively use this technology. Which gets into my last point, which is that AI, of course, has so many negative implications, but it can be used for good and there have been instances where it’s really helped students, especially in their learning process.
[00:22:49] Shreya: I am familiar with many students who use AI powered technology to explain problems if they don’t have access to a personal tutor, for example. So I think in this space, it’s really important to recognize that, you know, AI is innovative and that it can, you know, fill in gaps, but it should not be used as a replacement for solutions or be used in a way that puts students civil liberties and rights at risk.
[00:23:13] Shreya: And that should be part of the conversation, you know, protecting students in all aspects when acquiring or even, you know, developing this technology. Some of the surprising things that, you know, most students or parents may not be aware of is that school powered computers or school sponsored computers carry a lot of these, you know, pre installed technologies, including, you know, software that monitors your activity in all aspects, whether it’s emails, history, or, you know, just.\,
[00:23:42] Shreya: anything really that you do on your school sponsored computer. This technology is also linked to your school given email account. So even if you’re logged in to your account on a personal computer, there is a possibility that this software is being used on you to monitor you. And I think that’s just the most like, most creeping in type of technology that is being used in schools right now where you’re not really truly aware but you know students are aware when you get an alert that your teacher is able to watch your screen during some type of assessment.
[00:24:13] Shreya: And students are aware that they need to be careful about what they search or else, you know, a blocked website comes up. Or their search history is flagged, and then that’s, you know, that’s just a general, I guess, agreement that students have kind of come into, which is why I always say students have slightly been desensitized in this, not slightly actually, very desensitized in this kind of state that they’re in schools where surveillance is just the norm and that they’ve just, you know, kind of grown up around it.
[00:24:42] Shreya: But yeah, I think there’s a lot of different ways that technology appears in schools, but one of the ways that is kind of hidden from public awareness, which, you know, facial recognition isn’t as hidden because you can see cameras around your schools. And vape detectors and gun detectors are also, you know, physical pieces of technology that are installed in schools.
[00:25:01] Shreya: But student monitoring software is an example of something that, you know, is installed in your school computers that you may not necessarily be aware of, but you internally are aware of. So when we speak about awareness about the use of school surveillance technologies, I think we have to acknowledge that all students are in some, at some level aware and they tend to self moderate what they search up and what they, you know, do on their school computers when they are aware of this.
[00:25:29] Shreya: And this can really have negative consequences for students whose only access to technology might be their school computer. And obviously, you know, self moderation does not create for the most comfortable learning environment where you’re engaging with controversial topics is necessary to grow as a person.
[00:25:46] Shreya: But, you know, once we gain that awareness, people self moderate, and of course, you know, if you do want to take action on this and are, you know, gravely affected by it once you’ve really understood its impact, I would of course recommend, you know, first just starting conversations with your teachers and with your administrators.
[00:26:05] Shreya: And parents about, you know, personal experience because this is one of the few, you know, spaces where students voices are, you know, treated with real reverence and, you know, are valued. So I think you have a special, you know, value to this conversation and it would be, you know, one of the action items you could take on is to just speak about how this technology makes you feel and how, you know, maybe you were sent to the principal’s office one time because you were flagged for something, but it was for a history project, not for some nefarious reason, right?
[00:26:41] Shreya: Speaking about your experiences to, not even publicly, but to people close to you would be incredibly valuable to just, again, raise awareness publicly. And of course, you know, I will start plugging my organization right now, but joining Encode Justice is a great way to really find a community of students who are also, you know, equally aware and passionate about issues like privacy, surveillance, AI ethics and beyond.
[00:27:06] Shreya: And, you know, we have multiple opportunities that you can participate in that helps you take action on this technology. Some of the real world implications of surveillance technologies include, of course, self moderation and what is known as the “chilling effect,” where students are, don’t feel as comfortable to express their true feelings.
[00:27:28] Shreya: Some more real life implications are I think in this case, I like to use hypothetical situations to really exemplify how specifically, you know, student monitoring technology and facial recognition could impact students. Of course, facial recognition has been. You know, proven by many different studies to be inaccurate on certain skin tones and certain races and ethnicities and genders.
[00:27:50] Shreya: So, misidentification is always an issue and in the case of student monitoring softwares, students who may not have access to technology beyond their school sponsored computers, which is a point that Marissa brought up, Marissa Syed brought up in an op ed that we wrote with the American Civil Liberties Union with New Jersey.
[00:28:09] Shreya: She went to a school that was predominantly low income, and she recognized that students who, whose only access to technology school computers might be more likely to share personal information on those computers and be, you know, more susceptible to data collection of sensitive info. And then, you know, if they also, you know, look up something that might be flagged for some reason or the other they would be, you know, in contact with some type of disciplinary measure.
[00:28:36] Shreya: And another aspect of that is just the general, you know, issue of how flagging lists could, you know, point towards certain communities. For example if a school is, you know, moderating students LGBTQ plus identities and someone who wants to look into that community for some type of project or is just curious about that aspect of life looks up something related to gender identity or sexual identity.
[00:29:03] Shreya: They could be, you know, policed for doing that, which is really scary, especially considering the political context in the United States and how bills like Don’t Say Gay in Florida have been passed. So this kind of technology can be really weaponized against certain groups that are already marginalized in our communities.
[00:30:12] Shreya: So I think that kind of encompasses some of the real life implications and also just implications for students and environments because of the use of these technologies. I think this is one of the best times to speak about issues like this because the awareness around artificial intelligence has really boomed since generative AI or generative artificial intelligence has, you know, come into the, main, you know, ecosystem of, technology and just in general conversation.
[00:30:42] Shreya: I think more people understand at least from, you know, very basic lens what artificial intelligence is and are more curious to learn how artificial intelligence could potentially affect their own lives. A lot of people are concerned about AI and how it could replace, you know, certain jobs, AI’s impact on hiring practices.
[00:31:02] Shreya: And this is just another aspect, right? Artificial intelligence is really booming in schools, mainly, you know, because or facilitated by the COVID 19 pandemic and remote learning. A lot of, you know, these technologies kind of emerged during that period, that coupled with this like rising awareness about advanced technologies like AI, really, you know, in general creates an environment where these conversations can actually be had.
[00:31:29] Shreya: I don’t need to, you know, give somebody like a course or, you know, an explanation about how artificial intelligence generally works because they’re more exposed to it just in their daily lives now. So I think that’s a part of it. The fact that, you know, the general public has become more cognizant about how AI is being used in so many different contexts.
[00:31:51] Shreya: And another part of it is just the fact that, you know, students voices are being amplified as well. I think people are recognizing that the younger generation, also deserves a place in the political process, and, you know, this is a great, it’s a good time to, come, like, be a student who wants to empower other students because people are willing to listen and treat you with legitimacy and respect because they understand where you’re coming from.
[00:32:19] Shreya: Some of the actions that I would want to, you know, uplift during this interview are actions that are spearheaded by other people at Encode Justice, so I want to take the opportunity to really uplift their work. First I want to talk about, you know, the idea of school surveillance work. Marissa Syed and I really spearheaded that at Encode Justice.
[00:32:38] Shreya: And, since then it’s really taken off, within our organization. Because like I mentioned, I think students voices are especially important in this space and is treated with more respect. So we’re able to make more change and enter more spaces, compared to other issues in the data privacy, surveillance, AI, governance umbrella, if, for lack of a better word.
[00:33:00] Shreya: But yeah, so, some opportunities that have come up with that is, you know, collecting student testimonies about school surveillance, which is something that I’ve led at Encode Justice. And then we’ve started a campaign to ban facial recognition in public schools. This has been led by GM Michelle, who is the Encode Justice New York chapter.
[00:33:20] Shreya: She’s absolutely incredible. and she’s working with Fight For The Future, which is a pretty well known digital activism organization to really fight for uplift this idea that, you know, facial recognition is really doing more harm than good in schools because of the risks I mentioned earlier, like misidentification and increasing the possibility of students, being faced with disciplinary action, which is already on the rise, since COVID.
[00:33:46] Shreya: So, that is a really important campaign that’s happening right now. And that’s led again by GM Michelle and it’s being supported by chapters in the DMV region and in Massachusetts, and California. and yeah, those chapter leads, who I will also name because they’re all amazing and incredible, Emily Garibrandt, Diksha Vaidyanathan, they’re all, you know, really working together.
[00:34:12] Shreya: And Siri Maneri from the Encode Justice North Carolina chapter. They’re all really working together to make this more of a nationwide movement because the New York chapter was successful in banning facial recognition in schools. So they wanted to really take this on and implement it in other regions, and especially locally.
[00:34:32] Shreya: So we’re recruiting high school delegates at the local level for that. But, those are, you know, some people that I wanted to uplift because And co justice is so, it’s huge, but there are so many, you know, amazing standout advocates. And we, you know, pride ourselves in also being able to recognize everybody’s efforts and give opportunities to students who want, you know, want to be a leader or want to learn about advocacy, and really, you know, uplift them.
[00:35:01] Shreya: So that’s, you know, a part of my job that I really like and I’m excited that I’ve been given this platform to do that. Some of the people that I derive inspiration from, of course, I’m going to, the Encode Justice community again. When I joined as a junior in high school, I wasn’t really aware of the, you know, real activism that youth especially lead in this country.
[00:35:24] Shreya: And there’s a long tradition of, you know, youth really taking on the lead to address issues that are arising in the moment. I think, you know, technology policy is such a, it’s a budding intersection, right? And it’s, you know, there’s a very huge lack of precedent, which kind of motivates me because I would love to be part of a movement that sets the stage for future generations and being the first generation to grow up under this kind of technology, I think myself and my peers have, you know, a special platform and special perspective that should be uplifted. So the fact that this community of people at Encode Justice exists, whose sole purpose is to do that and also empower each other, is really inspiring to me. The president of Encode Justice, Neha Raivanour, started this whole organization with a ballot measure in California, addressing the use of a potential, voter bail reform recidivism algorithm in bail reform, which has also been proven to be very biased.
[00:36:22] Shreya: She started that movement, you know, so many… it was our fourth anniversary, I think, a week ago, which is so crazy. But, she started that movement and that just has expanded beyond what anyone could have imagined. And it’s just, It’s amazing to see, you know, people, youth so interested and invested in their futures, not just in the United States, but also globally and this, you know, intersection of, AI and policy is really, important for the future. And that’s something that we recognize and that we empower each other to participate in. So that’s, of course, I think my main source of inspiration and this is another source that I’ve, you know, hinted at earlier is just the tradition of youth getting involved in activism.
[00:37:06] Shreya: It’s very cool to see, you know, years of history that, you know, we’re kind of walking in the footsteps of, and seeing youth in other movements, like the climate movement, and the reproductive rights movement, I think everyone is, you know, when they, their, perspectives are really, empowered and, valued in these spaces, and it’s great to see that they’re making progress, especially also in the gun reform movement or the gun control movement.
[00:37:34] Shreya: I think spaces like that, have really shown the value and power that youth have. And I want to emphasize power because we’ve made an impact so far and I want [to] continue being in that movement and also, you know, translate that work into the legislative process because although we’re activists now eventually we’ll graduate from college and you know, we want to go to grad school and I’m excited to see that generation of youth activists, you know, grow into policymakers and hopefully, you know, legislators who are making change, you know, actively happen within government as well.
[00:38:09] Shreya: To me, a better future looks like a space where students feel safe in their schools first and foremost. And more generally that, you know, youth are valued in conversations that, that relate to their future and that, you know, impact their future greatly. I think obviously this conference is a great example of, you know, uplifting youth voices
[00:38:30] Shreya: and I want to help continue doing that throughout, you know, whatever space that I enter and I think for me, you know, a future that is more inclusive and that is more understanding of each other and also aware of different perspectives, those are all general ideas, but I think my main, you know, goal for the future is to really center youth and people who’ve experienced the things that we’re talking about in the, when we have conversations about regulating or, you know, doing something else that might change the outcome of whatever is happening.
[00:39:04] Shreya: Join Encode Justice. That’s an easy one. But yeah, I think just the idea of youth empowerment and a power of youth, that’s something that, you know, I think, I hope that Encode Justice has, you know, demonstrated. and I want to encourage more youth, especially to join, you know, movements that they’re passionate about or gear their careers towards helping people because, you know, regardless of whatever space you’re in, you’re able to do that.
[00:39:33] Shreya: And also, you know, this is a message to parents and educators and just adults to include youth more meaningfully in conversations, not just as, you know, a placeholder. Or just to show that, you know, a youth is, you know, involved somehow, or that a young person is somehow involved. Yeah, I think, that it’s changed over the last few years, but I still think we could be more meaningful in enveloping youth into, you know, conversations and, you know, conversations about policy, especially because, that is, you know, of course the most like impactful part of this work.
[00:40:10] Shreya: So I just want to, you know, emphasize the fact that youth obviously, you know, are educated and we, deserve to be included in conversations about policy. That could potentially change our future, or that does change our future. Please do join Encode Justice. Our website is just EncodeJustice.org. You can also reach out to me at Shreya@EncodeJustice.org
[00:40:32] Shreya: if you’re interested in starting a chapter, just learning more about the movement. There’s this place for everyone in our community and whether you come from a political background or a technical background, we would love to have you and hear your thoughts on all of this work. And I also want to emphasize, you know, we don’t just do school surveillance work.
[00:40:52] Shreya: We work across the entire technology policy landscape. For example, we have an entire international chapter sector that focuses on, you know, how AI affects democracy abroad. And we have other projects related to police surveillance technology and data privacy legislation and AI education work, how, you know, AI should be better implemented in schools, and should facilitate learning, and a bunch of other things that, you know, are amazing and interesting.
[00:41:23] Shreya: So you should definitely join and check us out.
[00:41:29] Sophie: Hi, my name is Sophie Wong. I’m based in Minneapolis. I’m an artist, a zine maker, an educator, and I’m a speaker here at Data 4 Publice Good. I’m here speaking on the algorithmic ecology. So the workshop is exposing algorithmic ecologies, community tools, for resistance. So the algorithmic ecology is a framework and a tool that we developed, at stop LAPD spying coalition and free radicals.
[00:41:58] Sophie: So in collaboration, through a shared working group. That was the Data and Algorithms Working Group, and the algorithmic ecology is basically a four part framework, that helps you map your fight against a particular algorithm or program, that is causing harm in a community, or that you just want to have more context for.
[00:42:19] Sophie: And so, it started out, with mapping the fight against, It’s a predictive policing program called PredPol, which is really central to kind of the story of the development of the framework. And the framework itself has four parts. So there’s the community impact, which is kind of the first layer at the bottom.
[00:42:37] Sophie: And then there is the operational layer, which is, you know, the algorithm itself or the program, the software, and then whoever is using or implementing it. And then the layer on top of that is the institutional layer. So that’s things like, who were the institutions that were involved in the creation of this algorithm, what are the institutions that funded the creations of it that made it possible for it to be implemented. So in the PredPol case, just to talk about kind of theos layers, we’re starting with a community based in skid row, but also more broadly black and brown communities in Los Angeles. so talking about the impact on those communities, which include displacement, banishment, gentrification of communities, thinking about the ways that it exacerbates red lining. And then on the second layer, so operationalizing it, the police LAPD use this program called PredPol to put, you know, data into it to create these 500 by 500 feet squares that are called hotspots. and then, those hotspots are places where they are predicting more crime will happen and they’ll send more police officers out to those areas.
[00:43:49] Sophie: And then the layer on top of it, the institutional layer, helps us contextualize, okay, if we are thinking about PredPol within its context, why was it created? Is it just here to find crime or to deploy police officers? No, we find that in an actual historical and institutional context, it’s developed also to gentrify Los Angeles and quarantine the Skid Row area.
[00:44:16] Sophie: So, who are the academics who are involved in the creation of it, who are the city hall members who are involved in the implementation of it within LA beyond just the LA police department, and who are kind of the non profits also who are taking money to put a rubber stamp of approval on this program.
[00:44:36] Sophie: So those are kind of, the first three layers and then the final fourth layer is the ideological layer. So what are the values at play that connect to all of these different parts of the ecology? And what we really find is that this helps us understand that all of these campaigns and all of these fights, are really, at their core, abolitionist, because it’s not just about removing one program and making it possible for that harm to be disappeared, because it’s an ecology of functions like any ecosystem where if you remove just one thing out of it, ecosystems are resilient, they will fill that niche in with something else, which we are seeing in LA as PredPol, has been discontinued. The LAPD has simply replaced it with data informed community focused policing and other forms of data informed policing. And so we really are seeing that ecology is resilient in order to remove that harm you need to tackle, and address all of the different parts of the ecosystem, and also
[00:45:37] Sophie: that creates more targets for a campaign, for a fight. More ways for people to get involved, more areas for expertise to be relevant. And so really, we create this algorithmic ecology as both a way to map a fight, but also as a way to communicate what you’re trying to do with your campaign in kind of a visual diagram.
[00:46:04] Sophie: So there’s this quote from an LAPD spokesperson that goes, talking about kind of predictive policing. And he says, “It’s math, not magic. It can’t be racist.” And that’s one of the things and one of the reasons that we as Free Radicals, which is an activist collective, I’m a part of at the intersection of science and social justice, got involved in this fight is because of this use of this rhetoric of science and technology as objective or as value neutral.
[00:46:33] Sophie:And we really, really push against that, right? Because all of these things, this technology, and this use of data is being used to say that things are objective when of course, why is that data being collected? Who is making the decisions to collect this data? Who is that data being shared with?
[00:46:50] Sophie: And then what are the outcomes of that data being put into use in certain ways, for example, in a predictive policing system. And so I think really what we want to push against is this idea that any kind of data or any kind of algorithm is neutral. All of these algorithms are not only not neutral in the way that they function, right?
[00:47:10] Sophie: I think oftentimes we hear this narrative of dirty data in, dirty data out, which is very true, right? So, like, if you’re using historically racist or biased data, so like crime data, for example, and you put it into a system that’s going to tell you, where crime is going to happen, it’s going to spit out an equally racist outcome and that’s something that we hear often and it creates this ratchet effect of, okay, then we’re going to deploy police more into one area, and then of course if there’s more police there, then they’re going to find more crime there, et cetera, et cetera. But what the algorithmic ecology really shows is that it goes beyond that, right?
[00:47:51] Sophie: It goes beyond just dirty data in, dirty data out, crime, leading to more deployment, leading to more finding of crime. It really says, okay, well, you know, even if you could create a neutral data set, what is this algorithm here to do? This algorithm is here to find crime. This algorithm is here to create a class of people who have been disenfranchised and can be exploited, and their labor can be exploited.
[00:48:22] Sophie: And they can be removed from a certain area, so that area’s property and land can be taken by people who are going to use it for profit. And so, at the end of the day, it’s not just, oh, about de biasing the algorithm, or about making the data that goes in less racist, so that the data comes out less racist, but really thinking about, what is this, program or this algorithm?
[00:48:45] Sophie: Why does it function the way that it does and do we want that function to exist at all? So I first got involved with, the campaign against PredPol that the Algorithmic Ecology Framework and, Diagram kind of came out of because I am from LA or, I guess, if you’re from LA, I’m from the Inland Empire.
[00:49:08] Sophie: I’m from the, I’m from LA County. but for people who aren’t from LA, I’m from LA. For me, and I did live in LA city proper while I was involved in this, but I, for me it was always very important that the work that I was doing was local and that kind of the work that I was doing with Free Radicals at the time, which was a lot of, public education, political education around, the politics of science and technology, that it be grounded and also useful and generative in a local context. And so for me, it was really about, I am in community with these people around me. And so I need to understand the ways that what I’m talking about is actually being like operationalized in some ways in my communities so that I can be part of fighting against this harm, so that we can all, because we’re all so deeply interconnected, be a part of creating something that’s better, a world that’s better. You know? I have friends who are public high school teachers in Los Angeles. And, you know, their lives are directly affected by their students lives. And also, of course, by, like, administration and by, like, city decisions, including funding, which are impacted by things like how much funding goes to policing.
[00:50:32] Sophie: And then their students are directly impacted by that funding that goes to the police and then what the police actions are. And so I, you know, all of the people that I’m in community with, are deeply impacted by all of these things. And, that’s always really important to me is to be there for them. One of the main things that I want people to walk away from this workshop or from what I’m sharing is that it’s always important to look beyond just the program or just the software or just the algorithm or just the technology.
[00:51:06] Sophie: It’s always important to ask, why does this technology exist? Who makes it possible for this technology to exist? Who benefits from it and who is harmed by it? Because just starting with, how can I fix this technology to make it less racist or sexist or less biased? The real answer and the root answer to that is never going to be within the actual technology itself.
[00:51:28] Sophie: It’s always going to be somewhere in the larger ecosystem of that technology. I think we’re seeing like a real burgeoning of the use of AI in various forms, you know, machine learning or large language models, or, even like simple algorithmic automation of different things. And one of the things that’s often present in those conversations is like, A, who’s benefiting and who’s being harmed by these technologies.
[00:51:59] Sophie: But what’s often missing from those conversations is what are the impacts on actual people who are using this technology or whose lives are being displaced by this technology? So one of the things that I worked on, last summer, was with a professor, in Orange County. In California, and we worked on a story that had to do with longshore workers, at the port of Los Angeles being displaced by AI and automation.
[00:52:35] Sophie: and this is not a new story, but it’s something that as AI is developing more and being part of, you know, the conversation, I’m constantly getting ads about AI integration in like apps that I currently use, or ads for new forms of AI that I can implement into my daily workflow or my life and as this becomes a larger and larger industry, I think it’s very important for people not just to think about like, Oh, what’s the danger of using AI in my life and maybe it being inaccurate or something like that, which is, you know, a huge concern. but beyond that, what is the function of this AI? Why are people creating it? Who is funding it? And what do they want the world to look like? And then how do you As the consumer want the world to look like, what do I want the world to look like?
[00:53:26] Sophie: An action that folks who are listening to this can take, is to just find out what’s happening in your local communities. I think that’s a very easy answer, but I think it’s something that is also kind of hard to do, and so I would encourage folks to think about, getting involved, reaching out to local organizations that, are doing work already that is relevant to your life.
[00:53:53] Sophie: And then also I think one of the things that’s important too, is to just take this lens to all of the things that you encounter and to ask questions about those things and not take it for granted that anything is just neutral. I think my inspiration for a better future comes from the fact that like, when I look outside, I can see trees, and the earth around us is so beautiful, and I want to be able to experience that for myself as I grow older, and then also for future generations to be able to experience, like, the beauty and joy
[00:54:28] Sophie: of the world around us and the relationships that we have with each other. And that’s not possible when people are being deeply exploited by capitalism, or like placed into carceral systems, or like we’re experiencing climate change at unprecedented levels, and so I think that’s what really drives me forward is just like a deep appreciation and joy of the world and the people around me.
[00:54:53] Sophie: I think a better world is just one where all of the things that my friends who are working in, like, these kinds of communities, like, for example, as educators, all of the things that make it impossible for them to do their jobs, all of the funding that goes into those things, goes instead into, like, equipping people to be able to do things in relationship with each other.
[00:55:22] Sophie: So for example, one of the things we’ve talked about a lot at the conference already, are different forms of surveillance technology. So what would it look like if, instead of putting in, like vape detection and shot spotter and spending hundreds of thousands, if not millions of dollars on those technologies, you spent that on making it so teachers didn’t have to spend their own money in classrooms to buy school supplies. Or to make it so that teachers, there are more teachers in schools so that classrooms aren’t like 40 kids, or make it so that doctors actually have access to enough time to spend with each patient, you know, things like that.
[00:56:03] Sophie: First, I would like to plug Stop LAPD Spying Coalition. I think the research and also the work that they’re doing based in Skid Row out of LA Community Action Network, is so, so, so, so vital and also part of a long history of work that they’ve been doing since 2010, then also, even far beyond that.
[00:56:20] Sophie: and so you can go find them. I think they’re just @stopLAPDspying on Instagram and Twitter/X. or you can find them at their website, which I think is www.stoplapdspying.org. Or yeah, I think it’s org. and then if you’re interested in reading a little bit more about Free Radicals, we’ve been on a bit of a hiatus since about 2021, but there’s still a lot of content, on the blog, and that’s just at freerads.
[00:56:49] Sophie: org. and I think I’ll probably just leave it there at the two org plugs and yeah, folks want to reach out to me. I won’t plug like my own personal stuff since this is really coming out of the work that we did as these two collectives.
[00:57:06] Catherine: I’m Catherine Squires, I live in St. Paul, Minnesota, and I’m a writer editor and yoga teacher. I’m a speaker at this year’s Data 4 Public Good conference. I came to the Data 4 Public Good conference to be on a panel where we were presenting the results of a nationwide survey of students and parents about how COVID 19 impacted the use of technology and particularly surveillance technology in online learning.
[00:57:34] Catherine: I was invited by Marika Pfefferkorn as part of her work with the Midwest Center for School Transformation and Dignity in Schools campaign to help analyze the data that had been collected by the community researchers for the Dignity in Schools campaign. So we looked at the survey results as well as the information from the focus groups and interviews to analyze how parents and students experience things like technological difficulties or accessing technology while they were trying to navigate the world of online learning in the wake of the COVID shutdown.
[00:58:11] Catherine: I think this is a really important thing for everyone to be thinking about, the way that technology is ramping up in schools, particularly surveillance technologies. Many companies are promising to help keep kids safe or even address the spike in depression and other mental health challenges that we’re seeing in young people these days, but there’s actually no evidence that the kinds of surveillance technology that they say will flag problematic behavior or cries for help actually prevents any of the things that they say they can prevent. Instead, it makes students who already feel like they are overburdened by surveillance and punishment feel even less part of the community.
[00:58:52] Catherine: Because in the United States, school districts are locally controlled, there isn’t a lot of data yet on how many schools are using surveillance software like Gaggle or Sentinel or one that’s even called Bark as if there’s a guard dog in the school watching when kids are using social media when they’re not supposed to or something like this.
[00:59:13] Catherine: But in our survey, which covered 29 states and had over 500 respondents, we found that 77 percent reported that their school was using some kind of surveillance technology to monitor student behavior and activity online. It’s not surprising to me that so many and parents reported that this kind of technology was being used in their schools.
[00:59:36] Catherine: Unfortunately, since the spikes in school shootings and, you know, the wave of mental health problems that many students have faced in the last 10 years, many school districts are turning to technology as a cheap way, quite frankly, to try to mitigate harm and also to give parents a sense that they’re doing everything they can to protect students.
[01:00:00] Catherine: But even some of the companies that sell cameras and facial recognition software to schools admit these things don’t stop that kind of violence because you actually have to know the person and their face in order to screen them from coming into the school. And what we know is that unfortunately most school shooters are from the community.
[01:00:22] Catherine: They’re not outsiders who already have a criminal record or a history of gun violence. And so these tools are probably giving people a false sense of security and safety, whereas building better relationships between students and the adults in the schools and the adults in their lives and their school could help with the communication that might actually help people identify a student who’s struggling before they go down a dangerous path.
[01:00:48] Catherine: One thing that I’d like people to take away from our study is not only how prevalent these technologies are, but how little people know what is happening with their students’ data and who else might be brought into the dragnet. So for example, if a student has a program like Gaggle being used for the work they’re doing at school, for their social media messages, that monitoring that’s happening on their school device, if they bring that school device home, or if they log into the school system, maybe using a sibling’s device or their parent’s device.
[01:01:24] Catherine: Let’s say they ran out of battery power and forgot their charger in their locker. Well, if they log in on their mom or dad’s computer using the school system, now mom and dad’s computer is being monitored. And so it’s a much wider set of people who are getting their information taken without their knowledge.
[01:01:46] Catherine: And so it’s really important for parents and students and advocates to ask school systems, to ask districts, principals, the people who do procurement, to be transparent in the kinds of contracts that they are signing with these third party companies and to give much more information up front of what students are actually giving consent to when they say, I promise to use this device that the school gives me because they are probably having much more surveillance happening, not just at school, but also in their home.
[01:02:18] Catherine: So we didn’t cover in this study who are the decision makers, but we do have evidence from some Freedom of Information Act, investigations and the 74, which is an education online magazine that’s been doing a lot of investigative work on the way surveillance is invading schools, that it’s usually at the district level or the Department of Education level.
[01:02:42] Catherine: But again, in the United States, school districts are so decentralized, our whole educational system is decentralized. And so what one district is doing in Ramsey County, where we are, and what something is happening in Hennepin might be completely different. So for example, Minneapolis public schools used Gaggle.
[01:03:01] Catherine: St. Paul Public Schools didn’t. So even just a city next door could be completely different depending on what the school board, the superintendent, and other people decide to use with different kinds of grants they get for student well-being. And indeed, with the Gaggle case, it was money that was provided during COVID to try to support students that was used to create the contract with Gaggle.
[01:03:25] Catherine: If people want to see the findings from this report, we’re finishing up the recommendations that are coming out of it, and it will be available on the Dignity in Schools campaign website in a few weeks. It’s really important for people to be thinking about these issues at a time when schools, like hospitals, are becoming more and more targeted by folks who are trying to do things like data ransom.
[01:03:49] Catherine: So unfortunately Minneapolis public schools was the target of a ransomware attack and they were able to get student records. Some of those student records included the ways that students were being flagged by Gaggle and the documents that were flagged. So this can include anything from a message between classmates to a school assignment or a journal reflection.
[01:04:13] Catherine: So personal, personal information was available. And to the point where some of the things that were released after the district refused to pay the ransom, identified students. And so these companies, because they’re able to use proprietary information and other veils to say exactly what they’ve got, what they store, how they store it, how long they store it.
[01:04:39] Catherine: your data could be vulnerable. And again, people in your house, their data could be vulnerable too if the student is logging in on a different device. And then if a ransomware attack happens or some other data leak happens, then it could expose many more people than just the student. So these things we don’t like to think about because we hope that it won’t happen, but more and more schools are getting targeted by bad actors who know that they don’t have the kind of money to pay for security systems or they’re using third parties and that’s a backdoor to get into their data.
[01:05:13] Catherine: Some people might be familiar with the FERPA regulations, which is supposed to protect student and family data when a student is in school all the way up through college. But if you actually look at the court cases when people have felt their FERPA rights were violated, there’s never ever been a case where a school, district, or even a department of education had to pay any damages for that being done.
[01:05:37] Catherine: So there are many, many loopholes that allow law enforcement and other actors to get access to student data, and I don’t think people understand that enough, that FERPA regulations don’t protect you. And they were written at a time when we didn’t have all the different kinds of electronic devices and data and cameras that we have in schools now.
[01:05:59] Catherine: So it really needs to be updated and strengthened if people want to have the kinds of privacy protections that they deserve. If people are concerned and want to find out what is actually happening in their schools, I would argue they probably need to ask multiple people in different roles at the school.
[01:06:18] Catherine: Sometimes these programs for surveillance are brought in under words like student well being or student support. But it adds up to a new surveillance regime. It’s quite likely your student’s teacher does not understand or even know about the scope of the surveillance that might be happening in the school because they may not have been briefed on it or they may not have heard the whole story.
[01:06:45] Catherine: For example, we know that many teachers in Minneapolis public schools didn’t know that gaggle had been implemented or to what extent it was able to surveil students beyond the school day. In fact, most of the red flags that happened with the Gaggle system, happened after school hours. So, if you don’t get an answer from a teacher, go to an administrator and school board members. You also might make a request, a freedom of information request of the Department of Education or your local school board, because some of those contracts are never going to be in a simple, easy place to find online. And sometimes things can be decided in a consent decree or consent agenda in a board meeting, which means they’re not actually aired for debate.
[01:07:30] Catherine: So the paper trail could be very hard to follow, and so asking lots of questions of different people and hopefully with other parents and concerned community members so you can share the burden and not have to do it all by yourself. I’m inspired to do this work because I was brought into it in the early 2000s, when I met Marika Pfefferkorn, when she was working on the school to prison pipeline.
[01:07:55] Catherine: And then when they shifted that emphasis to the school to prison algorithm, seeing the ways that these new computing programs and big data were used, being used to predict students failing, predict students becoming criminals. She realized that the scope of the work had to shift. And as a mother of a student with special needs, I know how much
[01:08:19] Catherine: more of a burden is on those kinds of students as well as black and brown students when they’re at school and they are being surveilled more than their white peers. So I have a personal stake in building a better way of supporting students without just resorting to data rich programs that actually replicate so many of the biases that already exist in our school system.
[01:08:43] Catherine: A better future for our schools, for me, starts with the students and the people who care about them getting together and having time to dream about what a truly supportive, enriching school environment looks and feels like and designing it from the ground up because we know that so many of our school systems are struggling as well as not coming up with ideas on how to fix the problems that have plagued them for so long.
[01:09:13] Catherine: And I believe in public schools. I come from the public school system. I taught in public universities for over 20 years and I know how important education is. And I really want the people who are most impacted to be at the center of designing those futures. And that’s what this conference is really all about.
[01:09:34] Helen: Thank you for spending some time with us today. We’re just getting started and would love your support. Subscribe to Creativity Squared on your preferred podcast platform and leave a review. It really helps. And I’d love to hear your feedback. What topics are you thinking about and want to dive into more?
[01:09:49] Helen: I invite you to visit CreativitySquared. com to let me know. And while you’re there, be sure to sign up for our free weekly newsletter so you can easily stay on top of all the latest news at the intersection of AI and creativity. Because it’s so important to support artists, 10 percent of all revenue Creativity Squared generates will go to ArtsWave, a nationally recognized nonprofit that supports over 100 arts organizations.
[01:10:14] Helen: Become a premium newsletter subscriber or leave a tip on the website to support this project and ArtsWave. And premium newsletter subscribers will receive NFTs of episode cover art and more extras to say thank you for helping bring my dream to life. And a big, thank you to everyone who’s offered their time, energy and encouragement and support so far.
[01:10:36] Helen: I really appreciate it from the bottom of my heart. This show is produced and made possible by the team at Play Audio Agency. Until next week, keep creating.