Without Ethics, Digital Equity is Impossible: Three Questions Leaders Must Answer

Without Ethics, Digital Equity is Impossible: Three Questions Leaders Must Answer

Since the start of the pandemic, educators, leaders, and policymakers have undertaken substantial efforts to increase student access to devices and high-speed internet in an attempt to close the digital divide. However, improved digital access cannot be conflated with greater digital equity — the notion that students not only have access to and ownership of the tools that best support them as learners, but also the opportunity to develop the skills and competencies required to best leverage these digital resources within a system designed to sustain and deepen their education. The short-sighted focus solely on digital access continues to obfuscate the broader issue of digital equity.

Of particular concern, when it comes to education technology, ethics has largely been an unattended issue. Despite good intentions, schools and districts implement digital tools without considering the broader ramifications. This has become even more salient given the current social and political agendas driving education — and the role that technology plays in both arenas. With anti-LGTBQ+ laws on the rise, women’s reproductive rights under attack, books being banned, and educators feeling silenced by their school boards — ethics could offer educators and leaders a pathway to better support students through a focus on digital equity.

As leaders and educators address the ongoing challenges of the school year and look to implement new technologies, three ethical questions can guide decision making to ensure that all students can experience an education that helps them to reach their full potential. Whereas the first two explicitly address critical ethical challenges associated with technology, the third raises a broader question pertaining to the types of opportunities available for all students.

Question #1: Are we building equitable learning environments if we fail to address the unintended consequences of the technologies that we put in schools?

At the end of his book, “Diffusion of Innovations,” Everett Rogers raises the issue of unintended consequences. He asserts that despite the best of intentions, new technologies have the potential to cause harm. Extending Rogers’ premise, Professor Yong Zhao explains that unlike in medicine, education never examines the “side effects.” For example, a reading initiative might result in an initial uptick in literacy scores, but the unintended effect could be that students grow to hate books.

From security systems that leverage facial recognition to AI-enabled platforms that promise to meet students’ varying needs, new technologies have flooded into schools. Although digital tools and data are often viewed as inherently neutral, they do have ethical ramifications. Sociologist Ruha Benjamin argues that technology — and the underlying algorithms — have the potential to deepen disparities and perpetuate systemic inequities, creating what she terms a “new Jim Code.”

Consider these issues surrounding facial recognition. As the pandemic forced students online, higher-ed (and some K-12 districts) invested in virtual proctoring systems. Students of color found that the camera could not accurately detect their faces without extreme lighting. The Center for Technology and Democracy reported that these systems unfairly flagged students who might have atypical eye movements, motor-tics, speech patterns, or the need to use assistive technology.

Similarly, AI-enabled learning platforms might operate under the guise of personalizing learning; however, they can also unfairly penalize language learners or students from different cultural backgrounds. Before the pandemic, students in several schools even staged walkouts to protest the “personalized learning” platform that intended to improve their education. Although the initiative intended to address digital equity by increasing access to technology that delivered individualized instruction, leaders failed to consider the unintended side-effects of asking students to spend their days working relatively independently. As a result, the students said they felt as though they were not learning anything and that they hated school.

Action Step

Before implementing any new program or platform, ask to understand the dataset driving the algorithms and question whether the developers tested for bias. Even more importantly, question the side effects:

  • Does student engagement change?

  • Do particular subgroups of students seem adversely affected?

  • Are your students represented by the content of the platform and in the dataset driving the algorithm?

Question #2: Are we choosing the technology that best supports our students’ learning if it makes them feel as though they are under constant surveillance?

With increased virtual and hybrid learning, the lines between home and school have become blurred. Although federal legislation requires schools to monitor students’ technology use on district-issued devices and networks out of safety, compliance, and community concerns, this becomes problematic when systematic monitoring turns into surveillance.

From Gaggle to GoGuardian, tools purport to improve student safety, academic integrity, and technology-monitoring efficiency. However, without careful consideration of the types of data being collected and screened and when they are being collected, districts run the risk of making students feel as though they are learning in a state of constant surveillance.

Consider the conflict that emerged this fall when students and teachers discovered that their schools had adopted e-Hallpass — an app that tracks when students move throughout the building. Though heralded as a tool to increase safety and efficiency, its implementation raised a number of questions related to student data and privacy as well as the ethics of creating a surveillance state in schools. Opponents of the software argued that it would contribute to the over-policing of students with disabilities who may need additional breaks, LGTBQ+ students, as well as Black and Latino/a students who are often unfairly targeted.

A recent EdSurge article asserts that with “Don’t Say Gay” laws and abortion bans proliferating across parts of the country, student surveillance raises a host of new ethical concerns. In response, many students have already started shifting from school to personal devices, adopting encrypted messaging apps such as Signal, and even teaching themselves about everything from protecting their data to sex education. However, for students who do not have their own devices or rely on district-issued internet connections, there is no escaping the constant monitoring.

Action Step

Take a step back and go for a walk in your students’ digital shoes. The Stanford d.School Shadow a Student Challenge helps leaders identify areas of improvement for their students by engaging in empathy. A similar approach can occur with students’ virtual lives. Consider all the various tasks and activities that need to be completed digitally each day and then ask the following:

  • How many times were your digital interactions tracked or recorded?

  • Which of your messages (email, text, voice, etc.) were potentially scanned?

  • What options did you have to ensure your own privacy and do your students have those same options?

Question #3: Are we ensuring that every student experiences an education to prepare them for success in a technology-rich world?

Educational technology has long promised that learning would become more dynamic, differentiated, and personalized. Unfortunately, failure to consider disparities in opportunity, representation, and bias can perpetuate long-standing inequities. Ensuring that technology-rich learning leads to greater equity requires explicit attention.

From creativity to computational thinking and media literacy to makerspaces, today’s students need the opportunity to develop the academic, cognitive, and non-cognitive skills required for success in a technology-rich, global society. And yet, despite a body of literature indicating that all students benefit from meaningful opportunities to learn with technology, disparities continue to persist — especially for students of color, those who are learning English, those living in low-income communities, and those enrolled in remedial or lower-level courses.

Lack of resources, professional learning, technology support, and home access have all been cited as reasons for not providing students with meaningful learning opportunities. Even more concerning, adult bias about students’ technology use not only results in inequitable academic experiences but also discriminatory disciplinary practices and the policing of behaviors. As Matthew Rafalow explains in “Digital Divisions: How Schools Create Inequality in the Tech Era,” the behaviors, experiences, and opportunities celebrated in predominantly white, affluent schools are often considered irrelevant or disruptive in other contexts.

Action Step

As students and teachers settle into the routine of a new school year, it is imperative that leaders pay explicit attention to ethics. In your next team, committee, cabinet, or faculty meeting, consider starting with these questions:

  • Is it ethical that all students may not have the same opportunity to develop critical digital and media literacies so they can develop as informed digital citizens?

  • Is it ethical for students to have inequitable access to technology-rich learning experiences depending on their school or even their classroom teacher?

  • Is it ethical that student learning, performance, academic outcomes, and discipline could be determined by bias in an algorithm or dataset as well as the preconceptions of educators and leaders?

Wrestling with these ethical questions may create more complexity and ambiguity than answers, and there are multiple barriers that leaders and policymakers must address to move beyond digital access and towards digital equity. However, it’s important to remember that digital equity is an ongoing, iterative process for schools and districts. Open-source resources, like the TLA Digital Equity Guide, offer additional guidance and strategies to help leaders and educators develop tactical plans of action, no matter where they are on the journey towards true digital equity.

This blog is part of a series on digital equity. Read the first blog on digital privacy here.

Beth Holland

Beth Holland

About the Author

Beth Holland is a Partner at The Learning Accelerator and leads the organization’s work in research and measurement. She brings both a rigorous academic background and practical experience to the team’s research efforts.

Let’s Connect

Looking for more information? Email us at

info@learningaccelerator.org

Have a media inquiry? Contact Lacey Gonzales at

lacey.gonzales@learningaccelerator.org

411 Congress St
Office 403
Portland, ME 04101

 Discover More: Follow Us


    Skip to content