I’ve been employed as a “helper” for twenty years, and my skill with technology is derived completely from a service perspective. I’ve never had formal training in computers, computing, instructional design, server administration, or dozens of other skills I use every day as an academic technologist. Instead I’ve applied the critical skills from my undergraduate liberal arts education to learn on the job, gaining a deep understanding of technology, its benefits, and its challenges through the experience of supporting people. As a result, my approach to technology is deeply pragmatic in a way that conflicts with many others in my field. I’ll consider paying attention to virtual and augmented reality, for example, when the headsets don’t make people sick, and when someone can prove to me — without just asserting that “it’s different this time” — how it’s going to solve any of the problems educators and learners face.
After a decade of watching my colleagues be constantly seduced by the Next Big Thing, I am tired of hearing from those who can afford to ignore how deeply embedded racism, misogyny, and other forms of bias and hate impact our algorithms, our software, our “smart” cars, our automated personal
surveillance assistants. And I’m tired of having our colleges and universities willingly and freely sign over the large data sets — often without consent or notification — needed to develop and “improve” these platforms so they can be sold back to us at increasing costs. 84% of information security leaders are White, 83% of them are men; the people most impacted — Black people, queer and trans people, poor people, disabled people, undocumented people, people who are all of these and more — aren’t even in the room. Throwing our hands into the air and declaring ourselves a post-privacy world doesn’t help; it only abdicates responsibility. What does help is resisting. I want to develop models by which our departments, our schools, our institutions of higher education can resist, and reshape, and reform prevailing data practices.
To do that, I need several things: a more detailed understanding of how institutions of higher ed and their technology divisions function; familiarity with the existing literature related to surveillance, privacy, and data ethics and ownership; the ability to perform both qualitative and quantitative research; and the writing skills to contextualize findings for both academic and public audiences. Consulting with and advising individual faculty in these issues is rewarding, and has an impact on more students than I could reach by myself, but my current work does not address the damage being done at the institutional level. Ideally, the skills I learn and refine through this program will help me to move up into a leadership role within academic technology, sideways into a support role focusing on data ethics, or out to an adjacent organization that engages in research, scholarship, or action related to ethics and technology in higher education.