The use of digital educational technologies has surged during the pandemic, bringing students and faculty a range of new options for learning. In so many ways, this is a great thing, and we鈥檝e covered some examples of 萝莉社-Dearborn professors being inspired to use new techniques as a result of their remote teaching experiences. As more learning goes online, though, colleges and universities are also having to wrestle with some emerging privacy and data ethics issues, according to The Hub for Teaching and Learning Resources鈥 Autumm Caines.
The basic scope of the conversation about digital data ethics in higher education is more or less similar to how we think about privacy and data in our daily lives. As you know, many digital platforms, like social media or websites, track lots of data about our behavior, and that data is used by companies for various purposes. We鈥檝e gotten pretty used to some of them, like how tracking allows businesses to show you products you might want based on your web searches. Sometimes, though, this kind of data mining can lead to real harm. For example, your data can be stolen, or it might unintentionally reveal something about you to another person that you鈥檇 prefer was private.
In a higher education environment, learning technologies are also collecting data about their users, who are primarily students. The kind of data can vary quite a bit by platform. A learning management system like Canvas, for example, has that allow faculty to see which course pages a student viewed, track completed assignments and grades, and view students鈥 鈥渓ast activity鈥 and 鈥渢otal activity鈥 in the course. But say a professor wants you to download an app or visit a website for a one-off assignment. The app or website might very well be collecting a broader set of data about your identity and behavior. And here lies one of the murky ethical issues, says Caines. In this case, the professor is more or less requiring the student to download third-party software. But it鈥檚 the student who has to check the box and agree to the terms of service, which likely requires sharing potentially sensitive data, and thus carries some risk. So the question is: Is compelling a student to share their data in this way ethical or fair?
Sometimes, Caines says, even seemingly more benign data sets, like Canvas metrics, can be used in harmful ways. The most high-profile example came last year, when . The college claimed that Canvas metrics revealed students were accessing course pages containing information relevant to their exam while they were taking their tests. 鈥淭he problem is Canvas metrics were never designed to be used for this kind of forensic investigation,鈥 Caines says. 鈥淭hey鈥檙e simply not that precise, and Canvas even says their metrics can鈥檛 be reliably used in this way. If you leave a Canvas page open or have one open on another device 鈦犫 and sometimes even if it鈥檚 closed 鈦犫 the app might still be pinging the server trying to maintain its connection. So it can look like you鈥檙e active in the system, even if you鈥檙e not.鈥 Eventually, Dartmouth came to this conclusion too and dropped its case and apologized to the students.
Other times, the risks of harm come from the now ubiquitous practice of selling data. Caines says the recent poster child case for this involves Naviance, a software platform used by 10 million high school students every year to research colleges and submit college applications. Not surprisingly, Naviance also collects a lot of student data, which it then, through its sister company, Intersect, uses to sell universities targeted recruitment and ad campaigns. (Both Naviance and Intersect are owned by the ed tech company PowerSchool, which is in turn owned by private equity firm Vista Equity Partners.) But a , a nonprofit big data watchdog organization, found this data is being used in ethically problematic ways. Using the ad and recruitment platform, college admissions offices can target students based on their location, grades, majors they鈥檙e interested in 鈦犫 and also their race. The Markup found multiple cases of universities deliberately targeting only white students.
So what鈥檚 the best way for universities to navigate this increasingly complex data landscape? Caines says in some situations, the approaches can be relatively straightforward. If a professor is asking students to use an app for an assignment, they could also have an alternative assignment for those who were uncomfortable with the app鈥檚 terms of service. And students could also consider using alias email accounts for these kinds of activities, which offers some, though hardly foolproof, protection. For broader issues, like how an institution uses data to recruit students or enforce academic standards, Caines says we鈥檒l likely need to have deep, ongoing conversations that include a wide range of constituencies at the university, so we can reduce the risks of harm to vulnerable groups. Right now, in fact, the three U-M campuses are considering what such a policy might look like. Moreover, Caines says, it always helps to be transparent with students and parents about what our practices are.
One other bit of wisdom from Caines: Remember that the fast-growing and increasingly profitable ed tech industry isn鈥檛 just targeting students, it鈥檚 targeting us as institutions. Colleges and universities are some of the biggest consumers of educational technology, and we can be savvy about what we choose to bring into our classrooms. For example, Caines says we鈥檙e not helping our students if we don鈥檛 introduce them to the latest tools and technologies that are relevant in the workforce, simply because they carry some risk. But we can also choose to pass on enticing, heavily advertised products that overpromise and underdeliver. For example, one trendy selling point nowadays is 鈥渆ngagement metrics,鈥 where platforms attempt to score how engaged students are with learning via an amalgam of clicks, page views, and time spent on a page. The problem, Caines says, is those metrics paint a pretty superficial picture of student engagement, and aren鈥檛 all that useful to skilled instructors who are already tracking this via other more nuanced instructional practices. She says there鈥檚 also a general danger of becoming too reliant on data. 鈥淚t can definitely become reductive. If we can only measure certain things, then we run the risk of only valuing those things, and that鈥檚 problematic for a bunch of different reasons. Ask anyone who鈥檚 been in education a long time and they鈥檒l tell you there are many things that are extremely valuable that we simply can鈥檛 quantify.鈥
###
Story by Lou Blouin. Want to learn more about data ethics in digital education? The Hub recently held a webinar featuring Bonnie Stewart, an expert in this area, and you can on the Hub blog.