At Educause, a Push to Monitor Student Data is Met with Concerns About Privacy and Equity


CHICAGO — Colleges are increasingly using Big Data to monitor students, control their access to information and set them on learning paths they may not have chosen, argues Chris Gilliard, a professor at Macomb Community College, who says the practices add up to “digital redlining.”

“I don’t think education is a predictive task,” said Gilliard, criticizing the use of data systems that allegedly forecast student success or failure. “I’m seeing a whole lot of wreckage in the process that people aren’t considering.”

Gilliard presented his critique on Wednesday at the annual conference of Educause, one of the largest gatherings of college technology leaders. The very tools he objected to were prominently on display here, in a sprawling exhibit hall of vendors and at numerous sessions touting the benefits of algorithms in higher education.

For instance, brightly-colored booths throughout the exhibit hall promised to ingest all kinds of student information and spit out revelations for campus administrators. A director of innovation at the University of Texas at Dallas described in a presentation how his institution tests Echo Dots—which record sound data—in residence halls. And a representative from the University of Houston explained in a poster session how students can unlock the university’s campus app with their fingerprints or faces.

Plenty of parents, students and faculty may find all this data collection, analysis and application “creepy,” said Rob Curtin, director of higher education for Microsoft, during a conference session. But the implications may cross from mere discomfort into actual damage for students of color and other minorities on campus by limiting their learning potential without their consent.

“People think the harms are far off,” Gilliard said. “The harms are right now.”

The professor raised his concerns even as elsewhere in the cavernous convention center, his hosts and the leaders of two partner associations prepared to promote a statement they published in late August imploring colleges to move “aggressively forward” in the use of statistical analysis and predictive modeling to recruit students, help them graduate and improve institutional efficiency.

Gilliard used part of his session to question the bold promise made in the statement’s title: “Analytics can save higher education. Really.”

Who is doing the saving? Gilliard asked. For what? From whom?

When it came to answering those queries, the professor said, “I don’t think this document did a good job.”

Calling Out Misuse

Digital redlining, as Gilliard defines it, comes in many forms. One is denying students at community colleges access to academic journal subscriptions or using parental controls on websites, a practice that intends to block objectionable material but may also impede research on topics of valid scholarly interest. Another is digital surveillance, through tools that track eye movement while students read, document their paths across campus or, as some companies are trying to encourage, monitor their electrical brain activity while they sit in class.

Gilliard raised alarm about facial recognition systems in particular, partly because the technology has been found to fail people of color and transgender people. “The technology doesn’t ‘recognize’ you, it tells you who it thinks you are,” Gilliard argued.

Facial recognition technology also exemplifies how student data can escape college control and be sold or borrowed in unsettling ways, Gilliard said. He cited the case of Duke University researchers recording student faces to build a facial recognition dataset that made its way into tools the Chinese government uses to monitor Muslim minorities.

Indeed, college administrators should be vigilant in guarding student privacy against non-consensual uses, especially by ed tech vendors with whom institutions sign contracts, said Scott Schafer, university privacy officer at the University of Pennsylvania, in a different session.

“Data is the new gold for vendors,” he explained. “I view it as my duty to make sure we’re signing the right privacy protection schedules.”

But Gilliard and other educators worry about how colleges use student data internally, too. For example, data tracking can be used to steer students away from the fields that most interest them or delay their progress by prompting advisers to repeatedly sideline them in remedial classes, said Pam Eddinger, president of Bunker Hill Community College.

Chris Gilliard used this version of a popular meme to critique the use of student data at colleges.

Striking a Balance

Leaders from Educause, the National Association of College and University Business Officers and the Association for Institutional Research were prepared for their joint statement to face skepticism, they said, explaining that they’re not blind to the adverse effects that analytics can have on some students.

Indeed, the document includes language about avoiding “pernicious discrimination and bias” and “inappropriate sharing and use of data.” And in the most recent edition of its magazine, Educause invited instructional designer Autumm Caines and digital scholarship librarian Erin Glass to explore issues related to digital redlining. In their op-ed, the authors support permitting students to opt out of data collection and propose that professors incorporate data-privacy lessons into their digital literacy efforts.

Still, association representatives defended practices like placement testing to help decrease the time it takes students to complete their degrees and data analysis to identify which students most need emergency aid or additional academic advising.

“There’s an obligation to support students as learners, and you can’t not use data,” said Lindsay Wayt, director of analytics at the National Association of College and University Business Officers, in an interview with EdSurge.

There were signs throughout the conference that educators are trying to strike a better balance with their data practices. Several speakers mentioned how, for the first time, privacy ranked as one of the top 10 IT concerns that Educause members have, according to the organization’s annual survey. The chief information officer at Dartmouth College, Mitchel Davis, explained how he is working to rewrite campus contracts about what tech vendors can do with student data. At UT Dallas, where the Echo Dots are being tested, administrators collaborated with student government to better inform students about how their data may be collected.

“Privacy is a big deal,” said Kishore Thakur, senior director of innovation, cloud and platform services at the university.

Gilliard said his criticism stems from his acknowledgement that education technology tools are, to some extent, necessary.

“There are lots of people who cannot afford to not use these things,” he told the large audience assembled for his session. “If you’re the child of a billionaire, it doesn’t matter what you do in school. If you’re a child growing up in certain areas, it really matters. That technology poses some capabilities that are going to be important.”

But the way ed tech is developed, tested and launched on campus should change, he said. His vision: Out with a ‘move fast and break things’ mentality famously championed by Facebook, and in with small-scale trials that identify and fix mistakes using the insight of the people who will be most affected.

“We have been burdened with some definitions of ‘innovative’ that benefit Silicon Valley and companies and not humans,” Gilliard said. “How we figure out the potential harms, who is involved in the process to begin with, all of those things can mitigate some of those issues.”



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version