Computer Science, the Liberal Arts Way

Summer 2022

Covering the walls of McGregory 314 this spring, colorful Post-it notes bore statements like “I feel useless” and “I value playing well over winning” and “I don’t like not being in control.” This affinity diagram represents one of the group projects for the course Human-Computer Interaction (COSC 480), taught by Assistant Professor Nick Diana.

Assistant Professor of Computer Science Nick Diana in front of his students’ affinity diagrams. Photo by Mark DiOrio

The class has two objectives, Diana explains: “The first is understanding that human beings have certain strengths and weaknesses, and machines also have certain strengths and weaknesses. How can we build systems that complement the strengths and weaknesses of these two kinds of agents in a way that’s productive — that’s better than just a human being alone, or better than just a machine alone?” Secondly, “How do we do that in a way that is human centered? We want to create technology that actually helps people. It is not just easy to use — that’s the baseline — but it’s meaningful, it’s joyful to use … not destructive.”

Affinity diagrams are a method to glean high-level insights from user interview data, Diana explains. In the example pictured here, the students were interested in the emotions of competitive gamers. The researchers interviewed their subjects before, during, and after playing video games, taking detailed notes about the gamers’ emotional states at each period. The students then looked for commonalities across their interviews and created a hierarchy of data based on insights from those common threads. “The idea behind this model is, if you have users doing or saying the same things, then that’s probably important.”

The findings were unexpected, Diana says. The students found that the emotion the competitive gamers felt after winning a match wasn’t happiness or enjoyment, it was relief. “So this is abatement of negative emotions rather than seeking of positive emotions, which is really interesting.”

We want to create technology that actually helps people. It is not just easy to use — that’s the baseline — but it’s meaningful, it’s joyful to use … not destructive.

Nick Diana

After the interviews, the students developed a solution and went back to the field several times to ask users for feedback. The affinity diagrams stayed on the walls throughout the semester so students could refer back to the data as they were developing opportunities for positive change. By the end of the iterative process, the students “have something that they can be pretty sure people actually want,” Diana says. “It’s a more reliable way of designing useful, meaningful technology.”

The end solution was building a “match review” interface that gamers could access between matches. “The idea is that an algorithm would churn through their gameplay data and look for common errors, then the user could watch short video clips on how to improve in the future,” Diana explains. The solution is relatively straightforward, he adds, “but this group found that, at least for competitive gamers, it was impossible to separate their emotions from their performance.” The better the gamers performed, the better they felt, so the students focused on ways to improve how users felt about their performance. “It also emphasizes some key insights [the students] found through their user research,” Diana says, “namely that these kinds of gamers care a lot about the nitty gritty details of their performance and only have the mental bandwidth to process this kind of feedback after the match is over.”

He views the class as a bookend for the students’ computer science education. Because it is a high-level course taken by upperclass students, they’ve completed their core liberal arts education and have spent their last couple of years becoming computer science experts. “My course asks, ‘How do we circle back to that liberal arts education? How do we apply those skills in my domain of expertise?’” he says. “I think the course injects some humanity into computer science, and hopefully they’ll take that with them when they build new technologies … so they can do so in a responsible and human-centered way.”