Illustration depicting the human side of technology
Illustration: Andrea D’aquino

Information schools sit at the intersection of computing technology, data and people. Here at UW–Madison, we are diligent to ensure that humans are at the heart of our classes and research. This is especially important when grappling with the many controversies involving social media, AI and big data, including issues surrounding privacy, cybersecurity, intellectual property and disinformation.

Take, for example, this question: Are social media companies responsible for protecting their users from targeted disinformation? That’s at the core of a recent U.S. Supreme Court case I’ve been following for my research. In Murthy v. Missouri, two states and several individuals sued the Biden administration claiming that a number of federal agencies — including the FBI, the Department of Homeland Security and the Cybersecurity & Infrastructure Security Agency — communicated with social media companies to remove or demote content based on concerns about its legality, public health effects, terrorist threats and election interference. The plaintiffs alleged that these communications created unreasonable, coercive pressure on the companies and thus violated the First Amendment.

Photo of Alan Rubel
Alan Rubel is a professor and the director of the Information School in the School of Computer, Data & Information Sciences. His research fields include information ethics, policy and law.

Yes, this is a case about people and their rights, but to understand it you also have to have a fundamental knowledge about the computational technologies, large troves of data and artificial agents at play. This includes how malicious actors are able to deploy AI-controlled bots to spread misinformation and exploit social media algorithms to target audiences as well as how federal agencies are able to collect information about those activities.

With that foundation, it is then possible to discuss the bigger questions of the case: How to discern disinformation, whether social media companies are responsible for stopping it, and what the proper scope of government action is in addressing malicious information. Ultimately, the Supreme Court didn’t address the merits of Murthy v. Missouri and concluded that the plaintiffs did not have standing to bring the lawsuit. But in the iSchool, these are the questions we are preparing students to grapple with every day.

Versions of this case are likely to return, and so are other issues at that same intersection of computing technology, data and people. Just as the fall semester began, the U.S. Department of Justice indicted two employees of Russia Today for a covert, $10 million effort to create and distribute Russian-government content to U.S. audiences. In a separate case, the DOJ seized a raft of internet domain names that had been used as part of foreign malign influence campaigns. Plus, the CEO of the Telegram was indicted in France for allowing criminal activity on the app. In some sense, these cases are about technology, but as in Murthy v. Missouri, the fundamental issues are social and human.

In the College of Letters & Science, we extol the value of a broad liberal arts education. As faculty and instructors, we prepare students to analyze and evaluate complicated, real-world problems like these by teaching critical thinking, contextual understanding and problem solving. The humanities and social sciences are absolutely crucial in addressing key issues that arise in the context of computation, data and people. Our aspiration is that students across L&S, like my colleagues in the iSchool, will use their broad and deep understandings from across disciplines to engage with these kinds of questions.

More From Fall 2024

Asked&Answered
Where are we now?

A look at how the College of Letters & Science impacts each of Wisconsin’s 72 counties.

Here&Now

A roundup of the latest happenings on campus in UW–Madison's College of Letters & Science