Skip to main content
EDIT

Neely Center Efforts Turn Design Codes into Policy

Neely Center Efforts Turn Design Codes into Policy

The Center’s upstream design solutions are propelling Minnesota legislation to improve social media platforms.

04.01.24
Illuminated scale of justice on a simulated digital background; male holding laptop hidden behind graphic overlay.

Neely Center’s Social Media Design Codes are finding favor with policy makers in the U.S., U.K., and Kenya 

[iStock]

Stay Informed + Stay Connected
MARSHALL MONTHLY BRINGS YOU ESSENTIAL NEWS AND EVENTS FROM FACULTY, STUDENTS, AND ALUMNI.

The idea of censorship is anathema to American democracy and free speech. Yet, there’s a big divide when it comes to social media platforms, long considered vulnerable to bad actors and manipulation. The calls for state and federal regulation clash with decries of government interference.

“While people agree about the need to protect users from harmful content, people disagree about what specific content qualifies as ‘harmful,’” said RAVI IYER, managing director, NEELY CENTER FOR ETHICAL LEADERSHIP AND DECISION MAKING. “Allowing either companies or the government to define what is or is not harmful creates justifiable concerns about the abuse of such power.”

NATHANAEL FAST, Director of the Neely Center and the Jorge Paulo and Susanna Lemann Chair in Entrepreneurship, contends that’s why regulating content moderation is inadequate as a solution.

“We have an estimated 5 billion social media users generating a never-ending stream of content with constantly evolving patterns of communication. That makes it difficult, if not impossible, to stay on top of hate speech and other types of content we might not want,” Fast said.

Experts at the Neely Center have determined through research and collaboration that “upstream design” is more effective to improve social media’s impact on society. The DESIGN CODE FOR SOCIAL MEDIA was developed to be the answer for leaders in government and the digital world. As a suite of recommendations over privacy, usage, and content consumption, the design codes give agency to the user, empowering the individual to have control over what and with whom they engage, while making it more difficult for individual actors to push bad content to thousands or millions of users.

“[This] allows companies to mitigate the harmful effects of their product without being forced to make decisions about what anyone is or is not allowed to say,” Iyer explained.

Policymakers are taking notice.

On February 28, the Minnesota state legislature introduced bill number HF4400, incorporating several elements of the design codes: providing default privacy settings that users can change if they choose; optimizing content based on user’s “expressed preferences” instead of what they engage; and setting daily limits of engagement for new users to reduce reach and influence from “superspreaders.” 

Our design code is ‘content neutral,’ meaning that it does not require a platform to make decisions about which content to allow or amplify. Instead, it anchors on signals from users as to what they prefer and/or consider to be higher quality content. 

— Ravi Iyer
Managing Director, Neely Center for Ethical Leadership and Decision Making

If passed, the Prohibiting Social Media Manipulation Act would take effect in July 2025 to prevent unwanted experiences for Minnesota residents.

Fast is pleased the codes are having the desired policy implications they wanted.

“Our approach at the Neely Center is to create tools at the intersection of ethics and technology for leaders across all sectors in society,” Fast explained. “In the case of the design code, we are especially interested in improving government and company policies; so, the fact that policymakers and tech companies are beginning to use our design code tells us we’re on the right track.”

Iyer is slated to appear at various state hearings to advocate for the bill. Prior to the proposed legislation, his advocacy with Minnesota Attorney General Keith Ellison helped spur its introduction by collaborating on the state’s initial report on how to protect children from the harmful effects of technology.

Instead of the social media platform’s algorithmic ranking system dictating what content an individual receives in their feed — based on what they happen to engage with — the most effective safeguard is to optimize what the individual wants to engage with. Under this type of redesign suggested by the Neely Center’s recommendations and included in Minnesota’s legislation, the platforms would operate on the same level of engagement and therefore compete solely on value and quality.

Minnesota is just one example of the Neely Center’s advocacy among government entities looking for new legislative models for protecting children. The design codes have the potential to scale worldwide solutions.

Overseas, the Neely Center presented the design code to numerous people within the U.K. Government and the U.K.’s communications regulator (Ofcom), as they have been working on the new Online Safety Act. Ofcom also has a history of measuring user experiences online, similar to the NEELY CENTER INDICES, and there is much to be learned methodologically across both efforts. As Ofcom implements the Online Safety Act, Iyer will advise them on conceptual frameworks and empirical approaches to understand, measure, and improve outcomes for people in digital communications.

Two of the Neely Center’s partners also are having fruitful dialogues in Africa and South Asia about how the design codes can be integrated into government policies. BUILD UP has engaged with the Kenyan and Ghanaian governments, while SEARCH FOR COMMON GROUND recently organized a gathering in Sri Lanka of global civil society organizations working to combat TFGBV (Technology-Facilitated Gender-Based Violence).

These are powerful validations of the impact of the Neely Center’s work.

“It’s encouraging that we’re making a difference both nationally and internationally. I have a strong conviction that universities can and should contribute thought leadership that can benefit all,” Fast shared. “This is especially true with AI, as we are all affected by the tools we’re building, so the more we can create a global conversation that makes us all smarter, the better off we’ll be.”

Recently in Washington, D.C., Iyer gave an invited talk before the Federal Trade Commission about how upstream design could counter the disempowering effects of what the government agency identifies as “dark patterns” — tricks and traps that mislead consumers into making decisions they might not otherwise make if they had access to full information.

The Neely Center organized and convened the Design Solutions Summit 2024 at Search for Common Ground’s D.C. headquarters. The Summit gathered academics, technologists, civil society members, policymakers, and influencers who are focused on design solutions for improving online civic discourse. Efforts are heightened given the scale of GLOBAL ELECTIONS in 2024, in which at least 64 countries, representing nearly 49% of the people in the world, will head to the polls.

As the debate rages on over what is or isn’t considered free speech, government entities and digital leaders are searching for other ways to protect the online community — outlawing design choices that lead to negative and harmful outcomes are proving to work according to Fast.

“It's important to note that we’re testing the impact of design changes that are made over time. Through our Neely Ethics and Technology Indices, we are tracking user experiences of negative and positive content longitudinally, and we’ve already started to see the improvements that occur when individual companies make design changes.”