Skip navigation links

Oct. 29, 2024

Ask the expert: Technology regulation proves key issue in election

This is part of Michigan State University’s “Ask the Experts: 2024 Election Issues” series where experts answer questions on specific and relevant issues that affect people’s day-to-day lives. This one focuses on policies for technology regulation.

The past few years have seen major technological changes — from the integration of generative artificial intelligence, or AI, in our daily lives to advanced robotics and automation. As a result, questions abound concerning regulation of this new tech and, while government leaders have introduced some policies, there’s a great deal of uncertainty in terms of how much the government will step in to regulate these growing and highly integrated innovations.

Anjana Susarla is the Omura-Saxena Professor of Responsible AI in Michigan State University’s Broad College of Business.
Anjana Susarla is the Omura-Saxena Professor of Responsible AI in Michigan State University's Broad College of Business.

As the presidential candidates prepare for Election Day — now just days away — it’s clear that technology regulation is shaping up to be a big issue for both Vice President Kamala Harris and former President Donald Trump. And despite who wins, the new president is likely to face mounting pressure to confront issues of technology regulation throughout their presidency. 

Anjana Susarla is the Omura-Saxena Professor of Responsible AI in Michigan State University’s Broad College of Business. She is an expert in information systems and artificial intelligence and, here, she answers questions on the different ways Trump and Harris could approach technology regulation. 

Responses are excerpts from an article published in The Conversation.

How do Harris and Trump differ on AI policy?

One area where there are stark differences between the candidates is when it comes to recognizing and addressing algorithmic harms from the widespread use of AI technology. AI affects your life in ways that might escape your notice. Biases in algorithms used for lending and hiring decisions could end up reinforcing a vicious cycle of discrimination. For example, a student who can’t get a loan for college would then be less likely to get the education needed to pull herself out of poverty.

At the AI Safety Summit in the U.K. in November 2023, Harris spoke of the promise of AI but also the perils from algorithmic bias, deepfakes and wrongful arrests. Biden signed an executive order on AI on Oct. 30, 2023, that recognized AI systems can pose unacceptable risks of harm to civil and human rights and individual well-being. In parallel, federal agencies such as the Federal Trade Commission have carried out enforcement actions to guard against algorithmic harms.

By contrast, the Trump administration did not take a public stance on mitigation of algorithmic harms. Trump has said he wants to repeal President Biden’s AI executive order. In recent interviews, however, Trump noted the dangers from technologies such as deepfakes and challenges posed to security from AI systems, suggesting a willingness to engage with the growing risks from AI.

What actions have the Trump and Biden-Harris administrations taken on regulation?

The Trump administration signed the American AI Initiative executive order on Feb. 11, 2019. The order pledged to double AI research investment and established the first set of national AI research institutes. The order also included a plan of AI technical standards promoting the use of trustworthy AI in the federal government.

The Biden-Harris administration has tried to go further. Harris convened the heads of Google, Microsoft and other tech companies at the White House on May 4, 2023, to undertake a set of voluntary commitments to safeguard individual rights. The Biden administration’s executive order contains an important initiative to probe the vulnerability of very large-scale, general-purpose AI models trained on massive amounts of data. The goal is to determine the risks hackers pose to these models, including the ones that power OpenAI’s popular ChatGPT and DALL-E.

How have the Trump and Biden-Harris administrations addressed antitrust?

Antitrust law enforcement — restricting or conditioning mergers and acquisitions — is another way the federal government regulates the technology industry. The Trump administration’s antitrust dossier includes its attempt to block AT&T’s acquisition of Time Warner. The merger was eventually allowed by a federal judge after the Federal Trade Commission under the Trump administration filed a suit to block the deal. The Trump administration also filed an antitrust case against Google focused on its dominance in internet search.

Biden signed an executive order on July 9, 2021, to enforce antitrust laws arising from the anticompetitive effects of dominant internet platforms. The order also targeted the acquisition of nascent competitors, the aggregation of data, unfair competition in attention markets and the surveillance of users. Additionally, the Biden-Harris administration has filed antitrust cases against Apple and Google. 

The Biden-Harris administration’s merger guidelines in 2023 outlined rules to determine when mergers can be considered anticompetitive. While both administrations filed antitrust cases, the Biden administration’s antitrust push appears stronger in terms of its impact in potentially reorganizing or even orchestrating a breakup of dominant companies such as Google.

How could their approaches differ on cryptocurrency?

Late in his administration, Trump tweeted in support of cryptocurrency regulation. Also late in Trump’s administration, the federal Financial Crimes Enforcement Network proposed regulations that would have required financial firms to collect the identity of any cryptocurrency wallet to which a user sent funds. The regulations were not enacted.

Trump has since shifted his position on cryptocurrencies. He has criticized existing U.S. laws and called for the United States to be a Bitcoin superpower. The Trump campaign is the first presidential campaign to accept payments in cryptocurrencies.

The Biden-Harris administration, by contrast, has laid out regulatory restrictions on cryptocurrencies with the Securities and Exchange Commission, which brought about a series of enforcement actions. The White House vetoed the Financial Innovation and Technology for the 21st Century Act that aimed to clarify accounting for cryptocurrencies, a bill favored by the cryptocurrency industry. 

What actions could come next on data privacy?

Across the U.S., several states have tried to pass legislation addressing aspects of data privacy. At present, there is a patchwork of statewide initiatives and a lack of comprehensive data privacy legislation at the federal level. The lack of federal data privacy protections is a stark reminder that while the candidates are addressing some of the challenges posed by developments in AI and technology more broadly, a lot still remains to be done to regulate technology in the public interest.

Thus far, Biden’s AI executive order calls on Congress to adopt privacy legislation, but it does not provide a legislative framework to do so. The Trump White House’s American AI Initiative executive order mentions privacy only in broad terms, calling for AI technologies to uphold “civil liberties, privacy and American values.” The order did not mention how existing privacy protections would be enforced.

Overall, the Biden-Harris administration’s efforts at antitrust and technology regulation seem broadly aligned with the goal of reining in technology companies and protecting consumers. It’s also reimagining monopoly protections for the 21st century. This seems to be the chief difference between the two administrations.

 

Media Contacts

COLLECTION

more content from this collection

Politics, elections and policy