Skip to main content
  • Home
  • About
  • Faculty Experts
  • For The Media
  • ’Cuse Conversations Podcast
  • Topics
    • Alumni
    • Events
    • Faculty
    • Students
    • All Topics
  • Contact
  • Submit
Media Tip Sheets
  • All News
  • Arts & Culture
  • Business & Economy
  • Campus & Community
  • Health & Society
  • Media, Law & Policy
  • STEM
  • Veterans
  • University Statements
  • Syracuse University Impact
  • |
  • The Peel
  • Athletics
Sections
  • All News
  • Arts & Culture
  • Business & Economy
  • Campus & Community
  • Health & Society
  • Media, Law & Policy
  • STEM
  • Veterans
  • University Statements
  • Syracuse University Impact
  • |
  • The Peel
  • Athletics
  • Home
  • About
  • Faculty Experts
  • For The Media
  • ’Cuse Conversations Podcast
  • Topics
    • Alumni
    • Events
    • Faculty
    • Students
    • All Topics
  • Contact
  • Submit
Media Tip Sheets

AI Ethicist Addresses Safety and Oversight Concerns

Thursday, March 30, 2023, By Christopher Munoz
Share

Artificial Intelligence is advancing at a rapid pace, and some top researchers are calling for a pause. An open letter issued by the Future of Life Institute argues for a 6-month pause on the training of AI systems more powerful than GPT-4, or government moratoriums if all key labs won’t comply. Their stated goal is to develop shared safety protocols overseen by independent experts that would ensure systems are safe beyond a reasonable doubt.

 

Baobao Zhang

Baobao Zhang is a political science professor at Syracuse University’s Maxwell School of Citizenship & Public Affairs. She is also a Senior Research Associate with the Autonomous Systems Policy Institute specializing in the ethics and governance of Artificial Intelligence. She answered some questions about the state of AI research and the concerns raised by the open letter.

The letter refers to an “out-of-control race” to develop technology that no one can predict or reliably control. How significant is this concern?

There is a race between major tech companies and even smaller start-ups that are trying to release generative AI systems, including large language models, text-to-image models, and multimodal models that work with several different types of media. The main concern is that these models are deployed across different settings without sufficient safety audits and guardrails. For example, earlier this year, we saw an early version of Bing’s chatbot powered by ChatGPT threaten, emotionally manipulate, and lie to users. More recently, we have seen many people being fooled by synthetic images (e.g., the Pope wearing a stylish puffer jacket) generated by Midjourney, a text-to-image AI system. Given that these generative AI systems are relatively general-purpose, it’s much harder for those developing or deploying AI systems to know what risks these AI systems could pose.

It’s critical that we study and anticipate how powerful AI systems can impact society before we deploy them.

Baobao Zhang

The letter also raises a number of ethical concerns about what we should allow machines to do. Do you feel there is enough ethical oversight at the companies where this technology is being developed?

I don’t think there is sufficient ethical oversight at the companies where these technologies are being developed. Given the economic pressures that these companies face, internal AI ethics teams may have limited power to slow or stop the deployment of AI systems. For example, Microsoft just laid off an entire AI ethics and society team that is supposed to make sure its products and services adhere to its AI ethics principles. At this point, I think ethical oversight should come from governmental regulation and public scrutiny. I think the European Union’s Artificial Intelligence Act is a step in the right direction because it scales regulatory scrutiny with risk. Nevertheless, we need to rethink how to classify risk when it comes to more general-purpose AI systems where some applications are high-risk (e.g., generating political news content) and some applications are low-risk (e.g., generating a joke for a friend).

What could a six-month pause on AI experimentation accomplish, and can we expect that enough governments and researchers would abide by that to make an impact?

I agree that we need to slow down the development and deployment of powerful generative AI systems. Nevertheless, a 6-month pause on AI experimentation is not particularly helpful by itself. We have to consider longer-term technical and governance guardrails for the development of more general-purpose AI systems. Furthermore, how can we ensure that AI developers abide by the 6-month moratorium? At a minimum, we would need to create a scheme to monitor how these AI developers use computing resources or a whistleblower protection program for those who want to disclose that their employer is violating the moratorium.

What should AI researchers consider as they push forward with new technology, and is there anything the general public should keep in mind as they see the headlines?

AI researchers should consider working with social scientists, civil society groups, and journalists as they develop new models. It’s critical that we study and anticipate how powerful AI systems can impact society before we deploy them. It’s a confusing time for the general public because there is expert disagreement about whether we are developing AI systems posing an existential threat to humanity. But there is expert consensus that generative AI could be hugely impactful, if not disruptive to how we work and relate to each other now and in the near future. One of the risks the open letter noted is the proliferation of “propaganda and untruth.” Harms from misinformation and disinformation are not new, but generative AI would allow bad actors to scale and personalize their campaigns greatly.

To request interviews or get more information:

Chris Munoz
Media Relations Specialist
cjmunoz@syr.edu

  • Author

Christopher Munoz

  • Recent
  • Falk College Sport Analytics Students Win Multiple National Competitions
    Friday, May 16, 2025, By Cathleen O'Hare
  • Physics Professor Honored for Efforts to Improve Learning, Retention
    Friday, May 16, 2025, By Sean Grogan
  • Historian Offers Insight on Papal Transition and Legacy
    Friday, May 16, 2025, By Keith Kobland
  • Live Like Liam Foundation Establishes Endowed Scholarship for InclusiveU
    Tuesday, May 13, 2025, By Cecelia Dain
  • ECS Team Takes First Place in American Society of Civil Engineers Competition
    Tuesday, May 13, 2025, By Kwami Maranga

More In Media Tip Sheets

Historian Offers Insight on Papal Transition and Legacy

As the Roman Catholic Church begins a new chapter under Pope Leo XIV, historians and scholars are helping the public interpret the significance of this moment. Among them is Margaret Susan Thompson, professor of history in the Maxwell School of…

From Policy to Practice: How AI is Shaping the Future of Education

President Trump recently signed an executive order focusing on educational opportunities surrounding artificial intelligence. Among other things, it establishes a task force to promote AI-related education and tools in the classroom. That is a major area of focus for Dr….

V-E Day: The End of WWII in Europe, 80 Years Later

This week marks the 80th anniversary of Victory in Europe (V-E) Day, when Nazi Germany formally surrendered to Allied forces on May 8, 1945, bringing an end to World War II in Europe. While it signaled the collapse of Hitler’s…

Hendricks Chapel Reflects on the Legacy of Pope Francis

If you need an expert to discuss the legacy of Pope Francis, you may want to consider Syracuse University Catholic Father Gerry Waterman, OFM Conv., or The Rev. Brian E. Konkol, Ph.D., vice president and dean of Hendricks Chapel. He…

Diving Deep Into the Fluoride Debate

Fluoride in drinking water has become a highly charged topic in recent weeks. In March, Utah became the first state to prohibit the addition of fluoride to the state’s public water systems, a move praised by U.S. Health Secretary Robert…

Subscribe to SU Today

If you need help with your subscription, contact sunews@syr.edu.

Connect With Us

  • Facebook
  • @SyracuseUNews
  • Youtube
  • Facebook
  • Instagram
  • Youtube
  • LinkedIn
  • @SyracuseU
  • @SyracuseUNews
  • Social Media Directory
  • Accessibility
  • Privacy
  • Campus Status
  • Syracuse.edu
© 2025 Syracuse University News. All Rights Reserved.