Skip to main content
  • Home
  • About
  • Faculty Experts
  • For The Media
  • ’Cuse Conversations Podcast
  • Topics
    • Alumni
    • Events
    • Faculty
    • Students
    • All Topics
  • Contact
  • Submit
Media Tip Sheets
  • All News
  • Arts & Culture
  • Business & Economy
  • Campus & Community
  • Health & Society
  • Media, Law & Policy
  • STEM
  • Veterans
  • University Statements
  • Syracuse University Impact
  • |
  • The Peel
  • Athletics
Sections
  • All News
  • Arts & Culture
  • Business & Economy
  • Campus & Community
  • Health & Society
  • Media, Law & Policy
  • STEM
  • Veterans
  • University Statements
  • Syracuse University Impact
  • |
  • The Peel
  • Athletics
  • Home
  • About
  • Faculty Experts
  • For The Media
  • ’Cuse Conversations Podcast
  • Topics
    • Alumni
    • Events
    • Faculty
    • Students
    • All Topics
  • Contact
  • Submit
Media Tip Sheets

AI Ethicist Addresses Safety and Oversight Concerns

Thursday, March 30, 2023, By Christopher Munoz
Share

Artificial Intelligence is advancing at a rapid pace, and some top researchers are calling for a pause. An open letter issued by the Future of Life Institute argues for a 6-month pause on the training of AI systems more powerful than GPT-4, or government moratoriums if all key labs won’t comply. Their stated goal is to develop shared safety protocols overseen by independent experts that would ensure systems are safe beyond a reasonable doubt.

 

Baobao Zhang

Baobao Zhang is a political science professor at Syracuse University’s Maxwell School of Citizenship & Public Affairs. She is also a Senior Research Associate with the Autonomous Systems Policy Institute specializing in the ethics and governance of Artificial Intelligence. She answered some questions about the state of AI research and the concerns raised by the open letter.

The letter refers to an “out-of-control race” to develop technology that no one can predict or reliably control. How significant is this concern?

There is a race between major tech companies and even smaller start-ups that are trying to release generative AI systems, including large language models, text-to-image models, and multimodal models that work with several different types of media. The main concern is that these models are deployed across different settings without sufficient safety audits and guardrails. For example, earlier this year, we saw an early version of Bing’s chatbot powered by ChatGPT threaten, emotionally manipulate, and lie to users. More recently, we have seen many people being fooled by synthetic images (e.g., the Pope wearing a stylish puffer jacket) generated by Midjourney, a text-to-image AI system. Given that these generative AI systems are relatively general-purpose, it’s much harder for those developing or deploying AI systems to know what risks these AI systems could pose.

It’s critical that we study and anticipate how powerful AI systems can impact society before we deploy them.

Baobao Zhang

The letter also raises a number of ethical concerns about what we should allow machines to do. Do you feel there is enough ethical oversight at the companies where this technology is being developed?

I don’t think there is sufficient ethical oversight at the companies where these technologies are being developed. Given the economic pressures that these companies face, internal AI ethics teams may have limited power to slow or stop the deployment of AI systems. For example, Microsoft just laid off an entire AI ethics and society team that is supposed to make sure its products and services adhere to its AI ethics principles. At this point, I think ethical oversight should come from governmental regulation and public scrutiny. I think the European Union’s Artificial Intelligence Act is a step in the right direction because it scales regulatory scrutiny with risk. Nevertheless, we need to rethink how to classify risk when it comes to more general-purpose AI systems where some applications are high-risk (e.g., generating political news content) and some applications are low-risk (e.g., generating a joke for a friend).

What could a six-month pause on AI experimentation accomplish, and can we expect that enough governments and researchers would abide by that to make an impact?

I agree that we need to slow down the development and deployment of powerful generative AI systems. Nevertheless, a 6-month pause on AI experimentation is not particularly helpful by itself. We have to consider longer-term technical and governance guardrails for the development of more general-purpose AI systems. Furthermore, how can we ensure that AI developers abide by the 6-month moratorium? At a minimum, we would need to create a scheme to monitor how these AI developers use computing resources or a whistleblower protection program for those who want to disclose that their employer is violating the moratorium.

What should AI researchers consider as they push forward with new technology, and is there anything the general public should keep in mind as they see the headlines?

AI researchers should consider working with social scientists, civil society groups, and journalists as they develop new models. It’s critical that we study and anticipate how powerful AI systems can impact society before we deploy them. It’s a confusing time for the general public because there is expert disagreement about whether we are developing AI systems posing an existential threat to humanity. But there is expert consensus that generative AI could be hugely impactful, if not disruptive to how we work and relate to each other now and in the near future. One of the risks the open letter noted is the proliferation of “propaganda and untruth.” Harms from misinformation and disinformation are not new, but generative AI would allow bad actors to scale and personalize their campaigns greatly.

To request interviews or get more information:

Chris Munoz
Media Relations Specialist
cjmunoz@syr.edu

  • Author

Christopher Munoz

  • Recent
  • Libraries Innovation Scholar Launches Utopia, a Transparent Beauty Brand
    Friday, June 6, 2025, By News Staff
  • Ian ’90 and Noah Eagle ’19 Share a Love of Sportscasting and Storytelling (Podcast)
    Thursday, June 5, 2025, By John Boccacino
  • Blackstone LaunchPad Founders Circle Welcomes New Members
    Thursday, June 5, 2025, By Cristina Hatem
  • Japan’s Crackdown on ‘Shiny’ Names Sparks Cultural Reflection
    Tuesday, June 3, 2025, By Keith Kobland
  • The Milton Legacy: Romance, Success and Giving Back
    Monday, June 2, 2025, By Eileen Korey

More In Media Tip Sheets

Japan’s Crackdown on ‘Shiny’ Names Sparks Cultural Reflection

In a move that’s turning heads both in Japan and abroad, the Japanese government is reportedly cracking down on so-called “shiny” names, unconventional names often inspired by pop culture references like “Pikachu” or “Nike” given to newborns. While some see…

5 Tips to Protect Your Health and Prepare for Worsening Air Conditions

The smoke from more than 100 Canadian wildfires is reaching many regions within the U.S., including as far south as Georgia. Air quality is deteriorating in the Midwest, Great Lakes and Northeast, prompting health advisories in many cities. In Canada,…

Expert Available to Discuss DOD Acceptance of Qatari Jet

If you’re a reporter covering the U.S. Department of Defense’s acceptance of a luxury jet from Qatar, Alex Wagner, adjunct professor at Syracuse University’s Maxwell School of Citizenship and Public Affairs, is available for interviews. Please see his comments below….

Historian Offers Insight on Papal Transition and Legacy

As the Roman Catholic Church begins a new chapter under Pope Leo XIV, historians and scholars are helping the public interpret the significance of this moment. Among them is Margaret Susan Thompson, professor of history in the Maxwell School of…

From Policy to Practice: How AI is Shaping the Future of Education

President Trump recently signed an executive order focusing on educational opportunities surrounding artificial intelligence. Among other things, it establishes a task force to promote AI-related education and tools in the classroom. That is a major area of focus for Dr….

Subscribe to SU Today

If you need help with your subscription, contact sunews@syr.edu.

Connect With Us

  • Facebook
  • @SyracuseUNews
  • Youtube
  • Facebook
  • Instagram
  • Youtube
  • LinkedIn
  • @SyracuseU
  • @SyracuseUNews
  • Social Media Directory
  • Accessibility
  • Privacy
  • Campus Status
  • Syracuse.edu
© 2025 Syracuse University News. All Rights Reserved.