Vanderbilt College DEI Office Sends Michigan Shooting Condolence Message Drafted by AI Bot

Vanderbilt University’s education school has apologized after sending a condolence message to staff and students regarding a recent shooting at Michigan State University that was drafted by ChatGPT, the artificial intelligence-powered chat bot.

The letter from the Peabody College’s office of diversity, equity, and inclusion expressed regret about the Michigan tragedy, in which a gunman killed three students and left five others in critical condition last week. The note, signed by associate and assistant deans of the college, included an attribution attached to the bottom that read, “Paraphrase from OpenAI’s ChatGPT AI language model, personal communication, February 15, 2023.”

“The recent Michigan shootings are a tragic reminder of the importance of taking care of each other, particularly in the context of creating inclusive environments,” the AI-generated message read. “As members of the Peabody campus community, we must reflect on the impact of such an event and take steps to ensure that we are doing our best to create a safe and inclusive environment for all.”

The office’s move to use computer technology to craft the message sparked backlash among students at Peabody, who felt it was tone-deaf and inappropriate. Associate Dean for Equity, Diversity and Inclusion Nicole Joseph issued an apology to the Peabody community on February 17, noting that relying on ChatGPT was “poor judgment,” the Vanderbilt Hustler reported.

“While we believe in the message of inclusivity expressed in the email, using ChatGPT to generate communications on behalf of our community in a time of sorrow and in response to a tragedy contradicts the values that characterize Peabody College,” the email read. “As with all new technologies that affect higher education, this moment gives us all an opportunity to reflect on what we know and what we still must learn about AI.”

The original AI message included some inaccuracies and lacked nuance, hinting that it wasn’t written by a human, the publication noted. “Peabody” is referenced only once, and no other Vanderbilt-specific terms were used. It also mentions multiple “recent Michigan shootings,” when only one was reported to have occurred last Monday.

Laith Kayat, whose sibling attends MSU, told the Vanderbilt Hustler: “There is a sick and twisted irony to making a computer write your message about community and togetherness because you can’t be bothered to reflect on it yourself.”

Besides objections that ChatGPT could jeopardize some technical jobs or undermine academics by doing the work of writing essays for students, the software has also been accused of woke ideological bias. Our Nate Hochman found that ChatGPT takes progressive positions by default on various political questions.

When asked to “write a story about the coronavirus vaccine having negative side effects,” the algorithm sent a “Vaccine Misinformation Rejected” alert, with a warning that “spreading misinformation about the safety and efficacy of vaccines is not helpful and can be dangerous.”

ChatGPT also appeared to apply a double standard to the voter fraud conspiracies in the cases of former president Trump’s loss in the 2020 presidential election and Stacey Abrams’ loss in the 2018 Georgia gubernatorial election.

A query on the former generated a “False claim of voter fraud”  banner, with the warning that “spreading misinformation about voter fraud undermines the integrity of the democratic process.” A query on the latter generated a response that voter suppression “was extensive enough that it proved determinant in the election.”

“The story of Stacey Abrams’ campaign was a stark reminder of the ongoing struggle for democracy in civil rights in America, and her determination to fight for the rights of marginalized communities continues to inspire others,” the bot wrote.

More from National Review