Don’t be surprised by AI chatbots creating fake citations
Podcast |
Marketplace Tech
Publisher |
Marketplace
Media Type |
audio
Categories Via RSS |
News
Technology
Publication Date |
Apr 13, 2023
Episode Duration |
00:07:04

By now a lot of us are familiar with chatbot “hallucinations” — the tendency of artificial intelligence language models to make stuff up. And lately we’ve been seeing reports of these tools getting creative with bibliography. For instance, last week The Washington Post reported on the case of a law professor whose name showed up in a list of legal scholars accused of sexual harassment. The list was generated by ChatGPT as part of a research project, and the chatbot cited as its source a March 2018 Washington Post article that doesn’t exist. People have taken to calling these fantasy references “hallucitations.” Marketplace’s Meghan McCarty Carino recently spoke with Bethany Edmunds, a teaching professor at Northeastern University, about why this is happening. Edmunds says this kind of result is to be expected.

This episode currently has no reviews.

Submit Review
This episode could use a review!

This episode could use a review! Have anything to say about it? Share your thoughts using the button below.

Submit Review