Submitted by Sydney Brogden on
It is an exciting time in technology! Emerging tools and technologies in Artificial Intelligence (AI) are making waves across library service areas and demanding the attention of students and professionals everywhere.
These tools have the capacity to directly affect AskAway, and have already started to. Some service providers have reported requests from patrons seeking help tracking down resources recommended to them by generative AI chatbot tools like OpenAI’s ChatGPT or Google’s Bard. Users can feed a prompt into these tools asking for a list of peer-reviewed articles. Theoretically, this would expedite the research process and leave only the work of tracking down the full text of the recommended articles. However, these technologies have not yet matured enough to do this flawlessly. There are examples of ChatGPT providing blatantly false information to users. The following example describes what are referred to as AI ‘hallucinations'. You can learn more about AI hallucinations in this blog post from Seattle University Library.
During a recent AskAway interaction, a patron asked for help locating full text access for this reference:
"The African American Dream: A Conceptual Framework" by Nikitah Okembe-RA Imani, published in the Journal of African American Studies, Vol. 13, No. 2 (June 2009), pp. 133-147.
The formatting looks convincing, and a little bit of digging demonstrates that the journal is real (with valid issue and volume numbers), and the author is a real scholar. However, there is no paper that exists with this combination of information. The real information in the fake citation has been puzzled together in a way that complements the prompt fed into the AI tool by the user. This demonstrates why the name ‘hallucination’ fits - the AI has produced a result that does not really exist, but that looks convincing to the end user.
This can be particularly frustrating, as a patron may feel that they have discovered a perfect resource to strengthen their research project or assignment, only to be told that it does not actually exist. These interactions can play out in a variety of ways. Sometimes patrons are upfront about their request for help looking for references they retrieved from an AI; they may believe that someone with more search expertise should be able to find the elusive resource. Other times, the involvement of an AI tool is not mentioned. This difference can impact an AskAway service provider’s experience of the interaction and set them on a wild goose chase for a resource that does not exist.
So, what can we do as AskAway service providers when asked for help with AI-generated references?
- Use a search engine to verify ready reference requests
- For ready reference questions, use a search engine to check that the resource the patron is interested in really exists before spending time trying to track it down in a library system
- Once confirmed that it’s a real resource, then switch back to showing the patron how to navigate to the resource via the library or request an inter-library loan as needed
- Communicate generously with frustrated patrons
- Recognize that these tools provide results that seem convincing, and can often seem uniquely well suited to the patron’s needs
- Asking a patron directly if they retrieved the reference from ChatGPT or another AI tool may elicit defensive responses
- It may be more effective to ask for background information from the patron (e.g., “Can you tell me more about how you found out about this resource?”) rather than questions that have potential to seem accusatory (e.g., “Did you get this from ChatGPT?”)
- Acknowledge frustration and validate as appropriate
- Offer support in re-directing the search to the library and relevant databases or provide guidance on search terms by conducting a reference interview
Questions or Suggestions?
Do you have any suggestions to add based on your experience or any questions? Let us know.
- Log in to post comments