Generative artificial intelligence (GAI)

Keith Nockels, Academic Librarian (medicine, clinical sciences, healthcare), Library and Learning Services, University of Leicester

Before the column itself, an announcement. I have decided it is time for me to step down from doing this column. Especially because, but not only because, I discover I have been doing it since 2005! It is definitely time, I think, for someone new, someone with a new approach and perspectives, and new ideas about what to cover. 

So, this is the last column from me.  If you are interested in taking this column on, I would encourage you to contact the editors at Newsletter.HLG@cilip.org.uk. If you would like to chat to me about how I have done it, please contact me at khn5@le.ac.uk, but it does not need to be done like this, I am sure there are other ways to do it!

I don’t think I expected to have done this for so long, but I have enjoyed doing it. Some of the columns have found their way into material for my students. Some topics have come from things that have been raised by those students, or discussed within the academic areas I support. Thank you for reading the column for however many of the last nearly 20 years, and I do hope it has been useful and helpful to you.

So, a hot topic for the last column, that of generative artificial intelligence (GAI). I’ve included some introductory sites, some general GAI tools, and some material about its use in healthcare. Some of the things below are from my own experience. No doubt, the list will go quickly out of date, but for now, I hope it is a useful starting point if this is an area you had been meaning to explore.   

The University of Leicester’s policy and guidance

The University of Leicester has a policy about using GAI in teaching, learning and assessment. This was developed over the summer, and academics are making students aware of it. If you are in higher education, perhaps your institution is doing the same. Ours is a PDF at https://le.ac.uk/-/media/uol/docs/policies/quality/ai-policy.pdf.

I was interested in how good things like ChatGPT and Copilot are as a source of health information, so included this in the library class with our first-year medical students, with the approval of academic colleagues. I use the class to check they are aware of the GAI policy, and tell them of the series of guides that the Library has produced. We also talk about the hazards of sharing information with a GAI tool, so whether it ends up being shared with others – not a good idea if the information is copyright, or personal data, or something like course material that you should probably not share.

I have done a video covering the use of generative AI as a source of health information, which is at: https://tinyurl.com/2hjxeuw7

This covers all those issues, maybe in a bit more detail.

Information about generative AI

JISC National Centre for AI blog

https://nationalcentreforai.jiscinvolve.org/wp/: A good place to start for discussion of current issues, and an awareness of what those issues are, in HE and FE.

JISC Innovation – AI

https://www.jisc.ac.uk/innovation/artificial-intelligence: More detailed guides, including a primer, and demos of some of the tools.  

JISC AI community

https://www.jisc.ac.uk/get-involved/artificial-intelligence-community: Join JISC’s community, which conducts discussions using a Jiscmail email list.

SCONUL list of resources

https://www.sconul.ac.uk/knowledge-hub/ai/resources-and-links/: Guides, and links to lists of resources. SCONUL also have an AI and information literacy Community of Interest Group, which you can register an interest in by emailing (see: https://www.sconul.ac.uk/cpd-hub/community-of-interest-groups/)

ITHAKA product listing

https://sr.ithaka.org/our-work/generative-ai-product-tracker/ : List of products marketed for post-secondary education. Ithaka S and R “… generate action-oriented research for institutional decision-making and act as a hub to promote and guide collaboration across the communities we serve. With our partners, we design and evaluate projects that make higher education, scholarly communication, and cultural collections more accessible to diverse populations”.

Other directories of AI tools are Future Tools, https://www.futuretools.io/, and There’s an AI for that, https://theresanaiforthat.com/

I don’t say anything in this column about image generators like DALL-E that use GAI, but the directories above will list such tools.

With generative AI tools like the ones in the previous section, what you get depends on how you phrase the question. A number of UK HE libraries have LibGuides or similar on generative AI, and those might have information about this “prompt engineering”, like the one from Sheffield Hallam University at https://libguides.shu.ac.uk/ai/gettingstarted or this from Newcastle University, https://www.ncl.ac.uk/academic-skills-kit/information-and-digital-skills/ai-literacy/prompts/.  These sites will have information on other aspects of AI, and a search like generative ai site: ac.uk in Google will find these, and information on the approach to AI in assessment. 

My grateful thanks to colleagues in the University of Leicester for sharing details of many of the above in the Teams space where our Special Interest Group on AI meets!

Generative AI tools

First, some “chatbot” type sites that use generative AI.  I have not included sites like Google Maps that might use GAI as part of doing something else.

ChatGPT

https://chatgpt.com/: I first covered generative AI in classes with medical students in October 2023, and used this. Jisc has a guide at https://nationalcentreforai.jiscinvolve.org/wp/2023/03/14/getting-started-with-chatgpt/, although it might now be out of date.

Claude

https://claude.ai/: I have just signed up to this with my Google account. You give a phone number and are texted a code, which you input to finish setting up the account. You then give your full name and how you want to be called and get a link to Anthropic’s Usage Policy, and a note about how conversations are checked by an automatic abuse detection system, which refers things (I assume) to humans if need be. Then you see:

“Claude may occasionally generate incorrect or misleading information, or produce offensive or biased content.

Claude is not intended to give advice, including legal, financial, & medical advice. Don’t rely on our conversation alone without doing your own independent research.”

This applies to all similar tools and I have been saying as much to students. 

Claude has a limited free plan.

Copilot

https://Copilot.cloud.microsoft/: This on my work device links itself to my corporate Microsoft access. In this year’s class with medical students, I used this instead of ChatGPT. We are happy Copilot meets our security requirements in terms of security of data shared with it. You can also access it via Bing and a page of Bing search results has a Copilot tab.  

Gemini

https://gemini.google.com/: This is Google’s. I assume it is responsible for the “AI Overview” that can appear above search results in Google. That overview has a note at the end, certainly if you ask a medical question:

“This is for informational purposes only. For medical advice or diagnosis, consult a professional. Generative AI is experimental”.

You can login with a Google Account.

Others which I have explored less are:

Elicit

https://elicit.com/: Pricing at https://elicit.com/#Pricing, there is a free plan with more limited possibilities.

Grok

https://x.ai/: This is related to the social media platform X which I have just removed myself from.

Julius

https://julius.ai/: For data analysis.

Perplexity

https://www.perplexity.ai/: I have been telling students that generative AI like ChatGPT or Copilot is not a search engine and does not search live. But Perplexity is a search engine that uses AI.

ScholarAI

https://scholarai.io/: The free plan only allows limited use (see https://scholarai.io/pricing)

Finding references through generative AI

As well as using GAI to find information about a health condition, I wondered if students might use it to find actual references. So, I asked ChatGPT for 10 articles on the use of cannabis as a treatment for epilepsy.

My prompt was “Find me ten research articles published between 2020 and 2024 that discuss cannabis as a treatment for epilepsy”. I am not saying this is a good prompt, and have read more about prompt engineering since I wrote it!

ChatGPT gave me ten references. I then asked for links or DOIs. 7 of the 10 DOIs did not work at all. The other three went to a different article. I tried to find the articles in PubMed by searching for the article title and found none of them, although some of the authors do write on the topic and at least some of the journals exist!

I have asked Claude the same question and am part way through analysing the list of 10 articles. Signs after the first two are not good, year and volume of the journal match, and the journals exist, but the starting page number is mid article, and not the right article. The authors seem to write on the topic.

I think all this demonstrates that the tools are not searching the literature as we would, but looking for language patterns and that is causing it to generate the reference as a pattern of text. 

However, I used the same prompt to ask Copilot the same thing. I got just four articles although two of them were the same article (one with a publisher link, the other with DOI).   The links all worked. All were from 2020-2021 (a quick PubMed search suggests there are newer ones). Copilot asked if I wanted additional articles. I said yes but got the same ones again.   Three of the four were open access but not sure if the other one is. That fourth, subscription, one appears not to be in PubMed, interestingly.  

We are aware of things like Research Rabbit and Litmap, which use GAI, and we will be evaluating those soon. How tools like this fit into the systematic review process is not clear to me, although my first thought is what they do may not be entirely systematic or reproducible, and should, if used, be reported alongside other ways to find references like through citations. I have been thinking recently (as you may know if you are on the Jiscmail list I asked on!) about “automation tools” that if used you need to declare in your PRISMA flowchart. This may be things like Cochrane RCT Classifier, which uses AI and which you can train to identify RCTs and by extension, trials that are not RCTs. I notice that Covidence (I have access only through an invitation) has a link to articles “auto marked as ineligible” and I wonder if this is using GAI.   

AI and health research

A quick look in a database like Medline shows there is a lot of work being done looking at uses of AI in health (for example, interpreting images, or ECGs), and on its use in health education. A search of your favourite database will be more useful than a list here! Or sign up for a biomed-news alert.  I receive Thomas Krichel’s Biomedical Librarianship alert, which each time lists some articles about the use of generative AI in health research, and some about how reliable the health information from GAI tools is. This alert is listed on the page at https://biomed.news/reports, along with another from Farhad Shokraneh on AI in evidence synthesis.

Thank you for reading this column, however long you have been reading it for!

Keith Nockels