With all the controversy around generative AI these days, one question some students and researchers are raising is, can I just cite AI as a source? The answer is, definitively, no. There are numerous reasons why citing AI would be a self-defeating and pointless exercise. Here, I will focus on four:
1. Generative AI is not a stable reference.
2. Generative AI is not a scholarly source.
3. Generative AI is a tool, not an authority.
4. Citing AI would render the entire exercise of citing sources pointless.
Generative AI is not a stable source. When you cite a book or an article you are telling the reader that this is where you got the information, and where they can check its validity or learn more. A typical reference--such as a published book, scholarly article, webpage, or government report--is a stable text, published at a certain date and time, or location, or available on a certain website, and so on. The reader should be able to actually go to that source and be able to verify that it contains the information cited.
Generative AI bots like ChatGPT and the rest do not have this stability. If you ask ChatGPT a question, then I ask it the same question, we will not get the same answers. In fact, many AI bots give you the option to generate a different answer, right there! Because an AI bot does not always give the same exact information, it literally cannot serve as a reference.
Generative AI is not a scholarly source. By a scholarly source we typically mean an academic text that has gone through some kind of peer review, editing, or fact-checking, to ensure its validity. When you write a prompt and get a generated response from an AI bot, the text you see is brand new and has never been reviewed by anyone. It is simply impossible for generated text to have the level of quality control that is expected for academic sources.
Generative AI is a tool, not an authority. If you do want to use a chatbot, or Google's AI Overview, etc. to find information, then do so with caution. Treat it like a tool, not like an authority. Consider that before the current wave of AI products came along, Google's search algorithm was already, at base, a technology not that dissimilar: you enter in a prompt, and it responds, not with a string of likely words, but with a ranked list of websites. When you go to those websites or articles and find useful information, you then cite those sources; you are never expected to cite the Google (or Bing, or whatever) search engine, which helped you find those sources.
In the same way, if you get information on any subject from an AI bot, get it to tell you the sources of that information. Then go to those sources and read them, first of all to make sure they actually exist, and secondly to find out what they actually say. Do not rely on an AI summary, as these are not reliable; they can at best do no more than approximate the information you need. The actual sources are the authority you cite; the AI is just a tool, and does not merit citation.
These three reasons alone should be enough to make clear why citing AI as a source can never make sense. But if you want another reason, here is a very serious one: Citing AI would render the entire exercise of citing sources pointless. The fact that an AI sources is completely unaccountable--no one can independently check up and see what it did, in fact, tell you--means that a writer who is allowed to cite AI could make any claim whatsoever and then attribute it to AI. For instance:
"Seventy-five percent of left-handed people develop brain cancer (ChatGPT)!"
"Anthony Kalamar is the greatest writer of the Twenty-first Century (Google AI)!"
See? Students (for example) citing AI for papers would no longer have to do any research whatsoever in fact, they wouldn't even have to bother using AI. They could just make up facts, statistics, and so on, whole cloth, and claim to have gotten these from AI. And no one could ever prove them to be lying.
The entire point of citing references for your information is to create a reliable structure of relations between texts, in which the true sources of statistics, arguments, and evidence can be tracked down and evaluated by readers. Because generated AI bots are not stable, accountable, or authoritative sources of information, they cannot be part of such a reliable structure, and so no, you can't cite AI as a scholarly source.
(Note: You can view every article as one long page if you sign up as an Advocate Member, or higher).



