Chat GPT has many useful applications, but it cannot do everything. The things that it is good at, it does very well. But, where it fails, it can be disastrous. For this reason, and many others, we recommend against using Chat GPT as a source of information.
Reasons:
1. Chat GPT doesn't "know" anything, it is just trying to respond to the prompt given in the most probable way that is expected from the prompt, it it isn't able to verify information because it has no mechanism to do so.
2. Chat GPT cannot cite where it is getting it's information from. It doesn't have any way of explaining or tracking why it is responding in the way that it is so there is no way of cross-referencing any output that Chat GPT is giving.
3. It is wrong a lot. Chat GPT is wrong a lot of the time because it doesn't have any means of "knowing" what it is saying. It only knows how to respond to the prompts it is given. It can't fact-check or verify information so it should not be trusted as a tool to perform those tasks.
Chat GPT is an effective and versatile tool, but that does not mean that it can do everything. It is important to learn the limitations of the software and the limitations of any Language Model before applying it in the workplace or the classroom.
"ChatGPT is incredibly limited, but good enough at some things to create a misleading impression of greatness. it's a mistake to be relying on it for anything important right now." - Sam Altman, CEO of OpenAI