Generative artificial intelligence (AI) tools like ChatGPT and Google Bard have made headlines since their introduction for things like managing your stock portfolio, building software in minutes, and for just being downright creepy:

Existential dread much, ChatGPT?
But as human-like as ChatGPT’s responses can be, the real issue isn’t whether AI is going to take over the world (we promise ChatGPT is not alive). As of now, generative AI is still greatly unchecked, which can compromise your data and credibility.
AI large language models (LLMs) are designed to hold conversations with users, and create responses based on trained information from their developers. The data sources for these responses can include websites, articles, books, and more.
Since generative AI is still an emerging technology, our Chief Global Information Officer, Dr. Curtis Cole, recently released guidance on using generative AI. University committees plan to recommend a set of generative AI tools that will meet the needs of the Cornell community by the end of this year. That aside, it’s important to note that you are accountable for your work, regardless of the tools you use to produce it, and you cannot enter any Cornell information, or another person's information, that is confidential, proprietary, subject to federal or state regulations or otherwise considered sensitive or restricted since generative AI tools are considered public.
With caution. LLMs can work in a pinch for problem-solving (like writing a complicated Excel formula), exploration, or increasing productivity, but they’re by no means a perfect solution.
Try our 10-question pop quiz on cybersecurity. We’ll announce some winners the week of Oct. 23.
October is National Cybersecurity Awareness Month, an annual collaborative effort between government and industry to ensure we have the resources you need to maintain your security online. Throughout October, we’ll be sending you tips on protecting your information and avoiding malicious attempts to extract your personal data. Visit the ITS website for our 2023 tips.