Assessing Risk and Finding Inspiration
Like many of us, when generative AI took off with ChatGPT’s release in 2022, I was skeptical about its capabilities and usefulness. I recall using it early on as a supply chain consultant to do some research, and the website that the AI tool linked to in its output was for a different company than what the AI listed in its narrative! How could I trust this tool?
After that experience, I was skeptical about its practical applications. Over the last few years, however, I have seen others become more productive while using AI, and I began to experiment again with AI-based tools at work and at home:
- As a contracts manager, I often review edits (e.g., redlines) from contractors to standard contract language. I use Copilot, an AI-powered digital assistant developed by Microsoft Corporation, to compare the original and edited paragraphs to assess the risk exposure to our company if we accept the changes. While AI is not a substitute for a lawyer, it can provide valuable insights during negotiations.
- As an avid artist in my personal life, I provide Copilot with a few concepts for a painting to draw inspiration. After a few iterations, I usually find examples that align with my vision. If I don’t find good examples, I realize the concept might not translate well onto canvas. This process helps me experiment with ideas before investing too much time into a piece that might not work.
While AI has many uses, it is essential to exercise caution, especially with proprietary or confidential data. If you are using AI tools for work, make sure to follow your company’s AI usage policy. Hopefully, you are also able to find ways to become more productive or creative in professional and personal settings.
— Alina Bartley
Taking Notes and Writing Specs
Working for a technology company means I often “dogfood,” which is using our products early and regularly to see how well they work and identify improvements before new features reach our customers. I was initially resistant to the surge in AI tools: Is it just a fad? Will it really save me time and energy? However, I have seen firsthand and through colleagues how it can impact day-to-day workflows.
I regularly interact with Microsoft Copilot, a tool integrated into Microsoft 365 that assists with productivity and creativity tasks. However, you could insert your choice of AI tool here and likely see similar benefits.
As a product manager, I run a lot of meetings and am the person on point for managing our time effectively and driving action items forward. With recording and transcription in Microsoft Teams, I am able to be more present in the conversation vocally and in chat instead of meticulously documenting who said what. Copilot also generates meeting insights with a discussion recap, action items with ownership, and more. I still fulfill my due diligence and take my own notes, but these summaries are extremely helpful as a way to verify and catch anything that I might have missed.
Another part of my job involves writing specs, also known as specification or requirement documents, which articulate a vision and plan for a given feature. Even if it is a half-formed idea or a list of bullets, tossing whatever I have top of mind into Copilot gives me a starting point and a template that I can modify and fill out in more depth. Getting examples from Copilot is particularly helpful for inspiration when writing about more objective topics like metrics or roles and responsibilities per discipline.
One of the challenges I have run into is learning how to improve my prompt-writing skills so I get better results. Some tips that help include being more specific, using “do” and “don’t” statements to improve the output, adding examples as a reference point, and building on previous prompts to give more context or adjust tone and voice.
If you want a fun way to dip your toes in, ask Copilot to assess your emails, chats, files, and more to tell you what superhero character you are most like and identify your strengths and work style. Magazine personality quizzes, move over — AI can tell you if you are most like Wonder Woman, Storm, or Black Widow!
— Nicole Woon
Determining Drug Targets
In my role as a scientist in drug development, one of the key aspects of my job is exploring and validating new drug targets. Before AI tools in the biomedical research space existed, maintaining a thorough knowledge of the disease area I work in (heart failure), reading scientific literature, and evaluating collated human data were critical for initial evaluations of robust drug targets. These activities can span hours to days.
Over the past several years, biological knowledge graphs have been built (and published) by many groups to support biomedical research and drug discovery, some of which are AI generated. These tools can integrate large amounts of heterogeneous data to support accurate and rapid retrieval of data for knowledge discovery.
While the time-tested skillset of exploring literature is still critical for this process, if I need to quickly familiarize myself with a new disease area or understand the validity and strength of a new potential target, I now have another option. With a properly constructed search, biological knowledge graphs can give initial insight within minutes.
— Emily Ongstad, Ph.D.
Learning AI Prepares Students for the Future
Over the course of my studies in computer engineering at University of California, Santa Cruz, I have seen tools like ChatGPT go from being completely prohibited to being integrated into course material.
Many professors state in their syllabus that using ChatGPT or other large language models, known as LLMs, is an academic misconduct violation and will be penalized as such. These restrictions are in place because of concerns about academic integrity, originality, and the potential for misuse. However, some professors actually integrate LLM usage into their course materials. One of my professors provided examples in class of how to write prompts to get the results you want, how to determine the “correctness” of the answer to the prompt, as well as how to cite the LLM responses. The professor strongly emphasized the importance of citing ChatGPT usage in this class. All submissions with ChatGPT had to include the citation to be given credit.
I believe professors like this who soften their stance on LLM usage recognize its potential as a valuable tool for learning and problem-solving. This class was one of the most interesting and practical classes I have taken. By teaching students how to use these tools responsibly, professors are preparing students for real-world applications, ensuring they develop practical skills.
I believe it is realistic to integrate LLM usage into academia as long as the source is cited. These are tools that will only continue to become more prominent in our everyday lives and therefore prominent in academia and industry. When I was in middle school, using internet sources rather than library sources was frowned upon. Now citing online sources is the norm, and I believe LLM usage and citation is the next step in this evolution.
As with any information, it is important to cross-reference AI outputs with credible sources, ensuring that the generated content is factually accurate.
Those who are not allowed to use LLM tools in academic and professional settings risk falling behind those who leverage these technologies effectively. Leveraging LLM tools responsibly while critically assessing their outputs ensures students can stay ahead and excel in an evolving academic environment.
— Amanda Harrison