Home  /  Resources for Instructors  /  Generative AI and College Writing

Generative AI and College Writing

The development of generative AI has broad implications for both academic research and college teaching. The information provided here focuses on the particular affordances of generative AI that relate to academic writing and reading. Caltech's Center for Teaching, Learning, and Outreach offers faculty additional resources on teaching and generative AI.

This page is updated regularly, but guidance about generative AI can change quickly; the technologies themselves change frequently, often without advance notice and sometimes without clear communication as to exactly how they have changed. Caltech faculty who would like to consult about questions of generative AI in their writing, teaching, or mentoring may reach out to us for a more detailed conversation.

Within the broad array of technologies that may be described as "artificial intelligence," the term generative AI describes those which produce previously unseen writing, data, images, sounds, or code in response to prompts from human users. These tools are built using very large data sets, with the result being that the tool can imitate a certain type of writing, image, etc., without directly copying from any single referent text.

It is worth noting that some researchers have raised concerns about applying the term "artificial intelligence" to these tools, since this term has associations for some that do not accurately map onto what generative AI tools actually do. An well-known paper described text generation tools as "stochastic parrots" (Bender et al., 2021) in order to emphasize that the tools are fundamentally imitative of human speech and do not understand their own outputs in the way the term "intelligence" implies they do.

Most users of generative AI tools based on LLMs do not have a deep understanding of how such tools actually work. While many Caltech students are knowledgeable about computer science and machine learning, we should not assume they all understand what a chatbot is and how it works.

While faculty don't have to be able to build a generative AI chatbot in order to coach students through how to use or avoid such tools in their particular lab or course, some faculty may find that exploring the technological details is helpful, both to their own understandings of what the tools actually are and to their ability to speak confidently with students about the uses and limits of the tools.

We have found the following two explanations to be helpful for readers outside of computer science who are looking for an accessible, in-depth introduction:

If you have questions, we recommend talking with colleagues in Computing and Mathematical Sciences who understand these tools at a deep level. At Caltech, we are lucky to be surrounded by faculty who are knowledgeable in this area.

Generative AI tools are proliferating at a rapid pace, and each tool is different. Some have access to the internet and others do not. Some are free, and others require monthly fees or are included in productivity suites that require subscriptions. Engineers put varied guardrails into each tool to shape the results it provides, but information about those guardrails is often not publicly disclosed. The tools outputs can change over time because of changes developers are making behind the scenes.

Some of the most popular tools include:

Text Creation

Code Writing

Image Creation and Editing

Generative AI tools have many potential uses that align with tasks students commonly undertake, including:

  • Locating research or information on a particular topic
  • Summarizing and taking notes on reading
  • Brainstorming and generating ideas for projects or papers
  • Planning and outlining writing
  • Composing writing
  • Revising writing, including reconceptualizing, reorganizing, and/or rewording for clarity
  • Proofreading writing
  • Generating images
  • Describing images
  • Writing and debugging code
  • Analyzing data
  • Solving problems

Another way of thinking about how students can use generative AI is in terms of roles the technology can play, rather than tasks it can accomplish. Ethan Mollick, who has written extensively on generative AI and higher ed, explains in a post on his website that we can think in various ways about the type of role generative AI can play for a college student, such as: mentor (provides feedback), tutor (offers direct instruction), coach (prompts metacognition), teammate (improves team outcomes), student (receives explanations from learner), simulator (provides deliberative practice), and tool (accomplishes task).

It is up to each instructor to determine which uses of generative AI will enhance a student's learning and which might inhibit it in a particular context. See below for more guidance on creating a policy for generative AI use in a course.

The HWC recommends that every instructor develop course-specific policies regarding student use of generative AI. Not only do these policies help your students understand your expectations in this area; they help us support your students with meeting your expectations.

Almost every college course asks students to complete assignments that generative AI can assist with (see the previous question for more details). Most rules and norms about academic honesty are based on long-standing academic practices, but the wide availability of generative AI is quite new. At present, the rules for what uses of generative AI are ethical and which are not are unclear. The tools are so new that both best practices for using them and the consensus about when they should not be used, both by students and by academic researchers, is still being formulated. Due to this uncertainty, it is important that faculty ensure that students are given clear guidance for what uses of generative AI are expected or allowed and which will be considered cheating or academic misconduct.

We recommend that a course policy contain the following information:

  • What generative AI tools, if any, are students allowed to use?
  • For what purposes can students use those tools? For what purposes is use of the tools disallowed?
  • What kind of documentation should students provide to the instructor regarding their use of the tools?
  • What actions is the student expected to take to ensure the outputs of the tools contain accurate information (or functional code), are free from bias, and do not include plagiarism? (See the next question in this series for more information about citation and generative AI.)
  • What should students do when they have questions about whether a particular use of generative AI is allowed or not?

We also recommend talking with students about your policy and the rationale behind it. Help students see how your policies are thoughtfully connected to content, skills, and habits of mind you want them to learn in your course.

If you are seeking examples of generative AI policies, this list offers a wide variety of real policies used in college courses.

When academic writers use another writer's prose, we put it in quote marks. Academic disciplines have detailed conventions for how to attribute quotes and paraphrased ideas or borrowed data to their original authors. Conventions for citing or disclosing prose produced or edited by generative AI tools are just beginning to emerge. Just as the standards for citation differ by discipline, so does guidance about how to handle generative AI.

Here are a few examples from academic professional organizations and publishers of guidance or policies on citing/attributing/disclosing the use of generative AI. Their recommended practices may or may not fit the types of assignments you have students complete. If not, the examples may help you see how to develop a custom-built practice for a particular assignment or course.

American Psychological Association (APA)

Modern Language Association (MLA)

Institute for Electrical and Electronic Engineers (IEEE)


NEURIPS 2023 (a large machine learning conference)


At present, we have almost no research-based evidence to consult about how generative AI affects student learning. Faculty are left to think logicially about what we are trying to teach students and how particular uses of generative AI would help or impede that process.

To illustrate this, let's imagine two hypothetical courses, Entrepreneurship 101, and Shakespeare 101. The first course introduces students to how to launch a start-up business. The second course introduces students to Shakepeare's tragedies.

Entrepreneurship 101 assigns students to develop a pitch deck for meetings with funders. In this context, students need to develop a clear, compelling description of their product or business. A professor might encourage students to make use of both ChatGPT and DALL-E 2 to generate dozens of different ways to describe their product or business as well as to produce evocative images for the slide deck. Then, students have to an approach, create their slide deck, and justify their choices. This might be similar to the situation a start-up CEO would encounter in the real world, in which a marketing professional provides them with options to choose from for branding a product and they work with those suggestions to create a document that will accomplish their goals.

One skill Shakespeare 101 teaches is how to connect knowledge about the political history of the author's period to the themes of the play. Students are asked to read two articles by historians about the court of Queen Elizabeth and then to write a short analysis of how the history of the court does or doesn't seem to shape one of the plays. ChatGPT could provide students with a rough draft of this paper that points out a couple of ways that King Lear seems to comment on contemporary politics in Elizabethan England. However, having generative AI suggest the connections between history and literature allows the student to skip the type of analysis the course is trying to teach them to do. The instructor might determine that if the student no longer needs to read the essays or closely reread the play to complete their analysis paper, they are less likely to learn about or remember anything about the history of this period. Further, they may not develop the skill of connecting the facts of historical period to the subtle and creative ways that literary texts comment on contemporary politics.

In the first example, generative AI could help make an assignment work effectively. In the second, it may allow students to skip crucial cognitive work that is a focus of the course.

We have much to learn about how generative AI tools can help student learn. In the coming years, new tools will be developed that are custom-built to certain learning purposes (like tutoring students taking chemistry courses, for example). For now, faculty must think proactively about what their course learning goals are and whether certain uses of generative AI support or disrupt those goals. Trial and error will be required as faculty learn from experiments and as the tools themselves evolve over time.

The pace of change in generative AI is rapid. Advice or guidance can become dated within months or even weeks. Here are a few resources we currently recommend to keep up with news in this area:

Academic publishing moves more slowly and will take time to catch up with the ongoing innovation in this area. We recommend checking for news and resources in your field by seeing what is offered by your professional association(s) and by disciplinary pedagogy journals. This short annotated bibliography by Anna Mills focuses on the intersection of generative AI and education.

As always, take caution when reading pre-prints. Some researchers are sharing pre-prints to get information out to readers quickly. However, these papers sometimes generate headlines only to later be questioned by experts.