Timeline of AI in writing

From Timelines
Jump to navigation Jump to search

This is a Timeline of AI in writing.

Sample questions

The following are some interesting questions that can be answered by reading this timeline:

Big picture

Time period Development summary More details
Year AI subfield Writing domain Event type Event description
1966 Natural language generation Dialogue systems Concept demonstration Joseph Weizenbaum develops *ELIZA*, an early rule-based chatbot that mimics human-like conversation using pattern-matching techniques. While simple, it marks the first instance of AI simulating natural written interaction.[1]
~1976 Natural language processing Style and grammar analysis Tool development American computer scientist Lorinda Cherry at Bell Labs in Murray Hill, NJ, begins developing programs to analyze English texts for weaknesses in diction and style.[2]
Late 1970s Expert systems and grammar checking Technical writing and editing Tool development Bell Labs’ Documentation Technologies Group in Piscataway, NJ, creates complementary programs forming the UNIX™ Writer’s Workbench Software suite.[2]
1981 Natural language processing Writing instruction and analysis Tool development Michael Cohen at UCLA develops the HOMER program.[2]
2016 Transformer models Text generation and comprehension Model introduction Google researchers introduce the *Transformer* architecture, which becomes the foundation of GPT, BERT, and other generative models, revolutionizing natural language processing and written language generation.
2018 (July 24) Deep learning and sequence modeling Predictive typing Product launch Google announces the rollout of Smart Compose. Initially an experimental Gmail feature, Smart Compose helps autocomplete email sentences, especially for common phrases, greetings, and addresses. It learns users' writing styles over time, including names and jargon, to offer personalized suggestions. Users can accept suggestions by pressing the tab key. The move marks a shift from Google's usual practice of delaying feature releases after announcement. The feature aims to enhance email productivity without replacing full message composition. It becomes a key part of Gmail’s intelligent, AI-driven tools.[3]
2019 (February 18) Pretrained language models Journalism and technical writing Product deployment OpenAI releases GPT-2, a large language model capable of generating highly coherent and human-like text. While the research is widely praised for its technical achievements, OpenAI’s decision not to release the model’s largest version sparks controversy over ethical concerns. The company cites the potential for misuse in disinformation and abuse at scale. Critics accuse OpenAI of hype-generation.[4]
2022 (March) Natural language generation Local journalism and weather reporting Newsroom automation The Argentine local paper Diario Huarpe starts using AI-powered automation from United Robots to publish around 250 football articles and 3,000 weather reports monthly. This helps its small team expand coverage despite limited staff, especially for sports and local weather. The system uses structured data and Natural Language Generation (NLG) to create articles in natural regional Spanish. While some initial resistance exists, journalists start using the automation to focus on deeper stories. The main challenge is accessing structured data—especially for local leagues—though weather data is readily available. Automation boosts efficiency, SEO traffic, and audience reach in San Juan.[5]
2022 (November 30) Conversational AI and large language models General-purpose writing and content creation Product launch OpenAI introduces ChatGPT, a language model designed to hold natural conversations. This launch would significantly impact writing by making advanced language generation accessible to a broad audience. Writers gain a powerful tool for drafting, editing, brainstorming, and generating content across genres and formats. ChatGPT reduces time spent on routine tasks, such as grammar correction or summarization, and helps overcome writer’s block by suggesting ideas or alternative phrasings. It also democratizes writing support, benefiting non-native speakers and professionals alike. However, it raises questions about originality, authorship, and reliance on AI-generated text.[6]
2023 (February) Generative AI critique and human–AI collaboration Professional and creative writing Opinion and commentary In a publication, Ken Scudder explores the rise of AI-generated text and its implications for human writers. Using ChatGPT to generate an article on AI and writing, he notes the machine’s competent yet generic output, lacking creativity, style, and personal insight. While AI can assist with tasks like grammar checks and brainstorming, it cannot replicate human flair, specific lived experiences, humor, or the effective use of quotes. Scudder argues that writers should embrace AI as a tool to enhance productivity, not fear it as a replacement. Ultimately, staying relevant means being more human, not less.[7]
2023 (March 14) Generative AI and productivity tools Business and personal productivity writing Product integration Google announces a major step in integrating generative AI into Google Workspace. Starting with Docs and Gmail, trusted testers would gain access to new AI-powered writing tools that help users draft, rewrite, summarize, and adjust tone effortlessly. Over time, similar enhancements will roll out to Slides, Sheets, Meet, and Chat, enabling features like auto-generated visuals, data analysis, and note-taking. Google emphasizes user control, data privacy, and responsible AI design in alignment with its AI Principles. The initiative aims to transform Workspace into a collaborative AI partner, enhancing productivity while preserving human creativity and decision-making.[8]
2023 (March 16) Large language models and enterprise AI Workplace communication and document generation Product launch Microsoft 365 Copilot is introduced as an AI-powered productivity tool that integrates large language models like GPT-4 with Microsoft 365 apps and user data through the Microsoft Graph. It aims to transform how people work by enhancing creativity, boosting productivity, and upgrading skills using natural language prompts. Embedded in Word, Excel, PowerPoint, Outlook, Teams, and more, Copilot automates tasks, summarizes content, and generates creative outputs. It also introduces Business Chat, which pulls contextual data across apps. Copilot prioritizes security and privacy, using a permission-based model and preserving enterprise compliance.[9]
2023 (March 27) Generative AI ethics and pedagogy Educational writing and academic instruction Scholarly analysis In a publication, three English professors explore the educational, cognitive, and ethical implications of generative AI. Chris Anson warns that relying on AI for complex writing tasks can hinder intellectual growth, as it bypasses the cognitive processes essential to learning. Huiling Ding highlights ethical issues like authorship, plagiarism, and the displacement of artists and writers. Paul Fyfe encourages structured classroom experimentation with AI to foster critical understanding and AI literacy.[10]
2023 (July) Large language models and academic AI use Academic and educational writing Institutional guidance The UNC handout Generative AI in Academic Writing offers guidance on using tools like ChatGPT and Microsoft Copilot in academic contexts. It explains how generative AI works by predicting text based on large language models trained on internet data. Potential uses include brainstorming, outlining, summarizing, editing, translating, and drafting transactional communications. The handout warns against over-reliance, highlighting risks like factual inaccuracies, fabricated citations, biased outputs, and violations of academic integrity.[11]
2023 (September 24) Generative AI criticism Creative writing and literary expression Opinion and commentary In an essay, Alex Roddie argues that generative AI threatens the essence of creative writing by automating the thinking and emotional processes essential to genuine expression. While AI can mimic language, it lacks lived experience, intentionality, and individuality—the core of meaningful writing. Roddie rejects the idea that AI can improve writing, warning that outsourcing creativity weakens critical thinking and erodes the human voice. He believes generative AI fosters conformity, dilutes originality, and risks turning writers into passive consumers of machine output. Ultimately, he sees AI not as a tool for empowerment, but as a force that may degrade human expression and personhood itself.[12]
2023 (November 16) Instruction-following language models Technical writing and task-oriented content Model launch OpenAI instroduces InstructGPT, a language model developed fine-tuned from GPT-3 to follow specific instructions with greater precision and contextual relevance. Utilizing the Transformer architecture, it combines instruction parsing with advanced text generation capabilities. InstructGPT is available in multiple parameter sizes and accessed via the OpenAI API. It is particularly suited for structured, task-oriented applications such as technical documentation, code annotation, content generation, translation, and automated customer support. Unlike ChatGPT, which is optimized for interactive dialogue, InstructGPT is designed for directive-based outputs. Its training incorporates human feedback and few-shot learning, enhancing its adaptability and alignment with user-defined tasks.[13]
2024 (March 18) A study by Zhuoyan Li, Chen Liang, Jing Peng, and Ming Yin, presented at the CHI Conference on Human Factors in Computing Systems, explores how people value and experience generative AI-powered writing assistance. In a randomized experiment with 379 participants writing essays or stories, researchers test independent writing, AI editing help, and AI-driven drafting. Participants show they are willing to forgo pay for AI support, especially for creative tasks and full content generation. AI is found to boost productivity, confidence, and grammar but also reduces accountability, satisfaction, and writing diversity, revealing both benefits and risks for education and professional use.[14]
2024 (April 23) Generative AI critique Blogging and personal writing Opinion and commentary In a blog post, Jim Eagar argues that AI should not replace human writing. He criticizes AI-generated content for lacking personal voice, creativity, and authenticity, stating that he stops reading blog posts when he realizes they’re written by AI. Eagar believes relying on AI stunts a writer’s development and turns them into editors rather than creators. He emphasizes that AI can’t produce original thought or emotion, often sounding artificial and corporate. While AI can aid brainstorming or summarizing, he urges writers to use it sparingly and preserve their unique voice and creative growth.[15]
2024 (October 1) AI-assisted academic tools Scholarly and research writing Ethical guidance and policy commentary An article by Charlotte Huff explores how AI tools can support psychology researchers and students with tasks like grammar correction, citation formatting, and idea generation. However, the article warns against over-reliance on AI, emphasizing the need for human authorship, ethical oversight, and transparent citation. APA Style guidelines stress that only humans can be listed as authors and that all AI usage must be properly documented. Risks include misinformation, fabricated citations, detection failures, and bias—particularly against non-native English speakers. Ultimately, Huff points out that AI should augment, not replace, the scholar’s role in the research process.[16]
2025 (January) Multimodal generative systems and narrative AI Interactive storytelling and fiction writing System development and research study An article presents a novel interactive system, ChatGeppetto, for AI-assisted narrative generation based on semiotic reconstruction. Drawing on syntagmatic, paradigmatic, meronymic, and antithetic relations, the system enables users to remix and create stories from existing narratives. The AI co-author, powered by ChatGPT and Stable Diffusion, uses abductive reasoning to generate text and visual scenes. A user-friendly prototype supports casual users and writers, aiming to enhance creativity and story coherence. While initial user studies show no statistically significant differences in satisfaction, the semiotic model is praised for enriching emotional and memorable content. Ethical concerns around authorship, bias, and AI’s creative role remain central to future research.[17]
2025 (February 13) Large language models and human–AI co-creativity Creative writing Qualitative research study A study explores how 18 creative writers intentionally integrate AI—specifically large language models—into their writing practice without compromising core values like authenticity and craftsmanship. Unlike other forms of writing, creative writing is seen as highly personal and choice-driven. While some see AI as reducing creativity to mere prompting, the writers studied develop nuanced, evolving relationships with AI tools. They make deliberate choices about when and how to use AI, building custom workflows that preserve their creative control. These writers see AI not just as a tool, but as a dynamic collaborator, offering insight into ethical and sustainable ways to support creativity with technology.[18]
2025 (February 20) Large language models and creative support tools Professional and creative writing Practical guidance and commentary In her article, Sanjida O’Connell outlines ten ways writers can use AI, particularly large language models like ChatGPT, to enhance their craft without sacrificing creativity. Initially skeptical due to unauthorized use of her work in AI training, she now views AI as a tool to support—not replace—human creativity. Her suggestions include using AI to brainstorm ideas, generate writing prompts, locate literary agents, craft pitches, summarize research, proofread, and create publicity material or illustrations. O’Connell emphasizes prompt engineering as essential and advises writers to retain control and individuality. According to O’Connell, while AI output can be clichéd, it can still spark meaningful writing when used mindfully.[19]
2025 (May 23) AI and authorship discourse Literary writing and personal expression Reflective essay and cultural critique An essay by Luke Beesley defends the tactile and personal nature of writing by hand—specifically with pencil—as a quiet act of resistance against AI’s growing role in literature. He recounts his lifelong habit of handwriting first drafts, highlighting the sensory experience and human imperfection involved. As AI-generated content becomes more prevalent and harder to distinguish from human work, he argues that authenticity will increasingly lie in process, practice, and material traces like notebooks and pencil stubs. The draft, archive, and physical record of creation may soon serve as proof of genuine human authorship amid growing concerns about trust, plagiarism, and creativity in the age of AI.[20]
2025 (May 26) A study suggests journalism is not dying but evolving with AI. Despite fears of newsroom collapse, Danish labour market data show little evidence of job losses linked to large language models. Public anxiety remains high: surveys reveal most Americans expect declining news quality, job cuts, and misinformation. Yet investigative work demonstrates AI’s constructive role. Outlets like The Wall Street Journal, The Washington Post, and Associated Press use machine learning and geospatial tools to uncover stories, proving AI augments reporting with speed and depth while preserving human judgment.[21]
2025 (June 12) Narrative coherence modeling and Retrieval-Augmented Generation Long-form story generation Framework development and evaluation A paper introduces SCORE, a framework designed to enhance coherence and emotional depth in AI-generated stories from Large Language Models (LLMs). SCORE identifies narrative inconsistencies by tracking key items, generating episode summaries, and using a Retrieval-Augmented Generation (RAG) approach with TF-IDF and cosine similarity. It addresses common LLM issues such as inconsistent character behavior and emotional tone. Inspired by memory mechanisms in generative agents, SCORE evaluates character consistency, emotional flow, and plot logic. Tests on LLM-generated narratives show that SCORE significantly improves coherence and stability compared to standard GPT models, offering a more structured method for refining long-form AI storytelling.[22]
2025 (June 27) Generative AI and publishing ethics Book publishing and literary labor Advocacy and public statement Over 70 authors, including Dennis Lehane and Lauren Groff, publish an open letter on Lit Hub urging major U.S. publishers to limit their use of generative AI. The letter demands commitments not to release AI-generated books, replace human workers with AI, or use AI-trained on copyrighted material without consent. It also calls for the continued use of human narrators and translators. The initiative quickly gains over 1,100 signatures. Organizers emphasize the existential threat AI poses to authors' livelihoods, especially following recent court rulings favoring AI companies. Only Simon & Schuster responds, affirming its commitment to protecting authors’ rights.[23][24]
2025 (June 30) Generative AI in education College and academic writing Cultural analysis and commentary A New Yorker article by Hua Hsu explores how generative AI is reshaping college writing and, by extension, higher education itself. Students increasingly rely on tools like ChatGPT for everything from essays to personal communication, often bypassing the writing process altogether. While some educators respond with in-class exams and handwriting exercises, others embrace AI's potential as a learning aid. The shift raises concerns about authenticity, skill development, and the value of education in an AI-saturated world. According to Hsu, rather than mere cheating, students’ use of AI reflects broader systemic trends—efficiency, consumer-minded education, and unclear learning goals—forcing a reevaluation of writing’s role in learning and human growth.[25]
2025 (August 28) WhatsApp introduces “Writing Help”, an AI-powered assistant that offers in-app suggestions to refine your messages’ tone, style, and clarity. Users simply compose a message, tap the pencil icon, and choose from options like professional, funny, rephrase, supportive, or proofread to enhance their text before sending. Powered by Meta’s Private Processing technology, all suggestions remain encrypted and private—neither WhatsApp nor Meta can access the original message or AI output. The feature is first beging rolled out in English to select regions, with broader availability expected later in the year.[26][27][28]
2025 (September 10) Rice University launches a course that examines how generative AI can both inspire and challenge creative writing. Rather than outsourcing storytelling to ChatGPT, students explore ways to incorporate or resist AI’s influence, using it to spark ideas while confronting its limitations. The class engages with critical essays and AI-generated texts, while also addressing ethical issues such as copyright and environmental costs. Associate teaching professor Ian Schimmel emphasizes that grappling with AI’s flaws and controversies fosters deeper reflection on creativity and technology.[29]

Meta information on the timeline

How the timeline was built

The initial version of the timeline was written by Sebastian.

Check [2].

Funding information for this timeline is available.

Feedback and comments

Feedback for the timeline can be provided at the following places:

  • FIXME

What the timeline is still missing

Timeline update strategy

See also

References

  1. "ELIZA 1966 – The First Chatbot". Microsoft Store. Retrieved 30 June 2025.
  2. 2.0 2.1 2.2 Smith, Charles R.; Kiefer, Kathleen E.; Gingrich, Patricia S. (July–December 1984). "Computers Come of Age in Writing Instruction". Computers and the Humanities. 18 (3/4). Springer Nature: 215–224.
  3. Lardinois, Frederic (2018-07-24). "Google's Smart Compose is now ready to write emails for G Suite users". TechCrunch. TechCrunch Media LLC. Retrieved 23 July 2025.
  4. Lowe, Ryan (2019-02-18). "OpenAI's GPT-2: the model, the hype, and the controversy". Medium (Towards Data Science). Retrieved 23 July 2025.
  5. Oliver, Laura (2 June 2022). [[1](https://reutersinstitute.politics.ox.ac.uk/news/how-local-paper-argentina-uses-ai-publish-hundreds-sports-pieces-month) "How a local paper in Argentina uses AI to publish hundreds of sports pieces a month"]. Reuters Institute for the Study of Journalism. Reuters Institute, University of Oxford. Retrieved 17 July 2025. {{cite web}}: Check |url= value (help)
  6. "ChatGPT – OpenAI". OpenAI. OpenAI. Retrieved 11 July 2025.
  7. Scudder, Ken (February 2023). "Surviving in the Age of AI Writing". PRSA Strategies & Tactics. Public Relations Society of America. Retrieved 11 July 2025.
  8. Voolich Wright, Johanna (2023-03-14). "A new era for AI and Google Workspace". Google Workspace Blog. Google LLC. Retrieved 23 July 2025. {{cite web}}: Check date values in: |access-date= (help); no-break space character in |access-date= at position 3 (help); no-break space character in |last= at position 8 (help)
  9. Spataro, Jared (16 March 2023). "Introducing Microsoft 365 Copilot – your copilot for work". Microsoft Blog. Microsoft. Retrieved 11 July 2025.
  10. Garbarine, Rachelle (27 March 2023). "How is AI Changing How We Write and Create?". College of Humanities and Social Sciences. North Carolina State University. Retrieved 9 July 2025.
  11. "Generative AI in Academic Writing". UNC Writing Center. University of North Carolina at Chapel Hill. July 2023. Retrieved 9 July 2025.
  12. Roddie, Alex (2023-09-24). "Generative AI will not make you a better writer – it will destroy creative writing as a way of expressing the human experience". Alex Roddie. Retrieved 2025-07-09.
  13. Kumari, Priyanka (16 November 2023). "Unveiling InstructGPT: A Powerful Language Model by OpenAI". Labellerr. Labellerr. Retrieved 17 July 2025.
  14. Template:Cite arxiv
  15. Eagar, Jim (23 April 2024). "From Copycats to Creativity and Authenticity: Why AI Isn't the Future of Writing". Original Mac Guy. Retrieved 9 July 2025.
  16. Huff, Charlotte (1 October 2024). "The promise and perils of using AI for research and writing". American Psychological Association. Retrieved 9 July 2025.
  17. de Lima, Edirlei Soares; Neggers, Margot M.E.; Feijó, Bruno; Casanova, Marco A.; Furtado, Antonio L. (2024). "An AI-powered approach to the semiotic reconstruction of narratives". ScienceDirect. Entertainment Computing. doi:10.1016/j.entcom.2024.100810. Retrieved 11 July 2025.
  18. Template:Cite arxiv
  19. O’Connell, Sanjida (2025-02-20). "10 Ways AI Can Help Writers". Royal Literary Fund. Retrieved 2025-07-09.
  20. Beesley, Luke (2025-05-23). "I am writing this with a pencil – it could be an author's last line of defence against AI". The Guardian. Retrieved 2025-07-09.
  21. Zajmi, Xhoi (5 September 2024). "Journalism is not dying, it's evolving with AI, says new study". Euractiv. Retrieved 14 September 2025.
  22. Yi, Qiang; He, Yangfan; Wang, Jianhui (12 June 2025). "SCORE: Story Coherence and Retrieval Enhancement for AI Narratives". arXiv. arXiv.org. Archived from the original on 12 June 2025. Retrieved 11 July 2025.
  23. Redgate (organizer); et al. (June 2025). "Against AI: An Open Letter from Writers to Publishers". Literary Hub. Lit Hub. Retrieved 11 July 2025. {{cite web}}: Check date values in: |access-date= and |date= (help); Explicit use of et al. in: |last= (help); no-break space character in |access-date= at position 3 (help); no-break space character in |date= at position 5 (help)
  24. Veltman, Chloe (June 28 2025). "Authors petition publishers to curtail their use of AI". NPR. Retrieved 11 July 2025. {{cite web}}: Check date values in: |access-date= and |date= (help); no-break space character in |access-date= at position 3 (help); no-break space character in |date= at position 5 (help)
  25. Hsu, Hua (2025-07-07). "What Happens After A.I. Destroys College Writing?". The New Yorker (July 7 & 14, 2025). Retrieved 2025-07-09.
  26. Ginzburg, Daniela (28 August 2025). "WhatsApp launches AI-powered writing assistant". MLQ. Retrieved 2025-09-10.
  27. "WhatsApp introduces AI Writing Help: Smarter and secure messaging with Meta AI". WABetaInfo. 28 August 2025. Retrieved 2025-09-10.
  28. "WhatsApp launches AI-powered 'Writing Help' feature to help you adjust your message tone: What is the feature and how to use it". The Times of India. 28 August 2025. Retrieved 2025-09-10.
  29. Chiu, Abigail (September 10, 2025). "New fiction course allows writers to incorporate and "resist" AI influence". The Rice Thresher. Retrieved 2025-09-10.