This lecture is from the course Generative AI for Writers.
Token limits are the limits of the text AI can deal with both in terms of input and output.
I've discovered that when working with AI for writing, it's important to keep token limits in mind, especially when dealing with large amounts of text. As I input more and more information, I've also noticed that the AI can start to forget certain aspects of the data, even if it allows me to input a significant amount.
As newer generations of AI models are developed and updated, this problem may become less significant over time.