How Do Prompt Tokens Work in ChatGPT?
Have you ever wondered how ChatGPT understands and processes the prompts you provide? One important aspect of this is prompt tokens.
Prompt tokens are pieces of words that are generated when ChatGPT processes the input. These tokens can include trailing spaces and even sub-words, and are language-dependent.
For instance, Spanish words tend to have a higher token-to-character ratio, making it more expensive to implement the API for languages other than English.
What are tokens in ChatGPT?
To give you an idea of how tokens work, here are some helpful rules of thumb:
- 1 token ~ 4 chars in English
- 1 token ~ ¾ words
- 100 tokens ~ 75 words
- 1–2 sentences ~ 30 tokens
- 1 paragraph ~ 100 tokens
- 1,500 words ~ 2048 tokens ~ 5.4 pages
- 3,000 words ~ 4096 tokens ~ 10.8 pages
- 6,000 words ~ 8,192 ~ 21.6 pages
How many tokens can you use in ChatGPT?
Depending on the model used, requests can use all the way up to 32,768 tokens shared between prompt and completion.
In ChatGPT 4 which just came out in March 2023, the token limit is 8,192 tokens. The 32,768 is — for now — only offered to a few select test users.
If you’re using ChatGPT 3.5 (also known as Chatbot) — the model of chatbot that’s still free for the public to use — the limit is 4096 tokens.
Older models, like 3.0, are limited to 2,048 tokens or less.
Example of Prompt Tokens:
Let’s say you’re using the free version of ChatGPT 3.5 with the 4096 combined limit.
You enter a prompt that’s 4000 “prompt” tokens, approximately 3,000 words.
That leaves your result — the “completion” tokens — with only 96 remaining tokens to be used.
What happens if I go over the token limit?
There’s usually two outcomes that happens if you’re over the token limit. The main one is simply an error message, where it will tell you you’re over the length-limit and to try again.
Personally at this point, I usually just copy and paste what I wanted to type as a prompt, go into another editor and reduce the size, then start a new chat (upper left corner of the ChatBot screen), make sure my initial instructions are in place, and paste the reduced-size portion.
Another possible outcome is it will simply start ignoring some of your previous explanations. If you’re familiar with prompt-sequencing, where you work with the AI to sequentially produce an output, this means at some point it will start “forgetting” your initial conversation.
The problem? It will be pretty hard for you to know when this happens — it won’t tell you.
Whenever that occurs, I copy-paste the initial instructions I gave it and add a reminder to the bot.
But don’t worry too much about the total token limit, there are often creative ways to work within the limit, such as condensing your prompt or breaking the text into smaller pieces.
Things to look out for with ChatGPT prompts
When writing prompts, it’s important to provide context through instructions and examples. Quality also matters, so avoid spelling errors and unclear text.
The prompt size should also be considered, as both the prompt and the resulting completion must add up to fewer than 2048–4096–8,192 tokens.
If you want to learn more about tokens or try tokenizing text programmatically, there are several helpful tools available, including OpenAI’s interactive Tokenizer tool and Tiktoken, a fast BPE tokenizer specifically used for OpenAI models.
To wrap it up, prompt tokens are pieces of words generated when ChatGPT processes input, and understanding them is essential for crafting effective prompts, managing costs, and staying under the 2048 token limit.
By keeping these tips in mind, you’ll be well on your way to unlocking the full potential of ChatGPT.