Tokens can be thought of as pieces of words. Before processing, the input is broken down into tokens that don't necessarily align with word boundaries.
For a detailed explanation of tokens and how to count them, see the OpenAI Tokenizer Guide.
This calculator uses the actual GPT-2 tokenizer (the same one used by many OpenAI models) to provide precise token counts. It:
Note: While this calculator uses the GPT-2 tokenizer, some newer models might use slightly different tokenizers. For model-specific token counts, please use OpenAI's Tokenizer tool.
Note: All calculations are performed locally - no text leaves your browser.
▶ Show text inputText processing: