Tokens can be thought of as pieces of words. Before processing, the input is broken down into tokens that don't necessarily align with word boundaries.
For a detailed explanation of tokens and how to count them, see the OpenAI Tokenizer Guide.
This calculator uses the following approximations based on English text:
Note: These are rough estimates based on testing with various tokenizers. Actual token counts may vary depending on the specific text and model used. For exact token counts, please use OpenAI's Tokenizer tool.
Note: All calculations are performed locally - no text leaves your browser.
Text processing:
Tool made by Denis Shiryaev, shir-man.com