Considered one of the biggest gains, In accordance with Meta, arises from using a tokenizer having a vocabulary of 128,000 tokens. While in the context of LLMs, tokens generally is a few characters, total phrases, or simply phrases. AIs break down human enter into tokens, then use their vocabularies of tokens to create output.OpenAI is probably goi