Tokenization « Back to Glossary IndexSplitting text into units (tokens) for AI processing. « Back to Glossary Index