AI Models

Tokenization

The process of breaking text into smaller chunks called tokens that AI models can understand and process. Proper tokenization ensures AI models interpret your prompts correctly.

Usage Example

When you write a prompt in Videz like 'a person dancing on a beach at sunset,' the AI tokenizes this text into smaller pieces to understand and generate the corresponding video.