Tokens 101: Making sense of tokenization in AI

Tokenization is a fundamental concept for large language models, but what exactly is a token, how do tokenizers work, and why would we want to use tokens in the first place? Join this session and we will unravel the mechanisms behind transforming textual data into machine-understandable formats together. Through real-world examples and demos, you will grasp the essence of tokenization, the pitfalls, the relevance to prompt engineering, and why it is important to have some understanding of these fundamental building blocks of large language models.

Speaker

FURTHER SESSIONS FROM #AI

  • Building the future with Azure AI

  • Custom Copilots und Prozess Orchestrierung mit Microsoft Semantic Kernel und Gen AI

  • Das A in IoT steht für AI: Eine Kurzgeschichte der künstlichen Intelligenz für IoT Profis

  • Der AI Act ist da: Gamechanger für KI in Europa?

  • Have you discovered something exciting?

    Register today!

    Don't miss the chance to learn from leading experts and make valuable contacts!