Tokens 101: Making sense of tokenization in AI
Tokenization is a fundamental concept for large language models, but what exactly is a token, how do tokenizers work, and why would we want to use tokens in the first place? Join this session and we will unravel the mechanisms behind transforming textual data into machine-understandable formats together. Through real-world examples and demos, you will grasp the essence of tokenization, the pitfalls, the relevance to prompt engineering, and why it is important to have some understanding of these fundamental building blocks of large language models.
Sprecher
Roelant Dieben
Owner @ XPRTZ.cloud
With over 20 years of experience developing software on the Microsoft stack, Roelant Dieben has a lot to share about stuff that has been obsolete for years. With XPRTZ.cloud he is helping companies with their Azure cloud challenges, he is a Microsoft Azure MVP, and has a passion for machine learning & AI and application lifecycle management.... mehr erfahren