EN tokenization
LV sadalīšana daļiņās
RU разметка
DE Tokenisierung
FR segmentation du texte en unités lexicales
Definīcija: The process of identifying meaningful units within strings, such as words, keywords, phrases, symbols, and other elements called "tokens."
Microsoft Terminology 2023. Entry from the Microsoft Language Portal.
© 2023 Microsoft Corporation. All rights reserved.