Respuesta :

Answer:

 The term tokenizing is the process of splitting the data or some text into the minimal useful or meaningful units.

The tokenzing break the sequence of the string into various keywords, words and meaningful phrases. This process of breaking different elements is also known as token.

Token is the individual sentence and word in the tokenizing process and some of the characters are marked as discarded in this process such as punctuation.