Lexing

Lexing, scanning, or tokenization is the process of dividing a textual input into small annotated chunks that are suitable for processing by a parser.