Tokenization definitions

Search

Tokenization

Tokenization logo #21000 Tokenization is the process of breaking a stream of text up into words, phrases, symbols, or other meaningful elements called tokens. The list of tokens becomes input for further processing such as parsing or text mining. Tokenization is useful both in linguistics (where it is a form of text segmentation), and in computer science, where it forms p...
Found on http://en.wikipedia.org/wiki/Tokenization

Tokenization

Tokenization logo #21000[data security] Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. Th...
Found on http://en.wikipedia.org/wiki/Tokenization_(data_security)
No exact match found.