Skip to content

Tokenization

Tokenization is a data protection technique that replaces sensitive data with a surrogate value (token) that has no intrinsic meaning or value. The original data is stored separately and securely.

Tokenization is, along with encryption, one of the key techniques of data protection.

For more details, see a separate section on tokenization.