Technical
Tokenization
Replacing sensitive data with a non-sensitive placeholder (token) that has no exploitable value on its own.
Tokenization is a data protection technique that replaces sensitive data (e.g. a credit card number) with a non-sensitive placeholder called a "token." The original data is stored in a secure token vault, and the token can only be mapped back to the original data by an authorised system.
Tokenization vs Encryption
| Tokenization | Encryption | |
|---|---|---|
| Reversible? | Only with access to the token vault | Yes, with the decryption key |
| Format-preserving? | Yes (token can match the original format) | Usually not |
| Mathematical relationship? | None | Based on algorithm and key |
Common Uses
- Payment card processing (PCI DSS compliance)
- Storing sensitive identifiers in databases
- Cloud data protection
Legal Relevance
Tokenization is recognized as an appropriate security measure under GDPR Article 32.