Tokenization (data security)

This is a simplified example of how mobile payment tokenization commonly works via a mobile phone application with a credit card.[1][2] Methods other than fingerprint scanning or PIN-numbers can be used at a payment terminal.

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods that render tokens infeasible to reverse in the absence of the tokenization system, for example using tokens created from random numbers.[3] A one-way cryptographic function is used to convert the original data into tokens, making it difficult to recreate the original data without obtaining entry to the tokenization system's resources.[4] To deliver such services, the system maintains a vault database of tokens that are connected to the corresponding sensitive data. Protecting the system vault is vital to the system, and improved processes must be put in place to offer database integrity and physical security.[5]

The tokenization system must be secured and validated using security best practices[6] applicable to sensitive data protection, secure storage, audit, authentication and authorization. The tokenization system provides data processing applications with the authority and interfaces to request tokens, or detokenize back to sensitive data.

The security and risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens, or detokenize back to redeem sensitive data under strict security controls. The token generation method must be proven to have the property that there is no feasible means through direct attack, cryptanalysis, side channel analysis, token mapping table exposure or brute force techniques to reverse tokens back to live data.

Replacing live data with tokens in systems is intended to minimize exposure of sensitive data to those applications, stores, people and processes, reducing risk of compromise or accidental exposure and unauthorized access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications explicitly permitted to detokenize when strictly necessary for an approved business purpose. Tokenization systems may be operated in-house within a secure isolated segment of the data center, or as a service from a secure service provider.

Tokenization may be used to safeguard sensitive data involving, for example, bank accounts, financial statements, medical records, criminal records, driver's licenses, loan applications, stock trades, voter registrations, and other types of personally identifiable information (PII). Tokenization is often used in credit card processing. The PCI Council defines tokenization as "a process by which the primary account number (PAN) is replaced with a surrogate value called a token. A PAN may be linked to a reference number through the tokenization process. In this case, the merchant simply has to retain the token and a reliable third party controls the relationship and holds the PAN. The token may be created independently of the PAN, or the PAN can be used as part of the data input to the tokenization technique. The communication between the merchant and the third-party supplier must be secure to prevent an attacker from intercepting to gain the PAN and the token.[7]

De-tokenization[8] is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value".[9] The choice of tokenization as an alternative to other techniques such as encryption will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities. This is in addition to any technical, architectural or operational constraint that tokenization imposes in practical use.

  1. ^ "Tokenization demystified". IDEMIA. 2017-09-19. Archived from the original on 2018-01-26. Retrieved 2018-01-26.
  2. ^ "Payment Tokenization Explained". Square. 8 October 2014. Archived from the original on 2018-01-02. Retrieved 2018-01-26.
  3. ^ CardVault: "Tokenization 101"
  4. ^ Ogigau-Neamtiu, F. (2016). "Tokenization as a data security technique". Regional Department of Defense Resources Management Studies. Zeszyty Naukowe AON. 2 (103). Brasov, Romania: Akademia Sztuki Wojennej: 124–135. ISSN 0867-2245.
  5. ^ Ogîgău-Neamţiu, F. (2017). "Automating the data security process". Journal of Defense Resources Management (JoDRM). 8 (2).
  6. ^ "OWASP Top Ten Project". Archived from the original on 2019-12-01. Retrieved 2014-04-01.
  7. ^ Stapleton, J.; Poore, R. S. (2011). "Tokenization and other methods of security for cardholder data". Information Security Journal: A Global Perspective. 20 (2): 91–99. doi:10.1080/19393555.2011.560923. S2CID 46272415.
  8. ^ Y., Habash, Nizar (2010). Introduction to Arabic natural language processing. Morgan & Claypool. ISBN 978-1-59829-796-6. OCLC 1154286658.{{cite book}}: CS1 maint: multiple names: authors list (link)
  9. ^ PCI DSS Tokenization Guidelines

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search