- About Us
- Solution Areas
- Mobile Security & Collaboration
- Advanced Persistent Threats (APT's)
- Cloud Security Services
- Data Protection and Encryption
- Anti-Malware, Anti-Virus, Anti-Spam
- Agentless A/V for VMware
- EMC Storage Anti-Malware
- Email Server Anti-Malware
- Endpoint Anti-Malware Suites (Physical Desktops & Laptops)
- NetApp Storage Anti-Malware
- Perimeter Anti-Spam and Anti-Malware
- Server (Physical) Anti-Malware
- SharePoint Anti-Malware
- SmartPhone Anti-Malware
- VDI (Virtual Desktop) Anti-Malware Suites
- Virtual Server Anti-Malware
- Data Center, Virtualization & Cloud Security
- Citrix Xen Anti-Malware and Security
- File System Integrity Monitoring
- Microsoft Hyper-V Anti-Malware and Security
- Server (Physical) Anti-Malware and Security
- System and Security Platform Management
- Two Factor Authentication
- VDI (Virtual Desktop) Anti-Malware and Security
- VMware Anti-Malware and Security
- Virtual Patching
- Laptop/Desktop Privacy & Security
- Perimeter Security-UTM
- Cloud Services
- Webinars & Events
- Knowledge Base
- Escalade IT Blog
PCI DSS Tokenization Guidance Supplement
PCI Security Standards Council
Date of Publication:
The purpose of this Information Supplement is to provide guidance for payment industry stakeholders when developing, evaluating, or implementing a tokenization solution, including how tokenization may impact Payment Card Industry Data Security Standard (PCI DSS) scope. This document provides supplemental guidance on the use of tokenization and does not replace or supersede PCI DSS requirements.
This document does not define the technical specifications or steps required to implement a tokenization solution, nor does it describe how to validate PCI DSS compliance for environments using tokenization. This document is not an endorsement for any specific technologies, products or services.
This Information Supplement is intended for merchants that store, process, or transmit cardholder data and are seeking guidance on how implementing a tokenization solution may impact the scope of their compliance efforts with the (PCI DSS). Other payment industry stakeholders including payment processors, acquirers, service providers, assessors, and solution vendors may also find the information in this document useful.
Introduction to Tokenization
Tokenization is a process by which the primary account number (PAN) is replaced with a surrogate value called a token. De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value.
Depending on the particular implementation of a tokenization solution, tokens used within merchant systems and applications may not need the same level of security protection associated with the use of PAN. Storing tokens instead of PANs is one alternative that can help to reduce the amount of cardholder data in the environment, potentially reducing the merchant’s effort to implement PCI DSS requirements.
The following key principles relate to the use of tokenization and its relationship to PCI DSS:
- Tokenization solutions do not eliminate the need to maintain and validate PCI DSS compliance, but they may simplify a merchant’s validation efforts by reducing the number of system components for which PCI DSS requirements apply.
- Verifying the effectiveness of a tokenization implementation is necessary and includes confirming that PAN is not retrievable from any system component removed from the scope of PCI DSS.
- Tokenization systems and processes must be protected with strong security controls and monitoring to ensure the continued effectiveness of those controls.
- Tokenization solutions can vary greatly across different implementations, including differences in deployment models, tokenization and de-tokenization methods, technologies, and processes. Merchants considering the use of tokenization should perform a thorough evaluation and risk analysis to identify and document the unique characteristics of their particular implementation, including all interactions with payment card data and the particular tokenization systems and processes.