Pci compliance without compensating controls how to take your mainframe out of scope xbridge protegrity
1. Complimentary Webinar:
PCI Compliance Without Compensating Controls
How to Take your Mainframe Out of Scope
Complying with PCI is not easy. For the past 7 years organizations have found themselves in a
perennial battle to not just comply with PCI but to keep pace with its evolution. PCI DSS v2.0 does
not make that task any easier. Now it requires all stored cardholder data to be identified, then
protected or deleted.
With older technologies and techniques now appearing almost obsolete, companies are asking
themselves what their long term plan is to address PCI Compliance. With the vast amounts of
structured and unstructured data stored in the Mainframe, they’ve been forced to rely upon
compensating controls as a stop-gap measure to take the mainframe out of scope.
Protegrity and Xbridge have teamed to make your decision easier. Using new and proven
mainframe discovery and tokenization tools, there’s no longer a need to annually delay compliance
through compensating controls. Now you can quickly discover and map all cardholder data in the
mainframe, tokenize it, and permanently eliminate it from scope.
Join this webcast on April 12 to learn more about:
• New requirements with PCI DSS 2.0 and what they mean to you
• Automated data discovery on the mainframe
• How the combination of data discovery and tokenization can support PCI Compliance and
ensure performance, availability, transparency, and your existing SLAs are never impacted
2. Speakers:
Mike Kibort joined Xbridge Systems in 2008 with experience spanning 20 years in
technical sales, product management, and marketing. He has extensive product,
project, and partner management experience, as well as experience managing
company operations. Mr. Kibort’s experience has ranged from selling complete
engineered solutions for factory automation and equipment, to providing IT services
and software solutions to some of the largest companies in the world. Mike is a
participant and/or member of multiple data security and industry focus organizations
such as PCI –SSC, ISACA, Information Security Group, and has authored the white-
paper: “Achieving PCI Compliance on the Mainframe.”
Ulf Mattsson, Chief Technology Officer at Protegrity. He has created the architecture
of the Protegrity database security technology. Prior to joining Protegrity, he worked
20 years at IBM in software development as a consulting resource to IBM's
Research organization, specializing in the areas of IT Architecture and IT Security.
He is the inventor of more than 20 patents in the areas of Encryption Key
Management, Policy Driven Data Encryption, Internal Threat Protection, Data Usage
Control and Intrusion Prevention. Ulf received a master's degree in physics from
Chalmers University of Technology in Sweden, and holds degrees in electrical
engineering and finance.
4. Agenda
Introductions
Business Drivers for Data Protection
Changes in PCI DSS V2.0 – What they mean
Mainframe Data: Challenges preventing compliance
Taking the mainframe out of scope of PCI DSS
• Who is Xbridge Systems?
• DataSniff Mainframe Data Discovery Software
• Who is Protegrity?
• Protegrity Tokenization
Questions
4
5. Business Drivers for Data Protection
Government
• Sarbanes Oxley Act
• Gramm Leach Bliley Act
• Healthcare Insurance Portability & Accountability Act (HIPAA)
• Federal Information Security Management Act (FISMA)
• State Breach Notification Laws (e.g. California State Bill 1386)
Industry
• Payment Card Industry Data Security Standard (PCI DSS)
• Healthcare Insurance Portability & Accountability Act (HIPAA)
• Health Information Technology for Economic and Clinical Health
Act (HITECH)
Company
• Brand Protection in general
• High-wealth individuals, etc..
5
6. Data Security Impacts a Wide Range of Data
State Breach Notification Laws Payment Card Industry Data
(e.g. CA SB 1386) Security Standard (PCI DSS)
Federal Legislation
(e.g. SB 751)
Credit / Debit Card Numbers
Social Security Number
Driver’s License Number
Financial Account Numbers
Passport Number Healthcare Insurance Portability &
State or U.S.-Issued Driver's License or ID Number
Date of Birth / Birth Place Accountability Act (HIPAA)
Postal or Email Address
Telephone Number Medical related information
Mother's Maiden Name (Patient / Doctor, etc.)
Alien Registration Number
Employer or Tax ID Number
Medicaid or Food Stamp Account Number
Bank or Debit Card Account Number, Together With PIN
Vehicle Registration Number Other Laws
Biometric Data – Face, fingerprint, handwriting
Unique Electronic Number, Address, or Routing Code
Medical Records / Health Information Sarbanes-Oxley Act (SOX)
Telecommunication ID Information or Access Device Gramm-Leach-Bliley Bill
and more
6
7. Changes in PCI DSS V2.0 Affecting Stored PII
Must Define Cardholder Data Environment (CDE)
• Verify and document that no cardholder data exists
outside of the CDE
• PCI DSS defines all cardholder data within or outside
of the CDE is IN SCOPE unless deleted, migrated, or
consolidated into defined CDE, or CDE is expanded
to include that data
• Documentation of scoping results for assessor
reference
• Mainframe data is not excluded
• Compensating controls no longer adequate
• Access controls only part of the PCI DSS requirement
8. PCI DSS V2.0: Compensating Controls
PCI DSS V2.0 relating to data at rest and compensating controls
• Only those companies that have performed a risk analysis and have legitimate
technical or documented business constraints can consider the use of
compensating controls to achieve PCI compliance. Compensating controls may
be considered when an entity cannot meet a requirement explicitly as stated,
due to legitimate technical or documented business constraints, but has
sufficiently mitigated the risk associated with the requirement through
implementation of other controls.
Compensating controls must satisfy the following criteria:
1. Meet the intent and rigor of the original stated PCI DSS requirement;
2. Provide a similar level of defense as the original PCI DSS requirement;
3. Be "above and beyond" other PCI DSS requirements (not simply in
compliance with other PCI DSS requirements); and
4. Be commensurate with the additional risk imposed by not adhering to the PCI
DSS requirement.
5. The assessor is required to thoroughly evaluate compensating controls during
each annual PCI assessment.
8
9. PCI DSS V2.0: Access Controls
Access controls only part of an overall PCI DSS solution (see
requirement 7 of PCI DSS V2.0)
PCI DSS requires access controls combined with data
remediation to meet compliance with PCI DSS V2.0
Scope of Assessment for Compliance with PCI DSS Requirements
to understand and manage the people, processes and technology
that store, process or transmit cardholder data or sensitive
authentication data
Discover, define and create an inventory of all locations of
cardholder data – create a CDE.
Encrypt, tokenize, or delete all cardholder data
Create and manage access controls relating to all cardholder data
A fundamental problem with achieving compliance on the
mainframe has been the challenge of creating a
comprehensive CDE that includes mainframe data
9
10. Mainframe Data – The Critical Data
Up to December 31st, 2010
30% Mainframe Data with
Compensating Controls
70%
Other Databases
70% of the worlds mission critical data is stored on mainframes*
Compensating controls have been widely used to exempt mainframe data
from the PCI compliance process
*Source: IBM / SHARE Mainframe Executive Study, 2007
10
11. Mainframe Data – The Critical Data
As of January 1st, 2011
30% Mainframe Data with
Compensating Controls
70%
Other Databases
As of January 1st, 2011… PCI-DSS Version 2.0 requires ALL cardholder data
be identified and protected
ALL mainframe data is now “IN SCOPE” of PCI compliance
Previous use of “compensating controls” through RACF, Top Secret, or ACF2
are now considered insufficient protection for these large-scale stores of
sensitive data
11
12. The Mainframe Data Discovery Challenge
Companies do not know what really resides in their
mainframes
They do not know where ALL of their sensitive data
is located
They do not know how they will meet compliance
without knowledge of mainframe data
They do not know how to manage/prepare for the
auditing process to ensure success and compliance
12
13. The Mainframe Discovery Challenge (cont.)
Why the challenge?
No standard access to MF data for broad class of
data file types- from the network .
No standard access to mainframe metadata- from the
network
Internal MF access to metadata is not supported by
standard programming languages (C, COBOL, JAVA)
Lack of facilities to access production data while
minimizing impact on production throughput
Packed decimal presents a real challenge to standard
crawling tools
13
14. Mainframe Data is not Open Systems Data
Scope of environment
Terabytes of clear text and encoded text data
No tools have been available for searching for text within
all mainframe files
Storage methodologies
Data is “owned” by database subsystems and not
accessible by other applications
No established standards for identifying structure in
older databases like IMS & IDM
No structured directories like open systems
Datasets and types must be dealt with on an individual
basis (IBM IMS, DBMS, DB2, VSAM, Sequential, CA
IDMS, BDAM, PDS/EPDS, Flat Files, Migrated, Tape)
14
15. Solution Overview
Using DataSniff Mainframe Data Discovery
software and Protegrity Tokenization to
take the mainframe out of scope
15
16. Who is Xbridge Systems?
Founded by Dr. Gene Amdahl and Ray Williams Jr. in
1994 as Commercial Data Servers
Changed name to Xbridge Systems in 1999
Experts in mainframe data access technologies
Shifted focus to data security in late 2009
Released DataSniff Mainframe Data Discovery Tool in
late 2010
16
17. DataSniff Mainframe Data Discovery Software
Software Architecture
Generating an accurate assessment of the entire
Cardholder Data Environment within the
Mainframe
Discovering and mapping the location of cardholder
data on the mainframe
17
20. Why DataSniff for Mainframe Data Discovery?
DataSniff is the only automated data discovery tool for
mainframe systems
DataSniff provides the capability to meet the critical
first step in PCI compliance and assures all cardholder
data within the enterprise is identified for protection
Developed to minimize the potential impact of
performing analysis on production systems, or systems
that have restricted availability.
Provides confirmation that all sensitive data within
scope of PCI DSS has been remediated and/or risk-
assessed
20
21. Who is Protegrity?
Proven enterprise data protection software leader since the late 90’s.
Business driven by compliance
• PCI (Payment Card Industry)
• PII (Personally Identifiable Information)
• PHI (Protected Health Information) – HIPAA
• State and Foreign Privacy Laws
Servicing many Industries
• Retail, Hospitality, Travel and Transportation
• Financial Services, Insurance, Banking
• Healthcare
• Telecommunications, Media and Entertainment
• Manufacturing and Government
21
22. Current, Planned Use of Enabling Technologies
Strong interest in database encryption, data masking, tokenization
Access controls 1% 91% 5%
Database activity monitoring 18% 47% 16%
Database encryption 30% 35% 10%
Backup / Archive encryption 21% 39% 4%
Data masking 28% 28% 7%
Application-level encryption 7% 29% 7%
Tokenization 22% 23% 13%
Evaluating Current Use Planned Use <12 Months
22
23. PCI DSS - Ways to Render the PAN Unreadable
Two-way cryptography with associated key management
processes
One-way cryptographic hash functions
Index tokens and pads
Truncation (or masking – xxxxxx xxxxxx 6781)
23
24. Evaluating Field Encryption & Tokenization
Intrusiveness
(to Applications and Databases)
Hashing - !@#$%a^///&*B()..,,,gft_+!@4#$2%p^&*
Standard
Encryption
Strong Encryption - !@#$%a^.,mhu7/////&*B()_+!@
Alpha - 123456 aBcdeF 1234
Encoding Tokenizing or
Partial - 123456 777777 1234 Formatted Encryption
Clear Text Data - 123456 123456 1234
Data
I I
Length
Original Longer
24
25. Positioning Different Protection Options
Area Evaluation Criteria Strong Formatted Next Gen
Encryption Encryption Tokenization
High risk data
Security
Compliance to PCI, NIST
Transparent to applications
Initial Expanded storage size
Cost
Transparent to databases schema
Performance impact when loading data
Long life-cycle data
Unix or Windows mixed with “big iron”
Operational
(EBCDIC)
Cost
Easy re-keying of data in a data flow
Disconnected environments
Distributed environments
Best Worst
25
26. Different Approaches for Tokenization
Traditional Tokenization
• Dynamic Model
• Pre-Generated Model
Next Generation Tokenization: Protegrity Tokenization
26
27. Traditional Tokenization: Dynamic Model
Token Encrypted CCN
Dynamic Token Lookup Tables
1667 2815 2678 2890 9920 2556 1678 2267
• Lookup tables are dynamic.
2837 3674 8590 2637 3904 2673 3950 5968
• They grow as more unique tokens are needed.
8473 2673 4890 7825 1234 5672 4098 5589
Example: number of Credit Cards processed
by a merchant.
Application 9473 2678 4567 8902 9940 3789 4457 1234
• Table includes a hash value, a token,
3892 3674 5896 9026 0094 6789 2201 3785 encrypted CCN and other administrative
columns
1234 5678 9012 3456 3789 2001 8943 2289
Application • Large footprint. On the order of tens or
0048 2536 4782 3748 5678 4459 2098 1267 hundreds of millions of CCNs
Application
9937 2456 2738 4665 0093 2678 1298 2678 Performance
9926 1452 8364 3784 9903 2890 3789 4567 • 5 tokens per second (outsourced) to
• 5000 tokens per second (in-house)
0245 3678 5647 3957 2908 2567 1905 3785
27
28. Traditional Tokenization: Pre-generated Model
Token Encrypted SSN Pre-Generated Static Lookup Tables.
667 27 1890 009 38 2908
Assume that all possible combinations are
pre-generated.
039 27 1789 467 28 3905
• Lookup tables are static
567 38 2098 478 39 2096
• Contain all possible combinations. Example:
Application 409 28 1234 456 47 8765 all social security numbers required to support
a healthcare provider’s membership.
489 37 2290 768 56 0987
• Table includes a hash value, a token,
Application 774 36 5578 783 24 9906 encrypted SSN and other administrative
columns
990 37 2289 567 35 2341
• Large footprint. On the order of tens or
774 37 2907 009 48 3890 hundreds of millions of SSNs
Application
558 37 2908 884 56 0098
• Pre-generation may be impractical due to the
sheer size of all combinations (example; credit
667 49 2678 467 28 9036 card)
Performance
• Improved performance by not having to do as
many operations – dynamic tokenization and
encryption.
28
29. Additional Complexity with Additional Tokenization
Token Server Dynamic &
Pre-Generated Model
• Large footprint becomes larger
with the addition of more data
Application categories to protect.
• Makes tokenizing additional
categories of data a major
Application challenge.
Application
Credit Card Social Security Passport
Number Number Number
29
30. Performance
Traditional Tokenization
• 5 tokens per second (outsourced)
• 5000 tokens per second (in-house)
Protegrity Tokenization
• 200,000 tokens per second (Protegrity)
• Single commodity server with 10 connections.
• Will grow linearly with additional servers and/or connections
• 9,000,000+ tokenizations per second (Protegrity /Teradata)
30
31. Tokenization Summary
Traditional Tokenization Protegrity Tokenization
Footprint Large, Expanding. Small, Static.
The large and expanding footprint of Traditional The small static footprint is the enabling factor that
Tokenization is it’s Achilles heal. It is the source of delivers extreme performance, scalability, and expanded
poor performance, scalability, and limitations on its use.
expanded use.
High Complex replication required. No replication required.
Availability, Deploying more than one token server for the Any number of token servers can be deployed without
DR, and purpose of high availability or scalability will require the need for replication or synchronization between the
Distribution complex and expensive replication or servers. This delivers a simple, elegant, yet powerful
synchronization between the servers. solution.
Reliability Prone to collisions. No collisions.
The synchronization and replication required to Protegrity Tokenizations’ lack of need for replication or
support many deployed token servers is prone to synchronization eliminates the potential for collisions .
collisions, a characteristic that severely limits the
usability of traditional tokenization.
Performance, Will adversely impact performance & scalability. Little or no latency. Fastest industry tokenization.
Latency, and The large footprint severely limits the ability to place The small footprint enables the token server to be
Scalability the token server close to the data. The distance placed close to the data to reduce latency. When placed
between the data and the token server creates in-memory, it eliminates latency and delivers the fastest
latency that adversely effects performance and tokenization in the industry.
scalability to the extent that some use cases are not
possible.
Extendibility Practically impossible. Unlimited Tokenization Capability.
Based on all the issues inherent in Traditional Protegrity Tokenization can be used to tokenize many
Tokenization of a single data category, tokenizing data categories with minimal or no impact on footprint
more data categories may be impractical. or performance.
31
32. Tokenization Server Location
Tokenization Server Location
Evaluation Aspects Mainframe Remote
Area Criteria DB2 Work Separate In-house Out-sourced
Load Address Space
Manager
Availability
Operational Latency
Performance
Separation
Security
PCI DSS Scope
Best Worst
32
33. Data Protection Challenges
Actual protection is not the challenge
Management of solutions
• Key management
• Security policy
• Auditing and reporting
Minimizing impact on business operations
• Transparency
• Performance vs. security
Minimizing the cost implications
Maintaining compliance
Implementation Time
33
34. Data Protection on z/OS
API RACF
Applications
ICSF
Fieldproc,
Editproc, Data
UDF Security Mainframe
DB2
Solution Hardware z/OS
Security
Module
Utility
Files
DB2 LUW
Central Security
Administration
Informix
Hardware
System i Security Module
34
35. Encryption Options for DB2 on z/OS
Encryption Performance PCI DSS Security Transparency
Interface
API
UDF DB2 V7 & V8
UDF DB2 V9
Fieldproc
Editproc
Best Worst
35
36. Protegrity Data Security Platform
Secure
Archive Secure Database Protector
Storage
Secure
Distribution
Secure
File System Policy Usage
Protector
Audit
Log
Enterprise
Security Secure
Administrator Collection
Auditing &
Reporting
Application
Protector
Token Server
36
37. Enterprise Deployment Coverage
Enterprise Security Administrator (ESA)
• Deployed as Soft Appliance
• Hardened, High Availability, Backup & Restore, Scalable
Data Protection System (DPS)
• Data Protectors with Heterogeneous Coverage
• Operating System ZOS, AIX, HPUX, Linux, Solaris, Windows
System:
• Database: DB2, SQL Server, Oracle, Teradata, Informix
• Platforms: iSeries, zSeries
Extensible with Data Protectors on Demand
• Database / Operating System Certifications
• Operating System Versions
• API Language Support
37
38. Xbridge and Protegrity
Mainframe
Sensitive
Data Map
Databases
Database Files
Mainframe External Systems
Data
Security
Policy
Applicati
ons
Security
Officer
38
39. Protegrity / Xbridge Partnership
Initiated early 2011
Complementary technologies to provide capability
for identifying, then protecting all sensitive data in
mainframe environments
Can be engaged as separate entities, or as single
customer-facing provider
39
40. Summary
Mainframe increasing in utilization
External and insider threats rapidly increasing
PCI requirements specifically target all stored data
Compensating controls no longer adequate for mainframe
compliance with PCI DSS V2.0
Access Controls only part of a PCI DSS solution
Identification of ALL stored cardholder data is a critical first
step for a successful PCI compliance initiative
DataSniff is the world’s first and only automated mainframe
data discovery software.
Remediation of all stored cardholder data is of paramount
importance for any complete data protection initiative
Protegrity tokenization is the most effective method available
for remediation of mainframe and other data
40
41. Questions, Next Steps
For more information contact:
Elaine Evans
Protegrity
203.326.7200
elaine.evans@protegrity.com
www.protegrity.com