Contenu connexe Similaire à Risk Management Practices for PCI DSS 2.0 (20) Plus de Ulf Mattsson (20) Risk Management Practices for PCI DSS 2.01. Risk Management Practices
for PCI DSS 2.0
Ulf Mattsson
CTO Protegrity
ulf.mattsson [at] protegrity.com
© 2011 ISACA. All rights reserved.
2. Ulf Mattsson
• 20 years with IBM Development & Global Services
• Inventor of 22 patents – Encryption and Intrusion Prevention
• Co-founder of Protegrity (Data Security)
• Research member of the International Federation for
Information Processing (IFIP) WG 11.3 Data and Application
Security
• Member of
– PCI Security Standards Council (PCI SSC)
– American National Standards Institute (ANSI) X9
– Information Systems Audit and Control Association (ISACA)
– Institute of Internal Auditors (IIA)
– Information Systems Security Association (ISSA)
– Cloud Security Alliance (CSA)
© 2011 ISACA. All rights reserved.
02
3. ISACA Articles
Volume 6, November, 2011
Choosing the Most Appropriate Data
Security Solution for an Organization
Ulf Mattsson, CTO, Protegrity
© 2011 ISACA. All rights reserved.
03
5. Agenda
1. Present trends in data breaches
2. Review data security methods available today
3. Discuss emerging technologies for protecting data
4. Present what tokenization is and how it differs from
other data security methods
5. Review case studies on how new technologies can
minimize costs and complexities
6. Summary and conclusion
© 2011 ISACA. All rights reserved.
05
7. The Attacks on Sony Corporation
• Majority of attacks were SQL Injection - not advanced
• Data including passwords and personal details were not encrypted
• Data stolen
– 77 million names, addresses, passwords
– 102 million email addresses, birth dates
– 25 million phone numbers and payment cards
– 54 Mbyte Sony software code
• 21 uncoordinated attacks during Apr 4 to Jul 6, 2011
• 55 class action suits in the US (8/16/2011)
• Sony’s stock dropped 30 percent after attacks were disclosed
Source: http://defensesystems.com/articles/2011/07/18/digital-conflict-cyberattacks-economy-security.aspx
© 2011 ISACA. All rights reserved.
07
8. The Attack on RSA Security
• Attackers stole information relating to the
SecurID two-factor authentication technology
• APT malware tied to a network in Shanghai
• 60 different types of customized malware to
launch their attacks
• Based on a commonly used tool written by a
Chinese hacker 10 years ago
© 2011 ISACA. All rights reserved.
08
9. The AT&T Breach
• Security vulnerability in a Web site used by iPad customers
• Programming tricked the site into divulging e-mail addresses
• 100,000 e-mail addresses and iPad identification numbers were exposed
• Some examples
• New York Mayor Michael Bloomberg,
• FBI,
• US Departments of Defense and Justice,
• NASA,
• Executives from Google, Microsoft,
• Amazon, Goldman Sachs, and JP Morgan
Source 2010: http://news.cnet.com/8301-27080_3-20007417-245.html#ixzz1Y9IW9a7o
© 2011 ISACA. All rights reserved.
9
10. Best Source of Incident Data*
“It is fascinating that the top threat events
in both 2010 and 2011 are the same
and involve external agents hacking and installing malware
to compromise the confidentiality and integrity of servers.”
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team
*: Securiosis
© 2011 ISACA. All rights reserved.
010
11. Who is Behind Data Breaches*?
External agents (+22%)
Insiders (-31%)
Multiple parties (-18%)
Business partners (-10%)
0 20 40 60 80 100 %
*: Number of breaches
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
011
12. Demographics
• Criminals may be making a classic risk vs.
reward decision and opting to “play it safe” in
light of recent arrests and prosecutions following
large-scale intrusions into Financial Services
firms.
• Numerous smaller strikes on
hotels, restaurants, and retailers represent a
lower-risk alternative
– Cybercriminals may be taking greater advantage of
that option
© 2011 ISACA. All rights reserved.
012
13. Compromised Data Types*
Payment card data
Personal information
Usernames, passwords
Intellectual property
Bank account data
Medical records
Classified information
System information
Sensitive organizational data
0 20 40 60 80 100 %120
*: Number of records
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
013
14. Industry Groups Represented*
Hospitality
Retail
Financial Services
Government
Tech Services
Manufacturing
Transportation
Media
Healthcare
Business Services
0 10 20 30 40 % 50
*: Number of breaches
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
014
15. Breach Discovery Methods*
Third party fraud detection
Notified by law enforcement
Reported by customer/partner effected
Unusual system behavior
Reported by employee
Internal security audit or scan
Internal fraud detection
Brag or blackmail by perpetrator
Third party monitoring service
0 10 20 30 40 50 %
*: Number of breaches
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
015
16. Trend in Number of Breaches*
Hacking (+10%)
Malware (+11%)
Physical (+14%)
Misuse (-31%)
Social (-17%)
0 20 40 60 %
*: Number of breaches
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
016
17. How are Records Breached*
Hacking
Malware
Physical
Error
Misuse
Social
0 20 40 60 80 100 %
*: Number of records
Source: 2011 Data Breach Investigations Report, Verizon Business RISK team and USSS
© 2011 ISACA. All rights reserved.
017
18. Most Important Trends in DBIR
• The industrialization of attacks:
– There is an industry encompassing many actors, from coders to attackers to
money launderers
– They use automated tools and manage themselves and their operations as
businesses – including use of independent contractors
• All forms of attack are up by all threat actors:
– APT and IP loss serious issues
• Law enforcement really does catch some of the bad
guys:
– 1,200 arrests over the past several years by the Secret Service alone.
– Many of these bad guys/gals attack small business
Source: Securosis comments on 2011 Data Breach Investigations Report, Verizon Business RISK team
© 2011 ISACA. All rights reserved.
018
19. Data Security - Catch-22
• We need to protect both data and the business processes
that rely on that data
• Enterprises are currently on their own in deciding how to
apply emerging technologies for PCI data protection
– Data Tokenization - an evolving technology
– How to reduce PCI audit scope and exposure to data
© 2011 ISACA. All rights reserved.
019
21. Protecting the Data Flow - Example
: Enforcement point
Unprotected sensitive information:
© 2011 ISACA. All rights reserved.
Protected sensitive information
021
22. PCI DSS
© 2011 ISACA. All rights reserved.
022
23. PCI Encryption Rules
Encrypted Attacker
Public
SSL
Data
(PCI DSS) Network
Private Network
Application
Clear Text Clear Text Data
Data
Database
Encrypted OS File Data
Data System At Rest
(PCI DSS) Storage (PCI DSS)
System
© 2011 ISACA. All rights reserved.
023
24. PCI DSS
Payment Card Industry Data Security Standard (PCI DSS)
The PCI Security Standards Council is an open global forum
American Express, Discover Financial Services, JCB
International, MasterCard Worldwide, and Visa Inc
The PCI standard consists of a set of 12 rules
Four ways to render the PAN (credit card number) unreadable
Two-way cryptography with associated key management processes
Truncation
One-way cryptographic hash functions
Index tokens and pads
Source: https://www.pcisecuritystandards.org/organization_info/index.php
© 2011 ISACA. All rights reserved.
024
25. PCI DSS #3 & 4
Protect Cardholder Data
• 3.4 Render PAN, at minimum, unreadable anywhere it is
stored by using any of the following approaches:
• One-way hashes based on strong cryptography
• Truncation
• Index tokens and pads (pads must be securely stored)
• Strong cryptography with associated key-management processes and procedures
• 4.1 Use strong cryptography to safeguard sensitive
cardholder data during transmission over open, public
networks.
• Comments – Cost effective compliance
• Encrypted PAN is always “in PCI scope”
• Tokens can be “out of PCI scope”
© 2011 ISACA. All rights reserved.
025
26. PCI DSS - Appendix B:
• Compensating controls must satisfy the following criteria:
– Meet the intent and rigor of the original PCI DSS requirement.
– Provide a similar level of defense as the original PCI DSS requirement, such that the
compensating control sufficiently offsets the risk that the original PCI DSS
requirement was designed to defend against.
– Be “above and beyond” other PCI DSS requirements. (Simply being in compliance
with other PCI DSS requirements is not a compensating control.)
© 2011 ISACA. All rights reserved.
026
28. PCI DSS - Network Segmentation
• Network segmentation of, or isolating (segmenting), the
cardholder data environment from the remainder of an
entity’s network is not a PCI DSS requirement.
• However, it is strongly recommended as a method that
may reduce:
– The scope of the PCI DSS assessment
– The cost of the PCI DSS assessment
– The cost and difficulty of implementing and maintaining PCI DSS
controls
– The risk to an organization (reduced by consolidating cardholder
data into fewer, more controlled locations)
© 2011 ISACA. All rights reserved.
028
29. PCI Mandates Risk Assessment
• PCI 2.0 Requirement 12.1.2 mandates the use of formal risk
assessment based on established methodologies
• 12.1.2 Includes an annual process that identifies threats, and
vulnerabilities, and results in a formal risk assessment
– 12.1.2.a Verify that an annual risk assessment process is documented
that identifies threats, vulnerabilities, and results in a formal risk
assessment
– 12.1.2.b Review risk assessment documentation to verify that the risk
assessment process is performed at least annually
• Examples of risk assessment methodologies include but are not
limited to OCTAVE, ISO 27005 and NIST SP 800-30
Source: https://www.pcisecuritystandards.org/organization_info/index.php
© 2011 ISACA. All rights reserved.
029
31. Advancements in Cryptography
Year Event
2010 In-memory Data Tokenization introduced as a fully distributed model
Centralized Data Tokenization introduced with hosted payment
service
DTP (Data Type Preserving encryption) used in commercial
2005
databases
Attack on SHA-1 announced
DES was withdrawn
AES (Advance Encryption Standard) accepted as a FIPS-approved
2001
algorithm
1991 PGP (Pretty Good Privacy) was written
1988 IBM AS/400 used tokenization in shadow files
1978 RSA algorithm was published
1975 DES (Data Encryption Standard) draft submitted by IBM
1900 BC Cryptography used in Egypt
© 2011 ISACA. All rights reserved.
031
32. Intrusiveness of Data Formats
Intrusiveness (to Applications and Databases)
Encryption
Standard
!@#$%a^///&*B()..,,,gft_+!@4#$2%p^&
Hashing -
*
!@#$%a^.,mhu7/////&*B()_+!
Strong Encryption -
@
Alpha - aVdSaH 1F4hJ 1D3a
Encoding
Tokenizing or
666666 777777 8888 Formatted
Numeric -
Encryption
123456 777777 1234
Partial -
123456 123456 1234
Clear Text Data - Data
I
Length
Original
© 2011 ISACA. All rights reserved.
032
33. Encryption vs. Tokenization
Encryption Tokenization
Used Approach Cipher System Code System
Cryptographic algorithms
Cryptographic keys
Code books
Index tokens
Source: McGraw-HILL ENCYPLOPEDIA OF SCIENCE & TECHNOLOGY
© 2011 ISACA. All rights reserved.
033
34. Use of Enabling Technologies
Access controls 1% 91% 5%
Database activity monitoring 18% 47% 16%
Database encryption 30% 35% 10%
Backup / Archive encryption 21% 39% 4%
Data masking 28% 28% 7%
Application-level encryption 7% 29% 7%
Tokenization 22% 23% 13%
Evaluating Current Use Planned Use <12 Months
© 2011 ISACA. All rights reserved.
034
35. Use of Enabling Technologies, by Maturity
© 2011 ISACA. All rights reserved.
035
36. Hiding Data in Plain Sight
Data Entry
Y&SFD%))S( Tokenization
Server
400000 123456 7899 Data Token
400000 222222 7899
Application
Database
© 2011 ISACA. All rights reserved.
036
37. Reducing the Attack Surface
123456 123456 1234 123456 999999 1234 123456 999999 1234 123456 999999 1234 123456 123456 1234
123456 999999 1234 123456 999999 1234 123456 999999 1234
Applications & Databases
: Data Token
Unprotected sensitive information:
© 2011 ISACA. All rights reserved.
Protected sensitive information
037
38. Fully Distributed Tokenization
• How do you deliver tokenization to many locations without the impact of
latency?
Customer
Application
Token
Server Customer
Application
Customer
Application
Token
Token
Server Customer
Server Application
© 2011 ISACA. All rights reserved.
038
39. A Fully Distributed Approach
Random Static Lookup Tables
288910 Application • Multi-Use Tokens
288910
288910
288910
• Random Static Lookup Tables
– Remains the same size no matter
288910
1,000,000 Application
max entries
the number of unique tokens
• Example: 50 million = 2 million
288910
288910
Application tokens
• Performance: 200,000 tokens
288910
288910
288910
1,000,000
max entries
Application per second on a commodity
standard dual core machine
© 2011 ISACA. All rights reserved.
039
40. Evaluating Encryption & Tokenization
Database Database Old Memory
Area Impact File Column Token Token
Encryption Encryption Approach Approach
Availability
Scalability Latency
CPU
Consumption
Data Flow
Protection
Compliance
Scoping
Security
Key
Management
Separation of
Duties
Best Worst
© 2011 ISACA. All rights reserved.
040
41. Evaluating Encryption & Tokenization
Evaluation Criteria Strong Field Formatted Memory Token
Encryption Encryption Approach
Disconnected environments
Distributed environments
Performance impact when loading data
Transparent to applications
Expanded storage size
Transparent to databases schema
Long life-cycle data
Unix or Windows mixed with “big iron”
Easy re-keying of data in a data flow
High risk data
Security - compliance to PCI, NIST
Best Worst
© 2011 ISACA. All rights reserved.
041
42. Best Practices for Tokenization
Token Generation Token Types
Single Use Token Multi Use Token
Algorithm and Known strong
Key Reversible algorithm -
Unique Sequence
Number
One way
Hash Secret per Secret per
Irreversible
transaction merchant
Function
Randomly generated
value
Published July 14, 2010.
© 2011 ISACA. All rights reserved.
042
43. Comments on Best Practices
• Visa recommendations should be simply to use a
random number
• The only way to discover PAN data from a real token is a
(reverse) lookup in the token server database
• The odds are that if you are saddled with PCI-DSS
responsibilities, you will not write your own 'home-grown'
token servers
© 2011 ISACA. All rights reserved.
043
44. Secure Tokenization
What Makes a “Secure Tokenization” Algorithm?
• Ask vendors what their token-generating algorithms are
• Be sure to analyze anything other than strong random
number generators for security.
© 2011 ISACA. All rights reserved.
044
45. A Retail Scenario
Stores Stores Token
Authorization Servers
Aggregating
Hub for Store
Token
Channel
Servers
Settlement Loss Prevention Analysis - EDW ERP
Settlement
: Integration point
© 2011 ISACA. All rights reserved.
045
46. Case Study – Simplify Compliance
Case Study - Large Chain Store Uses
Tokenization to Simplify PCI Compliance
• By segmenting cardholder data with tokenization, a
regional chain of 1,500 local convenience stores is
reducing its PCI audit from seven to three months
• “ We planned on 30 days to tokenize our 30 million card
numbers. With Protegrity Tokenization, the whole
process took about 90 minutes”
© 2011 ISACA. All rights reserved.
046
47. Case Study – Simplify Compliance
• Qualified Security Assessors had no issues with the
effective segmentation provided by Tokenization
• “With encryption, implementations can spawn dozens of
questions”
• “There were no such challenges with tokenization”
© 2011 ISACA. All rights reserved.
047
48. Case Study – Simplify Compliance
• Faster PCI audit – half that time
• Lower maintenance cost – don’t have to apply all 12
requirements of PCI DSS to every system
• Better security – able to eliminate several business
processes such as generating daily reports for data
requests and access
• Strong performance – rapid processing rate for initial
tokenization, sub-second transaction SLA
© 2011 ISACA. All rights reserved.
048
49. Why Tokenization?
Why Tokenization – A Triple Play
1. No Masking
2. No Encryption
3. No Key Management
Why In-memory Tokenization
1.
2.
3.
Better
Faster
Lower Cost / TCO
$
© 2011 ISACA. All rights reserved.
049
50. Risk
Management
© 2011 ISACA. All rights reserved.
050
51. Data Security - Catch-22
Risk Traditional
Access
High – Control
Old and flawed:
Minimal access New:
levels so people Creativity
can only carry Happens
out their jobs At the edge
Low -
Data Tokens Access
I I Level
Low High
Source: InformationWeek Aug 15, 2011
© 2011 ISACA. All rights reserved.
051
52. Data Masking vs. Tokenization - Risk
Risk
Exposure:
Data display Masking
High – Data is only
obfuscated
Exposure:
Data in clear
before masking
Data at rest Masking
Data Tokens
Low -
System
I I I I Type
Test / dev Integration Trouble Production
testing testing
© 2011 ISACA. All rights reserved.
052
53. Choose Your Defenses
Cost
Cost of Aversion – Expected Losses
Protection of Data from the Risk
Total Cost
Optimal
Risk
Risk
I I Level
Active Passive
Protection Protection
© 2011 ISACA. All rights reserved.
053
54. Cost Effective PCI DSS
Firewalls
Encryption/Tokenization for data at rest
Anti-virus & anti-malware solution
Encryption for data in motion
Access governance systems
Identity & access management systems
Correlation or event management systems
Web application firewalls (WAF) WAF
Endpoint encryption solution
Data loss prevention systems (DLP) DLP
Intrusion detection or prevention systems
Database scanning and monitoring (DAM) DAM
ID & credentialing system
Encryption/Tokenization
0 10 20 30 40 50 60 70 80 90 %
Source: 2009 PCI DSS Compliance Survey, Ponemon Institute
© 2011 ISACA. All rights reserved.
054
55. Matching Data Protection with Risk Level
Data Risk Risk Level Solution
Field Level
Credit Card Number 25 Low Risk Monitor
Social Security Number 20 (1-5)
CVV 20
Customer Name 12 Monitor, mask, acc
At Risk
Secret Formula 10 (6-15)
ess control
Employee Name 9 limits, format
Employee Health Record 6 control encryption
Zip Code 3
High Risk Replacement,
(16-25) strong
encryption
© 2011 ISACA. All rights reserved.
055
56. Speed of Different Methods
Transactions per second (16 digits)
10 000 000 -
1 000 000 -
100 000 -
10 000 -
1 000 -
100 - I I I I
I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
© 2011 ISACA. All rights reserved.
056 *: Speed will depend on the configuration
57. Security Level of Different Methods
Security Level
High -
Low -
I I I I I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
© 2011 ISACA. All rights reserved.
057
58. Speed and Security
Security
Transactions per second (16 digits)
Level
10 000 000 -
Speed*
- High
1 000 000 -
100 000 -
Security
10 000 -
- Low
1 000 -
100 - I I I I
I
Traditional Format Data AES CBC Memory
Data Preserving Type Encryption Data
Tokenization Encryption Preservation Standard Tokenization
Encryption
© 2011 ISACA. All rights reserved.
058 *: Speed will depend on the configuration
59. Applications and Data - PCI vs. PII
Quality of Information (Granularity) Performance
need
High -
123- 12-1234
400000 123456 7899
PII 777-77-1234
400000 222222 7899
PCI
777-77-8888
111111 222222 7899
Low - # of
I I
Apps
Few Many
© 2011 ISACA. All rights reserved.
059
60. Risk Management Aspects
• Different data security methods and algorithms
• Enforcement implemented at different system layers
Data Security Method Hashing Formatted Strong Data
Encryption Encryption Tokenization
System Layer
Application
Database Column
Database File
Storage Device
Best Worst
© 2011 ISACA. All rights reserved.
060
61. Risk Management Aspects
• Different methods integrated at different system layers
Data Security Method
Hashing Formatted Strong Data
Encryption Encryption Tokenization
System Layer
Application
Database Column
Database File
Storage Device
: N/A Best Worst
© 2011 ISACA. All rights reserved.
061
63. Risks with Cloud Computing
Handing over sensitive data to a
third party
Threat of data breach or loss
Weakening of corporate network
security
Uptime/business continuity
Financial strength of the cloud
computing provider
Inability to customize applications
0 10 20 30 40 50 60 70 %
Dource: The evolving role of IT managers and CIOs Findings from the 2010 IBM Global IT Risk Study
© 2011 ISACA. All rights reserved.
063
64. What Amazon’s PCI Compliance Means
• PCI-DSS 2.0 doesn't address multi-tenancy concerns
• You can store PAN data on S3, but it still needs to be encrypted in accordance with
PCI-DSS requirements
• Amazon doesn't do this for you -- it's something you need to implement
yourself; including key management, rotation, logging, etc.
• If you deploy a server instance in EC2 it still needs to be assessed by your
QSA
• What this certification really does is eliminate any doubts that you are allowed to
deploy an in-scope PCI system on AWS
• This is a big deal, but your organization's assessment scope isn't necessarily
reduced
• It might be when you move to something like a tokenization service where you
reduce your handling of PAN data
© 2011 ISACA. All rights reserved.
064 Source: securosis.com
65. Summary
© 2011 ISACA. All rights reserved.
065
66. Data Protection Challenges
Actual protection is not the challenge
Management of solutions
Key management
Security policy
Auditing, Monitoring and reporting
Minimizing impact on business operations
Transparency
Performance vs. security
Minimizing the cost implications
Maintaining compliance
Implementation Time
© 2011 ISACA. All rights reserved.
066
67. Best Practices - Data Security
File Policy
System Database
Protector Protector
Audit
Log
Application
Protector
Enterprise
Data Security
Administrator
Tokenization Secure
Server Archive
: Enforcement point
© 2011 ISACA. All rights reserved.
067
68. Risk Management Practices
for PCI DSS 2.0
Ulf Mattsson
CTO Protegrity
ulf.mattsson [at] protegrity.com
© 2011 ISACA. All rights reserved.
Notes de l'éditeur 90 minutes HTran, a "rudimentary" bouncer tool written by a well-known Chinese hacker 10 years ago, was being used by various attackers to redirect traffic from infected computers to command and control servers. A piece of code used for debugging purposes in HTran would return an error message to the infected computer if the C&C server was unavailable, Stewart said. That error message revealed the final IP address of the server.The redirect tool routes traffic through several proxy servers to make it look like it is going through servers in the United States, Norway, Japan and Taiwan in order to obscure where the attack is originating, Stewart said. The botnet owners and attackers "didn't realize fully how HTran works," and very clearly were unaware of the debugger or the fact that the error message was being displayed, Stewart said.Dell researchers uncovered a few networks, all of which had China-based addresses, according to Stewart. The team scanned a list of 1,000 IP addresses that had previously been identified as being used in an advanced persistent threat attack and uncovered a "short list" of Chinese networks hosting the C&C servers, according to Stewart. While it was not 100 percent certain that the 18 servers it uncovered are the final destination, the fact that so many campaigns traced back to a handful of IP addresses seems promising, Stewart said.The addresses appear to belong to ISPs in Beijing and Shanghai, such as state-owned telecommunications giant China Unicom, but the carriers are big enough that it would be difficult to identify the individuals without assistance from the Chinese government, according to the Dell SecureWorks report.His research answered the question of "where" some of the advanced persistent threats originated, but not "who" the perpetrators were, Stewart said.However, organizations now have a signature that can be used to identify some of the APT activity in their networks. Not all attackers use this tool, but by looking for the error messages and using the Snort-based signatures the team developed to detect this particular Trojan, the IT department would at least be able to stop this particular APT, Stewart said. He also acknowledged that hackers using HTran would likely abandon the tool or fix the bug now that Dell SecureWorks has publicized the issue.“It is our hope that every institution potentially impacted by APT activity will make haste to search out signs of this activity for themselves before the window of opportunity closes,” Stewart wrote in the report. 40 "Risk management" is just another term for the cost-benefit tradeoff associated with any security decision.Protecting data according to risk enables organizations to determine their most significantsecurity exposures, target their budgets towards addressing the most critical issues,strengthen their security and compliance profile, and achieve the right balance betweenbusiness needs and security demands. As discussed earlier, a report by the Ponemon Institute, a privacy andinformation management research firm, found that data breach incidents cost $202 per compromisedrecord in 2008, with an average total per-incident costs of $6.65 million.All security spend figures produced by government and private research firms indicate that enterprisescan put strong security into place for about 10% the average cost of a breach. You can find the rightbalance between cost and security by doing a risk analysis. 40 "Risk management" is just another term for the cost-benefit tradeoff associated with any security decision.Protecting data according to risk enables organizations to determine their most significantsecurity exposures, target their budgets towards addressing the most critical issues,strengthen their security and compliance profile, and achieve the right balance betweenbusiness needs and security demands. As discussed earlier, a report by the Ponemon Institute, a privacy andinformation management research firm, found that data breach incidents cost $202 per compromisedrecord in 2008, with an average total per-incident costs of $6.65 million.All security spend figures produced by government and private research firms indicate that enterprisescan put strong security into place for about 10% the average cost of a breach. You can find the rightbalance between cost and security by doing a risk analysis. 40 "Risk management" is just another term for the cost-benefit tradeoff associated with any security decision.Protecting data according to risk enables organizations to determine their most significantsecurity exposures, target their budgets towards addressing the most critical issues,strengthen their security and compliance profile, and achieve the right balance betweenbusiness needs and security demands. As discussed earlier, a report by the Ponemon Institute, a privacy andinformation management research firm, found that data breach incidents cost $202 per compromisedrecord in 2008, with an average total per-incident costs of $6.65 million.All security spend figures produced by government and private research firms indicate that enterprisescan put strong security into place for about 10% the average cost of a breach. You can find the rightbalance between cost and security by doing a risk analysis. Just because AWS is certified doesn't mean you are. You still need to deploy a PCI compliant application/service and anything on AWS is still within your assessment scope. The open question? PCI-DSS 2.0 doesn't address multi-tenancy concerns AWS is certified as a service provider doesn't mean all cloud IaaS providers will beYou can store PAN data on S3, but it still needs to be encrypted in accordance with PCI-DSS requirements Amazon doesn't do this for you -- it's something you need to implement yourself; including key management, rotation, logging, etc. If you deploy a server instance in EC2 it still needs to be assessed by your QSA What this certification really does is eliminate any doubts that you are allowed to deploy an in-scope PCI system on AWSThis is a big deal, but your organization's assessment scope isn't necessarily reducedit might be when you move to something like a tokenization service where you reduce your handling of PAN data