Enterprise Data Protection - Understanding Your Options and Strategies
-
Upload
ulf-mattsson -
Category
Documents
-
view
554 -
download
3
description
Transcript of Enterprise Data Protection - Understanding Your Options and Strategies
Enterprise Data Protection -
Understanding Your Options
and Strategies
Ulf Mattsson
CTO Protegrity
Ulf.mattsson AT protegrity.com
Ulf Mattsson
� 20 years with IBM Development & Global Services
� Inventor of 22 patents – Encryption and Intrusion Prevention
� Co-founder of Protegrity (Data Security)
� Research member of the International Federation for
Information Processing (IFIP) WG 11.3 Data and Application
Security
� Member of
� PCI Security Standards Council (PCI SSC)
� American National Standards Institute (ANSI) X9
� Information Systems Audit and Control Association (ISACA)
� Cloud Security Alliance (CSA)
� Information Systems Security Association (ISSA)
02
03
ISACA Articles – Data Security
Topics
� Review the changing threat landscape
� Present different options for data security for PCI DSS
� Review a case study
� Show how to protect the entire data flow
� Discuss how to protect against advanced attacks
� Show how to balance performance and security with different
approaches to tokenization and encryption
� Review security enforcement at the application level,
database level, file level and storage level
05
The Changing Threat Landscape
� Some issues have stayed constant:
� Threat landscape continues to gain sophistication � Attackers will always be a step ahead of the defenders
� We're fighting highly organized, well-funded crime syndicates and � We're fighting highly organized, well-funded crime syndicates and nations
� Move from detective to preventative controls needed:
� Several layers of security to address more significant areas of risks
Source: http://www.csoonline.com/article/602313/the-changing-threat-landscape?page=2
06
� Six years, 900+ breaches, and over 900 million
compromised records
� Over half of the breaches occurred outside of the U.S.
2010 Data Breach Investigations Report
Online Data is Compromised Most Frequently:Online Data is Compromised Most Frequently:
%
Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
07
90 % of compromised records lost in highly sophisticated attacks
Threat Action Categories
Source: 2010 Data Breach Investigations Report, Verizon Business RISK team and USSS
08
Payment Card Industry Data Security Standard
(PCI DSS)
� The PCI Security Standards Council is an open global forum
� American Express, Discover Financial Services, JCB International, MasterCard Worldwide, and Visa Inc
� The PCI standard consists of a set of 12 rules � The PCI standard consists of a set of 12 rules
� Four ways to render the PAN (credit card number) unreadable
� Two-way cryptography with associated key management processes
� Truncation � One-way cryptographic hash functions � Index tokens and pads
Source: https://www.pcisecuritystandards.org/organization_info/index.php
09
Attacker
Public
NetworkSSL
Private Network
EncryptedData
(PCI DSS)
PCI Encryption Rules
OS File System
Database
Storage
System
Application
Data At Rest(PCI DSS)
Clear Text Data
EncryptedData
(PCI DSS)
Clear Text Data
010Not Enough to Encrypt Pipes & Files
Protecting the Data Flow - Example
Protected sensitive information
Unprotected sensitive information:
011
: Enforcement point
Current, Planned Use of Enabling TechnologiesStrong interest in database encryption, data masking, tokenization
47%
35%
16%
10%30%
18%
1% 91% 5%Access controls
Database activity monitoring
Database encryption
39%
28%
29%
23%
7%
7%
13%22%
7%
28%
21% 4%Backup / Archive encryption
Data masking
Application-level encryption
Tokenization
Evaluating Current Use Planned Use <12 Months
012
Data Security Today is a Catch-22
� We need to protect both data and the business processes that rely
on that data
� Enterprises are currently on their own in deciding how to apply
emerging technologies for PCI data protection
� Data Tokenization - an evolving technology� Data Tokenization - an evolving technology
� How to reduce PCI audit scope and exposure to data
013
Hiding Data in Plain Sight – Data Tokenization
400000 123456 7899
Y&SFD%))S( Tokenization
Server
Data Token
Data Entry
Application
Databases
400000 123456 7899
400000 222222 7899
014
Aggregating
StoresStoresAuthorization
Retail Scenario with Tokenization
Token Servers
Aggregating
Hub for Store
Channel
Loss Prevention
Settlement
Analysis - EDWSettlement ERP
Token Servers
015: Integration point
Case Study - Large Chain Store Uses
Tokenization to Simplify PCI Compliance
� By segmenting cardholder data with tokenization, a regional
chain of 1,500 local convenience stores is reducing its PCI
audit from seven to three months audit from seven to three months
� “ We planned on 30 days to tokenize our 30 million card
numbers. With Protegrity Tokenization, the whole process
took about 90 minutes”
016
Case Study - Large Chain Store Uses
Tokenization to Simplify PCI Compliance
� Qualified Security Assessors had no issues with the effective
segmentation provided by Tokenization
� “With encryption, implementations can spawn dozens of
questions”
� “There were no such challenges with tokenization”
017
Case Study - Large Chain Store Uses
Tokenization to Simplify PCI Compliance
� Faster PCI audit – half that time
� Lower maintenance cost – don’t have to apply all 12
requirements of PCI DSS to every systemrequirements of PCI DSS to every system
� Better security – able to eliminate several business processes
such as generating daily reports for data requests and access
� Strong performance – rapid processing rate for initial
tokenization, sub-second transaction SLA
018
!@#$%a^///&*B()..,,,gft_+!@4#$2%p^&*Hashing -
Strong Encryption -
Intrusiveness
(to Applications and Databases)
!@#$%a^.,mhu7/////&*B()_+!@
StandardEncryption
Field Encryption & Tokenization – Data Formats
123456 777777 1234
123456 123456 1234
aVdSaH 1F4hJ 1D3aAlpha -
Numeric -
Partial -
Clear Text Data -
I
Original
I
Longer
666666 777777 8888Tokenizing orFormattedEncryption
Data
Length
019
Encoding
Data Security Method
System Layer
Hashing Formatted
Encryption
Strong
Encryption
Data
Tokenization
Risk Management and PCI – Security Aspects� Different data security methods and algorithms
� Policy enforcement implemented at different system layers
System Layer
Application
Database Column
Database File
Storage Device
Best Worst
020
Data Security Method
System Layer
Hashing Formatted
Encryption
Strong
Encryption
Data
Tokenization
Risk Management and PCI – Security Aspects� Integration at different system layers
� Different data security methods and algorithms
Application
Database Column
Database File
Storage Device
Best Worst: N/A
021
A Distributed Tokenization Approach
Customer Application
� Large companies may need to utilize the tokenization services for locations throughout the world. � How do you deliver tokenization to many locations without the impact of latency?
Customer Application
TokenServer
Customer Application
Customer Application
Token Server
Application
TokenServer
022
� Multi-Use Tokens
� Random Static Lookup Tables
� Remains the same size no matter the
Application288910
288910
288910288910
288910
Random Static Lookup Tables
Distributed Approach to Generate Random Tokens
� Remains the same size no matter the number of unique tokens
� Example: 50 million = 2 million tokens
� Performance: 200,000 tokens per second on a commodity standard dual core machine
Application
Application
288910
1,000,000 max entries
288910
288910
288910288910
288910
1,000,000 max entries
Application
023
Area Impact
Database
File
Encryption
Database
Column
Encryption
Centralized
Tokenization
(old)
Distributed
Tokenization
(new)
Scalability
Availability
Latency
CPU Consumption
Evaluating Encryption & Tokenization Approaches
EncryptionEvaluation Criteria Tokenization
Best Worst
CPU Consumption
Security
Data Flow
Protection
Compliance Scoping
Key Management
Randomness
Separation of Duties
024
Evaluation Criteria Strong Field
Encryption
Formatted
Encryption
Distributed
Tokenization
Disconnected environments
Distributed environments
Performance impact when loading data
Transparent to applications
Expanded storage size
Evaluating Field Encryption & Distributed Tokenization
Expanded storage size
Transparent to databases schema
Long life-cycle data
Unix or Windows mixed with “big iron” (EBCDIC)
Easy re-keying of data in a data flow
High risk data
Security - compliance to PCI, NIST
Best Worst025
Best Practices for Tokenization
Token Generation Token Types
Single Use Token Multi Use Token
Algorithm and
Key Reversible
Known strong algorithm
Unique Sequence
Number
���� -
���� ����
One way
Irreversible
Function
Number
Hash
Randomly generated
value
���� ����
���� ����
Secret per transaction
Secret per merchant
Published July 14, 2010.
026
� Visa recommendations should be simply to use a random number
� If the output is not generated by a mathematical function applied
to the input, it cannot be reversed to regenerate the original PAN
data
� The only way to discover PAN data from a real token is a (reverse)
Comments on Visa’s Tokenization Best Practices
� The only way to discover PAN data from a real token is a (reverse)
lookup in the token server database
� The odds are that if you are saddled with PCI-DSS responsibilities,
you will not write your own 'home-grown' token servers
027
� Ask vendors what their token-generating algorithms are
� Be sure to analyze anything other than strong random
number generators for security.
What Makes a “Secure Tokenization” Algorithm?
number generators for security.
028
Strong Cryptography - PCI DSS Glossary
� Cryptography based on industry-tested and accepted
algorithms, along with strong key lengths and proper
key-management practices
� See NIST (National Institute of Standards and
Technology, US) Special Publications
029
NIST Proposed Encryption Modes
Appearance of a mode in this list does not constitute endorsement or approval by NIST
1. FCEM Format Controlling Encryption ModeU. Mattsson
2. FFX Format-preserving Feistel-based Encryption Mode
http://csrc.nist.gov/groups/ST/toolkit/BCM/modes_development.html
2. FFX Format-preserving Feistel-based Encryption ModeM. Bellare, P. Rogaway, T. Spies
3. …
030
Data Protection Challenges
� Actual protection is not the challenge
� Management of solutions
� Key management
� Security policy
� Auditing, Monitoring and reporting
� Minimizing impact on business operations� Minimizing impact on business operations
� Transparency
� Performance vs. security
� Minimizing the cost implications
� Maintaining compliance
� Implementation Time
031
Best Practices - Data Security Management
Database Protector
File System Protector
Policy
AuditAuditLog
Secure Archive
Application Protector
Tokenization Server
EnterpriseData SecurityAdministrator
: Enforcement point 032
European Union United States
European Union Data Privacy Directive
95/46/EC - protection and movement of
personally identifiable information between E.U.
member countries and to outside
Rules are primarily state-by-state.
Firms are responsible for protecting PII data and
also for managing its transfer to others by
Once the data has been yielded to a
company, the company is largely free to
Privacy - More lax in US than in the E.U.
also for managing its transfer to others by
monitoring compliance of recipients
company, the company is largely free to
use it as it wishes, subject to local state
regulations.
Medical records are no different from other
E.U. citizen’s personal information because a
degree of data protection is already afforded.
Concern over medical records privacy
may increase with the push to reduce health
care costs through greater automation.
033
Questions?Click on the questions tab on your screen, type in your question, name
and e-mail address; then hit submit.
034
In the Case Study, Tokenization was
yielding some benefits for the retailer:
Please select ALL relevant options from below:
� Faster PCI audit � Faster PCI audit
� Effective segmentation of cardholder data environments
� Lower maintenance cost
� Better security
� Strong performance
ALL is the correct answer
035
What Makes a “Secure Tokenization”
Algorithm according to Gartner
research?
Please select ONE option from below:
� Hashing algorithms� Hashing algorithms
� Encryption algorithms
� Random values
� Howegrown algorithms
“Random values“ is the correct answer
036
The PCI standard consists of how many
rules?
Please select ONE option from below:
� 6� 6
� 8
� 12
� 16
12 is the correct answer
037
The PCI standard allows how many
different ways to render the PAN
(Credit Card Number) unreadable?
Please select ONE option from below:
� 2� 2
� 3
� 4
� 5
� 6
4 is the correct answer
038