What You Should Take Away from the PCI DSS 3.0 – Part 2

By Jeff Hall ·

In Part 1, I discussed a number of issues that I thought were very important for organizations to understand and begin dealing with before moving on to v3 of the PCI DSS. Here is the conclusion to what I believe organizations need to gain a handle on for PCI DSS v3 compliance.

Penetration Testing

One of those “best practices” that organizations will need to take some time to prepare for is the changes to requirement 11.3. These changes are the result of the frustrations of all parties to understand what constitutes an appropriate penetration test. The amount of issues that 11.3 instigates makes it difficult to know where to start, but I will try.

The first bullet is probably the one that will give people the most trouble. It states:

  • “Is based on industry-accepted penetration testing approaches (for example, NIST SP800-115)”

The issue here will come down to “industry-accepted.” The NIST SP800-115 document is given as an example which means it will become the de facto standard that most people will gravitate toward just as OWASP Top 10 is to the requirements in 6.5. There is also the Open Source Security Testing Methodology Manual (OSSTMM) that has an acceptable penetration testing methodology.

The bottom line is that the methodology needs to be recognized as an industry accepted standard and there are very few of those sorts of penetration testing methodologies around. As a result you will not have a lot of choices.

Once you have selected your preferred methodology, then you must be able to show that you have implemented it. This is where I expect things to go from bad to worse for even the largest of organizations. It is very rare that an organization has the skill set in house to do their own penetration testing. In my experience, most organizations outsource penetration testing because of the issues they encounter keeping the right people employed. I expect that this requirement alone will drive all but the most stubborn organizations to outsource their penetration testing.

But if the methodology is not enough, the next two bullets specified in the requirement will push you over the edge.

  • “Includes testing to validate any segmentation and scope-reduction controls”
  • “Includes review and consideration of threats and vulnerabilities experienced in the last 12 months”

Validation of network segmentation is not always as simple as one might think and could, in some instances, require a true penetration testing “artist” to confirm. Those “artists” are never found anywhere but with a third party which drives you even more to outsource.

But the final “nail in the coffin” is the review and consideration of threats and vulnerabilities experienced over the last 12 months. First, this is going to require you to document and keep track of all threats and vulnerabilities to your environment for the last 12 months. While the tracking of threats and vulnerabilities might seem easy, it is not as easy as it appears. 

You are not being asked to inventory every patch issued by Microsoft, Red Hat, Cisco or every other vendor you use in your environment. This requirement is going to require your organization to finally truly implement requirement 6.2, now 6.1 in v3 of the PCI DSS.

As a reminder, requirement 6.1 states:

  • “Establish a process to identify security vulnerabilities, using reputable outside sources for security vulnerability information, and assign a risk ranking (for example, as “high,” “medium,” or “low”) to newly discovered security vulnerabilities.
  • Note: Risk rankings should be based on industry best practices as well as consideration of potential impact. For example, criteria for ranking vulnerabilities may include consideration of the CVSS base score, and/or the classification by the vendor, and/or type of systems affected. Methods for evaluating vulnerabilities and assigning risk ratings will vary based on an organization’s environment and risk assessment strategy. Risk rankings should, at a minimum, identify all vulnerabilities considered to be a “high risk” to the environment. In addition to the risk ranking, vulnerabilities may be considered “critical” if they pose an imminent threat to the environment, impact critical systems, and/or would result in a potential compromise if not addressed. Examples of critical systems may include security systems, public-facing devices and systems, databases and other systems that store, process, or transmit cardholder data.

Essentially, you will have to develop a true vulnerability management program. Again, there are plenty of third parties out there that can provide such services at a cost much cheaper than developing an in-house program. However, a lot of people will try to do this in house in the mistaken belief that it is cheaper and will end up investing a fortune before finally calling it off.

Regardless of what you choose to do about meeting this requirement, this is not something you can just do overnight and be done. As a result, you need to start planning now for your solution so that you are not caught flatfooted at the deadline.

Wireless Inventory

Requirement 11.1.1 states:

  • “Maintain an inventory of authorized wireless access points including a documented business justification.”

This is not an inventory of wireless access points that are in-scope. This is an inventory of all wireless access. For organizations that have invested heavily in wireless, this could be an issue and take a while to produce.

Wireless security and management solutions such as those from Cisco, AirDefense, Fortinet or similar can typically provide the inventory. However, the business justification could take time to develop due to all of the uses of wireless networks. That is because the PCI SSC told QSAs at the Community Meeting that business justifications need to be a true discussion of all of the business purposes not “just because.”

So if you do not have an inventory of all of your wireless access points, be prepared to create an inventory or invest in a solution that will develop it automatically. And then start the process of justifying that wireless network. But before going out to buy that wireless management solution, you might want to move on to requirement 2.4…

Maintain an Inventory of In-Scope Devices

If 11.1.1 is a problem, then requirement 2.4 will probably push you over the edge.  Requirement 2.4 states:

  • “Maintain an inventory of system components that are in scope for PCI DSS.”

Like security information and event management (SEIM) solution sales were driven by requirement 10, requirement 2.4 will likely drive the sales of configuration management database (CMDB) solutions.

This is not just those systems and devices that come into direct contact with sensitive authentication data (SAD) also known as category 1 in the Open PCI Scoping Toolkit. This also includes systems and devices that could influence those category 1 systems otherwise known as category 2. As a result, the inventory will likely get very long very quickly and difficult to maintain and manage.

But in order to achieve this requirement, you will have to have developed that integrated data flow diagram that includes the network diagram required in requirement 1.1.3 that I discussed in Part 1. Without that diagram, the process of developing a truly complete inventory will be nearly impossible because you will be unable to determine what is in-scope and is not in scope.

For most people, this is not going to be a simple task, and you will probably need to get started sooner rather than later if you expect to meet the 2015 deadline.

Service Provider Credentials

A best practice until July 1, 2015, requirement 8.5.1 will likely take service providers a while to implement.  8.5.1 states:

  • “Service providers with remote access to customer premises (for example, for support of POS systems or servers) must use a unique authentication credential (such as a password/phrase) for each customer.
  • Note: This requirement is not intended to apply to shared hosting providers accessing their own hosting environment, where multiple customer environments are hosted.

The driver behind this requirement is that too many breaches were determined to have been caused by a vendor having remote access to customers’ equipment and using the same credentials to gain access to every customer. 

Obviously once those credentials were compromised, it became relatively simple for an attacker to gain access to any customer who is serviced by the vendor. As a result, the PCI SSC is now requiring service providers to use different credentials to gain access to each of their customers’ networks.

Now this would seem like a fairly simple problem to be solved. There are all sorts of enterprise password vault solutions just for this purpose. However, I can tell you from the experiences of some of my service provider clients that have implemented such solutions, making it work without impacting service level agreements (SLA) and other contractual obligations can be very problematic. 

It can also take up to a year to work out all of the bugs and get back to normal operations. During that timeframe, penalties due to SLAs not being met and other contractual issues can adversely impact the cost of implementing the vault.

As a result, if you are a service provider and you are providing management, monitoring or other services that require your personnel to have remote access to your customers’ networks, then you need to start planning for a credential vault solution as soon as possible so that you can have it operating effectively by the June 30, 2015 deadline.

6.5.6 Didn’t Make the Final Cut - Are You Sure?

A lot of people have commented on the fact that new requirement 6.5.6 is missing from the final version of the PCI DSS. Requirement 6.5.6 in the draft was focused on security of SAD in memory.

The guidance for this requirement very clearly explains why the Council thinks this requirement is important. It is in response to attackers infiltrating merchant’s POS environments and using memory scrapers to obtain SAD.

  • “Attackers use malware tools to capture sensitive data from memory. Minimizing the exposure of PAN/SAD while in memory will help reduce the likelihood that it can be captured by a malicious user or be unknowingly saved to disk in a memory file and left unprotected. This requirement is intended to ensure that consideration is given for how PAN and SAD are handled in memory. The specific coding techniques resulting from this activity will depend on the technology in use.”

At the 2013 North American PCI Community Meeting, most of the comments about this requirement revolved around how QSAs were expected to actually test for this. A lot of people felt it was going to be too easy for developers to blow off QSAs on this issue, so why bother to test for it?

Apparently all of our interest in the requirement had an effect because requirement 6.5.6 in the draft was removed.

However, 6.5.6 in the draft did not entirely go away. If you look at requirement 6.5.c in the final version of the PCI DSS it was changed from this:

  • “Examine records of training to verify that software developers received training on secure coding techniques.”

To this:

  • “Examine records of training to verify that software developers received training on secure coding techniques, including how to avoid common coding vulnerabilities and understanding how sensitive data is handled in memory.”

QSAs will have to determine if developers understand that SAD in memory is a security issue and needs to be addressed. However, the key takeaway from this discussion is that unlike 6.5.6 in the draft, secure memory management is no longer a best practice until June 30, 2015. Developers will have to begin addressing this issue immediately and will need to be able to prove that it is being addressed.

I believe this will be the requirement that is likely to create the most problems.

jeff-hall

Jeff Hall

Principal Security Consultant

Jeff Hall is a principal consultant in Optiv’s advisory services practice on the Payment Card Industry (PCI) compliance team. Jeff’s role is to provide post-sales support and consulting to Optiv’s clients as well as providing support and mentoring to other Optiv team members. He has more than 30 years of experience in project management, information security, information security strategic planning, software evaluation, selection and implementation, voice and data networking, systems analysis and design, information system audit, systems programming, and data center operations.