Lessons from the Equifax Data Breach Report

Nick Deshpande
6 min readDec 18, 2018

--

This month’s House Oversight & Government Committee report about the Equifax data breach is worthwhile reading for cybersecurity practitioners. The ~100 page document is instructional with implications across the gamut of security functions: compliance and governance, crypto, DFIR, vulnerability management and remediation, and so on. While some of the shortcomings that led to the breach are highly contextual and specific to Equifax, one can still derive lessons that apply more broadly to organizations in any industry. To that end, I’ve made a few observations with red teaming, governance, and general sound practice in mind.

For Red Teamers

No one can say for certain that a red team (either internal or professionally engaged) or involvement in a bug bounty program would have led to the discovery of the vulnerabilities and shortcomings that led to the massive breach. But as a practice, red teaming would likely have uncovered some of the issues that contributed to the breach, and could have validated the sense of urgency assigned to open issues, thus helping with prioritization.

In this case, there was some low hanging fruit (unencrypted credentials, PII was not encrypted, etc.), and arguably a red team should be left to focus on more highly deliberate attack techniques against crown jewels and supporting systems.

Based on the report, a successful red team engagement relies on understanding the target from an organizational behaviour perspective. It was clear that the attackers were able to exploit the gaps in Equifax’s processes and team responsibilities. A target system, comprised of technical components, doesn’t exist in a vacuum. Rather, it should be understood as the result of processes established by the organization that owns it. While time consuming, it can be helpful to establish a narrative about the organization, its values, structure, hiring, financial filings, etc.

Red teams might also choose to:

  • Maintain a library of webshells as JSP files or servlets to test against target web servers.
  • Compile a list of all internet-facing applications and services belonging to the target (fairly standard).
  • Acquire audit findings, if available — they make great blueprints.
  • Monitor certificate invocation and expiry dates — if the process to assign new certs (e.g. to web servers) is manual, this may grant a window of opportunity to attackers.

For Governance and Compliance Practitioners

The Equifax hackers compromised a system commissioned in the 1970s. Undoubtedly, it had experienced many iterations over the years, to find itself hosted in a legacy environment using outdated technology and vulnerable middleware — none of this is unique to Equifax, (which was undertaking a massive, multi-year migration to a SDN environment). The services and applications built by organizations can grow the attack surface if not properly governed.

Own your apps. With increasingly sensitive operations performed via web portals, security governance and involvement in the project lifecycle has never been more important. Privacy- and security-by-design remain elusive as the business case for the application of such practices only grows more compelling. An organization’s security team must demonstrate intimate familiarity* with custom-built applications, essentially those at greatest risk of becoming “legacy” when not maintained by an external entity held to SLA and maintenance terms. The accountability is internal, and thus there’s a greater burden placed on ensuring ownership across the organization: within the business, IT, and security.

*Intimate familiarity applies to dependencies as well —integrations, 3rd party code libraries, their licenses, versions, etc. It starts with the development and delivery teams, and the security practitioners assigned to them, to take stock of all the things, document them, and maintain them throughout the application’s lifecycle. This becomes a part of the wider inventory of software and assets leveraged across the organization.

Controls. When deviations are uncovered, it’s vital to immediately propose compensatory controls that can be provisioned before a full remediation of a vulnerable system can occur. Greater scrutiny should be facilitated by a SOC or, if required, third party.

Data governance. Typically led by a legal department, data governance practices (e.g. record retention) must be imbued across the organization. All data must be subject to the protocols informed by applicable laws, regulations, and internal standards. Today data, including PII, is aggregated in data lakes and in systems that are used to derive value from it. Leaders must understand what data is kept where, and what safeguards have been applied to it. In part, this is an inventory challenge, which lies at the heart of so many security incidents: if we don’t know we have it, or what state it’s in, how can we protect it? The same goes for data.

Other lessons

Reporting. There’s a lot to be said about metrics, since it appears there was a degree of guesswork about the status of remediation happening in both the Equifax security and IT teams. Senior leaders decried the lack of visibility into the more mundane tasks of tracking vulnerable systems and remediation efforts and while they may not have been responsible for such work, they remained accountable.

Leaders must demand sound reporting, which becomes the basis for decision-making related to resource allocation, budget justifications, employee advancement, and so on. Teams should be given (and have input about) clear measures of effectiveness (MOE) and measures of performance (MOP) to understand if a desired result is being achieved, and how efficiently. Such metrics, when combined with other factors, can be unbiased evidence used for team or employee scorecards. More importantly, it allows for transparent goal setting and visibility at the right levels of the organization.

Granted, there are ways to count that amplify or hide problematic issues, so treat it as an iterative project with executive oversight and input from across the organization. And, MOE/MOP are challenging when decision making and approvals are diffuse; thus, the time it takes to make a decision (e.g. as it relates to incidents or remediation) should also be subjected to measurement.

Pull security in. The narrative about the various teams that emerges in the House report reflects a dynamic that exists in most enterprises — security imposes, the business (and IT) comply, and security checks compliance. It’s a push model.

This must change.

The security team’s mission is to keep the organization — comprised of people, property, assets, and data — secure (at the most basic: available, confidential, and intact). This is a shared mission. Increasingly, the business and IT teams must be inspired to pull security in to their systems and applications, to demand that the most secure environment possible is achieved and maintained, while understanding there is no perfect security. Leaders in these teams must continuously challenge their counterparts in security to meet them where it matters most: in the service of customers. Practically, this could mean getting a security representative in the room during a meeting when a project is still conceptual. Eventually, a process should be established to formalize such engagements, but it’s equally dependent on culture.

About culture. No vendor sells culture. There are solutions to build awareness and training to develop capacity in the areas where risk is manifest (e.g. secure coding), but only capable leaders — starting with those most senior — can lay the foundation for a security-conscious culture that embraces x-by-design principles and eradicates the push-pull dynamic that tends to exist between security and the business/IT teams. Senior leaders should frame security as an enabler at every opportunity and, as above, select metrics that inform them about security conscientiousness.

A timeline of vulnerability disclosures related to ApacheStruts2 and the Equifax breach. The attacker was able to remain in the network for at least three months. By the author.

Conclusion

The report does much to dispel the narrative advanced a few months ago that the breach could be attributed to an employee failing notify system owners about a patch, which wasn’t widely embraced in any case. At the same time, it highlights some issues that, when considered, can become important lessons. Equifax learned a hard lesson and made many changes following the breach. A read of the House report and other assessments could mean that other organizations don’t need to experience the same pain before including sound security practices in their journey of continuous improvement.

--

--