Kathleen Moriarty

Trusted Assurance Simplified

Kathleen Moriarty

5 min read

13 You have liked this article 0 times.
0

In the third of this series of guest posts, Kathleen Moriarty talks about the importance of posture assessment - the process of evaluating organisation or system security - and looks at solutions for simplifying that process that could help organisations achieve higher levels of trusted assurance.


Security automation for posture assessment has been difficult to achieve even though many standards-based and proprietary solutions have been developed. The primary problem is the complexity of solutions requiring customisation by each enterprise. Additionally, some solutions work best in a homogeneous environment and are more complex when the environment is heterogeneous. This environmental overhead limits posture assessment to organisations with the resources to manage complex solutions, or those who outsource to managed security service providers.

The Importance of Posture Assessment to Trusted Assurance

Posture assessment is the process of evaluating organisational or system security by examining the respective components. Posture assessment aids in an overall understanding of the security posture of a network. It is also an important input to comprehensive Governance, Risk, and Compliance (GRC) solutions. By understanding an organisation’s current security posture, business risk decisions can be more easily made. An organisation can then establish and maintain security, privacy, safety, IT, compliance, and legal controls. This can be done even as changes emerge with new vulnerabilities, protocols, or other adaptations required.

How do we eliminate the requirement for customisation at each location to close this gap? Secure products out-of-the-box, that provide an automated posture assessment capability for system and application security at the time of deployment, centralises the security policy establishment to a small group of experts. This eliminates the need for distributed expertise, lessening the burden by using an architectural pattern for security management that scales.

Attestation from a Root of Trust

Posture assessment requires periodic comparisons of current controls to expected values. Having posture assessment as part of products when deployed is possible, but it will take some time until the industry can provide these capabilities.

Attestations, described simply, are signed objects that contain evidence. Attestation from a root of trust holds promise as a scalable solution to assure system and application-level security. Attestation at boot and runtime assures the individual hardware and software components are as expected (verified) and therefore are trusted. The attestation is provided as part of security solutions that build in controls to not only assess hardware and firmware but also to remediate on a granular component level. A root of trust may be hardware-based (e.g., TPM, AWS Nitro, OpenTitan) or virtual where attestations may be created or verified in a trusted execution environment (TEE) for instance. Support for attestations to assess system policy or measurements is beginning to move up the stack (line 201).

We are starting to see progress with work using attestations. The evidence may be generated on boot or runtime from hardware or software, then compared to a set of established policies or measurements. The policies or measurements may be provided by the manufacturer or other authoritative entity such as a trusted integrator or managed service provider. The policies and measurements must also be protected to ensure verification is trusted.

Reference Integrity Measurements

The measurements and policies may be implemented as described in "Reference Integrity Measurements," used locally on the system for remediation or a management system via remote attestation (this topic will be covered in the next blog in this series). For measurements at boot, components rely on dependent component measurements meeting expected values, or remediation actions may be triggered. Measurements on firmware or software might include hashes of the binary, side-channel information, or other observable pieces of information. The policies may even be provided in levels depending on the expected business or application requirements. The administrator may select which pre-configured security level is desired for their application and business needs.

While the boot processes and runtime measurements may be the same independent of the security policy, other settings that may be attested to could vary, such as the use of encryption, algorithms selected, key sizes, or password strength. Those responsible for managing systems may select from vendor-established policy levels, easing the need for customisation, while allowing for the flexibility often required to make business risk-based decisions.

In a model where policies and measurements for automated verification are vendor-provided, this technology assures transparency in the logs and evidence created. Technical analysts at an organisation can inspect the attestation verification results using log files, while at the same time enabling organisations that lack resources to have fully-automated assessments to expected assurance levels. While on-site resources may not inspect the attestation results, auditors may inspect all or a sample of gathered and assessed attestations.

How Transparency Helps to Build Trusted Assurance

Transparent verification of controls to meet trusted assurance levels aids in building trust in the provider of the software or hardware. Transparency of key performance indicators, telemetry, and controls bolsters trust in infrastructure, applications, and data. Attestation can shift the responsibility of establishing posture assessment automation for data, application, and system assurance to the vendor thus closing the gap in the number of information security professionals required.

Vendors should base their attestations on trusted controls, such as the CIS BenchmarksCIS Controls, NIST Special Publications (e.g., NIST SP 800-193, NIST SP 800-53), or the Trusted Computing Group’s Reference Integrity Measurements. This reliance on controls and benchmarks established by a trusted set of experts in a formal process aids in customer adoption.

__________________________________________

Read other articles in this series: 

Originally published as a CTO blog post on the CIS website.

13 You have liked this article 0 times.
0

You may also like

View more

About the author

Kathleen Moriarty, technology strategist and board advisor, helping companies lead through disruption. Adjunct Professor at Georgetown SCS, also offering two corporate courses on Security Architecture and Architecture for the SMB Market. Formerly as the Chief Technology Officer, Center for Internet Security Kathleen defined and led the technology strategy, integrating emerging technologies. Prior to CIS, Kathleen held a range of positions over 13 years at Dell Technologies, including the Security Innovations Principal in Dell Technologies Office of the CTO and Global Lead Security Architect for EMC Office of the CTO working on ecosystems, standards, risk management and strategy. In her early days with RSA/EMC, she led consulting engagements interfacing with hundreds of organisations on security and risk management, gaining valuable insights, managing risk to business needs. During her tenure in the Dell EMC Office of the CTO, Kathleen had the honor of being appointed and serving two terms as the Internet Engineering Task Force (IETF) Security Area Director and as a member of the Internet Engineering Steering Group from March 2014-2018. Named in CyberSecurity Ventures, Top 100 Women Fighting Cybercrime. She is a 2020 Tropaia Award Winner, Outstanding Faculty, Georgetown SCS. Keynote speaker, podcast guest, frequent blogger bridging a translation gap for technical content, conference committee member, and quoted on publications such as CNBC and Wired. Kathleen achieved over twenty five years of experience driving positive outcomes across Information Technology Leadership, short and long-term IT Strategy and Vision, Information Security, Risk Management, Incident Handling, Project Management, Large Teams, Process Improvement, and Operations Management in multiple roles with MIT Lincoln Laboratory, Hudson Williams, FactSet Research Systems, and PSINet. Kathleen holds a Master of Science Degree in Computer Science from Rensselaer Polytechnic Institute, as well as, a Bachelor of Science Degree in Mathematics from Siena College. Published Work: - Transforming Information Security: Optimizing Five Concurrent Trends to Reduce Resource Drain, July 2020.

Comments 0