NIST Alignment with Artifactory

Wayne Chatelain
Sr. Mgr, Software Engineering

The National Institute of Standards and Technology (NIST) in the US publishes guidelines for maintaining application security and countermeasures for managing major risks.

With binary repositories and container registries being a key component in the software delivery lifecycle for enterprises, Artifactory plays a central role for security and risk management.

We will explore some of the NIST recommendations and examine how Artifactory can be aligned with the NIST guidelines.

Video transcript

Hello everyone, my name is Wayne Chatelain and I would like to again welcome you to the JFrog SwampUP conference for 2021. Since the conference is virtual this year, we will be able to interact through chat as the session is being presented today. I want to encourage everyone to reach out in the chat features and ask questions. We can leave time towards the end to get deeper into questions but just remember that will all be done in the chat session.

 Again, I would like to thank everyone for attending today. In this session, we’re going to look at some key techniques to bring your artifactory implementation into alignment with NIST. But first, let me introduce myself.

 My name is Wayne Chatalain, and I’ve been with Capital One for over 15 years, I lead a team of software developers and SRE engineers that are responsible for the entire Artifactory ecosystem at Capital One. We centralize the binary management capabilities at a global scale, and help enable other teams across Capital One to deliver great software capabilities for our customers and associates.

 In this session, today, we are going to look at a few topics around NIST.

First, I’m going to talk about what NIST is, then we are going to dig a little deeper into the history of NIST and finally, we’re going to review some of the best practices and recommendations of NIST, and how that impacts Artifactory. When I was thinking about this presentation, the main focus that I wanted to touch on was about the configurations and implementation of Artifactory itself. A lot of us have probably read about best practices and recommendations from JFrog or other companies for installing rolling out and configuring Artifactory.

 We’ve had a lot of opportunities to interact with other companies, like at conferences like this one, to discuss how everyone is being successful. But really, how do we know if what we’re doing with Artifactory is good enough? And this is what I want to take a deeper dive into today.

 You know, actually there are several different ways to interpret this question. If your Artifactory implementation is good enough. Usually you think about this in terms of cyber or security. But are there other vectors that we should be considering as well?

 Like developer experience, audit experience, compliance experience, we’ll check about the impacts of these other personas as well. This is where NIST can help us.

 NIST actually provides us with standards, recommendations, and best practices across all the different parts of technology. It addresses security, compliance, software development, resiliency, and a host of other ideas and patterns. And by using this guidance, we can get an independent and third party viewpoint into Artifactory, the products implementation, and configurations.

 Some of these recommendations are familiar to everyone and come out of the box with Artifactory. Some are the best practices we hear about from JFrog. But some of the recommendations rely on the way that Artifactory is implemented and configured in your organization.

 Okay, so I might know what you’re thinking, this acronym for NIST. I’ve said it a few times already. What does this even mean? NIST actually stands for the National Institute of Standards and Technology. It’s an agency in the United States government and it’s part of the Department of Commerce. It’s a non regulatory body that tries to help influence innovation and competitiveness with measurements, science standards, and technology.

 NIST has actually been around for quite a long time. I’m not going to go through the entire history of NIST, but I did want to point out some of the highlights.

 NIST was started all the way back in 1901, and was first known as the National Bureau of Standards, or NBS. Its original focus was to provide guidance on weights and measurements. Also, it was to serve as a physical laboratory within the US.

 When we talk about weights and measures, we start thinking about things like kilograms, pounds, meters, feet, and so on. This is on the right track for what the NBS was originally responsible for. They ended up setting up a program to provide metrology services to the US, or the scientific study of measurements.

 Over the years, this continued to evolve with the direction of government to develop standards for commercial materials and products. Quality standards were introduced for things like clothing, headlights for vehicles, electrical safety, and much more.

 This eventually led to the development of one of the first generations of computers. Some of us may have heard of the SEC or the standard Eastern automatic computer, which was developed by the NBS. As the world modernized, the influence and responsibilities of the NBS continue to evolve, which eventually led to a renaming of the organization in 1988, to the National Institute of Standards and Technology.

 Today, their measurements support the smallest of technologies, like objects smaller than a human hair to the largest, like earthquake resistant skyscrapers. Over the years, NIST has had some significant impacts and contributions to the evolution of industry and innovation.

 I wanted to highlight a few of the accomplishments and achievements that came out of NIST. This timeline that we’re looking at is not representative of all the achievements from NIST, but a few of the ones that I thought were important and probably familiar for some of us. On here, you can see things like the first neon sign was debuted in 1904 at the World’s Fair, the National Electrical Safety Code was introduced in the US in 1915, in 1936, the first natural colors of a solar eclipse were taken and in 1949, this introduced one of the first atomic clocks.

In 1950, the sbhc computer was debuted, and in 1953, the first panoramic dental x rays were taking. In 1957, the first digital images were taken with the help of NIST. Some of the first data encryption standards were introduced in the US with the help and influence of NIST in 1977.

 In 1993, internet time services were launched with the influence of NIST and in 2000, Advanced Encryption standards or AES was introduced. Some of the incredible accomplishments have had such a notable impact that several people that have collaborated with NIST have been awarded Nobel prizes. In 1997, Bill Phillips was awarded a Nobel Prize in Physics for his work with laser cooling. In 2001, Evert Cornell was awarded the Nobel Prize in Physics for creating a never before seen state of matter, the Bose Einstein condensate. In 2005, Jan Hill was awarded the Nobel Prize in Physics, for contributions to the optical frequency comb technique for precision measurements with lasers. And in 2001, Dan Shchtman was awarded the Nobel Prize in chemistry for discovering crystal structures previously unknown to science, which are quasi crystals. And in 2012, Dave Wineland, was awarded the Nobel Prize in physics for experimental methods that enabled measuring and manipulation of individual quantum systems.

 Now, I’m not going to claim to be an expert on this, but I wanted to highlight how NIST has been highly influential in the US across multiple domains. A couple that we already touched on in the last few minutes was in Information Technology. Just like in other fields, NIST covers a variety of topics in the IT domain as well. Everything from artificial intelligence, cloud computing, IoT, mobile, biometrics, and augmented reality among others.

 NIST does research and collaborates with industry expertise and professional to help produce standards, recommendations and publications to advance and measure existing and emerging technologies. Each of these topics has a host of sub topics as well. As in cybersecurity, this covers topics related to cryptography, identity and access management, risk management, trustworthy networks, and more. NIST develops cybersecurity standards, guidelines, best practices, and resources to meet the needs of US industry, federal agencies, and the broader public.

 Following these recommendations and best practices is where we can leverage artifactory and our implementation of the product features to help measure and ensure that what we are doing aligns with the highest of industry standards.

 I’ve done some of that analysis and will highlight some of what I have learned about the NIST recommendations when it comes to a binary repository. One of the first recommendations we’re going to look around is vulnerability detecting and management. NIST recommends that your applications are built with repeatable stages, and automated with that pipeline.

 This is probably something that a lot of us are doing, especially with the evolution of DevOps and DevSecOps but it’s a good reminder to take a step back and evaluate your pipeline. Look across all of the stages of your release strategy, and determine where there are manual steps and work towards automating them. Also, vulnerability detection should be injected across all of the stages where your application is transformed, packaged, stored, or running.

 Where binaries and applications are pushed across different platforms is a potential point of injecting vulnerabilities into your application. One key point here is to make sure that you have vulnerability scanning enabled with artifactory.

 Next, NIST recommends that you have visibility into all layers of your application, not just the base layers. An evaluation for vulnerabilities needs to be done at every level, including the final product. Also, any open source or custom frameworks or software that’s being used with an application needs to be checked for vulnerabilities. This also suggests that reporting and monitoring should be centralized across your entire organization.

 Remember earlier when I mentioned that we will be looking at personas like cyber security and audit? This type of centralized reporting enables these stakeholders to better understand the risk posture of your organization. This reporting should include the vulnerabilities detected and highlight the ones that do not meet your organizational standards. Finally, NIST recommends that quality gates are inserted at each stage of the build and deployment cycles. Now, this is not very prescriptive on what specific quality gates should be implemented and that should be left up to your organization standards. But it’s good to keep in mind where there may be gaps in your own builds and releases, not only should you have quality gates, but when the quality standards are not met, the build process should be halted and any deficiencies should be remediated before proceeding again. Next, let’s look at configurations.

 NIST recommends that application configurations are validated with predefined settings and that organization should use tools and processes to enforce configuration compliance.

 When evaluating software configurations it’s important to consider best practices and recommendations like from the open source community or from third party vendors. It’s important to think about how software configurations are monitored when the code is built and stored in Artifactory.

 For software configuration, it’s also a recommendation Shouldn’t have centralized monitoring and reporting.

 Again, this would be beneficial for people in your security and audit divisions, so that they are aware of the overall compliance posture of your organization but in addition to just having the monitoring, it’s important that the thresholds are continuously kept up to date, as standards and processes evolve.

 One thing to consider that NIST recommends is to prevent the software from running if it’s determined to not be in compliance with the configuration standards.

 This can come from artifactory, a quality gate in the build, pipeline, or even at runtime.

 Also, NIST suggests that applications are sourced from trusted sources.

 This may be your artifactory implementation for any internally built software packages but it’s important to consider the remote repositories configured in artifactory.

 You must be sure that any open source sites or vendor provided sites are trusted when sourcing packages or dependencies from them.

 And NIST recommends that a minimalistic approach is taken when building software. This helps reduce any potential attack surfaces.

 Remember those 20 gig Docker files and your artifactory registry? It might be time to think about what is really contained in those and what’s really needed. This might seem pretty logical, but all the secrets and sensitive data used by applications should be stored externally of them.

 Actually, NIST recommends that secrets are injected, and provided at runtime. Approved vaults and secrets management processes should be leveraged in organizations, and native capabilities of orchestrators or providers should also be leveraged. Additionally, secrets should be provided only to the applications that need them, and should be encrypted at rest.

 Under this same standard, but you should evaluate how secrets are detected and managed within your artifactory.

 What processes are in place to ensure that developers in your organization are using the proper vaults, and not necessarily using artifactory by accidentally making a secret or some sensitive data into a binary or a Docker image? Next, let’s dig a little more into trust. NIST recommends that organizations centralize all of the packages and registries that are allowed to operate in the network. Artifactory of course, provides us with some of the tooling and capabilities to achieve this. But you should establish the ground rules of what is… what is meant to actually trust both open source remote sites and external vendor sites. Also, applications should be able to be identified with a cryptographic signature.

 In the use case of Docker and containers, you should look at using tools like Docker notary in your environments. For other package types, you will need to evaluate how you establish signatures and validation of those signatures. Signatures should be validated before the software packages are actually executed. This helps ensure that packages have not been tampered with, or compromised and can run as a trusted package.

 Enforcement policies and controls should be added to ensure that your hosts and orchestrators only run applications that come from your trusted and approved sources. This is another example where Artifactory can help ensure that trust can be established within your repositories. But additional actions are necessary in your environment to ensure that only Artifactory is used for sourcing your software packages.

 Sometimes it goes without saying, but packages and repositories and Artifactory should be regularly monitored and updated as things change. For example, new vulnerabilities may be detected making some repositories and packages less secure. Also, configuration requirements evolve and change in your organization and application packages need to be maintained to ensure that compliance goals are still met.

 When things do change, and new vulnerabilities are discovered, or configuration standards are updated, the old versions of applications should be pruned and removed from your repositories and registries. Now this process should be automated, and can be based off of a time tracker, or specific labels or metadata, or the actual vulnerability posture or configuration posture of the applications. But whatever works best for your organization, it’s recommended to remove the old binaries from your repos. Also, pruning doesn’t necessarily mean deleting, I’d recommend moving those old and unused binaries into an offline repository that is not accessible by the general developer community within your organization.

 If something was to happen, it should be easy to recover a binary for forensics analysis, or to replay a release, or dig deeper into some defect testing. Next, the operational practices of your organization should put emphasis on using software binaries with immutable names that specify discrete version numbers. So rather than configuring a pipeline to download a file called My App, for example, use the version number, like My App 2.3 to ensure that the correct and known good version of the application is being used in the pipeline, and build release. This is also where use of latest needs to be carefully evaluated. Latest does not always align with a discrete version number of your software and it’s not always guaranteed to be the exact most up to date version that has gone through testing cycles. So just make sure that your build and release automation is using the proper version of software that it intends to use. Now let’s take a look at a different aspect of security that deals with authentication and authorization.

 One of the key aspects in the recommendations from NIST is to understand where your sensitive and proprietary information is located and all access to your sensitive data should require authentication. That means that if you’re storing, or even have the potential to store sensitive data in artifactory, you should ensure that your anonymous access is deactivated, and that every user whether it’s a human user, or a system user, should be required to enter credentials to access the data and binaries in Artifactory.

 Another best practice is to limit who can also push packages to Artifactory. Not just for downloading, but also when publishing packages, authentication should be required. But in addition to authentication, proper authorization to publish packages should be validated.

 With authorization, it should be further restricted with fine grained permissions, and least privileged access. This means that a user should only be allowed to publish to their spaces and repositories within Artifactory. It’s good to make use of permissions and groups in your Artifactory instances for limiting access within the system. NIST also recommends to integrate authentication and authorization capabilities with your organization’s directory services models. This is going to be something like your company’s active directory services. For example, you can configure LDAP and Artifactory for authenticating credentials, but also can be used to ensure that users are in the correct LDAP groups to be able to publish to some of your repositories. And finally, NIST recommends that reads and writes should be logged and audited.

 You need to make sure that users who publish packages continue to need that authorization. For example, if a user changes roles in your organization, that person may no longer need access to produce artifacts, and access should be revoked. In addition, where you may have sensitive data stored, anyone who is accessing that information and downloading it should also be logged and audited.

 This helps ensure that everyone who has accessing the data is authorized, and prevents unwanted access to the data and packages.

 Next, another one that seems like it goes without saying, but NIST includes in their best practices that connections to registries and repositories should take place over encrypted channels. This includes all of the development tools within your organization, as well as the orchestrators and runtime environments.

 This should also include all of the connections from your Artifactory to external and third party remote sites as well. Each tool and capability will have different ways to be configured but the key goal is to ensure that all data that is pushed and pulled from Artifactory occurs between trusted endpoints and is encrypted in transit. So to recap what we reviewed today, we wanted to take a look at a few topics around NIST. First, we reviewed what NIST is. Remember, the acronym stands for the National Institute of Standards and Technology. Then we looked a little into the history of NIST. And finally, we reviewed some of the best practices and recommendations of NIST, and how that impacts different aspects of your organization and its software delivery practices, including Artifactory. So this takes us to the end of the presentation for today.

 I again want to thank everyone for attending my session and thanks to JFrog for giving everyone an opportunity to attend SwampUP virtually this year.

 Have a great rest of your day and enjoy the rest of the conference.

 

Trusted Releases Built For Speed