The Road to Software Supply Chain Security Compliance
In this episode, host Paul Roberts chats with Steve Lasker, a former Azure Program Manager with over 20 years of experience at Microsoft. Lasker touched on his industry experience to explain how the effort to secure software has evolved into what it is today.
EPISODE TRANSCRIPT
PAUL ROBERTS
Welcome to another episode of ConversingLabs. This is ReversingLabs podcast, where we talk about all things threat hunting, malware, intelligence, supply chain, security, infosec. And we are really pleased to have our guest with us today, Steve Lasker. Steve, to start off our conversation as we do, why don't you introduce yourself to our audience? And also if you could kind of give us a sense of your journey into the information security space.
STEVE LASKER
Yeah, thanks. It's a pleasure to be here. Thank you for having me, Paul. I've been really impressed with the work ConversingLabs has been doing and ReversingLabs products and services you guys are offering. So it's an interesting journey. For the last 20 years or so, I was working at Microsoft, and one of the last sets of things that I had been working on was the container tooling for docker container tooling for development through production. And then for the last six years or so, I had been working on the Azure Container Registry, running that service. Working in the industry for a while, we're seeing a reboot of the virtualization stack. So that's where the docker container ecosystem comes in. And it's not just a production things, right? It's development through production. And it was interesting to observe the trends that were happening. There was the amazing productivity that has with this shift in the industry, to expect more for less. But there is also this other interesting phenomenon happening at the same time where the way we're achieving this great productivity is pulling things from the Internet and pulling them in, whether it be containers or when I'm building my containers, pulling in the latest version of some package. And that was a pretty major shift from the days where we were pulling in software from vendors that we trusted, whether it be control vendors or libraries and so forth. So they were doing great due diligence on their software because their company was at risk. So watching all of this, realizing that we really needed to apply the standards that exist, right? This is companies have expectations for how their software should be verified and tested and meet certain compliance. And none of that was really being done. Around 2018, we started the effort around the Notary V Two work where we said, look, the best practice is to bring the content you depend on to your environment. There was no good signing technology that worked for that content at the time. It depended on the content being in the same place. So if I copy it to my registry, I no longer had that seal of integrity on it. So that's where we started that project, is making sure that I can put identities around who built these particular packages. Working with a great group of people across all clouds and vendors, and we had a great collaboration group there. So that's kind of how I got pulled into the space, was seeing the need and the opportunity not just for the Azure customers at the time, but Microsoft is a software company, not just a cloud provider. So how does Microsoft ship its software to run on its competitors, clouds and so forth? So it was a great opportunity to kind of engage in the open source community and figure out how do we bring the bar, the expected bar, back to this new generation of the way software is built, consumed and distributed.
PAUL ROBERTS
So, I mean, you had a 20 year stint at Microsoft and ended up there as a principal program manager for Azure and a really interesting time period to be at Microsoft from kind of the early 2000s, right. And kind of trustworthy computing memo era, right. And this kind of big organization wide shift in focus on application security and then, as you said, 20 years that really saw the evolution from desktop to cloud. Just if you could kind of talk about that experience and your journey, your time spent at Microsoft and kind of some of the changes that you saw there and how do you see those playing out now in the conversations we're having now about technology use and adoption and security? Obviously.
STEVE LASKER
Yeah, it's interesting. So if you wind back quite a bit, even to the days where Bill Gates and Steve Ballmer were running the company, the industry was in a different space. Right. You had the Microsoft's, the Oracle's, the SAPS competing for holistic platforms. All your apps and services should be built on Windows. All your apps and services should be built with Oracle apps and so forth. And every win was a loss to the other. So it was a really different marketplace. Shift to today, there's a much more interoperability. I use a little bit of Azure, use a little AWS. I might be using both not just software vendors, but open source projects as well as a way to consume in additional capabilities. So that paradigm shift is a very interesting one. And also with the ability for the Internet to pull in anything at any time really changed the industry on what's available to them. And also the expectations around how fast people need to get updates out. The days that we would build something and take two or three months to test and verify and roll out something before it's actually consumed, that's just laughed upon now. So that really shifted the landscape in the sense of where humans were doing testing and there was built in stability some extent, because software is coming from vendors that their businesses were at stake. Shifts now where you can pull in anything from any place at any time. And we don't have time for humans to do the testing and verification. So I think that brought about a pretty significant change that just happened to be while I was at Microsoft. The other place was interesting because Microsoft is a big company and I was able to move around with different divisions and different portions of the company. And when Azure was first getting started, I had the opportunity excuse me. When Azure was getting started, I had the opportunity to work on their billing system. So it was one of the largest billing systems in the world. Telcos were the second, but they were separated by country. And one of the areas that I was working on was our payment gateway. And just considering the risk and fraud aspects of what it takes for Xbox consumers, because there's a huge risk and fraud situation there where gamers are very motivated compared to corporate entities and what they are trying to do or who they are representing, because in this case, they're actually companies as opposed to individual gamers. So it's another interesting area to kind of see the levels of risk and fraud that were happening at a payment gateway and how you check and verify these things in a lot of places. Very similar to how we're trying to verify the sources of content, not just the source code that we are consuming, but who are the entities behind that source code. And I'm specifically using the word entities because people come and go, do I really want to trust an individual or do I want to trust a collection of individuals that might shift over time? So it's just an interesting aspect of as I think about how we want to verify the software we're consuming, packages, completed software, operating systems, is how do you assure that that wasn't tampered with? Is there some kind of integrity seal on it? And what is the supporting information around it that I can do some efficient checks to know that they already did some due diligence. They already checked that it doesn't have any vulnerabilities in it, but when did they last check? Because all of this is time shifted information. If they checked it last month, that's great, but we learn a lot in a month period. So how can I get updated information? Or when was that information stated that was done? And who was it stated by? Because if I find some random piece of software on a USB stick, was the old thing, or on the Internet some endpoint, I don't know who built that thing. If I find an SBOM for it, who actually assembled that SBOM, how do I know that that SBOM actually is from the vendor that built that vendor entity that built that piece of software or package? If I have a statement that says it meets some compliance or it was scanned, how do I know that that entity is also true? Because otherwise I'm just finding pieces of paper on the floor that I'm supposed to trust. I want to be able to read some information about it and know that it comes from an identity that I trust for my environment. Because they might be trustworthy, but they don't meet the government level or financial service sector requirements that I need for my environment.
PAUL ROBERTS
Yeah, well, you mentioned government level too. And I mean, I think one of the things we've seen in the last 18 months, right, is that the Federal Government, Biden administration have kind of taken the lead on trying to answer some of these questions or at least for vendors who are selling into the federal government space, which of course is Microsoft and many other vendors with executive order and some of the follow on guidance that's come out of Office of Management budget and so on around that. What's your sense on if that's going to move the needle at all? They're talking, like you said, about using software bills of material or having them they're talking about these issues around software provenance and so on. Is that going to make a difference again move the needle or not?
STEVE LASKER
Yeah, look, this is a very interesting space because security and standards of this are not direct value to companies. Companies are selling their software. They don't think that this is the when they're trying to sell their Widget. What differentiates their widget is not necessarily... The assumption is they're doing best practices on insecurity, but are they? So I kind of review this as like taxes and insurance. Where companies need to do it. The more mature the company or the more mature the people at the company will remind them you need to do this kind of stuff. Because a very healthy company in an hour can be out of business and have huge liabilities because something made it through. And they could have made it through because a human made a mistake or a human was emotional or a nation state actor was involved.
PAUL ROBERTS
Business email compromise, right? I mean that's tried and true, but a very effective way of getting your hands on corporate bank accounts.
STEVE LASKER
Yeah. So I think what's interesting around the government standards is and it's not just the U.S. Government, right? The U.K. Has set up standards. France, just every country is having its own set of standards because it's not just their government software. But I think it's the balance of what I refer to as the carrot and stick. I have a stick to hold you requirements to you must do these things. And if then it's the question of like all right, I must do these things, how easy is it to get it done. So that's the carrot. So the government by setting a set of standards out there, says these are the levels of expectations. There's lots of software companies that want to have government contracts so they're going to institute their software to meet these compliance standards and then it's up to the ecosystem to build the best tooling, the best products, the best services that continue to meet and exceed those standards. There's a minbar just like when we drive cars in the US. There's a minimum requirements for insurance. You have to have there's going to be a minimum set of requirements and then any large vendor, any vendor of any size will be smart enough to know you don't go for the minimum, you go for the companies that are doing the better part because the liability is just far too there, it's far too risky. I love the Tylenol incident back in 1982 as being a really interesting spot at the time, which you think is just absurd. We had medications and other food on the shelf that were just you could just open up the cap. In this case they dropped in some cyanide caplets, put the cap back on and walked away. And several people died a pretty horrible death as a result of that. And in an instant, Johnson and Johnson's name was gone, evaporated in addition to the liabilities. But that was the standard at the time.
PAUL ROBERTS
That's right. And as a result of that, there were changes. Right. That Johnson and Johnson made, as well as pretty much every other drug maker to how childproof caps and secure caps and packaging to prevent that type of tampering. You wrote a blog post on this unsealed and delivered, where you talk about that. You look back at analogs in physical supply chain and attacks on them and sort of lessons learned, how more traditional product vendors responded to those high level. What is the takeaway from that? What do software publishers and the software industry have to learn from how other product manufacturers have responded to these types of threats?
STEVE LASKER
Yeah, absolutely. And I like making analogies to existing infrastructure that's already built, it's tried and tested, and there's always subtle nuances, but the way the physical supply chain works, whether it be food or the parts that go into cars that get assembled, is very similar to what software is. The idea that I could instantly pull something across the Internet into my production environment. Sure it's possible, but how do I know that one, that the hundreds of thousands of connections between my production service and that endpoint are going to be operating through no fault that stuff happens? How do I know that an update that was well intended actually breaks my environment? Right. The largest vulnerabilities of cases are actually updates to the packages. It's not the initial software that's kind of known. You see "evil.com," you don't install the software. But our favorite, SolarWinds or any of these others, they're usually updates that come through in those cases were signed by the original company. So on all the best practices, we bring milk and food into our home. We put it in storage, it's there when we need it, we replenish when we need to. Manufacturers go out and get parts and when they bring them in, they're doing quality control. They make sure that the stuff they're bringing in meets their expectations. So the analogy here is why aren't we doing the same thing for software? Let's leverage the Internet, bring in the copies of the latest thing, do some testing on it, make sure it meets the... First of all, does it come from the entity that we believe it comes from? That's the integrity seal. Is it still the same? Because it's not just one endpoint we bounce them across, dev, staging, prod, so forth. Is it still original? Is it still authentic? When was it last checked? What is the timestamps that are associated with it? And when the SBOMs and the scan results were done and then every update is a change, it's an intentional change. Is that change a change that will cause a break in my environment? So you want to do functional testing? Is there a change in there that a human made a mistake and some code could actually go awry and cause an outage or cause a vulnerability? Or did somebody accidentally pull in another package that would not accidentally or intentionally pulled in a package that is doing bad stuff? Bad stuff could be the wide variety of things. So I think it's really a matter of the discipline that we're seeing is companies need to start putting those disciplines in place and it doesn't mean that humans need to physically scan everything, right? We're not saying that. We're saying is just apply good automation practices for testing and verification as you do for your own build environments and deployment. So it's just a matter of writing more automation. I like to say nothing's free. So if you're getting some free software from the internet or wherever you're getting it from, invest in putting best practices in place, including scanning for bad patterns. And that's where ReversingLabs has got some amazing technology, is to not necessarily know this particular package is known to be vulnerable because that's just a matter of doing due diligence. It's catching stuff before you know about it. So what can be done to look for patterns and things? Because to know I shouldn't deploy this thing because there is a smaller piece of a bad pattern that's embedded into this software that hasn't been caught yet.
PAUL ROBERTS
Right, and that is the sort of it's like the crawl walk run type situation. There's a tremendous amount to be done at the crawl and walk stage just around making sure you haven't fallen for a typosquatting attack and accidentally pulled in a malicious package. It was just named to look like the package you wanted. But then there's the SolarWinds scenario, right, at the sort of run stage where, yeah, everything checks out except that your development environment got hacked by a nation state and there's a malicious behavior in there that you're unaware of. But you need to be obviously before you ship that out to your customers because it's signed, it's coming from your update servers and for all intents and purposes for them, that means it's good to go. And yes, what's interesting to me about the analogy you make in your blog post is between physical supply chains and software supply chains. It's similar to a point Jen Easterly at CISA made recently in a speech where she talked about the auto industry, right, and auto safety, airbags and seatbelts, and all the improvements in auto safety in the last century that have turned automobiles from basically kind of death chambers, death traps, if you're unfortunate enough to get in an accident in them, to incredibly safe products. When you go back and look at that history, though, those changes were the result of regulations, right? First federal regulations and mandate seatbelts and then state level laws that mandated that people actually use the seatbelts. Right. And you and I are probably of that generation where the seatbelts were in the car but nobody wore them. Right. And there was this period from the 60s to the 80s where that was kind of the thing.
STEVE LASKER
It was a federal requirement, but it wasn't...
PAUL ROBERTS
But nobody had to use them, and culturally nobody had been. But then there were these state laws and people started to wear them. But we don't have that yet in the software space. And I'm wondering, we're talking about a day after the Biden administration put out their Cybersecurity agenda strategy document, really interesting and important policy proposal. What do you think is the right mix there of carrot and stick from the law and regulatory standpoint to really get company, because the automakers obviously did not want to put seatbelts in their cars. They fought it for years successfully. So what is the right carrot and stick approach, I guess, to get the software industry moving in the same direction as the automobile industry, where now in 2023, right, safety and airbags are selling features of vehicles. Right.
STEVE LASKER
And what's interesting is you have the carrot and stick and the other part is culture. Right. It became cool to I don't know if it's cool to wear seatbelts per se, but some places it is. Right. You see people driving around with five-prong harnesses in their car. They're regular day drivers, obviously. They're probably on the track.
PAUL ROBERTS
You don't feel safe if you're not wearing it. I know about you. I feel that way. Like if I don't have it on, I'm like, I got to get my seatbelt on.
STEVE LASKER
No, we talked about this at a different time. I swapped out the stereo in my car back when you used to do that and I had to pull the seats out to get the amp underneath and everything. And I'm in the car and I don't want to put the seatbelt bolted back in. So I'm just trying to see if it was stable. And I was literally like just shaking in nervousness because I just didn't have the safety pieces in place. Right, but that is an instilled culture that it shifts it from fighting the evil regulatory standards to no, this makes sense. Like I have friends that have died I have situations, I have companies that I know that have gone out of business. I know of these vulnerabilities that have happened. But what you also saw is this very interesting ecosystem of government and private sector interrupting back and forth where they keep on raising the bar and to some extent look at the space industry, right? The NASA had to initiate a lot of these efforts because it just wasn't cost effective at first so they invested themselves. Now you have private sector doing much better innovations to bring the industry forward. So you got to do something, you got to start somewhere. So I think having those regulatory standards are there will be the bar that again, this is taxes and insurance. Every company that's trying to do their business knows they have to do this but what's the checklist? If I need to go hand it to somebody that's not core to my product innovation but I need to put it to somebody that's responsible for making sure we're adhering to the standards. What is that list? We saw this with COVID. Companies were like, look, just tell me what to do. And so the CDC would put out guidance and forget all the details of how that played out in the sense of the politics of it. But the point was you're seeing but you still saw the need, right? You saw the company saying please just tell me what to do. And for us traveling to conferences, we're going to cities and these cities and countries and conference centers and staff of them were like they can't make up their own standard because they just get attacked so they can just point to whatever government standard for different countries. We went to conferences, look, we're just following the standards here, the local standard. The local, right? Well, that's actually a very good point. So I think that this is critical to set the bar with the stick the carrots are developing and you bring up a really good point of how do we do this across localities? Localities being cities, states, countries... We're a global economy. At some point we'll be a cross planetary economy. We'll have to figure out timestamps across multiple planets which will be an interesting problem.
PAUL ROBERTS
Do you want to talk about UAPs? Let's talk about UAPs. That'll be great for the numbers. Well for the downloads, for ConversingLabs you delve into that topic.
STEVE LASKER
I know, it's just like so many different fun we'll have to come back and talk more about it.
PAUL ROBERTS
200,000 downloads.
STEVE LASKER
So we were talking about the security aspects of where soft... Wow, I was going this other tangent of time zones and so forth. Oh, I know the standards across countries.
PAUL ROBERTS
Standards across countries.
STEVE LASKER
Look, the different countries are going to have slightly different variations on it. We need a way to have interoperability so that the different companies can evolve, the different security companies can evolve and have some interoperability. So this is where I've spent more of my recent time now focusing on some IETF standards around secure supply chain integrity, transparency, and trust. The SCITT effort, so that we can create an ecosystem for security companies, ReversingLabs, other security companies, companies building SBOMs, companies distributing supply chain information, that there are some standards so that we can see an ecosystem of multiple projects, products and services to have a level of interoperability. Because if we don't have standards, then everybody's just trying to build up from the same base and we don't really bring the ecosystem up as a whole. So that's where I'm pretty excited to see that evolve, because it gives us a way for multiple countries to contribute to those common elements. And if the way we think about SCITT is a way to provide verifiable identity based information, the actual content SCITT itself doesn't care about, but it wants to make sure it's these secure pipes that things that are put in, it can be trustworthy. And then it's up to the consumers to decide, is that information relevant to their environment? But you have this common way of communicating. And that might be the way ReversingLabs is looking for vulnerabilities, or the way they're producing information so that consumers can pull that in. So that's where I think is really powerful, so that the various projects can continue to raise the bar, but they're raising the bar and still able to communicate with each other so that there is this growing ecosystem any different than we plug into outlets in the walls and there's the standards that are there.
PAUL ROBERTS
So you mentioned that Supply Chain Integrity, Transparency, and Trust, the SCITT project, ieTF standards, always somewhat lengthy process, but anything to report on that or when we might see some output, or some guidelines from ieTF on this supply chain integrity issue.
STEVE LASKER
So it's true, standards do take time because the intent is to get multiple parties to come to agreement. And it's not just agreement because they conflict, they're bringing in different perspectives. And that's a very valuable place, is how do you bring a group of people that are collaborating together in a positive direction, that have different views and different experiences, so that you just have a better solution. So ieTF has been those standards bodies that the Internet runs on, and there's been an amazing group of people that we've been working with from all different corners of the industry, including multiple continents. We were debating on how to spell artifacts with an E or an I, because it depends on the side of the pond you're on. That said, the SCITT Working Group is, they've said, literally the fastest group that's been working through adoption in its history. And I think obviously there's a need that's part of it. But there's some amazing people that we're working with that have been in the ITF for years and know all the groups and what's, again, kind of, again, building on existing things. As we're having conversations in SCITT, we're talking about other working groups and other standards that are ieTF ready. Then we don't need to reinvent everything from scratch. Why don't we leverage the other pieces that are there, including verifiable credentials, which is part of W3C, or other projects of suits and rats and all kinds of other interesting project names that are there. But I see it happening quickly. I'm seeing adopters, I'm seeing the specs evolve quickly with reference implementations. I think we'll see some very quickly without disclosing too much there that will really set the bar for how companies can run a solution in the environment for what they need. Because I think a big part of security is how do I run it in a secure environment as well as a public environment because it's a mix of both.
PAUL ROBERTS
Right. Final question, Steve. There's often discussion of I think Wendy Nader kind of coined the term of the security poverty line. Right? And there's always a I think sometimes the discussions about these types of whether it's supply chain or what have you tend to tilt towards more affluent resourced companies who can, Microsoft is obviously they can spend whatever needs to be spent on supply chain. But so many software publishers out there, so many small companies, small businesses that don't have those resources, and yet they produce most of the stuff that's out there, most of the applications that you use, most of the software. As we're moving in, as we're starting to deal in a more honest way with this supply chain issue and the threats and risks. Do we need to be mindful of that? And how do we make this achievable, these types of objectives and goals achievable for companies of all different sizes, not just the big rich companies?
STEVE LASKER
No, absolutely. It's a great point and I think this speaks exactly to what we've been doing with SCITT. There's the matter of every consumer. When I say consumer, I'm not talking about end consumer. Our friends and family have nothing to do with tech, but just trying to get their apps on the phone. It's Microsoft consuming software. It's IBM consuming software to build its software because everybody's packaging other stuff. Every time you're trying to consume something, what is the level of depth of analysis that you're going to do? Are we going to do DNA analysis on every piece of software that we look at? It's not just the cost. It's time. It's practicality. When you look at a product and you look at the side, you see a brand, you decide whether you trust that brand. You look at the ingredients on it. If I'm allergic to peanuts, then I'm going to make sure it doesn't contain peanuts. Do I know it doesn't contain peanuts? I'm trusting. Right. I'm looking at this enough information. If I'm gluten, they need gluten free food. I'm going to look at the logos and so forth that are there. So it's a matter of building a sense of trust and being able to verify it, having enough information to make statements about the quality of the software so that now I get to smaller companies or smaller groups. There's a group in ieTF that they're focused on emergency management, whether it be schools or fire, police and hospitals and so forth. They don't have enough people to do high tech evaluation. But what if there is an entity that is dedicated to making sure that all of the emergency management software, which is unique, is being tested and is stamped with an approval because this entity is doing that work? And then every hospital, emergency management system, whatever, they're just making sure that this piece of software they're about to consume was tested within a certain time frame by an entity they're choosing to trust for their industry, and then they're good to go. So that's how I see it. Scaling from the really big financial firms that won't trust anybody just so they want to make sure that the risk is so high that they're going to do their own testing because they can afford to do it towards larger groups which still want that, but they're delegating trust to another entity that is doing that on their behalf. So that's how I think this will scale. But it's part of it is every one of those statements has to be trustworthy. So that's why we are focusing so hard on the verifiable identity work that we're doing in SCITT to make sure that when I'm pulling something out, I can actually trust the information. Just like when I go to a notary. If we buy a house, if you and I are exchanging the sale of a house, we have to go to a notary and I have to prove my identity. And you prove your identity. The notary doesn't care about the contract that's leaving to somebody else. They're making sure that we are both who we say we are, and then somebody wants to analyze the contract. They're saying, well, I know these are the people that they say they are because they were verified. Now they can just do the analysis on the actual contract.
PAUL ROBERTS
Right. Steve Lasker, it's been great having you on, and we look forward to having you on again and continuing to talk about supply chain security. Thanks so much for coming on ConversingLabs.
STEVE LASKER
Thank you for having me, Paul. ReversingLabs, I'm really impressed with the work you guys are doing, and it's exciting space. Thanks.
PAUL ROBERTS
It's great talking to you. We'll do it again.