Language selection

Search

CSPS Virtual Café Series: A Conversation on Cyber Security, with Melissa Hathaway and Scott Jones (TRN5-V05)

Description

This event recording features a conversation with Melissa Hathaway and Scott Jones on the implications of events like the Stuxnet and 2014 Sony Pictures hacks, the consequences of paying versus not paying a cyber ransom, and what people and organizations can do to thwart cyber threats.

Duration: 01:08:54
Published: January 13, 2022
Type: Video

Event: CSPS Virtual Café Series: A Conversation on Cyber Security with Melissa Hathaway and Scott Jones


Now playing

CSPS Virtual Café Series: A Conversation on Cyber Security, with Melissa Hathaway and Scott Jones

Transcript | Watch on YouTube

Transcript

Transcript: CSPS Virtual Café Series: A Conversation on Cyber Security, with Melissa Hathaway and Scott Jones

[The animated white Canada School of Public Service logo appears on a purple background. Its pages turn, opening it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. Text is beside it reads: "Webcast | Webdiffusion." It fades away to a video chat panel It's a close up on a bald man with a trimmed beard and square glasses, Taki Sarantakis. He sits in a home library.]

Taki Sarantakis: Welcome to the latest instalment of the CSPS virtual cafe. I'm Taki Sarantakis, the president of the Canada School of Public Service.

[A purple title card fades in for a moment, identifying him as being from the Canada School of Public Service.]

Today we are talking about a very, very important issue, and that's cyberspace and, more specifically, cyber security. Like all of our virtual cafes, we try to make them not only educational, but also fun because there is no law or rule that says that learning should be boring. In fact, it's the opposite. The more fun you can make something, the more prone you are to learn and to retain.

[Two more panels join the video chat, putting Taki's panel in the top left corner. On his right, a man glasses and a grey suit, Scott Jones, sits in front of a Canadian flag, and a logo reading "Canadian Centre for Cyber Security" in both English and French. In the bottom panel, a woman with a blond bob, glasses and a white pinstripe blazer, Melissa Hathaway, sits in a home office.]

Today, we have two very special guests to talk about cyber security and they are two world-class experts. The first is Melissa Hathaway. Melissa is one of the top experts in cyber security in the world. She has done bipartisan work in the White House, working on cyber security for both the Bush and Obama administrations. She regularly speaks to elites all over the world, including Harvard University. She's currently the president of Hathaway Global Strategies. Melissa, welcome.

Melissa Hathaway: Taki, great to see you.

Taki:  Our next guest... We're going to introduce our other guest indirectly. Melissa, this is actually a little bit of education for you. Now in the United States of America, the most famous bureaucrat in non-COVID times is typically a general, or an ambassador to the United Nations, or a long standing head of the CIA. In Canada, the most famous bureaucrat is actually a gentleman named Dave Phillips, and he's a meteorologist in our environment department. I've always found it very Canadian that our most famous bureaucrat is somebody that tells us about the weather. I'm making a prediction today. I'm making a prediction that the person who succeeds Dave Phillips as the most famous bureaucrat in Canada will actually be the gentleman that is going to be opposite of Melissa today. His name is Scott Jones. Scott, tell us what you do.

[Scott's panel fills the screen.]

Scott Jones: Thanks for that, Taki. I'm not sure I can live up to Dave's reputation and early morning media briefings, but I'm the head of the Canadian Centre for Cyber Security, which was created about two and a half years ago to be the front door for cyber security incidents and operational, and advice and guidance for the federal government across the country.

[All three panels return.]

Taki: Terrific. Today, we're talking about cyber security. In the bookcase behind me, I pulled out some books that are starting to pile up on my reading list.

[Taki shows off each book as he mentions their titles.]

I've got The Hacker and The State and I've got The Perfect Weapon. I've got- here's a scary title by a very brilliant man: Click Here to Kill Everybody. The Virtual Weapon. So Melissa, what's going on?

Melissa Hathaway: It's funny. I know Lucas and David. All of those authors are all colleagues of mine.

[Melissa's panel fills the screen. A purple title card fades in for a moment identifying Melissa as being from Hathaway Global Strategies LLC.]

I think it's really important to put it into context. Over the last 30 years, we have connected our critical infrastructures, corporate networks, and everything to the Internet, to work toward a digital transformation. And it currently represents about 15 percent of the global economy- our digital economy. In 2020, we saw a rapid acceleration of this digital transformation because 85 percent of our companies had to prioritize becoming more digital.

[Taki and Scott's panels return to the screen.]

We had a 300 percent increase in remote work because we had the shutdown. We saw the turning to artificial intelligence and other things to increase customer interactions. We became even more dependent on the internet.

We see these as key initiatives for our countries, as the digital economy of accelerating that digital transformation. I know that in Canada you're investing 180 billion dollars for the transformation of your core infrastructures. Here in the United States, we're making similar investments for health care and other things. Our countries are connecting a minimum of 127 new devices every second to accelerate the contactless society because we're going to need to embrace technology even further as we go forward in post-COVID era.

But the challenge with that is that many of these devices over the last 30 years were fielded with core vulnerabilities, with the principle of field it fast and fix it later. I'm kind of famous for coining the term of 'Patch Tuesday leads to Vulnerable Wednesday'. Last year alone, there were 1100 vulnerabilities patched by Microsoft, 1600 by Oracle. It's just a volume of vulnerability in the core of our everyday society that we just can't sustain.

In 2020 we had a 715 percent increase in ransom attacks, mostly going against our health care and the medical industry. We had an increase of 150 percent of distributed denial of service attacks, which knock you offline when you need to be online.

[Melissa's panel fills the screen.]

We saw a 600 percent increase in the Internet of Things attacks because-

[Melissa's screen dips through black.]

-those vulnerable devices that we're fielding every second are being exploited and hijacked to steal your personal information, to steal your intellectual property, to hijack your bank, etc.

The destructive and disruptive activities are increasing at a pace that is unsustainable for our governments and for our corporations. I worry about the intellectual property theft, the compromising of our medical research, the conducting of influence campaigns to undermine our democratic processes, the disrupting of our critical infrastructures and our core businesses, and then just the overall hacking of our international financial institutions and the like. 2020 accelerated many of these things because we embraced digital transformation and our governments continue to embrace that digital economy.

It's really essential now that we co‑invest in the resilience and that we try to draw down or buy down the risk that we've inherited and embraced over the last 30 years.

[Taki and Scott's panels return.]

I'm working with Scott and people all around the world trying to raise that awareness within our  governments and within our corporate environments of what is going to be necessary in order to ensure resilience and an ability to harvest the opportunities of the digital economy as we go forward.

Taki: Thanks, Melissa. Scott, Melissa raised a lot of the issues that we're going to go through a little more systematically as we go forward. One of the big things that I think people intuitively get is that our economy is migrating online. We've seen COVID accelerate that. I think Melissa said that about 15 percent of the global economy is online right now.

[Melissa nods.]

It doesn't take a genius to see where that trajectory is going. It's probably not going to go down. It's probably going to keep going up. As our world and our economy starts to go online, are we ready, Scott?

Scott: Well, you know I think we have to take a step back and look at this because while you might not be going online when you're doing a purchase, but the company you're going to is online. Its logistics supply chain is online. It requires all these online systems to be there.

[Scott's panel fills the screen.]

It is probably in a building that has building systems that are online. That's what's accelerating now: the internet we don't see. Sometimes you see the Internet of Things, but we all think of that as things like smart speakers and smart light bulbs—things we know about. I'm talking about the things that control our buildings. I'm sitting in our very empty headquarters building right now, but every light here is controlled centrally and can be controlled. The heating system can be controlled remotely by the building managers. These are things that we don't see. And then there's a logistics chain.

When we looked at the National Cyber Threat Assessment, one of the things we wanted to talk about is, yes, there's all of these things we interact with on a daily basis, but most of our life is dependent on things we don't even know are connected to the internet.

[All three panels return.]

That's the huge challenge for governments right now. How do we start thinking about this? I haven't even touched on the data it's generated. The data generates about us. What we do as individuals.

[Taki nods vigorously as Scott speaks.]

A lot has been made out of some of the privacy changes that have been made by these big Internet companies where they're no longer searching through our emails to generate ads. It's because they don't have to. They have all the data they need to build profiles on us. They don't need to see us as individuals anymore because they've got us pegged.

We've got all of these challenges on the internet. Absolutely, there is a huge pressure that we're going to face as a nation. Are we ready? No, we're not. Are we trying to be ready? Absolutely. Are there things that are getting better? Yes. Organisations are understanding their systemic risk more. Boards of directors are talking about this. Cyber security requires investment.

[Scott's panel fills the screen.]

It means when we're going to purchase, we need to start thinking about security differently. We cannot be lowest cost compliant for procurement, for example. We can't incentivize a CEO to cut spending on security for a product that is so integral to the management of our network infrastructure. I know we'll probably touch on some of their recent large scale supply chain activities. So these are things- we've got to talk about incentives. We've got to talk about what is an industry, and how do we start to change what is arguably the market failing to address security as a real risk.

[All three panels return.]

Taki: Scott brings up some really good points that people really need to internalize as public servants, because sometimes you hear the word "cyber" and "cyber security" and you think "that's not for me, that's wires and physical things. If something happens, my bank will reimburse me. I'm kinda good." Cyber security isn't that. It was that maybe 15 years ago. What it is now is exactly what Scott is talking about, which is it's almost everything. It's the water that you drink. It's the traffic lights that are telling you stop and go. It is in some cases, your front door locking and unlocking. It is a speaker listening to you and then taking a cue from that.

There's vulnerabilities everywhere. As Melissa said, the vulnerabilities are increasing daily because the number of things that are going online is just exponential. A couple of years ago, we started hearing the phrase the "Internet of Things." By the year 2030, a million devices will be hooked up to the internet. My God, by the year 2030, I will have a million devices hooked up to the internet alone. Everything is connected. Not just everything is connected, that's only step one. Step two is when everything starts talking. When your car starts talking to your garage door opener, to your furnace, to your lights. This is important for every policy analyst in the Government of Canada.

Now, let's go back a little bit. The first time a lot of us started hearing this in a real way was as a result of a bad movie. A few years ago, there was a movie made by Seth...Gogen, Rogen, something like that. And then somebody got really, really mad at Sony. Scott, Melissa, who wants to walk us through what happened? Some people call it an attack. Some people call it mischief. Some people call it espionage. Some people call it an invasion of sovereignty. First, what happened? Then we'll talk about what it was.

Melissa: This really bad movie was produced by Sony Pictures, which is a subsidiary of Sony Corp. in the United States.

[Melissa's panel fills the screen.]

We have a Japan-headquartered company with an American subsidiary, which I think is important. The subsidiary in the United States was publishing  this, uh- I think it's called The Interview, a very bad movie. It was making fun of the North Korean government and North Korea broadly. North Korea took offence at this and demanded that Sony Pictures not release the film. So let's say that that's the opening scene. Sony Pictures refused to obey or to meet the request of the North Korean government. The North Korean government and/or its proxy penetrated Sony Pictures, who had a very weak cyber security posture, infiltrated their corporate networks, was able to expose corporate e-mail, and tried to really derail the launch of the movie from going into the corporate networks.

[All three panels return to the screen.]

Concurrently, they made a general threat to the citizens if they went to the theatres to go watch the movie. So let's say that's scene two or the culmination. It became a policy conundrum. President Obama, at the time, called it an act of vandalism first. That was the initial reaction. Bad corporate security. Shame on you, Sony. You really needed to invest in that. This is not a national security issue, per se, because this is a movie and you're not a critical infrastructure.

But when there was a key threat to the citizens, the government had to really step in and think through that maybe this is not an act of vandalism, that this is something that really requires some policy decision‑making process. But the government was really not prepared for something like this. We're prepared for espionage. We're prepared for more destructive activities, but not something along these lines.

What's interesting is that the parent company, Sony, in Japan said they didn't have anything to do with it. It was Sony Pictures' responsibility.

[Melissa's panel fills the screen.]

The cross-geography or the different jurisdictions of the corporations was also interesting to watch from a corporate perspective and corporate risk management. They didn't see it as a real issue from that  perspective. That's kind of the opening A, B and C scenes, or 1,2,3. But I don't think it really resolved very well. Sony Pictures said they increased their cyber security. At the end of the day, they hired a new CISO. I would argue that they still don't really invest in the security of their corporate networks. The risk was not too great as far as harm from a consequence of lawsuits and other things. It was really embarrassment.

[All three panels fill the screen.]

Taki: Scott, you have this company in country A and this company is attacked by either country B or the proxies of country B. If that country had sent in stormtroopers, we would all get it. We would go, "oh, my God, we're under attack." Is this the responsibility of the government? You work in government, like I do. Is the government supposed to respond when somebody attacks your company?

Scott: That actually is the key question here, because I think a lot of us will argue, yes. Is it reasonable to expect a company to be able to defend against a nation state that has decided to use national power against them? To your point, we wouldn't allow a covert action team to operate on our territory. Why is that okay in the digital space?

[Scott's panel fills the screen.]

On the other hand, we would also expect that company to have had better physical security, to have thought these things through, and to have implemented some security measures.

If you leave your front door open, "is it breaking and entering or not?" is another part of the question. This is a really complex issue. The issue for us is when you look at it, what it turned into is a pattern for the next generation of cybercrime. Hack, steal, blackmail. Because ransomware, we've gotten better at it. Companies have gotten better at preparing for that. This was a great pattern. Steal the information and leak a little bit that's really embarrassing—especially anything entertainment related to get on TMZ and a few other things. Then blackmail them to try to get them to pay. The blackmail in this case was, "Don't put this movie out that was embarrassing." But in the cybercrime case, its, "Pay us or we're going to let this go and we're going to break the trust with your customers. We're going to break the trust with the people who have entrusted you with that data." There's a lot of pieces here. The fact that it's a nation state or a proxy is one of the hardest problems because it's hard to distinguish.

[All three panels return to the screen.]

You don't usually come out and put your country's flag on it and frankly, you can fake that. That is the whole issue around attribution that we face.

Taki: That's another key issue for people to keep thinking about as they manage programs, develop policies or react to events, that what's happening in the cyber world has analogies in the analogue world. There's theft. There's violence. There's attacks. There's good things. There's bad things. Do those rules transfer one-to-one? Is it assault when you click "I don't like?" Is it theft? Is it vandalism? Is it mischief? What is it? Because one of the things we know with cyber is that it makes it easier to engage in these activities. The scale and the transaction costs are pretty low relative to if you're creating stormtroopers to fly across the ocean and parachute and penetrate NORAD and NATO. Relatively speaking, a couple of clicks is pretty cheap.

Scott mentioned ransomware. We're going to get into ransomware in a few minutes because that's another big issue that's percolating in this world. Before we do, I want to move to what a lot of people, at least in the popular world, view as the next one after Sony. I always have trouble pronouncing this word. Stuxnet. Melissa talked to us about Sony. Scott, do you want to talk to us about what Stuxnet is? There's still a little bit of mystery around it. Maybe it's easier for you as a Canadian than somebody who's worked in the White House to talk about what Stuxnet is.

Scott: Sure. There's some pretty public reporting and speculation on this. That's what I'll be mostly going off of.

[Scott's panel fills the screen. His audio falls out of sync as he speaks.]

Stuxnet was a piece of software that was used to infiltrate Iranian centrifuges and cause them to have a physical, real-world effect, which was overspinning, meaning they spun so fast they spun themselves apart, essentially. It was a way of achieving a national outcome but using a cyber means rather than sending in bombers or missiles or other types of things. It was to set back uranium enrichment, was the goal of this program.

It was a piece of software that was very good at propagating. It targeted for the programmable logic controllers—the things that control how fast the centrifuges spin. And it was said, in this case "go and disable it." What people talk about, though, is that once you release something like this, other people can see it. They see how it works. They can take it and then they can reuse it. That's what a lot of the debate has been, not the outcome of the initial. It's how do these tools get repurposed? Because once you release a cyber tool, others see it. Others see the vulnerability that was exploited. Others see the potential. And also, every time you move the line, and I would argue others are moving the line a lot more than our allies are. Once you move the line, the line has set. Is it an act of war or not?

That's what a lot of the discussion is. Now, I'm not going to weigh in on, I'm not an act of war expert on things. I'm a cyber security expert here. At the end of the day, to achieve that outcome, the kinetic options would have been the only thing prior to cyber. Those are some of the things that have to be discussed. That's what Stuxnet really was. It was propagating. It would move itself around. It was very targeted. It was written specifically for this, but it could be repurposed. That's the grosso modo summary of it from my perspective.

[Taki and Melissa's panels rejoin.]

Taki: Melissa, I'm going to turn to you in a second. In Sony—just for our viewers to get it—a digital attack resulted in digital coming back. It was files. You've got files, whether it was, "Who is Brad Pitt? Who does he like? Who does he not like? What is the movie?" In Stuxnet, a digital attack resulted in the physical world changing, where I think Scott said a turbine or a fusion or something spun faster than it would have or spun slower or whatever the case was. Something kinetic happened as a result of that attack. Melissa.

Melissa: I think that Stuxnet was really the first widely reported case of an attack on a critical infrastructure: Iran's nuclear program.

[Melissa's panel fills the screen.]

It affected one fifth of their nuclear centrifuges. It disrupted, let's say, 20 percent of their nuclear program and their ability to create nuclear weapons. That was the strategic outcome that was desired by it. What was interesting about it, and maybe post it happening, is it was widely studied then by all of the security companies—Symantec, McAfee, and Kaspersky—and then engineering research institutions like CERN. They all published papers on what had happened and how it happened.

First, it was a blanket vulnerability in Microsoft that was exploited so that you could understand what was happening around the world. Second, it was targeting Siemens software, which was targeted for the engineering parts of the centrifuge and the like. They all published papers saying, "This is how this happened and this is basically how you could do it again."

[All three panels return to the screen.]

There we see the sons and daughters of Stuxnet. There is one called Shamoon. There's like 11 different versions of Stuxnet that are now being used against critical infrastructures around the world because the security community widely evaluated it, published it, and made it- on Wikipedia. It's on WikiLeaks. Everywhere. It made it widely available to anybody, not just to nation states, military or intelligence service.

Taki: That's a huge point for people to remember. The tools of war, generally speaking, since civilization began, generally have been reserved for high level organizations like states and empires, and things like that. The tools of war, now, in theory and in practise, can be available to a really smart 12 year old in your neighbour's basement. It doesn't have to be a really rich 12 year old. It just has to be a really smart 12 year old.

The next one. A couple of things happened recently. There was an election. There was an insurrection. Something else happened that probably we'd be spending a lot of time talking about if there wasn't an insurrection attack. I'm not sure we have a name for it yet. Most people are calling it SolarWinds. Melissa, what is SolarWinds and why should we care?

Melissa: I've done a lot of analysis on this and SolarWinds. It's what I look at. It's called SolarWinds because SolarWinds is a company that has the largest market share in network management and monitoring. Companies and governments all around the world are using this software product to monitor their networks for security, optimize their networks for efficiency, and the like. SolarWinds is the best product. It is best in class at the upper right quadrant of Magic Quadrant for Gartner. It's really the only product in its class for doing both the management and monitoring. They were very public about their customer space. They serve the Fortune 500. They serve many parts of the US government and other governments around the world. All of those entities were all published on their website up until about December 15th.

I look at this as a comprehensive ICT supply chain attack. You're going after a company that has a large market share that can get you into a lot of other targets: nested targeting. What happened was SolarWinds through a series of decisions for optimizing costs and was not paying enough attention to security. Its engineering was being developed over in Belarus, Czech Republic, and Poland. It didn't have very good security processes for its technology. It became a high value target.

Taki: Wait a sec, Melissa, just a moment ago you said this was best in class, Gartner top Quadrant. Now you're saying it didn't have very good security. What's-? There's a disconnect there.

Melissa: That's because it's a product that was looking at network management and monitoring. It wasn't actually protecting itself from making sure that its product couldn't be manipulated or its product wasn't vulnerable. It was good for what it was selling. This is the typical thing that we're seeing across the whole ICT industry. Field it fast and fix it later. We're going to optimise our costs by outsourcing it to the engineers in Eastern Europe or in the former Soviet Union. We're not going to invest in the security of our own company because that's affecting my bottom line.

[Melissa's panel fills the screen.]

You see this across the board. The entire ICT industry, I would argue, is negligent in this manner. They became a high value target. It was reported by the security firms and the United States government that, at least initially, the company SolarWinds, was targeted by the Russian government and specifically two of their intelligence services, the SVR and the FSB.

I believe that the Russians did a very good reconnaissance of SolarWinds. They initially implanted a software malware on SolarWinds' Network to observe their development process—how they develop code. They were able to understand those processes well enough that they were able to implant code to enable back doors into their program. Now the Russians can actually update the code and SolarWinds has no idea what that is. That happened as early as September of 2019. Russia had already chosen this company long before that because you're now doing your reconnaissance. They get into the company in September 2019. They start modifying the code undetected by SolarWinds and then they embedded at least two versions of malicious software within the software updates.

So, if I'm connected to SolarWinds, I'm now going to get a software update, but I'm going to get that software with malicious software that is going to evade any detection mechanism because it's been signed and it's legitimate in my network. It's now going to enable an outlet or a back door, by which the Russians can get into now whatever that nested target- whoever the next company or government is. That all happened between October 2019 all the way through to June 2020. There are multiple aspects of the malware.

How was it discovered? That's the next important thing. The company FireEye, which is known for cyber security, penetration testing, and forensic analysis for many of our governments and many of our companies had a phone call about one of the employee's credentials. It tipped them to start looking. "That is really strange. We need to start looking at our credentialing mechanism, our active directory, and identity access management of our network. "

They discovered that they had been breached and that they had lost intellectual property. At least 300 of their forensic tools were stolen, illegally copied. Then after about another week of forensic analysis, they determined that they had, because I think FireEye has very good security- they found that they were a victim of the SolarWinds breach because they were running SolarWinds on their network. They were one of 18 000 companies that were affected, that had the malware delivered to their network and many government institutions. That's how we found out. We didn't find out until December of 2020. The Russians had over a year of access into at least 18 000 targets. Maybe more than that because there are more than 300 000 customers of SolarWinds. I can keep on going or I can stop there.

[All three panels return to the screen.]

Taki: We'll pause you there for a second. Scott, the Canada-US ratio is usually about ten to one. We've heard a lot about SolarWinds in the United States, but it's not, as far as we understand it, isolated to the United States. What's the fallout in Canada of SolarWinds?

Scott: There's a few things and that was an amazing summary of the event. The only thing I would add is the kudos to FireEye for being proactive in disclosing and having the courage. They could have held back until it was a mandatory reporting period. Other than doing it a few weeks before Christmas and kind of wrecking the entire security community's holiday season.

[Scott's panel fills the screen.]

They let us get access to deal with these things really quickly, and Microsoft also did a substantial amount of contribution to this research as well, and other security communities got in. FireEye deserves a lot of credit for the courage to come forward. Not all companies do that.

If we go back on what's the impact on Canada? Canadian industry runs this. Canadian government has some things. Luckily, in many cases, they hadn't installed the vulnerable version that had been compromised to the supply chain attack. I sarcastically said, "Thank God the government's patching is bad sometimes." It actually paid off in this case. A lot of Canadian industry- I've talked to quite a few Canadian CISOs who said the same thing. Furthermore though, I think when you go back, it also goes to our investment in cyber security. There are ways, even with the supply chain compromise and this software being compromised, if you'd run it in the way that security best practises as established, which is something like this: you do not leaving it connected to the internet. You zone it into a management network. You couldn't have been compromised even though you were running the vulnerable version. So there are many ways this could have been stopped. Yes, SolarWinds had some things to do. But zoning this, running it properly, isolating, there were Canadian companies that had the vulnerable version, but they had isolated their management network.

The other thing is, this product was designed in a way where you had to disable parts of your antivirus and security protection just to make it work. And it had such pervasive access. It is the skeleton key to get into all of the things it manages. This is where we need to start looking at not just the supply chain, but also how are companies are using this? There are ways to do it. They're well understood. It's well documented. I'm not talking about an air gapped, classified network with expensive multi-million dollar encryption devices. I'm talking about separating from the internet, disconnect it so that it can't connect back in. That alone protected some agencies. So we did not see- there was reports of one compromise in media in Canadian companies. There was also a kill switch embedded in the software that the researchers have found that was sent out to a number of victims. It disabled the back door if the actor in question, and I won't say a country name here, given my role, wasn't interested. That was also something that was sent out as well.

There was a lot of things here. From my background, this was a very well-orchestrated intelligence operation. It was well thought out. Melissa hit on all the reconnaissance pieces. I don't have to sit on any of those. But it does go to the vulnerability of the industry in terms of we need to start thinking about systemic risk. How did these things compound? And our IT departments, what are they facing?

Every IT department is about cutting costs, getting down to the lowest cost possible. Outsource this. Get this into the hands of the cheapest thing possible because that's not my bottom line. One of the things that somebody said, and I wish I could attribute the quote. I feel bad. I should. They said, "You either know you're a digital company or you're going to go out of business because you're going to drive yourself that way." That goes for government as well. That's the overall summary.

Canada didn't seem to be heavily targeted by the actor in this case. We did have the software. We were vulnerable. We could have been. We've got pretty good indications that there were no major compromises, mostly because we're probably not on the top of the target list this time.

[All three panels return.]

Taki: I'm an ordinary citizen. I'm looking at this and throwing my hands up. I'm like, "Well, doesn't sound like there's a lot I can do." Scott, I heard this first from you, that the biggest security weakness in every organization is that ultimately it's hooked up to a carbon based organism, i.e. a human being. In this case, the way Melissa described it, it sounds like most people, not SolarWinds, but most people outside of SolarWinds did the right thing. They were buying the top security system. They bought best in class. They were this, they were that, and yet still there were vulnerabilities. I'm hearing a little bit different from you.

Scott: Sorry, I would argue Melissa really hit it earlier. It is a best in class product for what it does. Everybody who bought it was looking for the best in class product for what it does. They weren't evaluating how it was built and how it was built to be secure. They weren't evaluating it for that function. That's where I would say the carbon based problem is. They were looking for the lowest cost. They weren't valuing security. They weren't looking at it from a threat actors' perspective.

What would it let somebody get access to if it was compromised? Because if they had, they would have deployed it different. They would have invested in their IT Security Department to deploy it properly. They would have been saying, and demanding things like, "Okay. I want this product to go through some sort of vulnerability testing. Let's get together as the industry who's going to use this and put it through proper testing." There are some features in there that looks like the actor knew that this might happen. But, there are things that we can do as an industry. Vote wallet and the power of procurement is super powerful. You need customers. That's where I would say ultimately the failure was not emphasising that part of this problem.

Melissa: I agree with that. I think it's really important when you look at the key financial institutions that were using it, the key government entities that were using it, there should have been a demand for due diligence of the product based on their requirement for mission assurance, if you will. You can't afford for the banking institutions to have a vulnerability that could literally disrupt finance around the world. I think that when you look at their password for downloading the software updates was SolarWinds123, unencrypted. I mean it was just like, [stammering] the gross negligence, and I don't use that lightly, the gross negligence of SolarWinds on the security of their own infrastructure,  I don't think they're going to survive this event. Because we're going to rip and replace it out of our infrastructure. And it represented this particular platform, Orion, which is their flagship platform, represented 45 percent of its revenue.

Taki: You've both directly and indirectly touched on something that we don't think about a lot, which is the supply chain of these things. Scott said that the carbon based organism made a mistake because it didn't go back and check the supply chain across everything. Is that realistic? Do I have to before I buy my next iPhone or my next Android or my IoT connected coffee cup, my IOT connected Braun toothbrush? Do I have to go back? First of all, what is the supply chain? Second, do I have to go back and do that myself? Melissa, first, what is the supply chain? We have ships. Where are ships made? We have fibre. Where is fibre made? This is a lot of stuff that comes together to make something IoT or something digital.

Melissa: We have a global supply chain. Whether the chip was designed in Seattle, it's usually produced over in China. When you put together your iPhone or my computer, it probably has 20 flags or more associated with it, as far as where that keyboard is made, where the silicon is made, where the screen is made, etc. When we start to make decisions based on the flag of where the production is, I think that's unhelpful.

[Melissa's panel fills the screen.]

What we do need to start to think about is how we bring all of these different piece parts together and architect for resilience, architect for safety, especially as we're moving more toward this automated and autonomous. We need to be thinking about safety first. We need to be thinking about resilience, security, privacy. Those have to be key function features that we demand and likely will have to be demanded at a governmental level because the industry in its piece parts are not necessarily incentivised. But if I have an android, which currently has a serious set of vulnerabilities, and it connects to something else and it propagates an infection, we have to address that.

There are a number of different- the Europeans just put out IoT security guidance for the design of the next generation architecture. California put a law in place last year that says you can't sell a device in California, but basically the United States, if it's hardcoded "admin, admin" that you need to be able to update the passwords, update the software and bring security in. The Japanese have a law also and are doing red teaming against the products before they're allowed to be fielded.

We're just at the beginning. The problem is that we already have a great volume of insecure devices that have already been deployed. We still have to buy down the risk of yesterday's problem and correct tomorrows problem or today's problem through consumer protection regulation. It's being addressed partially through some of the data privacy regulations, especially as you all update PIPEDA. California has just put pretty stringent laws in place about these things. We need to address it broadly as you make the 180 billion dollar investment in updating your core infrastructures for moving and accelerating to the digital economy, that that money has to be used not only to get the functionality for accelerating modernization, but resilience, safety, security, privacy all have to be components of those decisions.

[All three panels return.]

That's where we have to bridge the gap of our decision makers. They are thinking first about the economic benefits and then we worry later about the security risks or the problems that those things bring to us.

Taki: Scott, so the good guy, so to speak, the good actor it sounds like he or she has to be right all the time in a lot of the different areas that Melissa has laid out. The bad guy only needs to be right once, whether it's in hardware, or in software, or in physical compromising or in bribing a programmer, bribing a security guard to let you take a picture of something. Is this realistic to think that as our world migrates online that our world will be as secure as our world was in the analogue world?

Scott: It very much is. The analogy for Canada is hockey. It's being the goalie. If you're the in defence, you're the goalie, you're the hero until the one puck gets in the net and then you're the most vilified person in the city that you play for. You could have stopped a million shots on goal until the one got in. So there's that.

Is it reasonable for individuals to go through and do supply-chain analysis? For maybe some of us, we might do it, but no, it's not. You can't because you won't get access to the data you need. A lot of it is company proprietary. It's confidential. So how do we start to deal with this? Part of it is a change in mentality, where we look at individual devices needing to be secure and we start to think about how we secure the system.

Assume that there will be data breaches, that there will be vulnerabilities, and we design it to be contained. That's something that you hear, things like zero trust, etc.. We're heading in that direction for the Government of Canada's infrastructure, because if not, it moves too slow. How do we build it so that we can contain very quickly? The other piece is, how do we protect the data? For either, you're giving an authentic command. Think of controlling the infrastructure in a city, and you're controlling the lights. What do you care about? Do you care that somebody can read what the command is? No, you don't. You don't need confidentiality, but you want to make sure that nobody turns both sets of lights green, and then you have the T-boning of the cars, type of thing.

[Scott rams his fists together.]

When you're doing a financial transaction, you know that everybody in the loop knows that you are at this store, buying this thing, whether through contactless delivery. What you want to make sure is that the vendor doesn't change your $10 purchase to a $1000 purchase. You care about the integrity of the message. Going back to some of the basic principles here matters, but also as consumers, we need to start thinking about, "I get that this light bulb is $8 on Amazon and I can get it. Where's it manufactured? Why is it only $8 but this one by another company is $28 and they do the exact same thing?" I guarantee you that they've cut a lot of corners.

If you're looking at say, smart cities, they're going to be buying millions of sensors. A dollar per sensor starts to save some major money and those are going to last for a decade. That's how they're designed. They're designed to be low powered, in bulk, last for a long time. Which means: how do you update them? You're not buying a thing. You're entering a relationship with the vendor of that product that you need to be thinking through. Is this going to be there in three or four years? If you've installed that smart doorbell? Is it a company that exists? Are you paying for updates? Should you have to pay for updates? That's a great policy question.

At the end of the day, I know there are some people that are listening to this in the Government of Canada going, "excellent, I love regulation. It's a great tool of the government." We are a market of 37 million people. Even with our American friends, we're a market of 400 million people. We- combine us with Europe: 800 million people. We still are outpaced by India, China, most of Asia, etc.. How do we work together internationally to make some of these things standards? That's going to be the question. It has to be a standard. It can't be government by government regulation. That'll be the interim. If we don't do it, we're not going to get the market we need.

Taki: Yeah.

Melissa: Yeah.

[Scott holds up his smart phone.]

Scott: I can pick this up no matter where, and I can import it into Canada. So you're not going to stop me like you are with the things that the Canadian Standards Association approves for electrical here. Different approach.

Taki: You've both been kind of counter, a little bit to what I've been saying, which is good. I've been playing the Cassandra. The world is falling. I think I'm hearing both of you say that if you take security seriously, if you focus on security, you can actually avoid a lot of these problems. One of the industries that historically has taken security in this realm seriously has been the financial services industry. They got there before a lot of us, including before governments, because they saw the future a little bit, that this was all coming online and that information protection was incredibly important to their brands and to consumer confidence. I wonder if I could hear from each of you a little bit. What industries, not so much companies obviously, but what industries grosso modo, are there, getting there, and not getting there in Canada and the United States maybe? Melissa, is it pipelines? Financial services? Sewers? Cities? Who's good and who's not so good?

Melissa: Well, Taki, I think you said it. The financial services is really the gold standard and it dates back to 1994 when they started to move toward online transactions more broadly. Citibank had the first real grand theft of their institution, which was 10 million dollars from Russian proxies or whatever. That actually created the position of a Chief Information Security Officer and then the investment across that whole industry of ensuring that financial transactions could be secure across geographies, etc. and that they wouldn't lose money.

To be honest, there is really no other vertical sector that has embraced security along those lines because it's more of an operational risk of whatever they're dealing with, not as much as real money. Right? The banks see it as, "if I lose that, it's real money. It's my bottom line." Others see it as an operational risk—a legal risk that then leads to financial risk. We have to still look to the financial services sector.

But, I would argue that I don't think we should be necessarily deciding that the primary function is security. I really think that we have to be designing so that the primary function is resilience, and that's different. That's a different engineering principle. When you get to resilience, the best industry in my mind is really power: electric, hydro, solar, etc.. The power industry is held to an account of resilience and uptime. If we started thinking about that, what's the mean time to restore? Mean time to recover? Put it in those terms. I think that's really where we need to head because that'll change the investments and it'll also change the design of the base that's coming into these verticals.

Taki: That's a tremendous point. We teach that, we hammer that home at the Canada School. A lot. A lot. A lot. Stop thinking about risk. Think about resilience, which is to say stop pretending that you know  a global pandemic is coming. Instead, think about what would happen. How quickly could I get my employees back online or connected if: If there was no power? If our IT centre went out? If. If. If. Don't think about the event that's coming. Think about how you respond to whatever event, whether it's weather or power lines going down or whatever the case. Scott, how would you answer the question? What industries are better prepared than others, if that's the case at all?

Scott: I think maybe this is where the US and the Canadian context just because of the number of players, might be slightly different but not substantially. I think that financial service is clearly the lead. There's no question. I think it's the maturity of their decision making.

If we look at our ATM cards, and magnetic swipe, we had swipe and pin technology and then suddenly we all started getting chip and pin technology. Why did that switch happen? Arguably, it's a much better technology. It's much more secure. But banks still held it back. That's been available for a long time, because the cost to them wasn't enough to spend on a security upgrade. They could make their customers whole, the customers would be content, and it didn't cost them more than it would to implement. And then it crossed the threshold. They understood what that threshold was. That's a very mature discussion.

I would say the contrary on the government side is that we accept zero. The answer for how much fraud is the government willing to accept. It's zero. We will spend 150 thousand dollars to prevent a dollar of fraud. Banks won't do that. They will understand it. They will contextualize it. They'll say "Yesh, you know what? It was worth it." We've done that in the pandemic case, but we're not done yet. We'll see when the auditor general reports come out on some of the income support and see how the media plays it. That was probably one of the most controversial things I'll say this entire time.

But if you go and you look, I would say that the electrical sector is taking this really seriously. The critical infrastructure sector, purely because they take a safety mindset.

Restoration, they've dealt with a number of- whether it be from ice storms, hurricanes, sometimes terrorism. Very rarely. Sometimes it's trees touching power lines. They understand resiliency and they understand how to restore. Cyber, to them, is another means to measure it. They have a measurement framework the same as the financial sector did.

Telcos is the other one where you do see a substantial amount. But actually I think Melissa nailed it. There's a lot of pressure because of the operation's side and so their security tends to be about how to protect the network so it can function. I do see a lot of changes there. They do invest a lot in security that Canadians just don't see. They do spend money. Some of it is fraud prevention, but most of it is just fundamental security pieces and some of that is proactive. I don't want to discount them. They are spending money in Canada on this. We've been working with them for a number of years.

The other sectors are behind: municipalities. They just don't have the resources to tackle this problem. You mentioned water earlier.  One of the things, we actually had to go to a municipality and say, "we've got a report your chlorination system is online. You could actually increase the chlorination in the water." Could you have killed anybody? No. Could you have made them sick? Yup. You certainly would have made the water stink and be undrinkable. For them, it was, "hey, this just lets me manage this." It's a continuity of business was actually the reason they did it. If they couldn't get to the plant, they could continue to increase and deal with these problems. So, how we start to tackle this as a systemic risk is the real crux here. The sectors for me, those are the ones. Finance is certainly way out there.

Taki: I want to talk about one last issue before I close with asking each of you for your advice for Canadians, and for public servants, and for our audience. My second last area is ransomware. We're hearing a lot about ransomware. I think we all know what it is now. It's almost like a bank stick up in the old world. In the old days, bad guys would go to a bank and say, "Stick'em up." Now people aren't just going into banks. As we've heard, banks are kinda protected against this, or at least better protected. People are going into hospitals and saying, "Stick your hands up". People are going into school boards and saying, "Stick your hands up." People are going into municipal land record registries. They're not yet, as far as we know, going into pipelines and electricity grids, and things like that. Although, I'm sure they probably are. We don't know about it. Give us your thoughts on ransomware, each of you, just as we close. Is it something we should do? Is it something that's happening a lot more than we think? What should we know about ransomware, Melissa?

Melissa: It certainly was, I would argue, the other epidemic that was happening in 2020. 700 percent increase in ransomware attacks around the world. What is ransomware? Ransom gangs are basically breaking in, or exploiting a vulnerable patch that you haven't made or a vulnerable software to get into your enterprise, whether your enterprise is a hospital, a school system as you said, accounting firm, insurance company, manufacturing.

[Melissa's panel fills the screen, the purple title card fades back in for a moment, identifying her as Melissa Hathaway of Hathaway Global Strategies LLC.]

The list goes on and on. The victims last year were remarkable and there was no sector untouched. They break in through a vulnerability that you haven't patched.

That's hard to keep up with it when you have so many that you're dealing with. It could be Cisco, Citrix, Microsoft, Oracle—pick your flavour. So they come in, and they are able to map your network. So that they come in undetected and then they match your network and they steal your data. They steal the intellectual property. They steal the PII. Whatever your special sauce is. And then, about a week or two later, they encrypt all of the data and the systems. This makes it so you get the black screen of death or the skull and crossbones on your computer. You can't use anything. This has been really problematic at hospitals because it's all the patient records and it's all the business systems. It's many of those things.

This is a test of whether or not you're resilient. Can your company restore those business critical systems from backup data? Backup systems? The problem is that most companies cannot and they certainly cannot do it in a mean time of 48 to 72 hours. They get issued a ransom. If you want your business critical systems to be back up, you're going to pay us 10 000 bitcoins, 500 bitcoins, whatever it is, some kind of cryptocurrency. This is the dilemma because you have to say, "I can't restore those systems. How do I pay?" Most organisations are not set up with a law firm or some place that can get bitcoin quickly. That becomes a problem.

Then the next problem is: am I actually funding a terrorist? We have published this in the United States. We've given guidance that you shouldn't be paying this because many of these are actually on our entity list of people known to be affiliated with people that are supporting terrorism or things that we believe to be illegal. It becomes a corporate dilemma of do I pay the terrorists to restore my operations and the like? I just see this going to increase because all of these corporate vulnerabilities are, again, at an exponential level. It's easy picking.

[All three panels return.]

It's easy money. It's easy money for those countries that could use proxies that are under major sanctions—Iran, North Korea, Russia—that are economically hurting. It's said that last year alone the North Koreans made over 2 billion US dollars in the ransom attacks that they conducted against the institutions around the world, because we're all paying.

Taki: Scott.

Scott: That's a terrific description and I think one of the big changes that we've seen is: ransomware used to just be about, they'd get in, they'd encrypt your hard drive. You'd get a little message and then you's pay. It was more targeted to individual users and sometimes it would propagate in your network.

[Scott's panel fills the screen.]

It's very much the more sophisticated, what we used to consider state level activities: reconnaissance, understanding what's critical to you, going in, corrupting your backups.

Even if you have a good restore procedure, you can't use it. Making sure that they go back in time so even your time backups are gone. Understanding who are the administrators, who to go after, etc., and finding that thing and then taking the information out. If you choose not to pay, you're now still suffering with this information out there that might be your customers', something that's your critical business process, or some piece of intellectual property. If you don't pay, they're going to let it go. I wouldn't call it necessarily a bank robbery. I'd call it more like a kidnapping. Something that you want desperately back, and you're willing to do anything you need to as a business. These are business ending events.

And if you look at the critical infrastructure side, imagine if the ransomware locked out our ability for the hydro companies- sorry, electrical companies. We call them hydro up here, which is I know, weird. And they locked up the ability to switch on and off the transmission lines. It's going to be -10° today, maybe -20° in Ottawa. Ugh. Soon. That's a major issue. You know there's a huge incentive to pay here. When we looked at the National Cyber Threat Assessment, one of the things we pointed out was that this only really works because of things like cryptocurrencies. There's a massive online market to buy these tools. You just have to have money to become a cybercriminal doing ransom. There are support organizations that, any of us that have to call our IT support would kill to have the level of support that comes with cybercrime tools.

And so we have to find ways to break this market down. Law enforcement is at a complete disadvantage. Cyber defenders are getting better at blocking, but this is a whack-a-mole game.

[All three panels return.]

How do we deal with the policy implications of anonymous cryptocurrency that transferred this money? All the things about: are you funding terrorism? Are you funding criminals? You're funding the next generation of cybercrime tools. You're essentially paying for them to come back and attack you when you pay that ransom, and they know you're going to pay, by the way.

Taki: What an hour. There are so many more things that we could discuss. Maybe if we're lucky, we'll have you guys back in a few months to talk about a few other things. I want to give each of you a final word here. I want it to be a different final word from each of you. Melissa, the final word we'd love from you is to give governments some kind of free advice. Scott, I'll give you a heads up. The free advice we want from you is to give Canadians, as individuals, some free advice on what they can do in this area. Melissa.

Melissa: At a very tactical level, if I were advising President-elect Biden, I would say that the first thing that we really need to do, if I were to put SolarWinds aside, is understand that the ransomware problem is significant against our health care institutions.

[Melissa's panel fills the screen.]

We should pivot the operational capacity that we had for election security and work with our allies to really, really support the ability to deliver vaccines and support the storage facilities that are being ransomed as well as the hospitals. We cannot afford to have a second pandemic along with what's happening. We have to be able to support safety and life.

[All three panels return.]

The second thing- oh.

Taki: No, sorry, go ahead.

Melissa:  I would say is that at a higher level we really need to have available, affordable, and reliable telecommunications infrastructure. I think this applies to Canada as well. It's not affordable, it's not available, and it's not reliable right now. I think that there's an opportunity, beyond the 5G conversation, as to be able to look at the delivery of internet from space. It's real and it's actually disruptive to the telecommunications carriers. I think it would actually really get to our outer territories for all of us. Since we're working from home and we're learning from home, this is essential for the continuity of our economy. Telecommunication has to be affordable, reliable, and available.

The final thing I would say is that we really need to be thinking about where the technology is headed and our dependence on the digital economy, and demand for the functionality to be driven by resilience, and not low cost and field it fast, fix it later.

Taki: Scott, give us some free but profound advice for Canadians.

Scott: We have an entire site full of free advice for Canadians. First of all, the basics of cyber security still matter.

[Scott's panel fills the screen.]

We talked about some really big threats here and we talked about big systemic things. For Canadians: patch, update, keep things up to date, and when you're looking at a product, don't just look at the bottom line cost, look for companies that actually care about privacy and will talk about it. Not that superficially talk about it, but actually say, "Here's your privacy settings. Here's what you can look into." I don't care what ecosystem you choose to be in, but think through privacy. And think "Do I want this information available? Do I want it available in the place I'm putting it in my home?" Do you want that smart speaker in your bedroom? Do you want a camera on your TV pointed at your bed? Right? I realise I picked a couple of salacious examples, but think about it. Do you want that there?

 There's a great cartoon where the 1950s person was like, "Do I want to talk about this on the phone? The government might be listening" and the 2020 version of this is a smart speaker. Let me tell you about my life. I'm going to give it to a couple of private companies. Notice how I didn't specifically mention a specific smart speaker because they all do this, right? Think about that. Just think about what you're doing. I'm not saying don't do it.

[All three panels return.]

There's a lot of great benefits that come with this technology. Just think of the benefits from it. We need to not be naive. We need to think about this. Your information's valuable. Do you need to put it there? The basics matter. Don't reuse passwords. Update your systems. To Canadians: You do that and you're already making yourself a bar above the victims of most of the things like cybercrime that you're mostly likely to be targeted.

Taki: Wow. What an hour. Melissa, Scott, thank you for spending this hour with us. Thank you for educating us and most of all, thank you for being friends of the Canadian Public Service. All our best. We'll see you again soon.

Melissa:  Buh-Bye.

Scott: Buh-Bye

[The chat fades to the animated white Canada School of Public Service logo appears on a purple background. Its pages turn, closing it like a book. A maple leaf appears in the middle of the book that also resembles a flag with curvy lines beneath. The government of Canada Wordmark appears: the word "Canada" with a small Canadian flag waving over the final "a." The screen fades to black.]

Related links


Date modified: