Skip to content
Logo with the title Let's Converge Podcast in white on a dark blue background, and the word Tanium in red, below.

Ep. 23: From TikTok to AI Regs – How U.S. Cyber Policy Can Guide Enterprises

Dec 16, 2024 | 21 min 35 sec

International cyber adviser Richard J. Harknett returns to the podcast with more guidance for enterprise leaders on today’s latest cyber concerns, including TikTok (yes, it’s a serious threat), inevitable AI regulations (they can actually spur business opportunity), and why he’s determined “to get rid of cybersecurity month.”

Summary

As a world-renowned scholar and one of the architects of the U.S. national cybersecurity policy, Richard J. Harknett regularly gives advice to government leaders, both here and abroad, on the cyber threats nations need to prioritize most. In our previous episode, he explained today’s new proactive approach to cybersecurity and how businesses can benefit from it.

That conversation continues here with specific advice for business leaders on some of today’s most pressing cyber threats. What follows is some frank talk about what brands really need to focus on – and what issues aren’t as troubling as they might think. His perspective on today’s headline-making cyber news might surprise you.

Host: Mike Curran, vp of global talent, Tanium
Guest: Richard J. Harknett, director of the Center for Cyber Strategy and Policy and co-director of the Ohio Cyber Range Institute, University of Cincinnati

[iFrame]

Show notes

For more info on U.S. cybersecurity policies and how autonomous endpoint management can keep you active, not reactive, check out our articles in Focal Point, Tanium’s award-winning online cyber news magazine, and these other useful resources.

The following interview has been edited for clarity.

Most Americans are okay with being marketed to, right? But we don’t like being manipulated and we don’t like being controlled. In cyberspace, there’s a real fine line, Mike, between being marketed, being manipulated and being controlled.

Richard J. Harknett, director of the Center for Cyber Strategy and Policy and co-director of the Ohio Cyber Range Institute, University of Cincinnati

Mike Curran: Hi, I’m Mike Curran, vp of global talent here at Tanium, and today on Let’s Converge, we’re talking “defending your business forward.”

It’s the second of our two-part conversation with Richard Harknett, who helped design the U.S. national cybersecurity strategy.

Our national cyber policy of “defending forward” follows the doctrine of persistent engagement, a proactive stance that’s not just for nation-states. Businesses can be persistent and proactive too.

In our first episode, Richard and I discussed how businesses can adopt this strategy and use it as a calling card in contract negotiations and in making decisions about third-party suppliers. Richard is director of the Center for Cyber Strategy and Policy and co-director of the Ohio Cyber Range Institute at the University of Cincinnati. He was the first scholar-in-residence at the U.S. Cyber Command and the National Security Agency, and he’s co-written a book worth checking out, entitled Cyber Persistence Theory: Redefining National Security in Cyberspace, from Oxford University Press.

Let’s get back into it. When we left off, Richard was explaining how cybersecurity is, well, it’s a lot like coaching girl’s soccer, which is something he knows firsthand.

Richard J. Harknett: I coach a women’s high school soccer, and years ago I was blessed to help facilitate a state championship. If I gave them a book and said, here’s a book on how to play soccer. Meet me tournament time and let’s see if we can win a state title. That ain’t happening. Right?

Curran: Yup.

Harknett: It comes from learning the game by doing it, by playing it. And cyber is all about the doing. And so if you don’t have in your organization a continuous security protocol of getting people to be attentive to how they could be exploited, they’re going to get beat in this game by the adversary who is spending a hundred percent of their time thinking about exploitation.

So one of my goals in life, Mike, is to get rid of cybersecurity month. And you would say as a cybersecurity guy, why would you want to get rid of that focus? We need cybersecurity hour. I mean, if you buy my theory that it’s about persistence, you can’t carve out one month a year and ask employees to watch a 20-minute video that they watched a year ago and say, okay, we’re good to go.

Curran: I imagine with the attack surface continuing to increase, it’s probably nation-states and bad actors are going after corporations at the same fury that they were years ago. How does the government divide resources to address those two sides?

Harknett: So I’ll just be careful just on that. So I don’t work for the U.S. government. I help consult with them. I help when the bat signal goes up. So I want to be careful. I’m not one of the guys, I’m not speaking on behalf of any U.S. government agency. So look, there’s an ebb and flow to this. I don’t want to put our adversaries on a 10-foot pedestal. Everybody has limitations in resourcing. So sure, those that want to impact elections, gear up around election periods and then they filter off.

One of the things that we’ve, a lot of, I’m sure your listeners would initially think about from an election security standpoint is can they get in there and change the vote? Can they manipulate the vote? And we could have a whole podcast on that, but the quick answer is no, and not substantively. It’s there’s a whole bunch of reasons why that isn’t happening. And we’ve done some really good stuff since 2020 and since 2016 even to make that better.

Our adversaries know that. And so where they’re spending their time is what we call cyber-enabled divisive information campaigns. So there’s a lot of talk in the media about disinformation and misinformation. The real thing that our adversaries are doing is what we call divisive information. They’re finding what divides us.

Unfortunately right now, most of us spend time on social media and in traditional media with only folks that we agree with. They just amplify that division.

Like you and I, if we were in the same family, brothers, we may disagree on something really significant, but if somebody came from the outside and started to intervene in our debate, we’d both look at ’em and say, “Hey, get the hell out of here. We may disagree, but we’re family. You’re not in on this.”

And so the argument that I’ve been making is that we’ve got to actually defend against these campaigns with that same sort of mentality.

These are really sophisticated actors, and there’s an undercurrent of stuff that the average user on social media doesn’t perceive. This is why we’re worried about TikTok. Because it does align with a particular state actor, China, that has a very different kind of government [and] looks at its population very differently. It’s not about individual empowerment, it’s about how the individual serves the state, and they don’t really agree with our way of life. And so having a megaphone into us….

Some of your older listeners may remember Voice of America during the Cold War. We sent radio into the, behind the Iron Curtain convincing them that, Hey, democracy is a better idea than what you guys got. And for 70 years we did that, and the Russians and Soviets tried to block those signals. They would confiscate radios and all that, make it harder, but we kept throwing that information their way. When did that make a difference? It made a difference that when the Soviet Union collapsed, people didn’t go into a void. They actually said, oh, there’s an alternative, like this democracy thing. Why don’t we try that?

If I’m chairman Xi [Jinping], if I’m heading up the Chinese Communist Party, and someone came to me and said, Hey, chairman, we’re in 150 million pockets of Americans on a device that they spend cumulatively, some of them spend over 10 hours a day on. It would be the height of irresponsibility as a leader for me not to use that device, if I actually think that the American way of life is not in Chinese interest. So it’s a two-way device that is able to pick up how long I hover on a video, not whether I click on a video, whether I hover on a video, and an algorithm that then tailors what gets sent to me and then starts to shape what I…look [at].

Most Americans are okay with being marketed to, right? But we don’t like being manipulated and we don’t like being controlled. In cyberspace, there’s a real fine line, Mike, between being marketed, being manipulated, and being controlled.

Curran: Nobody’s going to admit that they’re being manipulated even if they are. So I guess it leads towards the big discussion on should it be banned, should TikTok be banned? I mean, what are your thoughts on that?

Harknett: Yeah, so remember I told you, I coach high school girls. So this is not a very popular view on their part. There’s too much evidence to suggest that this is a device that is similar to a propaganda device like Voice of America, but we don’t have the way to just interfere with that. We’re not going to go confiscate people’s phones. We’re not going to do all this. It’s not that we want to deny Americans, those of us that are in this camp that say, yeah, you know what, this thing needs to get restructured. We’re giving away way too much data to somebody who doesn’t really like the way we organize our society.

I am sure there’s a really clever entrepreneur out there that could develop an app that does something fairly similar of capturing short video and communicating types of things that would be more transparent about the code that they use, [and] more transparent about a protection of civil liberties than what this device is.

So what’s gone through Congress is saying, look, the U.S. government is okay if TikTok changes certain things, like transparency over its code, where this data is housed, who gets to see it, et cetera, et cetera. So the ball was thrown back onto ByteDance, the parent company. It sounds really over-the-top and super-governmental intervention type of thing. But yeah, there’s a thing called world politics, and unfortunately countries go to war with each other over things, and you’ve got to protect the American public from what you’re seeing happening.

So yeah, it could be a painful transition for some 14-year-olds, but I think we could overcome that with entrepreneurial spirit.

Curran: Got it. Shifting over, I wanted to think it’d be remiss if we don’t mention AI. Obviously it’s become very prevalent in the last 18 months. How has that influenced the strategy, the defend forward strategy?

Harknett: There’s one thing that I agree with President Putin on. He said that whoever dominates artificial intelligence will dominate the second half of the 21st century, and I think he’s right.

My view of it is that we’ve got to stop thinking about it as code and tools and think about it from a decision-making model. So from a business standpoint, how do I deploy it within the context of my organization?

So what role and responsibilities am I going to give it? Is it going to just collect data for me? Is it going to collect data and analyze that data for me? Is it going to collect, analyze, and organize that data so I can make a decision, or am I going to give it the ability to make the decision? Those are options that you have if you understand it from a roles and responsibilities standpoint rather than from a code [and] technical-tools standpoint. So I think as much as I hope the work that I’ve been helping with created a paradigm shift from deterrence to proactive cybersecurity, that’s my next windmill that I’m tilting at, Mike, is that we got to stop thinking about AI from a tool standpoint and understand it from a decision-making roles and responsibility standpoint.

Curran: Yeah, I would agree. I think you can all agree we got to win the war. And I always think as I look at it now, we can’t overregulate AI, even though we don’t know what potential outcomes are, we can’t regulate possible outcomes. We kind of almost have to let AI grow and then go, okay, we got to maybe pull it back here, pull it back here, as opposed to trying to regulate it in the front end.

Harknett: Yeah, I mean, two quick thoughts on that, right, is one, we’re going to have an energy problem here pretty quickly with sort of rampant adoption of AI. We actually don’t have the electric grid to support the processing. So that’s an issue that we have to actually think through and where government and business has to get on top of this fairly quickly.

And my concern about the “Let the horses run” [argument] is that there are some training environments that are going to go way too far. And then once they’re way too far, how do you pull ’em back?

I’m on a group right now. I’m sort of supposed to be the sort of irritant in the room, sort of VP level, social media companies, telecommunication companies, and I keep trying to push them to get out of that bifurcated mindset that regulation is ipso facto constraint. Regulations are just parameters. They’re the rules of the road. In my old soccer metaphor, I mean, they’re how you play the game.

Now, I understand the argument is, well, look, the Chinese are going to play a different game under a different set of rules. The Russians are going to play under a different set of rules, et cetera. I say, great – our breakout is whoever finds the sweet spot of protecting civil liberties and maximizing economic opportunity, that’s the AI we need to develop. We don’t need to develop the AI for the authoritarians. We need to develop the AI for the liberal democratic states around [the world], because that’s where your market economies are and those market economies that want to be. And so why not get the best? The authoritarians aren’t going to develop that.

So let’s not compete with them on the wrong-headed version of this. So everybody’s got the math, everybody’s got the sort of supercomputing, or most people do that are going to be the leaders in this. It’s the quality of the data and what you want the data to do in the training environment. Their advantage is that they’re sucking up everything, because they have no civil-liberties parameters. But that’s going to give them a particular trained algorithm that is still all going to be about protecting state power, not individual empowerment.

So to me, it’s about modeling the training in a different way. In a market economy, how do you position and empower individuals within society? That’s a different algorithm than what the authoritarian algorithm is going to be. And that’s going to require us to live and work within some parameters. There’s certain levels of data that we’re not going to get at. That’s just a technical training challenge. And I think we should just take that on and say, “Hey, we can actually have really effective algorithmic agents who actually protect civil liberties and draw a line in terms of civil liberties.”

And we’ve got one example of that. We did HIPAA [the Health Insurance Portability and Accountability Act], the healthcare law, which basically said, we’re not going to allow people to monetize medical data. Everybody kind of came to an agreement, yeah, that’s not a good [idea], we don’t want what pills I take for what disease I have to then be sold to advertisers and things like that. So we carved out—the government came in with moderate regulation that carved out protection for individuals on the medical data side of things. And the world didn’t end.

I think we can have our cake and eat it too. I think we can be extremely innovative in the AI space while at the same time bringing safety and security into it.

Curran: That would be ideal, as they say. But…

Harknett: Yeah. Well, I’m half-glass-full on most of this.

Curran: Yeah, I love that. I love that. Hey, wrapping things up today, Richard, you’ve guided nation-states and our allies and the United States government as well on cyber strategy. If you were going to talk to a business – now specifically, let’s talk about not the giant companies here that have all the resources. Let’s say it’s a mid-size regional retailer, not somebody as big as Target.

What would be the initial steps in a proactive strategy that they could take to just get their foot, take that one or two steps forward?

Harknett: I would say make your CIO and your chief information security officer as important as your VP for marketing and product development. I mean, they’ve got to be at the table, and they can’t be the guys and gals down the hallway who are supposed to save you money through efficiency and then not give them the resources to protect everything, all of your intellectual property. So I think I could give you a laundry list of technical things you could do, but from an organizational business planning standpoint, it’s about making sure that those that you are entrusting with having the portfolio of cybersecurity as a lead are part and parcel of your business practices.

And then recognizing at the executive level and at the managerial level, it’s not an IT problem.

Cybersecurity is an economic organizational behavioral challenge in a technically fluid environment. That’s my bumper sticker. I can’t fit it on a bumper sticker, it’s too long. But if you treat it as a technical problem, you’re missing the boat. It’s about organization, it’s about behavior. And so therefore you got to rejigger your organizational flows.

And if you as a CEO don’t know what your anticipation and resiliency plan is in the face of a ransomware attack, then you need to go and find out what that is.

Curran: Go for long, at least enough. Yeah. Well, great advice, Richard. I really enjoyed the conversation…

Harknett: Yeah, I did too.

Curran: I’ve been talking with Richard Harknett, director of the Center for Cyber Strategy and Policy and co-director of the Ohio Cyber Range Institute at the University of Cincinnati.

If you’d like to learn more about proactive cybersecurity strategies, check out Focal Point, Tanium’s award-winning online cyber news magazine. We’ve got links to articles in the show notes, or visit tanium.com/p for publications.

To hear more conversations with today’s top business leaders and security experts, make sure to subscribe to Let’s Converge on your favorite podcast app. And if you liked this episode, please give us a five-star rating.

Thanks for listening. We look forward to sharing more cyber insights on the next episode of Let’s Converge.

Hosts & Guests