In Part 2 of his conversation with SDxCentral’s Matt Palmer, Nokia's Mike Bushong talks artificial intelligence for networking, among other topics affecting today's data centers.
What’s Next is a biweekly conversation between SDxCentral CEO Matt Palmer and a senior-level executive from the technology industry. In each video, Matt has an informal but in-depth video chat with a fellow thought leader to uncover what the future holds for the enterprise IT and telecom markets — the hook is each guest is a long-term acquaintance of Matt’s, so expect a lively conversation. During the back nine of his conversation with Mike Bushong, VP data center at Nokia. , Palmer got deeper into the technology topics of what's going on in data centers. Editor’s note: The following is a summary of what Palmer and Bushong discussed in this second part of their conversation, edited for length. To hear the rest, be sure to watch the full video above. Watch Part 1 here. Matt Palmer: In part one of our discussion we talked about networking for AI. The flip side of that question is AI for networking. I know you're well-versed in network automation and generative artificial intelligence (genAI) for that. What should people be thinking about concerning AI for networking? Mike Bushong: So everybody went out and did the same thing first, right? ChatGPT shows up, everyone's like I can change. I can train an LLM. On my documentation, I can provide contextual help. I can provide a natural language interface. Everyone has some version of that in their demo or in their product. Most of them are doing API calls to open AI on the back end. You know, as that stuff starts to get monetized and throttled, it'll be interesting to see whether open AI is the thing, or whether you have to do something local. I think there's gonna be some interesting commercial decisions there. It depends on how valuable that is to people. If it's a gimmick or a thing that they use some of the time, no one's gonna want to pay for it. If it's a thing they use all of the time, it's transformative. Then maybe you see some license-type models. There are two use cases everybody flocks to. The first one was security. I wanna detect an anomalous flow. I've got billions of flows. I wanna find the one that's rogue. What I discovered in a couple of previous jobs before is that it turns out that's crazy hard to operationalize because you don't have access to the training data. It's contextually specific. It changes. So it's not enough. Even if you could do it once, you gotta be current. What was an anomaly flow today is not an anomalous flow tomorrow. And how do you evolve? I don't think people thought through the operationalization stuff early on... The next use case everyone goes to is bad optics. Okay, we're going to detect bad optics. Then you ask, okay, fine, what's next? Now, sometimes people will get to CPU. If we see CPU spikes, that's basic thresholding right? That's not AI, that's linear regression. But okay. You know when to tell you what, when to do something. Yeah. So I'll go with you. Okay, fine. What's the next use case? And then you start to get crickets right? Nobody can really tell you. Where I think things probably go is the copilot model, which I think is likely to end up dominating at least for a while. I'm a little bit afraid of Skynet, like I've seen the Terminator movies and stuff. So let's say I'm a little bit on the side where I don't want everything operating itself. But what you could do is you could take tack commands. So every network has workflows. If you actually trained on those workflows, then you'd have the ability to anticipate what the next command would be. I see a sys log of a certain type, maybe I'm gonna start my troubleshooting with a command to see what's going on. I see you've issued that command, do you wanna issue this next one? And what you start to do is, take workflows that typically live in the heads of your most senior engineers and codify those and use the software to capture those workflows. That's valuable for two reasons, right? Or maybe three reasons. One: Getting it out of people's heads is a great way to mitigate the risk of people aging out, leaving, winning the lottery whatever. If you could take the skills of your 20-year vet and put them in the hands of your two-year new college grad. I think you start to get more capability. And if I can codify the workflows and the knowledge required to manage the architecture or the infrastructure, I am no longer wholly dependent on certification as a proxy for capability. Watch the full video for the rest of the conversation between these old friends and colleagues, who also happen to be tech visionaries...you can watch Part 1 here.