The EDGECELSIOR Show: Stories and Strategies for Scaling Edge Compute

The Secret World of Windows IoT: Demystifying Edge Computing with Microsoft's Joe Coco

August 30, 2023 Pete Bernard Season 1 Episode 5
The EDGECELSIOR Show: Stories and Strategies for Scaling Edge Compute
The Secret World of Windows IoT: Demystifying Edge Computing with Microsoft's Joe Coco
Show Notes Transcript Chapter Markers

We're taking a deep dive into the fascinating world of Edge Computing with none other than Joe Coco, the Group Partner Program Manager for Microsoft's Edge investments. With an impressive 25-year tenure at Microsoft, Joe shares his unique experiences and invaluable insights about the evolution of edge computing and the role that Microsoft is currently playing in its dynamism. We're talking about everything from industrial equipment to self-checkout devices to tiny sensors, and how Windows IoT is revolutionizing the value proposition of Windows in this context. 

Imagine if you could control edge devices from the cloud, with the right software and policies in place. How about unlocking new business opportunities, optimizing production processes, and replicating these processes in other factories? This isn't just a vision, but a reality that Microsoft is making possible with its Azure IoT investments. Joe helps us peel back the layers of this groundbreaking technology, shedding light on the layered security between the cloud and edge devices, and the importance of having a unified way of writing code and orchestrating workloads.

We also dive into the must-have conversation about business model innovation in the realm of edge computing. This involves constructing models that meet customer needs, integrating products to create a value proposition, and the transition from per-unit to subscription-based models. And of course, what would a discussion about edge computing be without touching on the implications of AI, such as network traffic optimization and operational challenges? So, buckle up for an enlightening ride into the heart of edge computing, its applications, and its future!

Want to scale your edge compute business and learn more? Subscribe here and visit us at https://edgecelsior.com.

Speaker 1:

When you ask people what Edge Compute is, you get a range of answers Cloud Compute in DevOps, with devices and sensors, the semiconductors outside the data center, including connectivity, AI and a security strategy. It's a stew of technologies that's powering our vehicles, our buildings, our factories and more. It's also filled with fascinating people that are passionate about their tech, their story and their world. I'm your host, Pete Bernard, and the Edge Celsius show makes sense of what Edge Compute is, who's doing it and how it can transform your business and you. So let's get started. Well, we can get started. I wanted to welcome Joe Coco here and, Joe correct me if I'm wrong you are the group partner program manager for. Well, actually, why don't you explain what your title is, Because I'm not going to get it right, but.

Speaker 1:

Joe Coco from Microsoft. but why don't you give us a little background on your title and kind of what your current what's currently keeping you?

Speaker 2:

busy Sure. Hey well, thanks for having me, Pete. My responsibility is I lead the product and program management team for a couple of different areas, but focused on our Edge investments at Microsoft. So, specifically, i lead the Windows IoT operating system, along with a set of related Edge services for processing data, collecting data and transmitting data to the cloud, and we can get into what the goal of all that is and all of that in our discussion Right right.

Speaker 1:

Cool, cool And so, and full disclosure, we've known each other for a while, so sometimes I'll have guests on here that I never met before, but in your case, obviously we've met many times. So give me a little background, or give our listeners a little background, on how you got to Microsoft. You've been at Microsoft like what, 20 years now or something.

Speaker 2:

Yeah, actually 25, so long time.

Speaker 1:

You get one of those big. What are those crystal things? Giant crystal.

Speaker 2:

Yeah, giant crystal, It has its own carrying case.

Speaker 1:

It's pretty impressive pretty heavy as well Fantastic 25 years, so that's a good chunk of time. I remember when I started at Microsoft I had done my longest in. Up to then I had worked for nine years at Phoenix Technologies, which I thought was forever, and then I did a couple of years in one place and three, four years in another. And when I got to Microsoft, I thought, you know I'll be here for some period of time, and then, before I knew it, it was 18 years later.

Speaker 1:

So I don't know is that it for you? Did you come into Microsoft thinking, oh, this is my lifetime gig, or what was the? are you surprised that you were there that long?

Speaker 2:

Yeah, no, i didn't intend to be at Microsoft as long as I have been. I had worked at three different companies before Microsoft on the east coast, in the northeast And you know I was not really a Microsoft person back then. I wasn't a Windows person, you know. I was doing things with, you know, unix at the time, you know, and it turns out that I ended up at a startup before Microsoft.

Speaker 2:

It's kind of at the age of the, you know, the beginning of the browser, the beginning really, the big internet wave, at the beginning of that, and Microsoft decided to buy the company And the deal was dependent on most of us technical people moving out to the Pacific Northwest, and so I was a developer at the time And so a bunch of us I don't know, maybe 20 of us moved across the country And I thought I'd be at Microsoft three to five years. Sure, you know, go back to what I was familiar with, go back to friends and family. But it was a good gig, right, right, good living in the Pacific Northwest and Microsoft had a lot of opportunity and a lot of interesting things going on, and so just kind of decided to stay.

Speaker 1:

It must have been kind of fun to have that cohort, like the whole company, move to Pacific Northwest. I mean you were all transplanted sort of en masse Right from one side of the country to the other, i mean culturally and socially and professionally. I mean that was kind of. I mean it was pretty destructive, but it must have been kind of fun too, i think.

Speaker 2:

Yeah, it was a very interesting and unique experience. I will say that, yeah. And then over time, people got dispersed.

Speaker 1:

Sure.

Speaker 2:

Things happened. I think I might be the only one left.

Speaker 1:

Well, 25 is a good chunk of time. Which part of the East Coast were you? I forget.

Speaker 2:

I was in Boston at the time, Oh yes. And working in Cambridge at the time. The startup that was acquired was called Firefly. It was one of the leading internet startups of its time, which was a fascinating experience in and of itself. For sure, It's a real amazing experience to go through such a transformation and to be part of that.

Speaker 1:

Yeah, no, I've always when I would meet some early in career folks, especially at companies like Microsoft. It's important for people to get that startup experience too. It's a different world where basically, you're doing everything Like there's no one else.

Speaker 1:

You're doing everything, you're taking out the garbage, you're writing the code, whatever it takes, and there's a little drama on funding and all that good stuff And everyone's kind of watching their costs like you should. And if you don't go through that experience and you just end up at a big company, then sometimes you don't have that empathy and don't have that context about how this really needs to work And everyone needs to work together. So, yeah, i was at a startup. We started with a few people and got to about 75 people And unfortunately, we were not acquired by Microsoft, but.

Speaker 1:

I did work with Microsoft a little bit on some color correction stuff. But yeah no, it's good to have that context, but now, of course, that's pretty far in the rear view mirror for you, 25 years on.

Speaker 2:

Yeah, the interesting thing I will note, though, about that time is, to your point about what it's like at a startup. I think that thing that I didn't appreciate until I was at Microsoft for a few years is the ability to have a global impact with the products you're building and the reach and the scale that you can attain because of the infrastructure at many large companies, whether it's the field sales organization or the legal team support or the marketing support, when you're at a startup. To your point, you don't have those things right.

Speaker 1:

No, you can look out the window. That's about as far as you can reach.

Speaker 2:

Getting adoption of your product is really hard because you just don't have that scale And also you don't have people customers who will trust you. They don't know who you are as a startup. They don't have a track record working with you. no-transcript. You know, at Microsoft we've been able to do that over many years. We have many customers and partners. Yeah, you get that huge ecosystem.

Speaker 1:

Yeah, no, i find it's interesting, i find it somewhat ironic. I would find people at Microsoft that kind of wanted to do their own thing and their own product group and kind of you know, and actually not leverage the rest of Microsoft. you know they're like, well, we're going to show the rest of Microsoft how to do this right by doing it ourselves, with no dependencies. And I always thought why you're really you're not getting it because you're not leveraging.

Speaker 1:

the reason why you're at Microsoft, which is you have this huge worldwide field sales and all this, you know, brand equity and all this trust, like you said, and that's the whole point of being in a big company is taking these ideas. and then you know scaling, using the company to scale them, scale them out, which, like you said, as a startup, these days there's more distribution platforms, there's more ways maybe to kind of flatten that out a little bit. Yeah, then there used to be, of course, so you can get a little more worldwide presence, you know, as a startup, even depending on your business model. But if you're going to be at a big company, you got to figure out how to scale, you know, with the company. That's the whole point of being there, i think.

Speaker 2:

But right, and to take advantage of all the other things other people in the company are building, that you can build on top of or alongside of right.

Speaker 1:

Yeah, yeah, accountabilities and dependencies. That's a big one that people end up learning at some point in their career, but it's probably better to learn about that sooner rather than later.

Speaker 2:

Right, yeah, if you're going to pay the tax of being at a large company that moves more slowly than the startup, at least take advantage of the benefits. Yes, exactly, exactly.

Speaker 1:

So you mentioned you're busy with a lot of things on edge. You know one of the topics is around this term edge computing. Even that it's a bit of a Rorschach test, right? Because you say it to somebody and, depending on who you're talking to, they ascribe different meanings to it, right? You know the network edge or the tiny edge, or you know sensor edge or heavy edge. When someone says edge computing or someone asks you at a cocktail party, what is edge computing? What is your answer?

Speaker 2:

Yeah, it's a very interesting question, you know. I think the easiest way for me to describe it is to tell folks it's the things they encounter in their everyday life and they don't even realize they're encountering them. So, whether it's a MRI in a hospital, whether it is a self-checkout device at a grocery store, whether it's a digital sign in an airport, of course a lot of us don't see the industrial equipment, but that's a key you know part of it right, but if you work in a warehouse and you're working with robots or in factories, all the manufacturing equipment, the industrial control systems, the sensors, the robots, all of that you know collectively.

Speaker 2:

You know that represents the edge. Of course, there's a lot of other pieces. There's little sensors or servers that you know collect and process data. You know there's all kinds of different devices. There's gateways that send data to the cloud, et cetera.

Speaker 1:

Right, yeah, there's old taxonomy out there, you know. You can call it tiny edge, right, or Cortex M-Base. There's like light edge, sometimes people say medium edge, sometimes they say heavy edge Right, you know. But, like you said, there's all these kind of workloads or compute functions that happen outside of the cloud, right, that you know, get jobs done, basically get things done.

Speaker 1:

I think when I think of edge, i also think that it's inherently connected to the cloud as well. It's very rare to find things that are not connected to the cloud. Now, we think about cloud plus edge and cloud to edge solutions, whether it's, you know, just for software updating or AI model updating or telemetry back or something like that. So there's usually some sort of connection. But it's all the, all the stuff that's happening outside the data center, right, on these different types of devices, like you said.

Speaker 1:

I mean self-checkout Wow, that's something that we all, you know, deal with on a daily basis. But you think back 10 years ago, self-checkout was kind of a crazy phenomenon. I mean, it was like science fiction, right, and now it's just, you know, deregure. So it's, yeah, no, it's interesting Every things that we encounter in everyday life. And so when people think of Microsoft. Obviously there's Azure and the giant hyperscale of cloud. And then they think of Windows, because everyone has Windows PCs although I am using a Mac book air M2 right now, so I will defer But the they think of Windows. But so Windows IoT is an interesting sort of hybrid right. It's sort of taking a lot of the value prop of Windows and the stability of that, but it's it's putting it in the context of some of these edge scenarios like you were talking about, right?

Speaker 2:

Exactly.

Speaker 2:

Yeah, i mean, we've been in the what I will call the embedded business at Microsoft, for you know, as long as I've been at the company you know, more than 25 years now actually Um, and so we continue to be committed to that market And it's a it's a fascinating market And now it's really exploding with the like you say, the connection to the cloud and bringing.

Speaker 2:

Essentially, you know, our goal is to bring Azure to the edge, um, and to bring the cloud native, modern programming metaphors. You know microservices, containerization, kubernetes, bringing all that you know really, which started in the cloud, bringing that to the edge. You know GitOps, et cetera, and um, and so you know, when we think about Windows IoT today, or actually more broadly, what we're doing with our edge offering, we're embracing all of that, whether it's Linux, whether it's Windows, all these things are uh, often virtualized, one running on top of another Um. So you know, we, uh, as many people know, we have our own Linux distribution called Mariner that we use internally. I think we may have renamed it now to, uh, azure Linux, in fact, the official name, so I should probably use the new name.

Speaker 1:

Um, but we can edit. We'll edit that out.

Speaker 2:

Yeah, uh, we use it in a cloud, but we also use it for our Kubernetes offering, which is AKS. So it's, you know, it's a Linux, you know, implementation of the CNCF Kubernetes standard And we use that in our cloud. But we've also brought it to the edge and we brought it to our server class devices on the edge, whether, uh, you know whether that's Windows server or Azure Stack, hci. But we've also brought a smaller version of that to, uh, you know, windows IoT classes small, you know, devices that cost maybe $200 to $3,000. Right, and so running, you know, linux workloads on top of Windows is a very common scenario now.

Speaker 2:

Yeah, yeah, interesting, and of course also at the same time, embracing the Linux host operating systems that are already out there as well and connecting them to Azure, because that's a huge part of the market. And so you know we are like, we're operating system agnostics. I happen to be responsible for Windows IoT, but all the work I'm doing on the other areas that are in my domain, you know I mentioned the data, the data services earlier. That's all Linux based investments that run on Kubernetes that are, you know, linux containerized services.

Speaker 1:

Right, right, yeah, some people may not kind of totally grok what you were saying, but the and not to get too technical here but there's like kind of instances of operating systems that can run inside of like a hypervisor framework, right, and so you can run Linux workloads just like you'd run on a bare metal Linux box, but you're running it as a workload even though it's running sort of inside of Windows, right, and just like you can have all kinds of virtualization platforms.

Speaker 1:

You can run RTOS in an instance, you can run Linux in an instance. I'm sure you can even run Windows in an instance, which is actually what Azure is. Right, you're running, you're spinning up Windows VMs in the cloud, and so that's kind of an interesting thing, as there's more horsepower in some of these devices on the edge. Now you're looking at pretty cool like virtualization schemes where you can get this case you get Windows. It is kind of your base OS that's managed and updated and secure, and then you can plop these other. It plops, not the right word, but the you can plop other workloads on top in different environments, right, and is that include RTOS too? Can you run an RTOS environment in Windows?

Speaker 2:

You know, typically we don't see customers running an actual RTOS operating system.

Speaker 1:

Like a determinant. yeah, right, right, right, Because it's not deterministic. I mean you can't really get the.

Speaker 2:

Yeah the well. It's kind of a. There's a bit of nuance here. So, in the case of Windows itself, there are a number of what we call, say, soft, real-time features in the operating system, and a number of our customers who build very interesting, complex equipment take advantage of those features. They're not very well known, but they exist and they work quite well.

Speaker 2:

But you know people who typically use an RTOS itself. They put it into typically a microcontroller based type of device that has that real, more, maybe hard real time or determinism that you talk about, and so they typically run that bare you know, quote bare metal on a small device very often. Now, actually, what you see is a lot of this equipment. Let's say it's a piece of equipment that makes semiconductors or an MRI machine or a robot. They often run multiple operating systems, right. So you have you might have Windows or Linux, like Ubuntu, linux or Windows IoT running kind of the as like the master brain of the thing or maybe the user interface of the thing, and then you have multiple microcontrollers that are running RTOS side by side with it, and so that's a very common environment.

Speaker 1:

Yeah, yeah Actually. So just to kind of riff on that a little bit you mentioned about how Windows IoT now supports Kubernetes kind of containers. Right, that's the AKS. Is that the AKS?

Speaker 2:

Yeah, aks. So exactly, we have AKS HCI, which is our Kubernetes for larger, more powerful kind of server class infrastructure, and then we have AKS, yeah, and then we have AKS Edge Essentials, which is more for the lightweight kind of a Windows IoT class, what I would say client class devices. This is typically running the hardware you might have in a laptop. Right right, but in some sort of industrial equipment or hospital equipment or retail equipment Right.

Speaker 1:

Hundreds of dollars, or maybe single thousands. Yeah, exactly. And so Equipment, yeah.

Speaker 2:

Yeah, and one, and it's one. They both support Kubernetes or K8s, but also we support AK, we support K3s, which is a smaller, lighter weight version of Kubernetes, also CNCF project, and so we both embrace both K8s and K3s.

Speaker 1:

On those, yeah, and that's kind of the frontier. I mean, it's one thing to sort of orchestrate Kubernetes, workloads and things in the cloud, And then now you're doing it on a heavy edge. But the frontier is sort of how light can you go in terms of extending that sort of Kubernetes based orchestration kind of framework? I mean, if you want to kind of cloud dev ops all the way to the edge, the edge of the edge, right, How low can you go? And it sounds like K3s currently is kind of the edge of the world Or is there I know you're working on there's this thing called ACRI. I don't know if you want to talk about ACRI.

Speaker 2:

Yeah, there's a few other things going on. So right now you're right. I mean, the Kubernetes was really started in the cloud and got a lot of traction in the cloud And then it's slowly moved to the server class devices on the edge or these kind of onsite data center machines, maybe you call them And then it moved to these lighter weight embedded devices Sometimes people call them light edge devices, But there's also other devices out there that can't run that full Kubernetes platform as we know it today, And there's a couple of efforts to integrate those smaller microcontroller based devices into these Kubernetes environments So you can run containerized Linux workloads on the edge and also interact with these other devices, whether they're sensors or just older equipment, legacy equipment that you're not going to upgrade with new software. And so the two interesting things there one, as you mentioned, is ACRI.

Speaker 2:

Akri, which is also a CNCF project, which what it does is it allows you to integrate those, let's say, a sensor that's not running Kubernetes into a Kubernetes cluster. It creates a Kubernetes resource for that device so that your workloads running on the Kubernetes cluster see it as any other Kubernetes resource, See it like a GPU as an example. We're not talking about GPUs here, but GPUs are exposed in kind of a similar way And so you can kind of interact with it and essentially do a protocol translation to whatever the native protocol is of that leaf node device.

Speaker 1:

Right, like an on-vif based camera or something like that.

Speaker 2:

Exactly On-vif RTSP based camera can be exposed as a Kubernetes resource to a Kubernetes cluster so that workloads running on the Kubernetes cluster can get video streams from that camera or tell it to zoom in, or whatever it might be.

Speaker 1:

Yeah, and the AKRI, is that a Kubernetes resource interface? Is that how it? was at the I don't remember the exact.

Speaker 2:

Yeah, well, that's part of it, i believe it's also It might be a Greek island or something like that. I can't recall exactly, so I'd have to look it up. Oh okay, i can't remember how we came up with the name. Yeah.

Speaker 1:

Okay, no, i thought it was just a. I remember we used to have I'm going to date myself, but working at Phoenix, we used to have all these different CMOS editors, and the popular one was YACE, y-a-c-e, which was yet another CMOS editor, exactly. So I thought maybe this would be like YAKRI, yet another Kubernetes resource interface.

Speaker 2:

Yeah, it took a while to get everybody to coalesce around a name.

Speaker 1:

That's a nice name Four letters, it's rational.

Speaker 2:

So that's one way of interacting with these smaller devices or devices that can't run Kubernetes. Another effort underway is the WASM effort WebAssembly effort that's going on, and really that's about Really I'm not an expert in it, but I would describe it as being able to have a smaller container infrastructure. So when you have containers today they're dockerized or they're Basically you're bringing part of the Linux operating system along with you when you bring your workload, and so there's other ways, interesting ways to do that that are smaller and more lightweight, and that's what this WebAssembly is about, and so you'll be able to have a Kubernetes cluster that's running these WebAssembly containers, which are smaller and lighter weight.

Speaker 1:

Yeah, So this idea of portable containers for software has been around for a long time. I mean go back to the Java days too. I mean we used to have J2ME and things like that, where ideally you would have a pretty programmatic interface across all these disparate platforms And your code would be. I think they used to say write once, run anywhere, which never really worked too well, But the concept was good. The concept was good. It was being able to find some sockets that you could talk to and put your code onto.

Speaker 2:

Right.

Speaker 1:

And it was an interesting to see how that plays out.

Speaker 2:

Right, yeah, i was going to. You were talking earlier about how all these edge devices are connecting to the cloud And I think when people hear that they think well, they're just directly connected to the cloud, or I don't buy that all these devices are going to directly connect to the cloud. You know, right, right, really, you know most of them won't be directly connected but they will be connected indirectly And through these you know, network topology is sometimes referred to as isa 95 networks are produced, network models Where you know, basically you have devices on different, in different layers with security boundaries between them. But if With the right software and the right security policies, those devices that are behind the security boundaries can be targeted and Can't can be targeted by the cloud, can be controlled. You know you can deploy workloads, you can update devices.

Speaker 1:

Yeah, yeah, the ideal. The ideal is that, like all of the all of the resources right whether those are sensors or Cameras or gateways or, you know, robots or whatever all look like resources from a cloud perspective right that can be targeted with the right workloads at the right Equipment at the right time, right with the right networks and all that good stuff yeah so that's the, that's the dream or that's the goal is to sort of Be able to develop and deploy solutions that kind of look as homogeneous as possible from the I guess, from the developer, of the operator perspective, right?

Speaker 2:

right, exactly, and you know, i think when we first started this journey of bringing the cloud down to the edge several years ago, you know, primarily with our Azure IoT, iot investments You know, we really thought about it from the perspective of perspective of let's get these devices connected So we can, you know, see what's going on with them, get some real time insights, you know, monitor them right, and that's been very valuable for a lot of scenarios.

Speaker 2:

But as we're going along, we want to kind of add more value by now starting to optimize those assets on the edge Or optimize that production process to deliver better customers, better products to customers, or improve the quality coming off the assembly line or whatever it might be. And then, beyond that, we get really into kind of more of the full digital operations goal, which is how do you, you know, identify new business opportunities? how do you reduce energy consumption? how do you replicate? You know this the process is that one factory in another factory, you know, really to allow business transformation. And so you know it's a journey that I think we're all on collectively, you know, our customers, our partners and us.

Speaker 1:

Right, right, yeah, no, i think it's like I saw some study I think it was from Telstra that they published, but it was sort of like phase one is migration. I got, i got on prem workloads and I put them in the cloud. Easy and not easy, but you know, that's kind of pretty basic stuff right the lift and shift.

Speaker 2:

Yeah, typical IT lift and shift, yeah.

Speaker 1:

Yeah, lift and shift, right, get it on to an elastic cloud resource fantastic. Then the next part was okay, i've got all this stuff out there. I need to sort of keep an eye on it and measure it. So just get data into the cloud, right. So almost like a one way.

Speaker 1:

You know, data pump up of sensors and stuff, and now we're talking about well, actually, now these, these devices on the edge, are actually a lot more capable. You know, the semiconductor is getting more powerful than it works, are faster, so these things are a lot more capable of doing, you know, kind of low latency, real time stuff where the action is, and they really need to be considered a resource as part of an overall solution. And so that's where we say, okay, well, let's take all the cloud dev. You know dev ops that people are used to, and I saw stats somewhere where there's like 10 times as many cloud devs as there are embedded devs. I don't know if it's true, i don't have a source, but I was told that. So Like, hey, let's standardize on that so that people can build solutions including, like you said, the point of sale right.

Speaker 2:

Yeah, i mean, customers are telling us they want the same metaphor is whether they're using them in the cloud or the edge. You're right. From a developer efficiency standpoint, having one way of writing code, one way of orchestrating workloads is incredibly valuable, whether you're doing it in the Cloud or on the Edge, and creating that continuum from the Cloud to the Edge is super important to customers. But it's still harder to do on the Edge today. So a lot of our investments, whether it's on Azure IoT or whether it's in Kubernetes with AKS, whether it's in Windows IoT, a number of other things we're doing that's all around. How do we make it easier for customers to use these Cloud paradigms on the Edge and really create this seamless fabric from the Cloud all the way down to the Edge? Yeah, so that's where we're focusing our time and it's an interesting time to be doing it.

Speaker 1:

Yeah, no, it sounds fascinating. So what else is keeping you busy these days? That sounds like it would keep me busy, but what else is keeping you busy?

Speaker 2:

I would say there's some really interesting thinking that we're doing around. How do we make thinking around business models and how do we align with customer business models that our business model mesh as well with their business model for some of these things. So that's an area that we're spending a lot of time talking to customers and thinking about how to structure things to make things cost-effective for them, to really help them again, whether it's reduced energy consumption or improved products, whatever it might be. Help them with their own business models Because, as you know, we work with a ton of partners, so they take our solutions to their customers and they build on top of what we provide, and so how do we make sure that whatever we're providing from a business model standpoint works for their business model as they sell it to their customers?

Speaker 1:

Yeah, no, it's interesting that you bring that up because I've had a number of shows here that have talked to folks about business model innovation In as much as maybe a few years ago, where people were talking about the tech and the chips and the networks in 5G. Now it's like, well, what's the business model? How do I get to more of a usage-based model? How is the ecosystem incentivized? So there's a lot of discussion around how do I deliver my part of the solution because everyone's always just a part of the solution so that it fits inside of this ROI envelope for the customer. So at the end of the day, a customer's got a problem, it's costing them X millions of dollars and they want to solve it. So the business model has to meet the customer where they're at to a certain extent.

Speaker 1:

So how they're paying for things, how they're actually even running their business, which is increasingly turning into much more of a SaaS-based world, frankly, is everything is more usage-based these days and we're on demand. That's how people are used to paying for things. So we're seeing a lot of business models shifting, maybe torquing a little bit, trying to fit into this new thing, this new framework.

Speaker 2:

Right, exactly As we think about that business model innovation. part of that is product innovation to integrate with the business model. In other words, what are the key things we want to integrate together and how do we construct a business model around that? Because the biggest problem the customers have is a lot of this stuff is just too darn complicated to do A lot of them. how do I get my devices connected? How do I traverse the security boundaries on the network? How do I use Kubernetes? How do I make it small and fast? How do I build my solution in this new world? I've got legacy applications too, by the way, that run natively, let's say, in Windows. How do I run those on the same machine where I'm running a Linux VM with Linux containers? All this stuff, bringing all of it together and then building your own application and your own business value. when we integrate these pieces, how do we think about that integration as part of the broader business model as well? Business models only make sense if they can be related to the product value you're creating.

Speaker 1:

Yeah right, I don't know if you listen to Scott Galloway. I'm a big fan of Scott.

Speaker 2:

Galloway.

Speaker 1:

I was going to say Scott, I don't know him, Scott. Yeah, But one of the things he talks about is the idea of going from a per-unit business model to a subscription model where you can't just take it and divide it by 12 and charge them on a monthly basis. There's a lot more involved there, Especially when it comes to software. There's a lot more engineering involved in architecture And there has to be product truth that supports the business model Exactly.

Speaker 1:

That's a key thing is that you can't just go in there and say, oh, let's just divide by 12 and make it a monthly subscription.

Speaker 2:

Right.

Speaker 1:

So it's good that you guys are thinking about that. Yeah, no, it's a theme. I think a lot of folks that have been in this business a long time like have gone from the lift and shift to the IoT to now edge computing are maybe a little more attuned to making sure the business model works. The P&L has to pencil for these investments, because these investments can be huge and can take a long time, and I think people really want to feel there's some confidence that they're going to. they're actually, at the end of the day, not only solve their problem but do it in an ROI framework that makes sense.

Speaker 2:

You know, the other thing that we're spending a lot more time thinking about is AI and how AI will change edge computing, but just in general, how we bring AI spanning this continuum again from cloud to edge in a way that's consistent and where we're doing the right things in the right place at the right time Right, And so that's obviously a big push. You know, everybody knows and has heard about what we're doing with JATGPT and other investments, So it's really about how we think about how to leverage these large language models to improve efficiency in manufacturing facilities or quality in hospitals or whatever it might be, And so that's kind of the next big thing about really spending a lot of cycles on.

Speaker 1:

Well, we've seen a little bit in networking. I know that there's more using kind of AI and predictive modeling for kind of traffic, network traffic and you know kind of routing packets in the right way at the right time for the best path optimization. So I can imagine that once people are getting systems in place where there's lots of different workloads running in lots of different devices and lots of different kind of network variables, that it's sort of like the orchestration of all that probably AI can really help with the orchestration. Some form there, right And sort of. You know, definitely that's definitely going to be happening, whether that's transformational neural networks or convolutional neural networks or something I don't know. That's beyond my pay grade. But yeah, you know we're talking about some pretty complex issues and, rather than kind of shipping an engineer in the box, hopefully there's going to be some AI capabilities that help people get through some of these operational challenges, because it's not only just building it and shipping it, but then it's deploying it and maintaining it.

Speaker 1:

There's the support issue and maybe some people don't realize when some of these solutions get deployed. They're deployed for like a decade or more. It's a long time, and sometimes they're in pretty unfriendly environments. You can't just kind of wander out there with the USB stick and update something like there. So that's the other thing is people think about. Like when you buy a PC it's two or three years, you updated it a few times, no big deal. Or your Xbox. These kinds of solutions not only are some of the mission critical but also are designed to be deployed for an extended period of time and the service and support is actually part of the business model. It has to be part of the business model as well.

Speaker 2:

You're absolutely right. These solutions are in place for many years. Typically They get optimized for the tasks they're doing some sort of chemical process control or something and you don't want to change those once they're working efficiently. You want to slowly improve them, maybe slowly optimize them by gathering data and learning and maybe there's some AI involved in that But the equipment's there for many years and the workloads they might get updated from a security standpoint or they might be optimized, but the task that it's doing doesn't change.

Speaker 2:

Overall. These things don't get replaced very often because, like you say, they're mission critical, whether it's kind of a life or death scenario or just a business continuity. Like your manufacturing facility, if something goes offline for even minutes, you're losing a lot of money. It's very disruptive. That is a huge part of the value proposition that I think we try to bring is that long-term support, long-term servicing, security. Increasingly, governments and industry bodies are requiring devices to be updated from a security standpoint to prevent these catastrophic failures that can bring down, let's say, a mass transit corridor or can disrupt energy delivery or whatever it might be. With all the rogue actors that are out there, this is becoming increasingly important and governments around the world are starting to regulate, security updating, and also industry bodies and such are doing that as well. That's also a very big focus of ours. That whole regulatory support is certainly something that we think we can bring a lot of value around.

Speaker 1:

Yeah no for sure, I think the regulatory requirements around that kind of manageability, updateability, i mean we've seen so many solutions that are sort of flash and forget things and maybe that worked 10 years ago, but these days they're so integrated, as you were saying at the top of the story, these things are so integrated into our daily lives whether we even realize it or not. There is, unfortunately, a lot more sophistication in some of the bad actors out there that the ability to not only have the right security architecture and protocols which is kind of a whole other podcast but the ability to update and even take these things offline proactively is going to be super important. Good Well, joe, we're running out of tape. No, i'm just kidding.

Speaker 2:

But no, this has been really fun. What's tape?

Speaker 1:

Yeah, exactly, Yeah. no, it's been great chatting with you and reconnecting and appreciate your time. Stay busy, stay healthy.

Speaker 2:

Hey, pete, thank you You too. It's been fun chatting with you.

Speaker 1:

Thanks for joining us today on The Edge Sosior Show. Please subscribe and stay tuned for more and check us out online about how you can scale your edge compute business. Make it fast Oh.

Exploring Edge Compute and Microsoft's Role
Exploring Edge Computing and Windows IoT
Connecting Edge Devices to Cloud
Innovation and AI in Edge Computing