By 2023, everyone will be working in the fourth dimension, say analysts at OSIsoft. A four-dimensional model is a 3D simulation cross-referenced with time. Digital twins, a recent development enabled by the Internet of Things, are essentially the first models of 4D. We spoke with Richard Beeson, CTO at OSIsoft to learn more about this shift from 3D simulation to 4D.
DW: First of all, tell me a little bit about yourself and OSIsoft
Richard Beeson: Okay, so a little about me. Well, actually the two stories are very tightly interconnected. [The company] was just about to celebrate its 40th anniversary of its founding and I just celebrated my 30th year with the company. So it’s been an interesting journey. So OSI is the maker of what we call data infrastructure, but it’s the information management layer for what is typically operations data. So it used to be DCSs, PLCs, all of that in the control world. So everything in the control world. We span that whole space. And then we take that up through the plant and across the enterprise.
As we’ve seen the evolution of IoT and all of that, we are now finding that our customers have the second network, really, this IoT world. One of the things they really do with us is stitch those two worlds together into one coherent, consistent operational view. Because a lot of these IoT sensors, while they’re not intimately part of the control world, they can have a big impact on how people understand what the control world is trying to control. So we help stitch all those together. We call that our data infrastructure. It’s called the PI system. And we have technologies that run all the way from the edge through the enterprise and up to the cloud. In a nutshell. That’s it.
DW: Okay. So now tell me a little bit more than about how 3D models will … and what do you mean by 3D models, will get replaced by 4D models? And what do you mean by 4D models?
Beeson: This is actually super fantastic that I got pulled into this conversation. This was an article written by a former colleague, Michael Kanellos and Michael and I, we were wonderfully able to disagree on just about everything. So, he actually wrote this, but in this particular case he’s a writer, he likes to spin words and tell stories and get people’s attention. And so I agree with the underlying thing he’s saying, but I don’t necessarily know that I would say it this way, but we can go with that. My friends at Hoffman are probably cringing right now.
I’m going to just try to make this short. So what we did in 2000, is we had a product which did data reconciliation for refineries. Data reconciliation is really important in refining. It’s like balancing your checkbook. So it’s got a financial component, but it’s really got an inventory component. I don’t know, are you familiar with that whole concept? Okay. Anyway, so we got this thing called, and they try to do it based on the measurements of flows coming, the level of some tanks and the levels of ships and what goes in and what goes out.
We’re not doing a nuclear science here, so everything is conserved, but the reality is that when you add everything up, it usually didn’t go to zero. Well, we had to apply advanced analytics to try to calculate which meters, which measures were off and look at that over time to try to then create a balance based on that information.
So to do all of this, we had to start building up this notion of the various assets within a refinery, the various things, and then the material flowing through them and the energy flowing through them. So for us, that was kind of our beginning in 2000 of what the worlds eventually started calling digital twins, which was we needed a way of representing these things so the people could build these models. And these are process flow models. They’re not necessarily physical modeling as it were.
So, actually even in that world we did, I mean what Michael was calling 4D, we were actually kind of already doing that because we’re taking this digital representation of the physical world to create these assets structures and these digital representations. And we were merging that with all the, what I call the digital information space of the operation associated with those models. So you take this digital information, operational information space and you intersect it with the physical modeling or the digitalization of the physical world.
And that gives you this temporal, physical reality, that’s really in the world of operations where everything is close and moving and combining those two was absolutely necessary. So that is at least my concept of the 4D. What I think we’re seeing happening and more is we’re really getting three layers of things that are getting digitized and merged. I would call it the 3D modeling of the physical reality. So the as designed and as built, so like with the mechanical engineers and the civil engineers would be using to actually construct the physical world and/or to reverse represent the physical world for things that were built before all those technologies. So that’s one. And that’s a pretty high fidelity kind of construct between the physical world and digital representations.
It’s like what architects use in their designing houses these days. So you take that and then you take the, what I would say kind of the digital representation of what those things are from different points of view. So from a civil engineer, mechanical engineers point of view they see the world one way, but from a process engineer’s point of view, they see the relationship between a pump and a tower and heat exchanger and how those all interplay as a dynamic model for a process. So that’s yet another kind of digital representation that serves a specific purpose.
Now you then overlay the digital information space of all the sensors and all the activities and all the dynamics that are happening with those. And you really start then being able to feed, and I’m sorry, I’m just going on here. You just start being able to feed now both this operational data, this information space with the models of how these things all interact and interplay with the physical 3D models if you will.
You’ve got this context and you’ve got the sensor data and you can now start feeding all of that into AI and ML and some of these advanced technologies. Well, we’ve got these amazing number crunching abilities with GPUs and everything that really shew together all of this information.
So you step way back. We’ve had our customers for 30 years, have had 35 years, actually have had some pretty good technologies stitching together this, what I’ll call this operational digital information space and the ability to apply streaming analytics and things like that since the mid-80s, but what we found as the technologies and AI and the big data crunchers became more powerful that as soon as you could add or bring in this contextual information, the digital twin information, the as built information and overlaid that contextual space together with this operational process information space, then you really give these analytics of something significant to chew on and you take away a lot of the work that data scientists used to have to do in terms of stitching this all together and providing all of the labeling and all the other things that give this information space, the ability to drive some of these analytical insights.
So I’m going to stop for a second because I just dumped a whole bunch of stuff out there.
DW: Let me ask you a couple of questions here. So if I’m understanding this, the internet of things has really been lately more about data gathering and the analysis of that data has always been kind of a sticking point. This is a way of resolving that sticking point or at least providing a way to take all of this data and look at it and potentially gain some insights from it?
Beeson: Yeah. There’s two things, just like what’s happening in the world and then at our system. So our system does two big things to help drive those outcomes. What you’re talking about. One is we provide a platform that stitches together all the different data measurements across these disparate systems, whether it’s from the control world, whether it’s from the the IOT land or whether it’s from basically anything else. But we then stitch that together with the contextual information. We call it contextual information. So contextual where we basically effectively label, but we create those associations between all these sensors and how they map to various means in the physical world. So by creating that binding. So both bridging the silos of all this sensor data and then overlaying or adjoining it with this contextual information. Yes, you accomplish exactly what you’re talking about. You really drive home the ability to unleash some amazing analytics.
DW: Okay. So now what does a design engineer need to know about all of this in order to make use of it, especially when they’re developing a design for a system or equipment that’s going to go into some kind of an automation application?
Beeson: Okay. So this is great. The world you’re talking about, it’s not the world we necessarily live in, but I’ve had some conversations because I know people like … What are they? Equinor and some others I’ve spoken with, they … So first of all, if you want to learn and get insights about your processes and your designs, you have to have lots of history. So, I mean, it’s like a child. It’s like anything else, you only learn from the experience. So the more data you can collect, the more you can have available to learn from, the more informed these insights will be.
I mean, for example, it’s sometimes very difficult to predict something if you’ve never had that experience before or a similar enough experience before. You might say, “Okay, we’re getting into a territory that we’ve never seen before. We have no idea this is good or bad”. As opposed to, yes, we think it’s most likely this is what’s going on because we’ve seen this before.
I promise I’m pulling this back together to your question. One of the things that’s fascinating about the data science here is that most of what people are doing in our space is they’re trying to do predictive maintenance. They’re trying to anticipate failures and things like that. And that’s fantastic because there’s a huge, huge upside to getting the most efficiency and the most effectiveness out of these existing systems deployments. But what I’ve argued actually for years and years now is that where the real significant transformation is going to happen is when those insights, the analytics that are being applied based on the history, based on what people are doing today, when that can start informing how you can not only make your machines last longer, but how you might be able to refactor or reconsider how you actually design your process in the first place.
And when it starts giving you those kinds of insights or how you can alter your process so that, wow if you just change the way we’re mixing upstream by doing X, Y, Z, whatever, you change that, the way we’re mixing there, now all of a sudden we believe not only are you going to be able to predict when there’s a downstream failure, but you’re going to set things up in such a way that you’re going to avoid most of those stresses that are driving the downstream failure in the whole first place so that takes somebody who understands the process, who understands what those insights are suggesting and has the ability to go change the physical operation, the physical design of that process or that equipment to have this outcome.
So that to me, like I said, they’re predictive, prescriptive maintenance and all that stuff, that’s fantastic. I don’t knock it, there’s still so much more to go there, but this next step of actually being able to build things in a way that are already designed for better outcomes, that’s going to be really, really game changing.
DW: To me, predictive maintenance has been around for a very long time and it looks like we’re revisiting again just with different technology. And I always thought that the internet of things was really more about what you just spoke about, about the ability to take this information and alter things in a way that could actually function better. So how close are we to getting to that point?
Beeson: Obviously, as I mentioned before, you need the history. You need to have learned from something. First off. Second off, you need people, at least for now, maybe machines, they’ll get there at some point, intelligent machines. But for now you need people who understand what the intention is. Okay. I mean, you have to know what you’re trying to achieve or what the outcomes are you’re trying to achieve, so you really need to connect the analytical insights back with somebody who understands the process or the design and the intentions of that design to be able to then make that leap. So, I think everything is there. I think it’s a human skill. And I don’t think that human skill, there are some amazing people, but it’s a human skill to be able to, to be able to say, “Okay, given our intentions, what we’re trying to accomplish, given these insights that we’re seeing, I understand what that implies in terms of building a better meal” or whatever that is.
DW: So it’s not something that’s-
Beeson: But I think it’s all there.
DW: But it’s not something as simple as an executive saying, “I want you to improve this process by 20%”?
Beeson: An executive saying improve the process by 20%, it may fund an effort. So a lot of these, yeah, you have to know that instead of, and in a lot of these worlds that process engineers and the operators, they’re just like literally trying to keep everything from falling apart minutes to minute to minute. I’m sure you’ve walked into these sites or you’ve lived this world. I mean, it is brutal. I was fortunate enough to grow up with a father who’s in manufacturing and he was very inclusive. So I got to see this. This is back in ’60s, ’70s, ’80s, and I just watched how brutal it is. There was still innovation, there was all that stuff going on.
But many brilliant people just having to spend their days in keeping the lights on. So yes. So if the executive says “Yes, this is important that we’re going to fund you to basically carve yourself out of that insanity and look at the data, look at the process, rethink all of this based on what the historical insights and all of that are” … And then, this is what gets really cool with some of this technology. Once you have those histories and once you have those models that are developed using AI, ML, whatever your technology to use is, once you have the 3D and I’ll stay with the history of the 4D digital models, then you can actually start to run simulations. So someone who understands these technologies can now start as long as they don’t go too far out of bounds, they can actually start running very realistic simulation based on those models of what the various outcomes may be.
Now, this is where you can actually start, and this is like a Google Go engine and some of those other things, this is where you can actually say, “Okay, now we understand the physics” if you will, the physics of Go is one thing, the physics test is another thing, “the physics of this process, because we’ve built up this contextual representation of all the parts and pieces. We understand the models of the chemistry, we understand the models of the energy, we understand the models of the failures and we’ve got all this history, so got all of those together.”
Now you can actually start. Now this is this person who is off trying to eek out 20%. You can actually start running realistic simulations based on all that history and context and start tweaking it and actually just running it digitally, running it in the virtual world to see what has a very likely chance of giving you that 20 or 50 or a thousand percent improvement or whatever it’s going to be.
Filed Under: IoT • IIoT • Internet of things • Industry 4.0, Commentaries • insights • Technical thinking, NEWS • PROFILES • EDITORIALS