Tweet

July 2017
AutomatedBuildings.com

Babel Buster Network Gateways: Big Features. Small Price.
Control Solutions, Inc. - Minnesota

(Click Message to Learn More)


Who You Gonna Call? — Digital Twins!

A digital twin is a dynamic software model of a physical thing or system.


Anno Scholten
Anno Scholten,
 President,
Connexx Energy

Articles
Interviews
Releases
New Products
Reviews
Secured by Cimetrics
Editorial
Events
Sponsors
Site Search
Newsletters
Control Solutions, Inc
Archives
Past Issues
Home
Editors
eDucation
Securing Buildings News
Training
Links
Software
Subscribe
ABB

Every piece of infrastructure, sensor, personal mobile device, and business process in a building today is a potential source of valuable data for improving operations and user experience. Insightful facilities project teams are beginning to direct it towards the creation and maintenance of digital twins. A digital twin is a dynamic software model of a physical thing or system.

The concept of building and maintaining a digital twin is a new frontier in the industrial art of digital modeling. We are entering a time when everything is getting connected, computers are ubiquitous, and the amount of data that can be collected, aggregated and analyzed is practically limitless due to cloud architectures. It is now within reach to create a full proxy of a building in the cloud. The digital modeling world has been working toward this moment since the first computer-aided design (CAD) tools for drawing symbols and geometries were introduced in the 1960s. Early CAD led to the very sophisticated BIM (building information models) that performance design engineers working in architecture & engineering firms use today to analyze and optimize systems.

The big advancement that distinguishes a digital twin is that it encompasses not just predictive design-phase data, but also time-series data captured from an occupied and operating building. Digital twins can serve as repositories of data from BIM, building automation systems (BAS) and sensor networks associated with lighting, physical security or other infrastructure. The replicas will come alive as they are fed time-series data from actual operations. A range of analytics packages will be run against the real-time data to glean insight into operations on a continuous basis or on demand by users. The information contained will become more granular, as more and more data is accumulated, organized and interpreted. Ultimately, anytime anyone has a query about the building, they’ll start by consulting its digital twin.

The term Digital Twin is not unique to buildings. In fact, the IT research firm Gartner identified it generally as one of its Top 10 Strategic Technology Trends for 2017. Per the report, digital twins of physical assets will be combined with digital representations of the people, businesses, and processes that comprise facilities and environments. Gartner predicts that billions of things will be represented by digital twins within three to five years. It sees digital twins replacing traditional monitoring devices and controls and augmenting the roles of skilled technicians. The report’s authors acknowledge that the proliferation of digital twins will require “a cultural change, as those who understand the maintenance of real-world things collaborate with data scientists and IT professionals.”

The buildings industry has been in the throes of this cultural change for years. The challenge of marrying IT and operations expertise has been taken up by professionals from all corners including building engineers deploying monitoring-based commissioning methods, equipment makers introducing predictive and conditional maintenance programs, and operations & maintenance managers striving to satisfy occupants that want personalized, responsive spaces. While they are complex to model, buildings may be closer to having digital twins than many other asset classes.

Consider, for example, representing a building’s chiller in software. The model might start as a simple block diagram showing component parts like condenser, motor, pipes, etc. As you add chiller performance data, the virtual twin becomes more information-rich, like a 3D wireframe view. By adding IoT sensor data, you can get more granular information about aspects of chiller operation of particular interest. Metaphorically, you’re adding detail, shape, and color to the digital twin. As you pull in more data, you can make it more and more like the physical chiller. The digital twin can also include equipment documentation, with links to online resources.

Value can be gleaned from the digital twin from the earliest phases of its evolution. Equipment fails in well-understood ways that can be described with a few rule-based algorithms and tested against trend data that records just the relevant parameters. So, fault detection and diagnostics (FDD) for specific equipment, like chillers, can be run against a relatively sparse ‘young’ digital twin. When there is a need for more granular data on specific aspects of operations, wireless sensors can be placed to gather the information of interest. For example, a hot/cold call from an occupant may trigger interest in air supply temperatures at a handful of points. There is no necessity to bring every point captured by a sensor system into a BAS. Likewise, there’s no reason not to keep populating a digital twin with the information. With today’s cloud architectures, the added cost to store and manage the additional data is minimal, and you don’t know what new use for the data will arise in the future.

FDD analytics is an important tool in the arsenal of anyone involved in optimizing building operations, but they are not the only tool. To optimize chiller operations, for example, you want to be able to query whether the chiller is running optimally in terms of the observed heat curve, then adjust the Sequence of Operations (SOO) programming accordingly. Today there are many commercial off-the-shelf statistical programs that do curve fitting, and you want to maintain your option to choose the best among them.

Digital Twins

A growing category in operational analytics for buildings is model-based predictive and prescriptive control algorithms. Fed historical and real-time trend data, these tools look for patterns to predict what will happen next. If predicted performance would result in energy waste or another undesired outcome, they can prescribe actions to course correct, and sometimes affect the necessary adjustments—like changing variable-speed motor settings, for example. These analytics packages are leading the buildings industry closer to machine learning and AI. Project teams will want to plan for the eventuality of running this type of analytics against the data stored in their digital twin.

In short, a digital twin platform should accommodate tools we know today, and those we have not heard of yet. Who knows what new analytics will emerge from the minds of next-generation data scientists? Certainly, the digital twin should not be tied to any specific analytics type or brand. The architecture should feature security as well as open, low-friction data interoperability at each level. Software stacks supported by vibrant open-source communities are considered to provide the safest future growth path today. A digital twin should be designed to scale, evolve and reincarnate for the lifespan of the building it represents.

How much time-series data would a building project team need to feed its digital twin? If trend data were collected for 50,000 points over five years, about 4.2 terabytes of time-series data would be created. Another inherent question is ‘How to navigate such an enormous data store?’ This is where metadata tagging systems like Project Haystack come in. A well-defined reference architecture and standard meta tags as defined by Project Haystack are needed to bring order to the terabytes of time-series data. As an estimate, about 200MB of Haystack metadata would be sufficient to navigate the 4.2TB of time-series data collected from a 50,000-point building space.

Reliable Controls In addition to all the actual time-series data that is collected once a building project is operating, another big data store is all the predictive performance and energy modeling data that design engineers do before a new construction, or major retrofit project is built. Ideally, a digital twin would integrate all this data as well. Architecture and engineering firms working on high-performance buildings like net zero projects know that they must stay engaged with facilities teams to meet design goals. Some have launched building optimization practices to close the feedback loop between design/construction and normal operations phases of a building’s life cycle. These practices are led by commissioning experts and systems engineers with a hunger to tap into all the insight available from collected building performance data–in other words, from a digital twin. They are in a good position to advise building owners on the practicality of investing in a scalable, robust platform that will serve the building well into the future.

Inherent to the digital twin concept is the idea that its value increases over time. As the information contained gets more granular, you will get more meaningful and reliable results to the analyses run against it, and the what-if scenarios you run through can start to get more complex. Consider the challenges of an engineer overseeing chiller operations for four geographically dispersed resort hotels. Someone in this role would typically have nagging questions like “Just because the consulting engineer said that the Sequence of Operations should be programmed a certain way 10 years ago, is this still the best way for right now?” Perhaps the chiller operator in one geography has discovered a better way, a new sequence of operations. Should the head engineer institute the updated SOO at all properties? Testing the proposed changes in the real world introduces risk and effort that he may not want to incur. But, if he could run simulations on the digital twin from the comfort of his chair, he could reduce that risk and make a stronger case for the proposed changes to each operator. Innovation and progressively greater efficiency would happen a lot faster if there were a digital twin to consult. Likewise, should something go wrong with a chiller at one location the digital twin would be a means to do forensics and support any decisions regarding safeguarding the chillers at the other properties from the same problem.

We also must expect that the digital twin trend is going to accelerate technology disruption and the remaking of many business and industrial processes. Nevertheless, the most forward-thinking facilities project teams are going to embrace the concept. Users of the Connexxion® Platform are already on their way toward creating digital twins. They rely on this scalable, secure data management and data visualization platform to transform and unify disparate data sources, to bridge heterogeneous networks, and to quickly deploy the analytics and other applications that various stakeholders are demanding. With Connexx Energy as their partner, these users are leading the way into the ‘digital twin’ era of smart building operations.


footer

opsys
[Click Banner To Learn More]

[Home Page]  [The Automator]  [About]  [Subscribe ]  [Contact Us]

Events

Want Ads

Our Sponsors

Resources