Tweet

October 2019
AutomatedBuildings.com

[an error occurred while processing this directive]
(Click Message to Learn More)


Transforming Automation User Experience with Deep Digital Twins

Deep Digital Twins enable a new type of automation platform, built on the fundamentals of physics, with the guarantees that come with it.
Troy Harvey

Troy Harvey,
CEO
PassiveLogic
Articles
Interviews
Releases
New Products
Reviews
[an error occurred while processing this directive]
Editorial
Events
Sponsors
Site Search
Newsletters
[an error occurred while processing this directive]
Archives
Past Issues
Home
Editors
eDucation
[an error occurred while processing this directive]
Training
Links
Software
Subscribe
[an error occurred while processing this directive]

User experience or ‘UX’ is the fundamental pursuit of all human-interactive technology. There is no larger market for human-centric technology than the world’s buildings infrastructure.

The human race first walked out of caves laboring to build structures, places of habitation, environments of assembly, marketplaces for commerce, and the cities to house them — all in the pursuit of a better quality of life. Indeed, the very nature of the built environment and study of architecture is fundamentally the oldest pursuit of “user experience.”

UX in Decline
Today, the buildings industry has a user experience crisis on its hands. We essentially hit “Peak UX” decades ago, and by some metrics, we are in UX decline. A broad survey of “smart building” products shows that our technologies are actually asking more of us — more of our attention and time — not less. However, our professional and personal lives demand systems that “just work” — our buildings technologies should act as agents on our behalf, not vie for more of our limited resources. But in fact, we are more likely than ever before to be spammed by poor quality notifications, disruptive queries into the state of our comfort, and continuous data streams that we have little time to parse. All the while our building control infrastructure hasn’t truly evolved beyond its 100-year-old ancestors — still largely a rag-tag collection of PID loops, thermostatics, static procedural programming, and perhaps a web interface exposing some of this disorganized complexity to the user.

While the building occupant and owner’s user experience is important, the industry is at least attempting to engage with it, even if sometimes the results are misguided. What we don’t talk enough about in the industry is the UX needs of the user experience creators — the de-facto customer of smart building technologies — the contractors. Contractors usually choose and install the product, as well as deeply interact with and support building control and automation technologies. Yet, due to the unique inversion of our industry’s value-chain, where the money at the top rarely has a lens into the hands-on needs of actual installers, our market has underinvested in the on-the-ground, installer user experience. This poor contractor user experience, in turn, undermines the end-goals of building technology — better buildings and more optimized control.

Image1

Black-Box Automation is Out of Gas
Today’s building technology approach is aptly named “black-box” automation. These platforms fundamentally know nothing about buildings, the systems they’re controlling, the physiology of human comfort, or the underlying purpose behind any control action. As such, “black-box” automation is not introspectable — nor can it reason, query, adjust, optimize, or commission. Yet these are the key building blocks of next-gen automation UX — systems that work independently when we are not looking (the end goal), and can meaningfully express themselves when we are paying attention.

Image2

While we’re seeing a constant stream of band-aid solutions that bolt-on to our automation stack with hopes of fixing this old “black box” foundation, this is probably going to be unsuccessful. Whether you are integrating analytics, a new communicating protocol, comfort systems, or cloud-based learning — two facts should be crystal clear. First, today’s “black box” systems know nothing about buildings themselves and have no underlying systems knowledge to extract or to build upon, leaving us with no foundation to support our future needs. Which leads to the second fact: adding this new functionality, limited by the weak foundation, requires laborious effort — thus we call it “integration” not “installation!” The combination of a weak foundation with a low technology ceiling, together with laborious effort requirements, is a core source of so much dissatisfaction in the marketplace.

Impossibility of Success
Our industry is in collective denial that what we are trying to do in buildings with today’s platforms is fundamentally impossible. Applying computer science theory could help us all take a step back. Automata State Complexity states that if we want to control a building (i.e., a non-deterministic system), we’d need 2n sequence states (i.e. using deterministic finite procedural programming).

But that is academic—the more urgent issue is what does this mean for the automation industry and building occupants and owners? For the average building, you’d have to write trillions of sequences to achieve high-quality UX and optimal control. Effectively impossible, right? It is what we call an “intractable problem.” These types of computer science challenges are theoretically solvable — but your team would need to write sequences for the next billion years to cover an optimal control “state space” and therefore provide satisfactory UX under all conditions.

Oversimplification is Actually Causing Complexity
The underlying problem is we have a huge mismatch between the complexity of systems we need to control and the automation tools readily available to model those systems. Take a typical VAV controller using PID loop thermostatics. On the one hand it is conceptually simple. On the other hand, as any automation installer tasked with fully commissioning a building with uncompromising expectations can attest, in reality it turns out to be insolvably complex to optimize for all conditions.

Why is this? A PID can be thought of as essentially modeling the indoor-outdoor state function to the zone damper actuation. However, PIDs can only model a simple curve, and zone states are impossible to model with such a simple regression. The thermodynamics of even the simplest single zone building must account for construction assemblies, variable solar gain, thermal mass, dynamic occupant behavior, and the internal and external thermal drivers of weather and energy use. This system complexity couldn’t be accurately modeled by even 100 PIDs. (And try tuning that!)

A single zone function is only the tip of the iceberg. The complexity compounds exponentially when modeling the zone interactions between 100 interconnected zones, or the interrelationships between equipment and subsystems, distribution and ventilation, occupancy and loads, energy prices, and the occupants’ comfort.

As it turns out, the current method of applying an overly simplified model to real-world systems is actually causing more complexity, not less. This added complexity gets piled on for contractors to juggle — and at the end of the day, hinders profitability.

Fundamentally every project is chasing its tail trying to keep a building properly tuned. Recently I visited a large factory that spends $90,000 every year in automation consultants. A programmer will come in the spring to tweak, tune, and adjust, and by summer, it needs to be commissioned again. This repeats season after season, year after year. As any commissioning agent will tell you, this story is not unique; this is the story of all buildings.

Why AI (alone) Won’t Save Us
While artificial intelligence (AI) will play a large role in the next wave of automation systems, it cannot save us on its own.

The foundations of the next automation revolution won’t be built on AI magically “un-dumbing” today’s dumb boxes. It's not fundamentally possible. There is a myriad of reasons for this.

  1. Every building has a unique design and topology, and a complex set of rules.
  2. if a building’s topology can’t be introspected from “black box” automation, then how can we learn to control the building?
  3. We cannot reliably control systems beyond Level-1 autonomy (i.e., adaptive set-points and PID) without an underlying knowledge of systems operation.
  4. Machine learning can’t self-assemble from information that isn’t there, or labeling that doesn’t exist, or an ontology that isn’t defined.
  5. Without understanding the thermodynamic interconnectivity and fusion of these building objects, sensors, and IoT, we will never be able to solve the real-world complexity of control.

There is no one set of rules that can be trained and statically deployed to all buildings. We can visualize the scale of the problem with the closest AI analogy: a person. It would be like having a child (who is far smarter than current AI) put in charge of controlling the building without the necessary context of building design, system topology, understanding of mechanics or physics, metrics of cost and performance, or knowledge of operational outcomes and conditions — nor the ability to inspect or discover this information from the outside looking in. The reality is whether AI, child or the world’s smartest physicist, it is unsolvable — you need something more.

A Paradigm Shift: Deep Digital Twins
How do we change the approach to automation to reflect the reality and complexity of real-world systems? How do we solve our site integration challenges? How do we address the UX needs of both occupants and installers, while also enhancing the UX of each actor in the market value-chain? Finally, how do we enable a user interface to our platform that provides self-managing, autonomous deep insights?

I’m going to introduce a new concept: Deep Digital Twins. You may have heard the idea of digital twins, conceptual evolution of CAD and BIM — basically 3D models of systems or buildings that have some degree of labeling, or applied object-level identification.
 
What are Deep Digital Twins? They build on the base concept of digital twins, embedding much ‘deeper’ information about the systems and objects being described. Functionally, they act as virtualized analogs of real-world objects, like zones, equipment, systems, and the physiological agents of human-comfort. Because they are built on a physics-based ontology, these analogs aren’t just labeled, but actually understand what ‘kind' of thing they are. The term ontology comes from the philosophical study of 'being,' and is used by computer scientists to describe computing systems that can introspect. Ontology is a framework that comprises the technological 'nature of existence' for an object in the world. The ontology, for example, provides a control system with the understanding of the fundamental physics of operation, how that operation interacts with the world around it, how its internal physics is organized, how the object interfaces with controls, the physical parameters of operation, and the meta-semantics of operation. The 'meta-semantics' of operation is the ontology translated into language or protocol.

Image3

Physics-Based Automation
Deep Digital Twins enable a new type of automation platform, built on the fundamentals of physics, with the guarantees that come with it. They solve real-world control challenges by directly modeling the physical complexity of actual systems — without oversimplification. And yet using Deep Digital Twins is simple precisely because each twin maps directly to the physical object we want to control without the heavy abstraction of PID, pre-canned algorithms, and static sequences.

Because Deep Digital Twins directly model buildings and systems, they solve today’s control theory mismatch struggle. Each of these twins models a single building component or piece of equipment on a one-to-one basis, creating a more useable model, highly immune to the inherent instability of oversimplified algorithms like PID, which are very poorly-fit regressions of the real world.

[an error occurred while processing this directive]If your system has a boiler, you just add a boiler twin to your system design — no further algorithm development required. The digital boiler twin is defined by its own physics, as are its automation requirements — enabling fully autonomous control. When multiple twins are linked to each other in a schematic diagram, these digital analogs model not only the real complexities of systems but also their emergent behavior, as such, they can automatically infer behavior and introspect results, failures, behavior, and ageing — even when sensors don’t exist.  Using this inference ability, an autonomous control platform built on Deep Digital Twins can self-commission, automate point-mapping, validate wiring, and provide continuous system measurement and verification against its original design.

When Deep Digital Twins are used to simulate the physics of buildings, the building automation system can test the future outcomes and costs of each control decision. When doing so, it forms a new type of control loop based on the virtual “beta-testing” of real-time sequences before applying them to the building itself. This “future-forward” control loop enables buildings to think, then act — rather than simply react.

Systems-Based Intelligence
An autonomous building platform utilizing Deep Digital Twins is built on systems-level control theory. At its core, this type of platform works on the sensor-fusion and control-fusion of digital models. It understands the interconnectivity of zones, equipment, distribution, systems, energy, and the sensors connected to these components. In today’s world, we are seeing exploding growth of IoT and sensor tech, smarter devices, and connected equipment. Yet what we lack is the organizational intelligence to make use of it all — Deep Digital Twins solve this challenge.

Changing Automation UX
Deep Digital Twins enable a user experience revolution. It makes previously impossible control solutions possible. For installers, it vastly simplifies how we design, program, automate, install, commission, maintain, and manage buildings — while providing stronger guarantees that deployments match physical reality.

These digital analogs enabled us at PassiveLogic to reinvent the UI/UX for automation engineers and contractors, allowing us to use familiar building and system drawings (documentation every contractor likely has, already draws, or can easily produce) to “program” our building automation engine. Introspecting the physical requirements, the system can automate the process of automation. The platform then self-assembles, generating a control system design for the building, wiring and point-mapping, user interfaces, analytics, and system-wide self-commissioning — as well as real-time autopilot control. 

Image4

For building occupants, Deep Digital Twins manage comfort based on actual human physiology, not solely air temperature from the nearest thermostat — solving for each individual. With an inherent systems-based knowledge at its core, the automation engine can also express insights and data fusion in human-terms, without relying on manual interface development designed to “interpret” incomplete data streams — because it fundamentally “understands” the building.

The UX of Everybody: Shifting The Value Chain
The UX revolution opportunity extends well beyond the users who have a direct touch on the building.  The building value chain has a wide array of external ‘users’ who benefit from Deep Digital Twins, starting with the architect, designer, and engineer who all want design guarantees that the building will preserve their intent. The contractor would like a worksite that digitally check-points the technician’s wiring, ensuring proper point mapping. The commissioners get built-in commissioning tools to validate the system. Managers and maintenance teams automatically receive detailed notifications, insights, and analytics of underlying operation without integration effort. And utilities receive a true demand-response system and auto-validated buildings for demand-side management programs.
 
Because Deep Digital Twins are a foundational concept with pervasive industry implications, the U.S. Department of Energy invested this year in PassiveLogic to develop an industry-standard format for Deep Digital Twin interchange. They live with our buildings from cradle to grave, ensuring buildings are not only optimally controlled but also meet our design and verification expectations and communicate in a universal language. This will allow every building user to reap the benefits — no matter if your role is an installer, contractor, architect, engineer, REIT, ESCO, utility, or occupant. 

Image5


About the Author

Troy Harvey is the CEO of PassiveLogic, a company developing the future of automation, built on digital twins from the ground up.


footer

[an error occurred while processing this directive]
[Click Banner To Learn More]

[Home Page]  [The Automator]  [About]  [Subscribe ]  [Contact Us]

Events

Want Ads

Our Sponsors

Resources