The
evolution of building and IoT automation is placing a strain on the
demands of installers, engineers, and manufacturers of equipment.
Growing customer requirements, and the fast moving technology
interplays between buildings, occupants, energy, and the business
processes within these structures, are adding to already overburdened
requirements on automation systems.
We as an industry need to come to grips with the fact that building systems are the
world’s most complex automated systems. Once we do that, we can then
address our systemic problems. Even the smallest buildings easily have
thousands of I/O points — or what we’d call degrees of freedom in
robotic analysis. In large buildings the I/O points can exceed hundreds
of thousands, and with the growth of the IoT industry, the complexity
is only growing. Only once we give buildings their due respect against
comparative cyberphysical systems like autonomous vehicles, Mars
rovers, or industrial robotics, can we start the conversation on what
we do to address the complexity.
In
addition to managing this rising system complexity and evolving
customer demand, there is exponential growth in the diversity of
applications and use cases We are exhausting our tools with
workarounds to solve this exploding complexity. We are asked to
model not only the HVAC systems, but the architectural and engineering
workflow. We need more than tags, more that labels, more than
interconnections. We need not only to describe hydronic and air flows
between mechanical equipment, but the data flow within and between IT
and IoT systems. We need to connect not only the building systems to
the structural elements, but also the interconnected business systems
within — whether that is the processes of occupants, logistics,
manufacturing, energy, or any of the myriad services we are currently
being asked to integrate with the building.
What we need, and why we need it now
The
Quantum Digital Twin standard introduces the market’s first digital
twin approach that can address not only the straightforward use cases
of describing equipment, their interconnections, and their control
interfaces — but also provides the pathway to fully autonomous
buildings. Here at PassiveLogic, we are primarily focused on full
autonomy, and we required Quantum to usher in this next generation of
technology. Quantum’s investors at the Department of Energy needed a
unified building description model for interchange and smart grid
systems, while many of our industrial partners simply needed to wrangle
existing IoT networks or analytics systems with a complete model. None
of this is doable using any of today’s semantic standards, nor other
digital twin approaches that largely resemble “BIM… but in the cloud”.
Quantum addresses these market use cases (and more):
- A complete building, systems, and HVAC description language
- A model-based control description language
- A single workflow from design, engineering, installation, commissioning, operation, and management
- An analysis definition for retrofit, historical, and new systems
- A generalized AI definition language for real-time automation and analysis
Most
importantly, Quantum balances the needs of complexity with the
requirements of implementation simplicity and democratizing automation
to a broader set of users. We’ve built graphical tools for building
Digital Twins, commissioning systems, site discovery of existing
topologies, design comparison, and building custom autonomous systems
to support the workflow. These tools are currently in private beta, but
will become available for use later in 2021 — for free.
Sematics vs Ontology
In
the first part of our series, we set apart the approach of semantic
systems from true ontologies. This is worth decomposing because the
history of the terminology is messy. Tim Beners-Lee had a vision of the
“semantic web” way back in the 1990s, built on concepts of formal logic
and linguistic theory. The concepts were strongly related to machine
learning technology of the era. Technologies like knowledge graphs,
Prolog, and Lisp — things that in a post deep learning era we now call
“traditional machine learning.” While valuable in defining formal
semantic systems, this early approach mixed up the usage of the terms
“semantics” and “ontology”, in part due to the reference technologies
of the time. After all what was state of the art ML in the 1990s were
expert systems largely built on semantic concepts.
While
this history is fruitful for endless nerdy discussions, its semantic
orientation has affected the course of technology development by
affecting the framework and goals of many standards. The underlying
framework of existing and currently proposed building standards are
largely semantic standards. They ask the question “what is my name?” In
contrast, a true ontology asks “who am I?” One is a linguistic
question, the other an existential one.
So
what? Simply put, if I know you have a “pump” in English, I can label
it. If we both agree that pumps are labeled “pump” and have a format, I
can tag it (e.g. Haystack). If we agree on an interconnect scheme I can
define a system topology (e.g. Brick). Yet for all this effort, my
system still doesn’t know what a “pump” actually is or what it does. And without this you can’t autonomously control it, optimize it, or learn on it.
Physical Fundamentals
In our previous article we introduced the concept of the “Gravi-Keister-limitor”.
The idea that a chair is not a “chair”, but a device to keep your butt
from colliding with the ground due to the force of gravity. The concept
of Gravi-Keitser-limitor describes an object in existential terms. It
doesn’t matter if its semantics are “chair”, “throne”, “chaise”, or
“bench” — they all play the same role in the universe. The Quantum
standard defines a component’s existential purpose by coupling its
actor roles and substance quanta, supporting this coupling with the
physics governing them. If you want to know anything’s purpose or how
to control it, physics is the meta language from which most other
questions can also be derived.
The importance of Actors
So
while Quantum includes familiar concepts of assemblies, equipment,
components, and properties, it adds the important concept of Actors.
Actors are the role a piece of equipment takes in any system. It turns
out, there are only 9 roles in any describable system. Take a buffer
tank. It’s a store. So is a battery, a sand bed, and a flash drive. A
transport on the other hand, moves substance from one place to another.
Pumps move water, fans move air, and conveyers move boxes — yet they
all do the same role within their respective systems.
If
a system understands what Actors are, it can discern the purpose of any
equipment, and how to orchestrate a system. More importantly, the
system can do this regardless of application or system complexity.
Using an underlying systems theory one can define how any set of actors
in any configuration can be operated and be controlled. But perhaps
more importantly, it answers the key question for equipment (or its
proxy): “who am I?” If equipment can answer this existential
“machine-2-self” question, it can also answer the simpler ones of
machine-2-machine and machine-2-human. Those other interactions become
just semantic downcasting.
The role of Quanta
The counterpart to the actors are quanta. Quanta
are the packets of substance exchanged between actors. They are
quantized so they can be operated on. Quanta can be thought of as
packets of continuous flows, or discrete packages. So to expand our
previous example of transports, what makes a pump unique is it deals in
the flow of liquid quanta, a fan transports air quanta, and a conveyer
belt deals with box quanta. Actors are the processors, and quanta are
their currency.
Quantum Create is the tool for building custom Digital Twins in the Quantum format
The evolution of AI
Today
semantics only provide labeling. You can collect labeled data in data
lakes. But then the long slow process begins to mine that lake of data
for information. Machine learning is an afterthought for someone to
figure out in the future. After all, the data is labeled… right? The
problem is you can never reconstruct what was not there is the first
place. Today’s data from BMS and control platforms is so sparse
that there is little to be learned or gained no matter how good the
model. Even leaving aside the question of how you are going to label
the data in the first place (a subject for the next article in this
series), the utility will be limited.
Unlike
the traditional ML of the semantic web days, we are in a post deep
learning world. This post deep learning world is a radically altered
technological landscape from the early semantic web. However, deep
learning has been of limited utility in the generalized control world,
beyond simple schemes to optimize algorithmic control.
At
PassiveLogic we’ve built a new AI technology called Deep Physics that
reimagines neural nets as a heterogeneous network of Actors
interconnected by Quanta. It this new world, Quantum literally is the
AI, directly computable by a Quantum AI Engine. We just proved a new
milestone in March with Quantum, benchmarking 10 Million times faster
optimization than EnergyPlus with GenOpt.
Autonomous systems, topology definition, and interchange — with one format
Quantum
was designed to scale across the diverse set of use cases we encounter
in buildings, the business processes within, and the energy networks
that interconnect them. It was designed to scale, from simple use
cases, yet maintaining the descriptive power to enable the next
generation of applications in the largest of interconnected IT and IoT
control applications.
In
the simplest deployments, projects may only need to structure data for
an IoT network. In retrofit installs, you may need to capture and
control an existing topology — whether a 30 year old analog systems, or
a pure IP network. In new construction you might need to automate a
whole workflow including engineering, install, commissioning, and
analytics. And in innovative edge AI systems, you need the complete
descriptive power to achieve full autonomy.
But
the world is more than automation and IoT networks. The digital
building is evolving beyond building systems. We must also model
business processes, manufacturing, process control, and the like.
Today, we are being asked to co-manage and monitor the digital and
physical goods and services flowing in and around the built environment
— as a system. And beyond the walls of the building, Quantum was also
developed to be a portable format for smart grid systems, district
systems, demand-response, and real-time peer-to-peer energy grids.
But
given all these requirements, how do you enable a larger audience in
building systems? In a word: tools. Bridging the divide between
simplicity and fully featured AI solutions can only happen with high
quality tools.
We
built Quantum Create as a visual application suite that integrators,
engineers, manufacturers, and service providers can use to build custom
Digital Twins in the Quantum format. Currently in early private beta,
Quantum Create will be freely available later this year.
With Quantum as the backbone, complex building topologies are simple to design, build, operate, maintain, and manage.
Stay on the lookout for part 3 of this series, where we will dive into
the different applications of Autonomy Studio in greater detail.