Babel Buster Network Gateways: Big Features. Small Price.
Recipe to Create Smarter Buildings?
Did I just come up with the recipe to create Smart Buildings?
The Building Whisperer
recipe to create smarter buildings?
Did I just come up with the recipe to create Smart Buildings?
smart buildings from existing buildings and letting others innovate
with them in a secure, standardized and easy way. Isn't that what it's
one. Scan the asset, (building, ship, car, etc.) in order to create the
shell of the 3D copy to hang up the objects in a contextual database to
understand relations. Markup where the sensors go in the digital twin,
install them in the asset.
the sensors on, and you'll have neurons which can deliver information
through neural pathways, straight into the digital twin database. The
brain can exist both in the building and outside (edge and cloud).
two is all about connecting existing equipment, with a bolt-on,
autodeployable connector library, just sticking them in the ground like
planting seeds, and that will populate the digital twin even more with
data from existing equipment in millions of messages per second.
Schema, tagging, can be added on the fly.
is part of the IoT/IoB first strategy because existing data is hard to
get out from existing equipment for various reasons. Not necessarily
approach ensures smart city inclusiveness and the definition of
What do you think?
This was originally
posted on Linkedin in my personal feed
on the 25th of August. Since then, it has generated about 5000 views,
42 interactions (likes, etc.) and over 60 comments and counting.
discussions and I will try to interpret the findings from the comment
section in this article. And if you have some minutes over, I think
it’s well worth it to go in and see some of the comments that I got.
Please, weigh in what your interpretations are and help me become
better! But here goes -
Creating smart buildings from existing buildings and letting others innovate with them in a secure, standardized and easy way. Isn't that what it's all about?
The reason why I want to come up with a smart building recipe for
existing buildings is fairly simple. I think most of us have heard
about the “45% of the worlds total energy consumption” – phrase and
that we spend 90% of our time indoors, where lighting, IAQ, noise,
humidity, indoor climate in general and tenant well-being is super
important. Yes, all of this might happen if we build smarter, and it’s
much easier to get money for initiatives in the construction phase. But
I don’t care that much since I’m all about working in the present. I
want to change all the buildings in the world NOW.
And that this is done with the help from others, in a secure,
standardized and easy way. Collaboration, open standards, the New Deal
for Buildings, focusing on tenant experience and users as the north
star, all this stuff. Isn’t that what it’s all about? I think so.
Step one. Scan the asset, (building, ship, car, etc.) in order to
create the shell of the 3D copy to hang up the objects in a contextual
database to understand relations. Markup where the sensors go in the
digital twin, install them in the asset.
This was probably the most controversial one of them all. The critique
was mainly around two things.
I had a great exchange here with Rune Winther who’s also written an
amazing thought piece regarding 3 types of
characteristics and 6 types
of components for smart building descriptions. I think it will be
published in some shape or form here at automatedbuildings.com as well.
Rune questioned if the first step would be to deploy technology if you don’t have an idea what to use it for. And that sensor put in one place, might not serve a business case of the tenants. But they might be in the right place for the cleaning department, or for someone else. So basically, what and for whom are we creating value for?
agree with this a lot. I’ve been in dozens of workshops and Proof of
concepts and also played the waiting game for technology, money,
business case, others should do it first, etc. etc. But I also think
that seeing is believing. One of the major things when it comes to
technological initiatives (IT-projects in general) is that it’s
difficult to get management buy-in. Where is the ROI’s? WHERE’s the
pre-study? And this is tricky and an article on its own.
conclusion to this conundrum was to say that it depends. And I would
recommend a combined approach to include technology in the workshops,
dealing with live data from sensors, and why not just scan a portion of
a building to have something to show the participants of what can be
done. If data is coming through and someone from the tenant stakeholder
corner says “But this doesn’t show me how warm it is by my desk” –
you’ve gained an insight based on data, not speculation. And if someone
says, “this is what I need exactly,” then you know that you also are on
to something. And everyone will see that it’s so damn easy today to get
started with something that doesn’t involve digging into existing
This one is also very interesting. If you haven’t had the time or haven’t seen it, I was just recently in a podcast discussing some of my thoughts regarding Digital Twins and my overall perspective on all things smart. You can find it here in my Linkedin feed and also here at the Spaces and Places podcast from Site1001.
Starting from the end, let’s say 10 years from now, I hope that most commercial real estate portfolios will be digital twin enabled. Judging from the go-to self-written guide of the BB-Cycle I think that it will take more than that, but I do think it will be the end-game so why not start with it right now. What I mean with the scan is just to go in, 3D scan the floor, the entire building, the room, and get the point cloud together. That’s fairly cheap and will only take a couple of hours if you keep it defined.
is of course but a fraction of the true digital twin, where
everything is made into 3D objects, stacked with metadata, and acting
as the universal database for everything regarding the building. It’s
not that, by far. But getting a 3D copy of the physical space will
allow others to understand the lay of the land much easier.
just want all of this data to be contextualized in a 3D shell so that
everyone can relate to and understand where sensors are from a “normal”
perspective, instead of having to understand it from a technical
1. Image of traditional
dashboarding without digital twin thinking
2. Overlay of existing data in a digital twin
two pictures that I got from Digital Twin expert Ken Olling at
Sekai are showing the exact same things. I find it much easier to
understand picture number two, and it also has added benefits of
troubleshooting much faster, probably also education, with AR/VR
overlays, and to find root causes in a much simpler way. Of course, all
of this is built upon a massive database in the future. But, to start
with, I just want to scan the asset and get the stuff in there because
I think it will lead to easier management buy-in.
A digital twin is much more than just these pictures, and it requires
an investment that might not have an immediate ROI equation today. But
Seeing is believing.
Turn the sensors on, and you'll have neurons which can deliver
information through neural pathways, straight into the digital twin
database. The brain can exist both in the building and outside (edge
Inviting a sensor company they could go in, mark up the places where
the sensors should go, ship them on-site, and someone could just go and
drop them out and have them communicate with the local control gateway
and get data streaming to a visualization platform immediately.
Dave Lapsley from the UK company Econowise Group put it nicely in one
of his comments to another Linkedin post, from Phillip Kopp at
Conectric discussing the need for 2100x amount of data than what is
usually occurring today.
I paraphrase a bit, but it’s a great comment:
“Just walk into a building with a box of sensors, fit them where
required. Switch on a tablet, and it becomes an edge server for BACnet,
Modbus, Mbus and Sensors, Drag and drop to create visually stunning
Dashboards complete with real-time query engine, edge fault detection
and custom reporting”
Quite easy to be done today and with the benefits of Open standards and
defined API’s you don’t really have to be locked into a vendor unless
you really want to. There are pros and cons with everything where open
doesn’t necessarily mean better or insecure. But proprietary doesn’t
necessarily mean worse, or secure. It always depends.
Step two is all about connecting existing equipment, with a bolt-on,
auto deployable connector library, just sticking them in the ground
like planting seeds, and that will populate the digital twin even more
with data from existing equipment in millions of messages per second.
Schema, tagging, can be added on the fly.
This was met with some skepticism, and that I probably missed a step or
two in between. And I agree to some extent. The challenge that I see it
is to connect to the existing and make the new and the old work in
Am I talking about technology? Yes.
Am I also talking about the new and the old as in OT and IT (and IoT)?
And am I also talking about the old(er) generation working alongside
the Young(er) generation? Yes.
And am I being overly annoying right now? Yes.
But this is the thing. How can we enable these worlds to come together,
right now, and utilize the best of all worlds in order to move faster
to value creation in a collaborative way? To play to each other’s
strengths and work with eco-systems thinking and joint ventures? But
I wrote all about it in my last article, discussing the need for streaming platforms, based on connectors to all of the existing data sources. It’s the last segment before the conclusion. Basically, this allows anyone to connect to proprietary data sources and databases, get the data out in an event streaming way. You can sit behind Scada systems, OPC UA, Modbus etc etc and get the data out in a way that meets the most rigorous IT-standards today. Linkedin, Netflix, Financial institutions, Healthcare, everything today is being built with Kafka Connectors and event streaming in mind. So why should building automation be any different?
This is part of the IoT/IoB first strategy because existing data is
hard to get out from existing equipment for various reasons. Not
necessarily technical. This approach ensures smart city inclusiveness
and the definition of sustainable buildings.
The IoT (direct to the cloud) is something that I don’t believe much
in. I do believe in wireless sensors and internet of buildings. And
also in the sense that Internet of things will be there, in that
everything will be connected. But not directly to the internet. Local
control, security in mind, that’s where it should be.
I’ve found it hard to get data out from existing assets due to people,
vendor lock-In and closed mindsets etc. etc. Maybe I just have been
working with the wrong people, who knows. But I want to have the
control. Or at least, be in a position to have alternatives and to
allow customers to own their own data without having to beg, pay, ask,
search for it or a combination.
I believe in the one API approach in that it is the real estate owners
who should be in control and have the keys to their buildings. Not only
the physical keys but also the digital keys when it comes to allowing
and enabling access to the digital twin. Security, allow who goes in
and out of the digital twin and who can access the BMS systems. This is
all part of sustainable buildings.
Because as I’ve written before, a building and its solutions have to be
robust, useful and attractive for its users at all times. And when the
scope grows to people and companies from the IT realm, and possibly
stakeholders that haven’t before been included in the BAS/BMS
conversations. Products need to shift accordingly and be equally
robust, useful and attractive for new stakeholder.
Robust in this sense, might mean modular to pass the test of time, and
to ensure smart city inclusiveness in that products, and buildings, are
part of a bigger whole. It’s about technical interoperability, semantic
interoperability, organizational interoperability, legal
interoperability and people, processes, culture, hierarchy and
IT/OT/IoT somewhere in the middle.
It’s indeed a paradigm shift we see here. I recently listened to the control trends podcast, episode 315, featuring Jim Young
from Realcomm conference group. If you haven’t listened to it/watched
it, do it now — fantastic insights into the past, the present and the
it easy to do all of this on your own? No. Impossible. Is it easier if
you go at it together with others? Most likely. It’s a joint effort
from a lot of different disciplines, and it will be an era of ferment
for many in the years to come!
Summary and Conclusion
what can I say? Go agile, work in sprints, collaborate, show
value quick and fail fast. It’s better to do something and be wrong
than just to sit around waiting for the right technology or stages in a
vetting process. It should be more about PEOPLE and to understand that
technology is just technology.
It’s so easy to get stuff going today, and there’s so much value to be created that it’s mind-boggling. I love this industry, the people in it, and I’m looking forward to the coming years. Just go out there and do it and start listening to what the buildings are telling you. And if you don’t know how just let me know and I’ll try to help out!
/The Building Whisperer –
[Click Banner To Learn More]
[Home Page] [The Automator] [About] [Subscribe ] [Contact Us]