December 2013
Interview

AutomatedBuildings.com

Innovations in Comfort, Efficiency, and Safety Solutions.
Belimo

(Click Message to Learn More)



 

Darrell SMithEMAIL INTERVIEWDarrell Smith and Ken Sinclair

Darrell Smith, Director of Facilities & Energy, Microsoft

Darrell Smith is the Director of Energy and Building Technology for Microsoft’s Real Estate and Facilities group.  Darrell oversees Microsoft’s Global Energy Strategy and Programs for Microsoft’s 34 Million square foot real estate portfolio.  In addition, Darrell is accountable for programming and deploying the Smart Building Technology at Microsoft’s 15 Million square foot headquarters campus in Redmond Washington and is chartered with broader deployment across the global portfolio.  Darrell has 15 years of industry experience in Facilities, Data Center Operations, Energy Management, and Manufacturing.


Redmond Operations Center (ROC)

We are connected to two million data “points” across 35,000 building assets, and over a 24 hour period, we collect 500 Million data transactions every day.


Articles
Interviews
Releases
New Products
Reviews
Control Solutions, Inc
Editorial
Events
Sponsors
Site Search
Newsletters
Securing Buildings News
Archives
Past Issues
Home
Editors
eDucation
ABB
Training
Links
Software
Subscribe
Secured by Cimetrics

SinclairWe have heard of Microsoft’s Smart Building effort at your HQ Campus, can you provide more context on what this is about?

Smith:  Our HQ campus in Redmond, Washington has the same scale of a small city.  The Campus consist of 15 Million square feet, 125 buildings and 58,000 housed personnel.  We are connected to two million data “points” across 35,000 building assets, and over a 24 hour period, we collect 500 Million data transactions every day. Historically we have not had the ability to leverage this “Big Data” to optimize our Campus.  In fact, our energy and engineering reporting had been a manual process (clipboards and spreadsheets).  Instead of a costly, intrusive retrofit, our project was anchored in using software to extract the information across multiple disparate building systems (multiple protocols) to make data-driven decision. Aggregating this data onto a common platform was seamless to the occupants and we added no hardware or sensors. We are using the operational information from the building’s systems to optimize our portfolio.  With the data in hand, my Team can take action to reduce energy consumption, optimize building assets and improve labor efficiencies (reduce windshield time).   After testing multiple-vendor’s solutions across 2.6M square feet for a year, we were excited to see that the technology could meet our requirements and there was a significant return on investment.   While the vendors performed well, we selected the vendor that best met our requirements Iconics, and we are currently at the conclusion of the deployment. We are in a private cloud (running on Hyper-V servers) and this allows us to optimize the IT infrastructure and further drive down our energy consumption (and carbon footprint) which supports Microsoft’s commitment to carbon neutrality.  

SinclairWhat are the complexities of an integration at this scale?

Smith: If we could not expose the data across ALL our building systems, our Smart Building effort would have had less impact.  I think of two activities with regards to building data, encapsulation and conversion.  Encapsulation is about transporting the data, and conversion is translating the different languages (protocols) to a common language. For the majority of our building systems, they were BACnet and we were able to consume the data in a secure way without the need for any conversion.  However, with one system, there was some effort required.  In the past, when systems were installed, there was not a need (or tools) to use the building data.  In fact, building owners may not be aware of what specifications to ask for with regards to building management systems (BMS).  While I would say the Industry is moving to open standards, our pilot required the smart building vendors prove they can consume the data from a variety of protocols. Another important area of integration is the building assets naming convention.  We needed to do clean-up on the naming convention for a couple of buildings.  Luckily, the majority of our naming convention was standardized early on, and we have a facilities system business rules (FSBR) process that includes the reviewing all projects meet to assure our standard are met.  Within this process, we have a CMAT table that informs the contractors of what system point we want to control, monitor, alarm and trend.  This investment assures consistency across our portfolio and is invaluable to the Smart Building integration. This standard proved invaluable part of the project. 

SinclairWhat were the elements of the onboarding process?

Smith:  There are three steps to the onboarding process, integration, configuration and the adding our fault rules.  This project has provided me an appreciation of integration practice, and in a standard deployment, the integration role is typically provided by a third party integrator.  We leverage CB Richard Ellis (our Facilities Management firm) to provide the integration function.  The overall process consist of completing a load sheet, and the software vendor (Iconics) provides the configuration.  The next step is our engineering team adds the fault rules and cost savings algorithms. What may be surprising was that the initial deployment team consisted of five individuals and the vendor.  The team members included a project manager, two controls engineers, a mechanical Engineer, and an IT Manager.  As we went through the process, our rhythm was onboarding two buildings a week.  We are now focused on operationalize the solution.

SinclairWhat capabilities did this provide that you lacked before?

Smith:  I’m excited about the vendor’s ability to build on Microsoft technology and the ability for the software platform to provide new and exciting capabilities.   For example, the vendors are building on .NET, SQL, Windows Server, Office 365, Azure, Bing Maps, and as Microsoft comes out with new fetchers, the vendors can building new capabilities to into their products.  It’s important to note that while we are providing a software layer across multiple building systems and taking the data to a common platform, I still need the Controls OEMs to build solid solutions.  That said, one of the capabilities that is driving down a significant amount of energy consumption is Fault Detection.  Fault Detection has been around for several years, and we are using the software to “cast a net” across the campus and detect when equipment is either not running as designed or when system set-points have been made outside our standards.   When we integrated the first building, we fixed all the anomalies we found. We then realized, at this rate, it would take us 10 years to complete the deployment.  We shifted to the 80/20 rule and focused on the faults that either had the highest energy savings or had the largest risk to the business. From a work stream standpoint, the value of creating “rules” and cost savings algorithms is to prioritize our work.  It was not about hiring more labor to address the faults we were finding, the “tool” allowed us to prioritize our work based on energy wasted (costs) or business impact (unplanned downtime) and 48% of the faults we were finding could be corrected in 60 seconds or less.  For the other 52%, we had to physically make a repair. For an example of a common fault, a terminal unit has the damper stuck open and is overcooling the space and the heating side comes on to compensate for the cooling.  This condition is known as “simultaneous heating and cooling”.  In most cases, the occupant is unaware as the space is within designed set-point and we (maintenance) are unaware because this is not an “alarm” condition. These types of faults are invisible and can occur frequently. When faults occur in larger assets, this can lead to higher energy waste.  Another key benefit is the reporting capability.  While some of the systems did have the ability to create trend logs and extract the data, it was not consistent across all systems, typically was a manual effort. With the software, we can automate asset performance reporting.  Our air-handler (AHU) report looked at how the system performed over time and across seasons and it could take 4-6 hours to create. Now the same report can be done in two minutes and we have 750 AHUs across the campus. If it takes 4-6 hours per report, it would take many years to complete the reporting for all units across the campus, now the reporting can be completed in hours, not years.

SinclairWhat was the business case?

Smith:  The “tool” is changing how we manage our campus.  Having this data at our fingertips is allowing the ability to commission (or “tune”) the buildings in real time.  This allows us to save energy immediately and focus our labor more effectively.  Before our rhythm was to retro-commission (or tune) each building once every five years. This was a manual process and covered 20% of the campus each year and tuned 200 assets. This resulted in ~$250K of energy reduction each year. Now, because we have the data at our fingertips, we can automate the process and commission the entire campus in one year, and instead of touching 200 assets, we can address 35,000 assets.  Once a building is commissioned, we have optics in real-time, and if the building’s assets (mechanical systems) start to run out of “tune”, we catch it as it occurs. The power of software and leveraging our Partner eco-system provides tremendous efficiencies and I feel is positioned to change the Facilities Management industry.  The result has been a $2M reduction of energy in the first 12 months after the deployment was completed, an ROI of less than 18 months for the program. Note: Washington State is the third lowest energy cost (per unit) in the US.  If we can achieve an 18 month return here, image the return in a State with a higher cost of energy.  The tool also provides persistence.  After we “tune” a building, if set-points are made that degrade performance they are flagged and corrected.  We don’t wait five years for the next retro-commissioning cycle, and this is huge for me.

contemporary SinclairHow would you characterized the solutions on the market today and what is your call to action?

Smith:  When I first started my research in this area in 2009, my concern was the solutions on the market were not mature enough and we would need to pause our procurement process until the market matured.  After completing our gap analysis with Smart Buildings Inc, we applied our principles: the solution has to be beneficial, practical, and scalable, supported.  The good news is our pilot project proved there are several off-the-shelf solutions that met our principles.  For others who want to pursue this path, my recommendation is to select a strategy/solution that meets what you’re trying to solve for (one shoe may not fit all).  There are Software and Service (SaaS) solutions like Ezenics where the building system data is sent to a “cloud” and they provide the engineering and analytics for their customers.  Other solutions include on premise solutions like Iconics where the software is installed within the company firewall.  Not only are there different network architecture solutions, the depth of the solutions varies.  For example, there are cost effective solutions, like SwitchAutomation, who is a cloud based solution that provides analytics on the utility and weather data.  As part of our ongoing testing of market solutions, SwitchAutomation integrated over 180 of our utility power meters in two days, pretty impressive.  I’m very fortunate to the extent that I have a 15 million square foot lab and the latitude to experiment.  My call to action is for companies, building owners and building operators to take action, do something.  There is a range of solutions on the market today and consultants who can assist with setting strategies and tactics. We are at a time where taking no action is no longer acceptable.

Further related reading:

88 Acres How Microsoft Quietly Built the City of the Future

http://www.newdaedalus.com/articles/2013/3/6/energy-and-the-microsoft-roc.html
http://www.greentechmedia.com/articles/read/microsoft-takes-a-step-toward-programming-the-city-of-the-future
http://www.realcomm.com/advisory/advisory.asp?AdvisoryID=579
http://smartcitiescouncil.com/article/microsoft-smart-campus-makeover-saving-millions-energy-costs


footer

SkyFoundry
[Click Banner To Learn More]

[Home Page]  [The Automator]  [About]  [Subscribe ]  [Contact Us]

Events

Want Ads

Our Sponsors

Resources