May 2012
Article
AutomatedBuildings.com

[an error occurred while processing this directive]
(Click Message to Learn More)


Wasted Data

A wasted opportunity.

Peter Chipkin
Control System Specialist
Chipkin Automation Systems Inc

Articles
Interviews
Releases
New Products
Reviews
[an error occurred while processing this directive]
Editorial
Events
Sponsors
Site Search
Newsletters
[an error occurred while processing this directive]
Archives
Past Issues
Home
Editors
eDucation
[an error occurred while processing this directive]
Training
Links
Software
Subscribe
[an error occurred while processing this directive]

Introduction

By using the Cloud to break the boundary between Building Automation (usually a local issue) and the cloud we can increase the value of the automation projects to the Enterprise.

At the local level we care that the temperature and air quality in a lecture hall, for example, meet the requirements of the users and respond to local commands like changing the temperature set point.

At the cloud level we care about the energy consumption, service interruptions caused by failures, managing consumables, policy monitoring and enforcement and the aggregation of data for comparative and management purposes.

While the cloud goals will include the local goals they will always be different and far more comprehensive.  There is a huge investment in the gathering of data at a local level but often all that is done is to lock down this data and keep it local.

This article looks at how enterprises can use this data, how they can benefit from this data and some technologies for addressing this issue.

There are 3 parts – 1) Why the cloud 2) How – an example device 3) Webby jargon demystified.

CAS Data Clients

CAS Data Clients gather local data and serve the data to the cloud for enterprise applications, or serve rich graphical web pages like these directly to the cloud and local browsers.

Image represents a visualization served by the cloud of data gathered at a local installation.  The cloud, enterprise and local view can all be the same. Image courtesy of dglogik.com who power the CAS data client web pages.

Part 1 – The uses and benefits of sending local data up the cloud chain

Centralized Monitoring and Operations Management

Using the cloud, enterprises can implement centralized support and operations systems because they can remotely monitor and command local installations. While many benefits are obvious consider the effect of having operations staff learn from the experiences at multiple sites rather than their local site; the ability to trend issues at multiple sites learning that problems are global not local and more.
 
Energy Management

Depending on your locations your enterprise might need to co-ordinate its energy consumption across multiple sites to optimize its costs and to ensure that as an enterprise they don’t exceed load allowances and peaks.  Without cloud data collection, how hard is it to compare energy consumption between locations, how hard is it to adjust for other factors like outside air temperature etc. So much of this data is already available at a local level that it is easy to provide it to your cloud.

Energy Management 

Image by Electrical-Equipment

References and links:
Virtual Building Energy Management – Automated Buildings.
Office of Energy Efficiency – Govt Canada 

Policy Monitoring and Enforcement: The cloud is the medium
 
Without the ability to compare variables at different locations you cannot identify locations which provide examples of excellence. Without the comparison you can’t easily identify which operations should be emulated at other locations.  Comparison allows the identification of outliers – poor performers and centers of excellence.  Outlier identification is a critical management operation for enterprises.

By providing centralized and aggregated data using the cloud, enterprises can ensure that local sites or subsidiaries are complying with operation and other requirements.

[an error occurred while processing this directive] There are some things we know we don’t know
 
15 years ago the profession of data mining did not exist. We didn’t have the data to mine.  Tons of aggregated and historical data provides the opportunity to learn something new and to use it in new ways.  Collation and Correlation tools provide opportunities to see things in new ways and observe patterns and trends that the Enterprise may not even know exist.
 
Larger Sample of Data

Larger data samples allow enterprises to look at issues like Mean Time Between Failures for various equipment items used in local automation. Improved preventative maintenance policies can be developed. Procurement policies can be modified to exclude vendors with poor equipment.

Reference: Forbes - Cloud Computing Meets Energy Management
Reference: Powerit Takes Big Factory Energy to the Cloud

Part 2 – Technologies for gathering and serving data
 
This part of the article describes the features of one particular product which allows enterprise to aggregate data as well as explanations of various ‘Webby’ type technologies used to transfer this data up the cloud ladder.
 
Data Clients


A Data ClientA Data client is a product that collects shop floor or field data, logs it and makes it available for remote monitoring. As opposed to the conventional definition of a protocol gateway, the data client is optimized for presenting data to cloud applications as opposed to a local controller. The specialization of these products is provided by the logging system, the ability to push or serve data to applications and SQL (and other) Data Bases and to provide some visualization. Typically a data client supports multiple protocols simultaneously.
 
CAS Data Clients use ModbusRTU, ModbusTCP, BACnetMSTP, BACnetIP, SNMP and other protocols to collect field data. They are configurable so that a specific set of data objects can be monitored. Data is served / sent to remote applications using protocols like HTTP, technologies like SOAP and markup languages like XML or JSON.  Current data values and configuration is presented using HTML served via HTTP. With logging turned on the devices capture 5000 samples (configurable up/down) and store these in a FIFO table using SQL technology. Records that are pushed off the end of the table as new records are collected, are stored in zip files which can be FTP’d off the device. The Log file contains human readable data.
 
Often cloud systems are blocked from reading data from local sites because of firewalls and network security policies. The CAS Data Clients are able to push data out to a cloud server as well as the supporting the conventional poll/response system. The data client can open an outgoing pipe connection to a remote monitoring system. Once the pipe is established the remote system can do conventional poll / response type requests using conventional protocols like Modbus or even BACnet.
 
The CAS Data clients are provided in two form factors. One as an actual device, the other as a virtual device which runs on a Windows platform.

A Typical Block Diagram is shown below.

Typical Block Diagram  


Part 3: Web(by) Protocols for remote data monitoring / data aggregation

This section of the article provides brief descriptions to some of the jargon used in communicating with the cloud.

HTTP: Hypertext Transfer Protocol. Client / Server protocol. Uses TCP/IP. Transports text and binary data. Primary purpose is to carry payloads for hypermedia information systems (World Wide Web) Plebut in practice has been used for a variety of purposes. Is not involved in the use of the data/ rendering the data. Can be used to carry HTML, XML, JSON and many other forms of data including binary objects like files and images. This protocol is popular for diverse services because so many sites leave port 80 (default HTTP port) open. Although  it can carry a very diverse set of payloads it’s a pretty simple protocol.

XML: It’s a (markup) language that defines a set of rules for encoding data in both human/machine readable form. Is not a protocol. Think of it as payload. Most often the HTTP protocol is used to transport XML data. A number of XML utilities are used by programmers to parse/process XML data. A number of tools are used to render/validate/document XML data. Not everyone uses the language the same way. Vendor A may encode data one way and vendor B may encode the data another way – even if they are dealing with the same type of data – eg. energy measurement. To use XML you need to be sure that the protocol used to serve the payload is the same as the protocol used to request it (eg HTTP for both client and server) and that both the client and the server expect the data schema to be similar. An XML schema is the structure of the data where as an XML payload is a set of XML data that conforms to the specified schema but contains a specific set/instance of data/measurement.

AJAX: Not a protocol. Not a markup language. A combo of techniques used to achieve the goal of updating a web page without interrupting the display of data. Ie a method of reading data in the background of a web page and using that data to update the foreground of the web page without interrupting it. Originally  Protocol=HTTP Payload=XML  Parser=JavaScript. However XML data can also be replaced by JSON data and a number of other parser’s are used. It’s a very generic term that implies few details.

JSON: JavaScript Object Notation

It’s a language that defines a set of rules for encoding data in both human/machine readable form. Is not a protocol. The encoding method is different to XML in format.

JSON vs XML:

JSON representation:

{"menu": {
  "id": "file",
  "value": "File",
  "popup": {
    "menuitem": [
       {"value": "New", "onclick": "CreateNewDoc()"},
       {"value": "Open", "onclick": "OpenDoc()"},
       {"value": "Close", "onclick": "CloseDoc()"}
    ]
  }
}}

The same text expressed as XML

<menu id="file" value="File">
  <popup>
    <menuitem value="New" onclick="CreateNewDoc()" />
    <menuitem value="Open" onclick="OpenDoc()" />
    <menuitem value="Close" onclick="CloseDoc()" />
  </popup>
</menu>
 
RPC: Remote Procedure Call. Not a Protocol but is similar. Some vendors implement an RPC for data exchange. The RPC messages are transported using a protocol like HTTP. Instead of using a particular protocol to read data from a device, such a Modbus, you send a RPC call with parameters (like the point address). The device responds with a payload of data which (vendors choice) may be encoded with JSON or XML. This is like a protocol within a protocol. The HTTP carries the messages. The messages are series of requests/responses which must meet that vendors RPC rules (or RCP protocol). RPC is open, vendors choose the protocol, the payload, the methods, the formats.

In this example: JSON is used as the markup language to define the request. The request is sent using HTTP Protocol. It is sent to the RPC url on the device. The device unpacks the HTTP, extracts the payload. Presents the payload to a JSON parser/analyser which extracts the JSON elements. These are presented as parameters to the Remote Procedure Call logic in the device. It processes it and forms a response. The response is marked up using JSON (in this example but it could have been simple text or XML) and then packed into a HTTP packet and sent back. 

Sample request:
{
"version": "1.0",
"proc": "GetPlantOverview",
"id": "1",
"format": "JSON"
}

[an error occurred while processing this directive] WSDL: Web Services Definition Language. It's like RPC except it's not wider open – the technologies used are specified. Protocol used to transport messages is HTTP. The Markup language for the payload is XML.  A WSDL implementation defines the abstract (like a schema) and the specific. Ie. Part of a WSDL implementation is the definition of the services and the data. The other part is the specific data being transferred for a specific request/response. A WSDL description of a web service provides a machine-readable description of how the service can be called, what parameters it expects and what data structures it returns. It is written in XML.

SOAP: Simple Object Access Model.  It is a protocol. It adds almost no value. Has become obsolete. SOAP adds a header and body and a few rules to the XML payload inside an HTTP message. When a SOAP message is received, the HTTP driver unpacks the HTTP payload and presents it to the SOAP driver which removes its header and other items and presents the XML payload to the application which will process it.  Why use it ? Without it, if your XML payload cannot be used or made sense of then there would be no service to send an error response. That’s one use.

REST: Representational state transfer. Uses HTTP Protocol to carry messages. Is a protocol. Simple. Requests are made by using HTTP GET,POST to specific urls. Specifics of the request are defined as HTTP parameters. The response is coded any way the vendor chose and are not necessarily XML.  Each programmer who develops REST services on a server may make different choices when it comes to the URL’s the parameters and the format of the response.

JAVASCRIPT: A programming language that has nothing to do with JAVA. Javascript is interpreted at run time as opposed to being compiled so it runs slower than most applications. It is used in web pages and  serviced by web browsers.  Not all browsers support the same javascript functions and they don’t all work exactly the same way in each javascript enabled browser/application. The Javascript in a web page is not seen by the user. It is used to implement methods and processes which site behind the visible page. For example JavaScript may be used to read data in the background and to update the visible page with the new data. Because it is run in a browser, the language has been somewhat crippled to give the illusion that it cannot access the files and other resources on your computer.

SQL: Not a protocol. Not a marksup language. Not related to the transfer of data.  Refers both to the programming language and to describe Database systems which support the SQL programming language. They are optimized for speed and database size.They are relational. A free implementation is known as SQL*lite. SQL databases need to be stored In one system so for large systems a front server handles the requests and distributes them to other SQL servers. New non-SQL databases can easily be spread out onto multiple sites.

JQUERY: A javascript library that is very widely used on the web. It contains a number of functions (subroutines) for things like animation. They are all built using jasvascript. Think of it as a collection of javascript shortcuts or higher level methods.

FLASH: by Adobe. Multimedia platform. Manipulates vector and raster graphics to provide animation of text, drawings, and still images. It supports bidirectional streaming of audio and video, and it can capture user input via mouse, keyboard, microphone, and camera. Flash contains an object-oriented language called ActionScript and supports automation via the JavaScript Flash language (JSFL). Flash content may be displayed on various computer systems and devices, using Adobe Flash Player, which is available free of charge for common web browsers, some mobile phones, and a few other electronic devices (using Flash Lite). Does not run on iphones.

Dashboard  

Image by: DGLogic

Enterprises use Cloud based Energy Monitoring and provide dashboards to provide feedback to inject feedback into a virtuous cycle of monitoring, reporting and reduction.



footer

[an error occurred while processing this directive]
[Click Banner To Learn More]

[Home Page]  [The Automator]  [About]  [Subscribe ]  [Contact Us]

Events

Want Ads

Our Sponsors

Resources