Award winning manufacturer of IT-based building automation.
Securing the Edges of the IOT
In cyberspace, you no longer know who your friends and enemies are.
On campuses, we used to worry about bored students. Today there are organized criminal gangs. There have been cities whose power was held for ransom. A recent report stated that "34 separate nations have serious, well-funded cyber-espionage teams targeting friends and foes alike." In cyberspace, you no longer know who your friends and enemies are.
Cloud-based IOT is growing security identity requirements beyond operational capabilities. A key first component of security is knowing who is on the other side of communications. Traditionally, this has relied on installing 3rd party certificates at each end of a conversation. Many IoT end nodes are on private networks and are unable contact certificate providers. (A certificate vouches for identity. Proper use of certificates should include checking with the issuer to see if a certificate has been revoked. An isolated system cannot contact the certificate issuing authority without itself becoming less secure.) California ISO has identified changing and updating certificates on the limited numbers of substations they communicate with as a constant source of risk. The local engineer is often not even aware of the certificates when CAISO calls to remind them it is s time for them to be renewed.
Many leading-edge building control systems are growing into multi-computer systems. These autonomous networks rely on far more network protocols than does the traditional BACnet or SCADA system. The required protocols range from low-level time protocols (NTP) to directory services (LDAP or Active Directory) to system management interfaces. These protocols may be insecure variants that require certificates, i.e., LDAPS, once again creating additional costs of ownership. Security personnel must know each of these interactions to configure the modern corporate firewall. Often, far too often, it can take months to get a full catalogue of this information from the developer or vendor.
Data is a growing
barrier to integrating the IOT. IOT data appears very small. The actual
requirements of network hand-shaking, of security negotiations, and of
routable transmission of packets means that data logging in the clouds
rapidly consumes more bandwidth than anticipated. Periodic batch
processing of data greatly reduces the communication requirements.
Intelligent local processing, including artificial intelligence (AI)
techniques, can preserve timely response while only sending unscheduled
data when it is useful.
Local data storage requires local databases, which create whole new problems. The problem with local data storage is that it requires local databases. A local database is another system to patch, to secure, and to manage. Local data may document contract performance or system compliance. Local data may be lost when a system fails, and, in the IOT, is likely not to be backed up.
The Purdue model of Computer Integrated Manufacturing (CIM) has long provided the canonical model for Industrial IOT (IIOT) security. That model does little to address the issues of things interacting over wide areas. At NREL, Erfan Ibrahim’s group described a Systematic Security Architecture for Grid Integration that defines the requirements for modern wide-area integration. The reference implementation requires considerable knowledge and detail work to scale to very large systems.
more points, more security, more data integrity are required with fewer
personnel and less work per node.
intelligence is part of the answer. Sensors and systems will be able to
learn the identity of their peers. These identities will replace the
Certificates used today. Identities, rights, and interactions will be
stored in local databases. Even the right to access the network will be
based on these identities.
The databases will
use the technologies behind the cryptocurrencies, like Bitcoin, or more
properly the cryptoassets. The cryptoasset technologies that will win
will be distributed, so the loss of one or more systems will not cause
the loss of data. The multi-system hashing that provides “proof of
work” for cryptoassets, will track the interactions and telemetry that
will be used in regression analysis by AI agents. If a remote party
needs to share that data or to see proof of performance, then that data
will be requested from the immutable cryptoasset data store.
cryptoassets that will thrive in this environment are those that do not
require the cloud to operate—for writing crypto-transactions to the
cloud will provide merely a less performant and more important cloud
database. They will be self-assembling, and able to adapt to system
participation changes over time. Where today’s cryptoassets are
designed to work hard, to limit coin generation, these future
cryptoassets will user lighter weight models in which systems as small
as a Raspberry Pi will participate easily.
And they will, of
course, be open source, both to enable wide multi-party adoption, and
because no one will trust their security if they are not.
[Click Banner To Learn More]
[Home Page] [The Automator] [About] [Subscribe ] [Contact Us]