Synergies

Aligning Time, Space, and Energy

Whenever time, space and energy are misaligned, you are spending too much. NBIMS (National Building Information Model Standard), now buildingSmart, is a standard for the definition and creation of space for work and life. The functions inside buildings are the largest users of energy in North America. Time is the missing component, the piece which is never aligned.

When we use energy at the wrong time, we require too much from the power grid. When we do not know the scarcity or plenty of energy, we will not use energy at the right time. Energy availability has seasons, as do other perishable commodities such as fruits and vegetables. Where the seasons of produce are the seasons of the year, the seasons of energy are the hours of the day. Using energy in the afternoon is like consuming raspberries in February, a profligate act. To know the scarcity or abundance of energy, we must have pricing that varies during the day.

When buildings are uncommunicative, they cannot respond to our needs for space. They cannot prepare space to be optimal when we need it. The cannot release their claim on the energy needed to prepare space when we do not need it. If we cannot communicate with buildings whether one person or twenty will use space, then the building must lay claim to the same energy for either case. When we can let buildings know what our space needs are, then they can use energy by preparing just what we need.

If we knew the scarcity of energy, would we change how we move through time and space? Human Resources may tell me to stay at home for a snow emergency. Why not tell me to stay at home for an energy pricing emergency? It would certainly save me time, and a lot of energy, to not commute to a building that is temporarily over-priced.

Space need accurate time. Too many building systems have no way to update time, no way to calibrate their clocks. To be responsive to time signals, building systems must pick up network time as do all other networked systems.

Managing space is old hat.

Managing energy is so ‘70s.

Managing time is in every self help book.

But to manage all three together requires standards, and interoperability.

Extreme Integration

New trends in building integration in which the systems respond not only to their own internal operations, but also to their tenants, and to the operations of the buildings housed within them are coming soon. Building systems will also interact with other systems in the building, not only the business systems, but also the other embedded systems with different missions. Buildings will also gain situational awareness of the world around them, starting with Demand-Response, but soon including greater awareness of weather and interactions with first responders. I like to think of this trend as extreme building integration.

Extreme building integration has been around for some time. It was difficult, and time consuming. Only those with special needs or special obsessions underwent the expense to acquire them. There was a certain cool factor to them, owners would show them off. But to date, they have had little effect on markets.

Three factors will increase the pace and penetration of extreme building integration. Increasingly, infrastructure that will enable them will be in-place to support the growing attention to Demand/Response as energy prices mount and carbon concerns gain ascendancy in the public mind. For Demand/Response, requiring internet-scale integration of control systems, to flourish, building system integrators will have to learn the lessons of encapsulation and isolation that are at the heart service orientation. Service orientation will require situational aware security to replace simple perimeter-based control.

These changes in system integration will both drive and be reinforced by three imperatives: information availability, a decreasing size and scope of each system, and interoperability. These three forces are mutually reinforcing, and will drive innovation for some time.

All organizations want actionable information abstracted from the raw data of their business operations. As enterprises discover such information in their Demand/Response systems, they will no longer look on their building systems as invisible and uncontrollable. They will begin to wonder what other business information can be gleaned from these systems. Can it decrease maintenance costs? Can it improve tenant loyalty or increase rents? A little information begets requests for more.

As systems learn to communicate in more standard ways, they open up interactions with more elements of the enterprise. More communications will encourage smaller systems with a single purpose. More complex interactions outside each system will encourage reducing the size of each system to control complexity. Smaller single-purpose systems will open up greater competition on performance of that function.

Interoperability is the largest aspect of these forces, unseen, perhaps, like the underwater portion of an iceberg, but keeping the other two afloat. Interoperability enables building owners and tenants to swap out building systems to meet special needs. Competition on quality of performance and subtlety of operation becomes possible. Interoperability requires the discipline to keep extraneous details out of the interface. Interoperability requires abstraction to enable new interactions not anticipated at design time. Interoperability enables and drives innovation. Interoperability allows not only extreme integration, but also agile integration, responding to the needs of the home or enterprise in different ways as needs change.

Soon, such integration will not seem extreme any more.

The Case for oBIX in Laboratory systems

Well, if not oBIX, something like it.

Most data in modern research is collected by automated systems. Computers assess, quantify, and print out data. Some may be able to produce spreadsheets. Some produce graphs. (I remember measuring graphs with great care to turn them back into numbers in a previous career). Some may extrude CSV (comma separated values) files, to be imported into databases. But almost everything starts with machine measure and tabulation.

Often the information you need to understand the experiment has been recorded, but it is only available in a nearby system that the researcher either has no access to, or does not know he can get.

The biologist who recently asked for access to our operations data is one example. He works with plants in greenhouses. Greenhouses strive for, but do not always produce, specialized conditions. Understanding plants, and small differences in growing plants, often involves understanding the conditions they grew in precisely. We have a system that tracks minute by minute the temperature and humidity of the areas it monitors. In areas in which natural lighting is being used and augmented, light levels are also tracked. This researcher asked for access to the minute by minute temperature, humidity, and light for each zone in the greenhouses. What is wonderful about this request is that we can provide it essentially for free

There other systems, specialized systems that researchers work around. Back when I worked in a Biochemistry lab, we had large variances in the reactivity of the materials we worked in. After half a year, I guessed that these problems were caused by variances in the level of liquid hydrogen in the large carboys we stored samples in. Today, those carboys are replaced by ultralow freezers. While I never could prove this, I got outstanding results by working straight through (a 50 hour shift every other week) and thus eliminating variability. You can find the results on the web if you are interested in quassinoids.

There is a large cancer research center on campus. As part of the background for each grant that it submits, it includes general material on the quality of the facilities and how they enhance the research performed therein. One of the pillars of quality is an ultralow freezer tracking system.

This system monitors all the laboratory freezers in the building. Data on the quality of the freezer systems is carefully monitored. Each freezer has its tolerance, and it can be documented that each system stays within its tolerance. This documentation is part of the overall facilities quality report. If your sample is stored here, it will not be accidentally thawed out.

What is not available is any easy way for researchers to access the same information. If they ask the right person, they can get a spreadsheet describing the details of the performance of a particular freezer. They can request these periodically. This system, like most building monitoring systems, has no way to let researchers pull direct feeds of data from the freezer monitoring system. There is no way for one of several researchers using the same freezer to set personal alarms on conditions that matter to him. Most of the researchers are unaware that they can ask for anything.

Every automated laboratory system produces its own special data format. There is an effort underway, UnitsML, that hopes to establish a data standard for every measurable physical condition. All testing runs, all data, will be able to be delivered in a standard UnitsML format.

When the researcher can freely get to information on conditions whether from Laboratory equipment, from specialized laboratory infrastructure, and from buildings in self describing formats over the internet, then better analysis is possible. When students can use this information as well, without too many people interfering with experimental conditions, than education is improved.

I could go on…but the motto of out Building System integrator seems appropriate here. “No Data Left Behind!” These words are good for research as well as for building operations.

Shedding old habits is the Hardest Task

We can't replace decades of "natural monopoly" regulation overnight without some effort to create new markets. We can’t replace decades of monolithic systems design without constantly re-examining our assumptions. It is going to be hard work to move to agent-based integration at every level. It will take continuing hard work at each transaction surface. The hardest part, though, will be changing the habits of thought.

This is why many of the “lessons” learned from the last round of electrical de-regulation are not very useful. A little de-regulation of the wholesale markets power without allowing new market entrants as buyers accomplishes just a little. A mix of regulated and un-regulated markets just creates gray markets for the pre-existing products. Today’s technologies, when brought to bear on power generation, distribution, and consumption, give us new opportunities, opportunities for markets and opportunities for innovation.

Today, at every surface of electrical transaction, the technology could easily handle open markets. The surfaces are the borders between Generation / Transmission / Distribution / Consumer. With today’s technology, there is no reason why individual consumers should not be able to contract with any generator they wish, whether individually, or through third party aggregators, the equivalent of mutual funds. With today’s technology, there is no reason for local distribution to be owned by the same company that owns the transmission lines – and some good opportunities to be had by separating them.

In the early PC era, common practice and ITS guidance was that computer equipment must be fully depreciated over 5 years. Looking backward at cost instead of forward at value prevented companies from realizing benefits from technological change. By some accounts, the moment when Microsoft seized control of the PC industry from IBM was when Gates realized that IBM was maintaining 286-based systems until fully depreciated. Utilities, with their focus on regulated cost recovery and static efficiency are stuck in the same trap. Until this changes, the electricity industry in the US will never be focused on the value creation and dynamic efficiencies that are the hallmark of every other engineered business in today’s world.

The old model for power markets was vertical integration from generation to consumer regulated as a natural monopoly. Economical microgrids integrating heterogeneous sources and storages methods cast doubt on how natural the monopoly is. Live metering technology with two way communications enable a market at each transition. Intelligent building systems let end users manage their power consumption, in response to their internal needs and to live pricing from the grid. The operations assumptions based around integrated operation, dumb metering, and lack of information are no longer valid.

We will recognize true deregulation of power markets by an increase in end user autonomy and an accompanying increase in innovation. It will enable market models that are nimble, looking to future value rather than back to sunk cost. Future value is dynamic, as it reflects changing consumer tastes in environmental policy, social environment, and in technology. True markets will let new entrants in, seeking to create value in novel ways. The last round of deregulation focused more on freeing up rent seekers, and, for some, escaping from poor [nuclear] plant decisions more than it did on creating any actual markets.

The technology is hard, and will be hard. Skilled engineering will be needed to find the value in new approaches. Great patience and public communications will be required to convince the public and utilities commissions to stay out of the way. But what we will need most is nimble vision to discover the new business models that will unlock innovation.

And the lenses that cloud the vision the most is seeing the way we have always done it.