Are you looking ahead to the Grid-Interop?

GridWise looks to transform the production, delivery, and consumption of energy by adopting an open standards-based architecture across the entire power grid. GridWise makes no assumptions that future power markets will look as they do now. GridWise applies the latest approaches of Information Technology to the problem or Electrical distribution. GridWise anticipates that opening up the interfaces to each business activity of the Grid will open electric power markets to innovation.

Interoperability is key to successful markets. Every level of the electric market from generation, to transmission, to local distribution to the customer will support interoperability through open interfaces. Interoperability opens markets, and GridWise will open markets and create business opportunities across the electrical power markets. Renewable power, differential pricing, near-grid buildings, and many other technologies addressing the most pressing issues of our time will be set free by interoperable standards.

So what does IT bring to the Grid?

  1. By working at the surface of Power Generation, we hope to create a live NASDAQ of power purchases, augmented by a formal ontology traditional value like reliability and emerging values like sustainability.
  2. By working at the interface between transmission and distribution, we anticipate enabling the development of micro-grids offering superior reliability supporting local generation.
  3. By working at the end of the distribution grid, we will deliver real-time usage and live pricing information to the home and office, offering incentives for more efficient energy use and storage.
  4. By working on the outside of the home and office, we anticipate new value propositions such as orchestration of on-side production and storage as well as competitive third-party management of power.
  5. Federated Identity Management. Look at 1-4.

The opportunities are large. The goal is to enable the creation of whole new markets., ones that will offer incentives to drive innovation instead of to throttle it.

Many were first introduced to the principles of GridWise at GridWeek in Washington last spring. Grid-Interop is a more technical discussion of interoperability and the energy markets. It is an opportunity for the executive or the technical staff to learn more about how interoperability will open up new markets and enable innovation across the whole electrical supply chain.

Keep an eye out for details

http://www.gridwiseac.org/interop/gridinteropforum.stm

New Daedalus says “Check it Out!”

Can you afford to not require building models?

Many of the best builders use ephemeral models today. Contractors generate their own building models. They create these models are created prior to the bid, to address the inadequacies planning using he traditional building system drawings. When he wins the project, the contractor will use this model throughout the construction process. This model is then discarded, as it is not specified as part of the project deliverables, and could create additional liability. It would be far better, and far more efficient, is these models were based upon the designer’s models, and were included as project deliverables at building turn-over.

Without a design process that actually includes the mechanical systems and their controls, there is no underlying operational model for the building. Without an underlying model, ongoing system maintenance is based upon guesses. Without live performance metrics, including instant access to energy metering, linked to that model, than building system operations are based upon experience and guesswork. When the system is green and non-traditional, you can eliminate experience, leaving only guesswork to operate the building, and to tell if the building is being tuned into or falling out of control.

The solution to these problems is an integrated data model for the building whose life extends as long as the life of the building. The data model starts with the capture of the design intents. Building designs should be models, not drawings, and should be standards-based. The energy model, for example, should run directly off the building model and could be compared to the design goals. Changes to the design, especially during value engineering when many innovative features are eliminated, could be automatically reflected in updated energy models.

This building model should be available electronically to each bidder and used throughout the construction process. The increased accuracy of the bid package and reduction in change orders during construction would reduce costs and result in as-built models that match the initial design. These accurate designs, would include full identification of the internal systems, their components, and their performance expectations.

With delivery of the as-built models, using system identifications consistent with the initial design documents, then building commissioning becomes validation of performance to the design. In the case of energy systems, commissioning becomes validation to and alignment with the energy model. This, at last, becomes a significant improvement over the traditional standard, described only half in humour, as “no sparks”.

This persistent design model would become the basis of maintenance and operations decisions. Maintenance staff would have ready access to design and commissioning documents keyed with the same systems identifications. Field notes and best practices discovered for one system could be made automatically available to all similar systems using the information model. acts necessary to support innovative systems would be available to maintenance and operations throughout the life of the building.

How far off are Self Maintaining, Self-Repairing Facilities

It is easy to fall into thinking that Energy and its twin Sustainability are the only benefits of the new-style of intelligent buildings we are starting to call Buildings 2.0. A significant benefit of these new approaches will be a reduced cost of ownership, and in particular a reduced cost of maintenance while providing higher levels of service.

FIATECH (see side bar) has called this the Self Maintaining, Self Repairing facility. This does not yet mean that buildings will have internal nanobots responding to structural stress sensors by reinforcing beams. For now, it aims at the humbler goal of each building being able to recognize the needs of its systems and prepare timely and accurate requests for their upkeep.

This starts with improved life-cycle data management for each building. Today, it is hard for the facility operator to find even the recommended maintenance and spare parts for all the equipment in a newly built or renovated building. Such information as there is is in a large bookshelf in a locked room in the basement. Spare parts requirements must be carefully copied out of the books – assuming there are no mistakes made.

More recently, some hand-over and commissioning firms have begun scanning these big books and delivering them to the owner on a CD full of PDF (Adobe’s Portable Document Format) files. This does improve accessibility, to the extent that documents held by both maintenance and procurement are more accessible. I am not convinced that a CD with several hundred PDF’s is really a usable library. Even if you do find the correct PDF, there is still the problem of transcribing information into other systems.

The life cycle data standard for buildings (NBIMS) which holds all information for a building includes specifying how to hand over this information (COBIE) from the contractor to the owner in a machine readable format. Ready access to that information alone, it is estimated, can reduce lifetime Operations and Maintenance costs by up to 20%.

But that is still not the yet the Intelligent Building, that is just Building Intelligence.

In Buildings 2.0, systems will know their name, because they were named in the initial design. Systems will know their target performance characteristics because the design intents and energy models are part of the design history in NBIMS. When a system needs maintenance, the system will request it, using the name assigned to that system during design. Maintenance personnel will be able to find the requesting system because the name will be the one in the searchable on-line plans.

How far out is this? Not very. Today most routine maintenance is schedules on a time schedule, whether it is not yet needed or is overdue. Think of this as always changing oil every three months, whether you have kept the car in the garage or driven great distances. If this results in maintenance too soon, it is wasted. If this maintenance is performed too late, the equipment has already been damaged. Systems that know only how long they have run, and have that information accessible over the network, will already be a big step in this direction.

If the systems get just a little smarter, or make their information accessible to other systems for remote analytics, they can move beyond failure detection to failure prediction. Add in some energy cost awareness, and the maintenance organization will receive notification of problems before tenants are aware, complete with instructions as to what needs to be done, and the estimated cost per month of failing make the repair.

For now, that will have to count as a self-maintaining, self repairing facility. Even without nanobots, that will be a substantial improvement in operational cost and quality.

Invisible and Uncontrollable

Recently a well-respected engineer and leader of a well-respected engineering organization lashed out at my comments. “My customers do not want to pay $120 for a controller on the roof, they will never pay for [the interfaces you advocate]!” This echoed my conversations the month before with my brother CJ, who has been programming high performance embedded systems in high-risk environments for his entire professional life. I asked him what it would take to engage a wider audience. CJ defined the barrier without hesitation: “It’s because people see these systems as invisible and uncontrollable. “

In defense, the engineer would have talked of the enterprise control systems his company offers. He would have pointed out that they had been leaders in developing products offering web services. These just add another trait to CJ’s apt description: inscrutable.

Have you ever sat on the edge of a conversation between two experts in a jargon-filled field not our own? You recognize that the conversation is in English. You do not recognize all of the words. Some of the words sound familiar but seem to have non-standard meanings. Unless you are really motivated, you soon stop paying attention.

That is where building systems are today. LON, BACnet, KNX, and others might as well be in Mandarin as far as Enterprise IT is concerned. When Building System providers produce Web Services, it is as if they switched to English, but highly technical English, laced with jargon, and demanding deep domain knowledge to understand. Controls companies say “We tried, and it failed.” It would be more accurate to say that they did the quickest, dirtiest translation they could.

None of these companies would dream of using on on-line internet language translator for their marketing brochures. Yet that is all they have done with their web services. These systems need to go beyond translating their low voltage protocols to XML—they need to translate their engineering processes into business services.

My friend Keith Gipson has all the retro-commissioning business he can handle right now, driven Energy Companies in Southern California. He will launch into descriptions of the last generation “Enterprise Systems” that are both chilling and hilarious. Somewhere in the narrative there is always a paragraph similar to: “So we found the three year old control system in a locked closet on the third floor, running Windows 98…we think it had been frozen up for months”

I cannot imagine ever considering a computer running Windows 98 as being enterprise ready. I don’t think even Microsoft ever marketed Windows 98 as an enterprise operating system. The controls companies that used Windows 98 until recently did so because it was the most recent operating system they could find with no standards for security or system protection that might change the way that they had always written programs. And as Windows 98 was always un-securable, it owners had no choice but to isolate it from the network and lock it in the closet.

Today’s Building Systems use web services are like they used Windows 98. They have made some sort of pro-forma nod toward mainstream systems. It is neither effective nor useful. Without security, the Windows 98 PC could not safely be connected to any network with other systems. Without service oriented abstraction and the useful security models they enable the new web services cannot interact effectively with enterprise systems.

Despite all, they remain invisible and uncontrollable because inscrutable. No one wants to pay for that.