I have been thinking about security and parsimony lately. Security is not merely about confidentiality or even identity. It is about predictability and integrity. Challenges to predictability and integrity occur not only malefactors, but from those who develop, test, and maintain systems. Even interoperability is a part of security, introducing new sub-systems, or upgrading old ones, can introduce unanticipated interactions and failures.
Introducing any interface not actually required introduces new attack vectors and increases the complexity of testing. An interface that is only used rarely is only tested rarely. Any non-essential interface is a site that will be delegated to the junior developer; primary interfaces will be tested fully while the non-essential interface becomes a back door.
DOS/Windows is the poster child for security and reliability problems. Upgrade problems and incompatibilities were legend. Many of these arose when little used and long deprecated interfaces were eliminated or changed. Some interfaces existed only to support development and testing, and were never even documented. As thousands of developers competed for advantage, these interfaces got used. In an ecosystem of systems with far more variety, we will be better served to never introduce these obscure interfaces.
The other challenge presented by DOS/Windows was the sheer number of interfaces. One bit of code might support a dozen interfaces. Code added to fix one problem would get replaced as code to address another security based on an earlier code fork would reintroduced the problem. Complex interfaces require complex maintenance.
An oft-heard and little understood truism is that security must be designed in. This can be interpreted to require planning for encryption and isolation at every interface. This task can be fiendishly complex and require that only the most sophisticated programmers work on each system. As we know that we cannot guarantee such attention, this is poor security design. Complex procedures embrace their own failure. Better security design offers fewer chances for missteps, and better chance for sustained success.
For the smart grid, this means fewer interfaces and simpler testing. The smart grid has defined inflection points, places where responsibility or ownership or business processes change. Such inflection points define business processes with specific requirements for shared identity and authority. These business interfaces define the risks and the costs of failure. They should be few, simple, and well-defined.
We may anticipate site-based generation is either PV or Wind. We could, in theory, include wind speed as a required part of the wind source interface for the grid. A house in a tidal swamp may have some sort of novel generation strategy that appears, in most respects, identical to energy generated by the morning and evening winds that characterize the weather between the California coast and desert. By excluding wind speed, in this example, the same interface would serve for the wind generation and the tidal swamp generation. Simpler, sparser interfaces are what enable diversity and innovation. Coming back to security, the unused wind speed interface in the Tidal generator, if mandatory, is the one that will be untested and eventually a security hole.
Anything we put on the smart grid will be there for some time. It will be upgraded numerous times and coexist with other version levels. The interfaces should be few, because they will be there for a long time, and implemented and patched by many programmers.