Be on notice: the term data historian doesn't do justice to the pervasive role such systems are poised to play in today's most dynamic industries, nor does it hint at the big part the technology will have in initiatives ranging from demand-driven manufacturing to plant energy management. If data historians turn out to be the killer app nobody ever heard of, however, answers to how the systems ...
Be on notice: the term data historian doesn’t do justice to the pervasive role such systems are poised to play in today’s most dynamic industries, nor does it hint at the big part the technology will have in initiatives ranging from demand-driven manufacturing to plant energy management.
If data historians turn out to be the killer app nobody ever heard of, however, answers to how the systems will scale to accommodate applications involving millions of inputs and outputs—referred to as I/O or tags—are still emerging.
The answers, partial as they may be at this point, involve everything from how a data historian is bought and sold to who owns the metadata that makes it work.
It could be said that OSIsoft made the market for data historians, and for its 20 years of efforts the company today has annual revenues exceeding $100 million, while remaining privately held.
The OSIsoft PI System helps Microsoft manage energy use in the midst of a massive buildup of its data centers, as needed to furnish broadband services to consumer customers. Bristol-Myers Squibb will use PI System to tie control and environmental systems in a greenfield $750-million investment in large-scale, cell-culture manufacturing. And PI System is the corporate standard for data historians at biotech manufacturer Amgen ‘s six production plants.
Since at least early 2005, Alison Smith and Colin Masson of Boston-based AMR Research have been calling attention to the big role data historians will play in an era where multi-plant global operations are increasingly common.
According to AMR, data historians originated as special-purpose data repositories for environments in which vast quantities of time-series or temporal data are acquired and stored at very high rates. Historians allow users to archive and retrieve years of data in a fraction of the time of a conventional relational database.
The term “historian” has become less appropriate, however, as the systems evolved to become a real-time data-delivery mechanism, performing mathematical operations that aggregate and transform acquired data into operational intelligence used for decision support.
Moreover, says AMR, a historian today “provides a data abstraction layer between the real-time process realm and the more ponderous world of transaction-oriented business and financial systems… [and] offers a platform for aggregating, consolidating, and recalibrating data from various systems in the production environment.”
This is the role of data historians, for instance, in plant service-oriented architecture (SOA)-based software suites for production and performance management. And data historians have a larger role in managing valuable resources in the energy, utility, oil & gas, chemicals, and other process industries.
Scale and fidelity
In his opening remarks at the recent OSIsoft user group meeting, Patrick Kennedy, CEO, president and founder, noted that PI System is today an enterprise system.
“We haven’t even begun to saturate the number of tags and fidelity of response that will be required from PI in the future: millions of points, and nanosecond response times. But in scaling you arrive at a point where a system that includes dozens of interfaces and millions of points becomes difficult to manage. We’re addressing that challenge.”
One aspect of the challenge, says Gregg Le Blanc, OSIsoft director of technical strategy, is how users find needed information in a system that size. One answer is to organize based on assets, rather than tags, with different views of assets based on role. LeBlanc says two-million tag systems already exist today.
Just as important, says Kennedy, are agreements with customers that make it easy to deploy and scale PI System.
In practice, PI System is a tool kit applied in environments where deep understanding of real-time data is required. While perhaps brought in for some specific purpose, companies tend to initiate a practice in it and identify more applications over time.
If, however, the contractual relationship with the vendor is based on tag numbers, expanding use of the system can be cumbersome, if for nothing more than the paperwork involved. The enterprise agreement makes it easier to incrementally expand use while receiving needed vendor support.
Since last year at least 12 OSIsoft customers in process and utilities industries—of more than 200 total—have signed enterprise agreements.
Amgen recently signed an enterprise agreement to expand and standardize its use of PI System across the company.
“Tag- and server-based pricing limited adoption of PI,” says Rob Gamber, principal engineer, Amgen. “The enterprise agreement streamlined procurement and helped avoid unnecessary cost justifications. We now have the dedicated resources of ‘managed PI,’ a governance structure, cross-site support, and, to the degree possible, standard architectures, configurations, and processes.”
Le Blanc says finding needed information in large-scale PI implementations is dependent on right use of the metadata in the PI Analysis Framework. It is this marriage of metadata and time-series data that delivers information in context to the user.
“We’ve given users more entry points into the system,” says Le Blanc. “You can start from an asset, such as a motor, a control system, or from the perspective of energy use.”
The problem is defining a data model, or taxonomy, acceptable to a widening range of users.
But as AMR’s Smith has pointed out, “There are no ‘universal’ manufacturing data models to accommodate the detail and diversity of individual manufacturing environments.”
“Standards are available,” says Le Blanc, “including S95 and S88, and people use them. Generally, it’s really more of a meeting of minds kind of thing, though, and AMR may end up playing a significant role by defining a unified practice for data models.”