Six tips for designing a manufacturing database
Designing and maintaining a manufacturing database is a complex task. Many issues need to be considered before adopting and deploying a solution. You do not want to be re-thinking your da-tabase strategy after you have logged 40-million records into your online production database!
Here is a partial list of recommendations to think about when planning a database strategy.
Figure it out before you get to 40-million records.... this practical advice to database design is an Online Extra to Control Engineering, June 2002 article, ''How to Get the Most from a Database.'' Click here to read ''How to Get the Most from a Database'' in its entirety .
Designing and maintaining a manufacturing database is a complex task. Many issues need to be considered before adopting and deploying a solution. You do not want to be re-thinking your database strategy after you have logged 40-million records into your online production database!
Here is a partial list of recommendations to think about when planning a database strategy.
Minimize your islands of data
Normalize your data
De-normalize your data
Use declarative referential integrity
Avoid "smart" primary keys
Plan early for archiving
Each of these items are examined briefly below.
1. Minimize islands of data.
chance of being properly administered and secured. Secondly, you will know where your data lives instead of having to search across various computers and applications. Finally, it is likely that you will be able to adopt new business rules and data analysis tools that span what once were disparatedata sets.
2. Normalize the data.
Normalized data is a term that is familiar to data modeling engineers. Normalization is set of rules that leads to efficient and non-redundant storage of data in a relational model. The common goal for many database designers is to achieve "third normal form," which requires three main steps:
A. First Normal Form - Eliminates duplicate columns in tables and assigns a primary key. A primary key is a column or set of columns in a table that uniquely identify a single record.
B. Second Normal Form - Removes subsets of data that apply to more than one row of a table and puts these data sets into a separate table that links to the original.
C. Third Normal Form - Removes any data that is not dependent on the primary key of the table. There are additional normalization levels that exist, but these three are generally accepted as the foundation of good database design.
3. De-normalize your data.
Isn't this a direct contradiction to the above step? Yes. When and why would you want to do this? Well, there is sometimes a tradeoff in database design between efficient storage of data and efficient access of data. For example, certain reports that are run against a highly normalized database may be computationally expensive, because many of the normalized data tables need to be joined together to produce the final report. In fact, data warehousing techniques generally involve some level of data de-normalization for more efficient reporting and analysis.
You also should consider historically accurate data. A reason for storing data in de-normalized form is to capture a complete and accurate snapshot of relevant data, without worry that one of the normalized data sets may change the derived data sometime in the future. For example, when you purchase an item online and go back and look at your account in the future, you don't see your account statement modified just because the price of the item changed after you bought it.
Normalized and de-normalized data sets can co-exist peacefully in a data model. The two techniques generally apply to different sets of data in your system.
4. Use declarative referential integrity .
Your database engine is a powerful piece of software that contains many features to help keep data safe and consistent. Most enterprise database engines offer declarative referential integrity as one of these features. Declarative referential integrity allows you to specify rules about relationships among data in various tables. For example, you can declare a rule that will prevent deletion of a record in a particular table if there are records in other tables that depend on it. This is called a restriction. Another option is, when deleting a record in a table, to delete records in any related tables that are dependent on the primary record being deleted (e.g. if deleting an order, delete all of its line items). This is called a cascade.
Many software systems will build these sorts of validation checks directly into the code, but declarative referential integrity will ensure that your data is safe at the database level, even if your software forgets to check one of the rules.
5. Avoid ''smart'' primary keys
A significant part of creating a relational database is to define the relationships among the different tables in the system. These steps are usually done as part of the normalization process described above and involve defining primary keys and foreign keys. Again, the primary key is the column or set of columns in a table that uniquely identify a record. A foreign key is a column or set of columns in a table that ''point'' to a unique record in another table.
odel that store particular information about the unit such as its current state, model type, operation history, test results, etc. One possible primary key could be the concatenation of the model type and serial number of the unit. This would provide uniqueness and additional information about the unit (its model type) in one field. While this seems to be an intuitive thing to do, it can cause problems later in consistency and performance. First off, it probably violates third normal form. But more importantly, what happens if the model of the unit changes during production? You then need to regenerate the primary key, which can be a troublesome and computationally expensive operation.
Okay, so let's say you remove the model type information from the primary key. What if the serial number changes? You still need to go back and change the primary key and all references to it in dependent tables. In this case, the best method is to use an internal arbitrarily unique number for the unit, and treat the serial number as merely an attribute of the unit. All foreign key references would use the internal unique number to link to the main table and would be indifferent to changes in the serial number, model type, or any other attribute of the unit.
This method can also help reduce storage requirements and overall database performance. Storing a foreign key that is a unique integer (4 bytes) is much more efficient than storing a 32-character serial number (32 bytes). Also, when joining related tables together, the comparison used to find matching keys might perform better for the same reasons.
6. Plan early for archiving
You probably won't be able to keep all of your production data online forever. Planning how much data you need to keep online is critical for maintaining a good-performing database. Also, knowing the capabilities of your database engine may help you plan a strategy. For example, Oracle provides a powerful feature that allows you to take parts of a table offline for backup, archiving, or deletion while the rest of the table is online in production. This is useful for keeping a recent window of data online while clearing out older data to another system.
In summary, spending some time planning ahead while designing your data model can save you many headaches down the road. The important thing is to define a consistent model based on understanding how your data will be used by your applications.
For more visit Rockwell Automation
Comments? E-mail MHoske@cfemedia.com
|ROCKWELL SOFTWARE FACTORY/TALK PROVIDES A COMMON LANGUAGE FOR INTEGRATION|
A common language that enables manufacturing applications to integrate seamlessly across the enterprise, Rockwell Software FactoryTalk provides a set of capabilities that span many functional and technical areas of the manufacturing enterprise. The FactoryTalk Data Model delivers a robust and consistent data model that can serve as the repository backbone for manufacturing data. The model allows applications to build upon standard data entities and eliminates duplicate configuration and storage of 'islands' of data. It also enables users to consistently apply business rules that cross application boundaries, ensuring that one application cannot violate constraints that another part of the system has defined. The FactoryTalk Data Model is also extensible, so that subscribing applications can independently add domain-specific entities and attributes while keeping the core entities consistent.
Another defining characteristic of the FactoryTalk Data Model is its adherence to standards. Where applicable, the model was built upon the S95 standard, which describes a common set of entities for enterprise to control system integration. This not only reinforces the robustness of the data model, but it also makes it easier for users and external applications to integrate with the system.
Creation of a data model is a huge step in building a robust repository. However, without domain applications and tools to interact with the data model, it would still require a huge amount of work from integrators and users to build a solution. Rockwell Automation provides a suite of modular solutions that provide application functionality on top of the FactoryTalk Data Model. These solutions are grouped into three categories:
1. Application products that provide out-of-the-box functionality
2. Product tools that allow users to build custom solutions
3. Integration tools that allow users to build extensive custom solutions and extensions.
Products like RSBizWare PlantMetrics provide out-of-the-box application functionality that use and build upon the FactoryTalk Data Model without requiring the user to understand, or even be aware of, the back-end data model. Installed products register with the FactoryTalk Data Model and instantly can share information that may have been created with other Rockwell Software products.
For customized data collection solutions, RSBizWare RSSql provides robust transactional capability to allow users to connect control systems to relational databases and other enterprise systems. Custom data collection applications typically involve writing at least three modules. First, data must be gathered from control devices such as OPC servers. Next, the raw data needs to be bundled into a transaction, often requiring extra data processing such as averaging or other mathematical or logical operations. Additionally, triggering logic needs to be defined that specifies how and when this collection occurs. Finally, the transaction needs to be sent to the database by generating the proper SQL code for the database engine being used.
RSSql eliminates the need to write any of the above code by delivering a set of services to provide a flexible data binding mechanism that allows a user to bind source data from a control system to fields in a database entity such as a table, view, or stored procedure. This is all done from a graphical user interface that hides the back-end database connectivity and SQL from the user.
Because it is bi-directional, RSSql also can write data from a database entity to a control system. This is useful for downloading recipes or setpoints to a controller based on an event that can originate in the control system. Users also can distribute RSSql components across several computers for high-performance applications. RSSql also employs a data caching mechanism to prevent data loss when computers, databases or networks become unavailable. RSSql supports many database dialects and ships with generic ODBC, OLE-DB and Oracle connectivity for maximum performance in high-transaction solutions.
Looking forward, the next integration layer of the FactoryTalk Data Model that is currently being built will deliver the capability to develop highly customized applications and extensions that interact with the FactoryTalk Data Model. The next release of RSBizWare will feature a set of Web services that wrap the FactoryTalk Data Model with additional application and system logic in a secure environment. This will become the programming interface to the Factory Data Model that will be usable from any platform and toolset that can consume standard WSDL and SOAP-based Web services. This will enable applications to be built that run on a spectrum of devices from Web-enabled handheld devices up through server-class business systems