Data reconciliation is a technique that aims to rectify errors done in measurement due to the measurement noise. Taking it from a statistical point of view, the main assumption is that no systematic errors are recorded in the set of measurements since the results may come out as biased or reduce the power of the reconciliation. Therefore, Measurement Error Reconciliation can be defined as a process of rectifying these errors and getting things to their accurate state.
Data reconciliation depends on the redundancy concept, in order to correct measurement and satisfy the constraints of the process. Here, redundancy is defined in a different way from red information theory redundancy. Instead, redundancyarises through the combination of sensor data; this is referred to as analytical redundancy.
Redundancy can be a result sensor redundancy; this is where sensors get duplicated, resulting in more than one measurements of the same quantity. Redundancy is also achieved when a single variable is estimated independently in several different ways from different sets of measurements.
Redundancy is directly connected to the concept of observability. A variable is only observable if the models and sensor measurements are used to uniquely determine the value or system state. A sensor is deemed redundant if its removal does not cause any loss of observability. Rigorous definitions of the observability, redundancy along with process for determining it, were established by scientists Stanley and Mah.
Data filtering involves the process of treating measured data such that the values become meaningful and lie within the range of the actual expected values. Data filtering is necessary before the reconciliation process with the aim of increasing strength of the reconciliation procedure. There are several known waysof data filtering, for example taking the total amount of several measured values over an estimated time period.
Advanced data validation and reconciliation is an integrated approach of combining the data reconciliation and data validation techniques, which is normally characterized by;
- Complex models for incorporating mass balances as well as thermodynamics, momentum balances, equilibrium constraints and hydrodynamics.
- Gross error remediation techniques to ensure meaningfulness of the actual reconciled values
- Robust algorithms for solving the reconciliation problem. Gross errors are measurement anomalies. Therefore, it is quite important to take note and identify these gross errors in the reconciliation process. Afterwards, the reconciliation statistical tests can be applied that indicate whether or not such an error exists in the set of measurements brought forth.