Any project aimed at improving data quality, both in terms of normalization and identification of redundancies, is realized using an appropriate and consolidated method.

The adopted approach involves certain fundamental stages, which constitute the foundations for the entire process:

  • Initial analysis
  • Iterative refining processes
  • System knowledge supply
  • Quality consolidation
  • Quality maintenance

The initial analysis stage provides for a series of preliminary steps that enable the analysis of the data through sampling, the verification of the initial level of quality, the identification of the main error characteristics and types, the definition of the basic rules and the determination of the desired level of quality.

All this brings the project to the operating stage, which consists of a series of iterative steps that enable the improvement of the level of quality through successive refinements. The preparation of the systems with the defined rules, the processing of the data, the analysis of the results and the definition of the new recycling rules are the components of an iterative process which, at each cycle, releases the data that have met the specified level of quality and reprocesses the data requiring additional treatment.

During the iterative processes, the system is supplied with the applied rules. In this way, the system ‘ learns’ to manage the new case histories emerging during the quality implementation process. The new rules are entered into the system by setting the parameters, by supplying the supporting database or by actual implementation. The gradual growth of system knowledge is fundamental for subsequent quality maintenance.

Once the iterative processes have been completed the consolidation stage is activated. This enables the quality implementation process to be rendered efficacious by replacing any ‘incorrect’ data with the correct one. The consolidation can act directly on the source of the data or support the migration of data towards other procedures.

The system, equipped with all it has learned in the course of the quality implementation process, is integrated in the user procedures, in order to filter and correct the data at its source, ensuring the long-term maintenance of the obtained level of quality without any need for renewed massive interventions.