High-quality master data has a positive effect on companies in many ways. For example, it is necessary in order to meet strategic requirements, carry out meaningful analyses and make the right business decisions. It is also much easier to comply with regulatory or legal requirements if the master data is well maintained. In sales, correct and complete data enables a 360° view of the customer and opens up lucrative cross-selling potential. The harmonization, integration and standardization of business processes also places high demands on data quality. In short: There is no good information from bad data.
Increased data quality at a fixed price
Goal: Keeping data quality at a permanently high level
After a data migration, the aim should be to maintain the achieved data quality in the long term. After all, the efforts made to improve data quality before the data migration should not have been in vain. It is advisable to use the experience gained from the three previous phases of data analysis, data cleansing and data migration to set up suitable processes for maintaining data quality.
The approach to keeping data quality at a high level includes avoiding duplicates, ensuring correct addresses, updating them regularly and validating and enriching the data. The basis for working with master data can be the definition of an appropriate set of rules. This includes the uniform recording of telephone numbers and e-mail addresses, the correct recording of bank details, the definition of duplicates and the maintenance of EANs in the correct format.
Scenarios for data validation
There are various scenarios for validating data after data migration that provide assistance in improving data quality:
-
Data quality check in SAP S/4HANA when creating a business partner
-
Master Data Consolidation of Active Records
-
Address validation and duplicate check with SAP Data Services
-
Validation and checks when using SAP MDG
-
Enrichment of data with additional business information
Master Data Consolidation of Active Records
The "Master Data Consolidation of Active Records" tool consolidates the active data available in the system. The optimal data record (best record) is determined according to defined rules and with the help of additional information from the duplicates. There are three different scenarios for consolidation:
-
Deletion of duplicates
The duplicates are marked for deletion and the key mapping is directed to the best record. -
Refinement of the best record
The best record is enriched with all the information that is to be retained, the duplicates are retained unchanged and the key mapping remains unchanged.
-
Enhancement of all data records
The duplicates are retained, but become copies of the best record and the key mapping remains unchanged.
After selecting the data records to be checked for duplicates, the matching results are determined on the basis of the existing configuration. The configuration determines which requirements a data record must fulfill in order to be considered a duplicate. The best records determined are then checked. The data records are validated on the basis of the defined rules and the results are activated in the database.
The flexible options for creating rules entail a certain degree of complexity, meaning that companies need to address the issue carefully. It is important to determine the optimum settings in order to actually achieve the objectives associated with consolidation.
Address validation and duplicate checking with SAP Data Services
SAP Data Services is the ETL tool from SAP. However, the functionalities required to improve data quality are not included as standard, but require an additional license. Address cleansing is based on postal directories and also includes the addition of missing postal details. If required, individual cleansing can be carried out, for example to correct format errors. A duplicate check is also included in the scope of services. Once the quality improvement measures have been completed, the optimized data is imported back into the SAP system.
Validation and checks with SAP MDG
SAP MDG establishes central ownership for all master data. Their processing is based on change requests. A flexible workflow concept offers the possibility to create the individually suitable master data control process including quality checks and authorizations. To increase the quality of the master data, the data is checked in the SAP MDG governance processes against the SAP business logic and against rules defined in BRFplus. The data quality checks from external services can also be used.
Enrichment/validation using additional information
Data quality can also be improved using additional information from external sources. For example, current address information can be retrieved from map providers such as Google Maps or HERE. Business directories such as Dun & Bradstreet or CDQ offer the possibility of supplementing the data with detailed information about companies. HR data or tax data, geo-information, telecommunications data, contact persons, industry information and product information can also be obtained from external sources.
Conclusion: Efforts for data quality are worthwhile
It is a worthwhile goal to keep data quality permanently high in the new system after data migration has been completed. Establishing effective measures to maintain high data quality ensures that companies benefit in the long term from the efforts made to optimize data quality during the data migration project. On the other hand, companies with high-quality data make a valuable contribution to ensuring operational efficiency, increasing customer satisfaction, meeting legal requirements, reducing costs and strengthening their market position. All in all, measures to increase data quality are investments that contribute to the long-term competitiveness and success of companies.