High data quality is an important success factor for data migration, but in practice, companies often fail to manage this. Statistics show that in around 40% of all data migration projects, poor data quality leads to higher costs and delays. Low-quality data is expressed in different ways: for example, missing values in certain fields make it impossible to generate correct data content. Different spellings and formal errors encourage the creation of duplicates.

 


 

We migrate your data – for high-quality and verified data

Request now

 


 

The right time for data cleansing

When companies carry out data cleansing, they enable lean and reliable data migration. This also includes identifying data that is no longer required and getting rid of this unnecessary ballast. For reasons of efficiency, data cleansing should take place before the actual migration. The effort required for data cleansing in the new system – i.e . after the migration – proves to be disproportionately higher and is therefore not recommended. The basis for data cleansing is a careful analysis of the existing data and its quality. Such an inventory, which we explain in detail in another blog post, determines the need for action and provides the appropriate starting points for the subsequent data cleansing.

 

Data cleansing in five steps

The goal of data cleansing is clear: The data should be correct, complete, consistent and of high quality so that it can be transferred to the new system. But how do companies achieve this goal? In practice, a five-step approach to data cleansing is well established:

  • Scope definition

  • Field mapping

  • Value mapping

  • Address cleansing

  • Duplicate check

 

Step #1: Scope definition

Based on the results of a previous data analysis, the scope definition determines the exact scope of the data cleansing and processing. This makes it possible to define whether and which data needs to be cleansed. Information may also be missing, meaning that the data must be enriched with the help of external sources. Data that is no longer required can be deleted.

 

The scope definition also involves selecting the appropriate tool and the optimal procedure for data cleansing. Ideally, data cleansing is largely automated, which brings speed and efficiency benefits. However, manual intervention is usually always necessary. The proportion of manual intervention in data cleansing varies depending on the type of data to be cleansed.

 

Step #2: Field mapping

The aim of field mapping is to define the field assignments between the source system and the target system. It may also be necessary to take fields from non-SAP systems into account. You also need to check whether certain fields are no longer required in the new system or no longer exist at all.

 

Step #3: Value mapping

Value mapping deals with the question of which values may need to be converted in order to be able to use them in the target system. A typical example is the adjustment from a one-digit (D) to a two-digit (DE) country code. The field lengths may change in the new system, making an adjustment necessary. In addition, it must be clarified whether there are different company codes that are to be consolidated into one company code in the target system.

 

Step #4: Address cleansing

During the ongoing operation of a system, often nobody feels responsible for keeping the address data up to date, so that it becomes outdated over time. As part of address cleansing, addresses are adapted to the official spellings and incorrect spellings are corrected. In addition, the focus is on eliminating additionally recorded information that is not post-relevant from the street and town fields. Official street renamings are also taken into account and may result in a correction of the address. It is possible to enrich the address data with additional information such as geocodes.

 

Step #5: Duplicate check

Address cleansing and the resulting standardized display of addresses increases the probability of identifying duplicates in the data records. The first step is to clarify which fields should be used to identify duplicates. For example, company and address combinations that exist more than once, tax numbers and bank details that exist more than once, telephone numbers, but also material numbers or material descriptions. Furthermore, companies must define the degree of matching from which a data record is identified as a duplicate.

 

Quality-assured data at a fixed price

As a proven expert in data migration projects, IBsolution offers address validation and duplicate cleansing at a fixed price. For companies facing a data migration, this offer comes with various advantages: They receive an overview of the existing data quality and do not have to spend their own personnel and time on data cleansing. The fixed price offer ensures budget security. Well-established processes and experience from countless data quality projects guarantee short lead times. Data handling in accordance with the European General Data Protection Regulation (GDPR) is also ensured.

 

Conclusion: Get started with clean data in the new system

Data cleansing is a crucial step in a data migration project. It improves data quality and ensures that no obsolete data is transferred to the new system. When companies carry out data cleansing before the actual migration, they ensure that the project runs smoothly, efficiently and reliably. This not only lays the foundation for a successful migration, but also sets the course for long-term data integrity in the new system.

 

We migrate your data – for high-quality and verified data

Request now

 

Further articles of interest: