Duplicate Data Demystified

Duplicate Data Demystified

Hello, looking at the myths surrounding Duplicate Data and understanding how these myths are misleading will help you deal with your data management and quality.

And we hope this Data Diary article will help. We will explain the cause and effects of duplication including legislation and compliance issues. Explain how you can reduce the risks of adding duplicate data, processes you can put in place to help reduce it, and how to remove it.

Duplicate data management starts with policies and procedures, with a little help from technology, you can deal with this issue improving user adoption/efficiency and customer satisfaction.

Duplicate Data Issues

Duplicate Data Demystified

The majority of CRM systems have duplicate data to one degree or another, that is no mystery! Experian states that 95% of businesses have seen impacts related to bad quality data and that a third of their customer and prospect data is inaccurate in some way. In addition, only 50% believe their CRM/ERP data is clean and can be fully leveraged.

Experience over two decades dealing with Business systems and sales force automation shows that the prime issues have always been poor quality data and poor data management. Whether this is from the adoption of legacy data, general data entry, or a lack of data governance and
standards.


Unbelievably today, the situation is concerningly no different. CRM and other business systems do a poor job of ensuring that the user experience is great when finding and entering data.

As a result, it is not surprising that users add duplicate records, but users are not the only source of this issue.

Let’s look at some of the common myths that exist around Duplicate Data


MYTH 1: DATA QUALITY IS IMPROVED BY ENHANCING USER EXPERIENCE

False! Whilst user experience is important, businesses must first address what defines unique data within their systems to ensure a baseline in data quality.

So, the key question is what defines unique data within your system. This isn’t as easy as it sounds, as the definition of duplicate data differs from business to business. This is where corporate data governance and policies play their role in ensuring users follow guidelines, in their daily tasks, to maintain and ensure data quality. User experience becomes inherently enhanced through the improvement of data quality, structured data, and formalized processes.


MYTH 2: THE BIGGEST CAUSE OF DUPLICATE DATA IS FROM DATA ENTRY

False! Duplicate data coming from data entry is largely a result of users being unable to identify the presence of data already in the system, resulting in them creating costly duplicates. Therefore, the main issue is the users’ inability to successfully find existing data.

Yes, it is fair to say that data entry is a vulnerable element to the data quality process, and without appropriate safeguards and data governance policies in place, it can be a serious cause of duplicate data. However, if users are given the effective means to search and find existing data, it is much less likely they will be the cause of duplicate data entering the system. Furthermore, do not waste precious time and effort keying unnecessary information. Preferably systems should also consider the adoption of a data quality firewall process at the point of entry, to provide systematic governance and early warning as data is being entered.


MYTH 3: DATA SHOULD BE DEDUPLICATED BEFORE IMPORTING INTO SYSTEMS

False! The process of bringing new data into systems is not always simple or linear, certain information may be common, but other details may be new or updated. Deduplication before import would risk losing important information.

Cleaning up your data outside of your production systems and avoiding the introduction of duplicate data into those environments might seem like the logical thing to do, i.e. keeping the CRM system clean of unwanted duplicates.

However, whilst we agree that data validation, data correction, and avoiding data duplication are all very important considerations, managing and suppressing duplicate data outside of the CRM system can have a significant number of drawbacks.

Bureaux Sourced Data
- Firstly, if the external data was sourced from a data bureaux service, what might be their definition used to match theirs against your existing data and how accurate might that duplicate identification be - it is possible that their flagging of potential duplicates is grossly inaccurate? Resulting in false positives that mean duplicates are wrongly missed or wrongly identified.

Golden Record - Secondly, when external data is not present in the production system, it will always be classed as the duplicate and as such the unfavoured item in any master/duplicate conflict. Often it is preferred that the existing data always takes precedence over any external source, however, this rule can never be challenged if the external data is always suppressed from ever entering the production system.

Lost Data - Thirdly and perhaps most important is the consideration of potential lost information. Suppressing any potential duplicate from ever entering the production system, means that each duplicate item will be discarded in its entirety, including all associated attributes. So
yes whilst external data may well cause duplicate conflict with existing data, there may be additional attributes associated with the external data that is not yet attributed to its existing counterpart. Whereas when permitted to enter the CRM system, the duplicates can be identified and this new data can be captured and reassigned (merged) into the existing surviving master record - thus forming a Master-Golden-Record and Single-Customer-View (SCV).


Paribus 365 is revolutionizing the way Microsoft Dynamics 365 users manage their data – as the proven DQ for Dynamics solution,
your data quality guardian for Microsoft Dynamics 365.
Saving users precious time and empowering organizations to finally realize the true potential of their customer data.

FIND IT, TRUST IT, ENGAGE IT!

See Paribus 365, the Dynamics 365 data deduplication solution, in action by requesting a free 30-day trial, or contact us for a demo.