Improve your data quality

Data Quality Management to improve and control

Inside an organisation, adequate data quality is vital for transactional and operational processes. Data quality may be affected by the way in which data is entered, handled and maintained. This means that quality always depends on the context in which it is used, leading to the conclusion that there is no absolute valid quality benchmark.

PwC’s Enterprise Data Quality (EDQ) framework offers an inside-out perspective to gauge the key data quality solutions of an organisation and design a comprehensive data quality roadmap to fast track the adoption of key quality initiatives:

  • How do we provide definitions and standards for our data?
  • How do we define, measure and report data quality across transactional, reference and master data?
  • How and where is data remediated?
  • How do we provide proper context, relationships and lineage for data?
Data Quality - Overview

Our Services

  • Identify and suggest areas of focus based on enterprise data quality using in-house tools and accelerators, Marcopolo, DQLite
  • Help organisations to establish data quality framework fit for their business model and future goals - EDQ versus traditional DQ
  • Assist in development of metrics, thresholds and policies, processes, people, roles and metrics needed to measure and track changes over a period of time
  • Accelerated setup through tried and tested model to help an organisation start with data quality programmes faster using AI/ML driven automation
  • Visualise data quality scorecards to measure data health and take faster actions through generated insights
Our Data Quality Framework
DQF1

Define Data Quality Scope and Approach

DQF2

Define Data Quality Organisation

DQF3

Define Data Quality Process

DQF4

Develop Data Quality Technical Infrastructure

DQF5

Provide Training

DQF6

Rollout Data Quality Programme

Case studies

Leading Bank in India – Data Quality solution implementation

Challenge

The Bank collects customer information from various sources like: core banking, credit cards, loans, demat accounts and Third Party MF etc.

So the primary concern was a single customer view in the existing DW after de-duping/clustering. Apart from that, house-holding, standardisation and some augmentation were also part of the solution.

Approach

PwC team introduced a Data Quality Solution using SAS technology through which customer information needs to pass before stored in the data warehouse. Approach included:

  • Fine tuning the quality knowledge base in the DataFlux(DQ Product) as per the data provided by the bank
  • This QKB was applied to the overall solution frame implemented using SAS ETL studio
  • DQ Framework used: DQF1, DQF3

Outcome

  • Single view of customer and GUI for user defined field management
  • Cleansed data ensuring single version of truth across customer information reporting
  • Facilitating customer data analytics

Financial Services Group – Data Quality tool evaluation and EDW Roadmap

Challenge

The organisation wishes to review customer accessing and targeting processes and systems across its business lines (AMC, Insurance and Distribution) with a goal towards revenue upliftment and enabling operational efficiencies, especially in front-line processes, i.e. sales, services and marketing.

PwC was engaged to define functional requirements, analyse gaps w.r.t. current application systems, design technical & operational architecture and recommend tools/solutions (BI, ETL, MDM, CRM and Portal) for implementation in a phased manner.

Approach

  • Scan – PwC scanned the business and system environment, captured management’s vision/objective and gathered functional requirements from users in each line of business.
  • Focus – PwC studied current systems, perfumed gap analysis against requirements and identified feature gaps.
  • Act – PwC segregated requirements which could be catered to by customisations to existing systems, developed operational and technical architecture and recommended implementation blueprint for new applications and solution areas.
  • PwC was further involved in business case (ROI – cost benefit analysis) along with expected hard/soft benefit for the proposed implementation
  • DQ Framework used: DQF1, DQF4

Outcome

  • Scalable technology platform for growing financial services business
  • Standardise deliveries to channel partners and customers
  • Measure and improve productivity of teams and distribution
  • Differentiate offerings from competition

Government ministry – Data Profiling Strategy & Implementation

Challenge

Client was looking to set up Enterprise Data Quality framework that will focus on building 100% accurate picture of the data as data profiling promotes good data governance. They wanted to set up comprehensive data quality model which would eventually provide accurate metadata and complete metrics for understanding the data as it actually is – rather than how it was designed years ago.

There were a number of challenges while building the strategy for formulating this framework:

  • There were many data issues like:
    • The occurrence of duplicate data
    • Null values and outliers in the data
    • Inconsistency of attributes
    • Inaccurate data
  • There were sample data from different Lines of Business which needed to be collated sector wise or system specific.
  • Source data were in Arabic format

Approach

PwC utilised the PwC Enterprise Data Governance Framework 2.0 (EDG) that helped the client drive the vision for Data Profiling and Data Quality checks in order to examine the characteristics of data. The following approach was undertaken:

  • Translation of source data from Arabic to English Language in bulk
  • Defining the data profiling measures or KPIs to be examined and performing data quality assessment
  • Discovering Metadata (Inferred & Documented) and assessing its:
    • Accuracy
    • Duplication
    • Completeness
    • Consistency
  • Performing inter-table analysis and inferring relationship between different tables
  • Providing best-practice guidance on Data Profiling tools and provided detailed phased implementation approach and roadmap for Data Governance initiatives.
  • DQ Framework used: DQF1, DQF2, DQF3, DQF4

Outcome

  • Leveraging data Governance framework to streamline data issues
  • Creating sector wise dashboards highlighting KPIs and basic data profiling metrics
  • Creating statistics report which entails descriptive statistics like min, max, count, sum, length, recurring patterns, identification of data domains

Large Healthcare corporation USA – Data Quality Assessment

Challenge

The healthcare client distributes health care systems, medical supplies and pharmaceutical products. Additionally, it provides extensive network infrastructure for the healthcare industry.

The client has actively used master data such as vendor, customer and material masters in various ERP and CRM systems for their master data management operations. The client was facing issues related to incorrect details, data inaccuracy, duplicate data, inconsistent data and issues related to integration of various database interfaces.

Approach

  • In depth profiling of client customer and product data, adhering to metadata business rules set forth by the client.
  • PwC used the Trillium Data Quality tool for profiling data. Business rules were also defined for cleansing customer data and determining duplicate customer master records.
  • Customer addresses were validated against the USPS database within Trillium, and the invalid records were cleansed and valid addresses were created as output.
  • DQ Framework used: DQF1, DQF3

Outcome

  • The results delivered by PwC helped the client determine the appropriateness of source system data in satisfying the data requirements and assisting in defining cross-system update rules
  • It helped to determine data inconsistencies across different client data sources
  • It was determined that the client’s data looked good relative to industry standards, except for a single source system that produced a high volume of duplicates

Petroleum corporation USA – Data Quality Assessment

Challenge

Client is a United States-based refiner, transporter and marketer of transportation fuels, lubricants, petrochemicals and other industrial products. Client was seeking profiling of customer data to obtain data quality baseline for critical fields, identify data quality issues, and validate relationships between relevant tables.

Approach

  • Profiling of client customer data, adhering to metadata business rules set forth by the client and engagement team.
  • Business rules were defined for cleansing customer data and determining duplicate customer master records.
  • PwC used the Trillium Data Quality tool for profiling data and validating business rules.
  • Customer addresses were also validated against the USPS database within Trillium, and for invalid records, cleansed and valid addresses were created as output.
  • DQ Framework used: DQF1, DQF3, DQF4, DQF5

Outcome

  • The high level statistics and detailed results were provided by technical profiling
  • Customer addresses were cleansed and provided accurate address records
  • Merge candidates were identified and records were merged accordingly, assisting the client in identifying disparate data among different source systems

Leading Telecom Major in Australia – Data Quality & MDM Implementation

Challenge

The client had undertaken a massive data transformation project. The objective of this project was to migrate data from the numerous (close to 200) existing Billing and Customer systems to Seibel CRM application and Kenan billing application.

In the process, the customer, service, product and provisioning information had to be cleansed and a master data set had to be maintained.

Approach

  • Investigate the legacy systems defined as in scope, in which information about the customers targeted for migration and their accounts, products and services reside.
  • Understand the subject areas (e.g. Entities and attributes or tables and columns) in the legacy systems including structure, definition and business rules.
  • Define Transformation rules including – Parsing the data, standardising the parsed data, enriching the standardised data.
  • Structuring source data to conform to the requirements of the target staging model.
  • Where there is no clear identical match, work out and document the ‘transformation rules’ to be applied, or work with the appropriate business representatives and vendor support to identify the appropriate source.
  • Where more than one system holds the required data, determine the survivorship rule.
  • DQ Framework used: DQF1, DQF3, DQF4

Outcome

  • Successful migration of all the customers from the various legacy systems
  • reduce the overhead of maintaining multiple legacy systems for various applications
  • A master data set for all the customers, billing accounts, services and product information in the target system

Insights

{{filterContent.facetedTitle}}

{{contentList.loadingText}}
Follow us

Contact us

Sudipta Ghosh

Sudipta Ghosh

Leader, Data & Analytics, PwC India

Tel: +91 22 6669 1311

Mukesh Deshpande

Mukesh Deshpande

Data Management Leader, Consulting, PwC India

Tel: +91 98 4509 5391

Amit Lundia

Amit Lundia

Data Governance Leader, Consulting, PwC India

Tel: +91 98 3692 2881

Mohua Sen

Mohua Sen

Data Quality & MDM, Consulting, PwC India

Tel: +91 94 3330 1678

Hide