$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Senior
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Data Quality, Data Migration Data Governance
Jayita Paul
,
, United Kingdom
Experience
Other titles
Skills
I'm offering
Senior Technology professional with 14 years of overall experience Areas of expertise specializing in the areas of Data Migration, Data Quality & Data • Data Quality and Data Governance Governance. Extensive designing, consulting and execution experience • Data Integration in data governance and quality management across Retail, Insurance, • Data Migration Banking & Financial Services. Experienced in delivering large data • Data modelling projects in waterfall and agile from onshore, managing cross functional • Collibra - L1/LII/LIII Certified distributed teams. • IDQ Certified - v9 and 10.x • ETL-Datastage, Informatica Expertize in identifying, evaluating, and designing systems and • DQ Tools- IDQ, Information analyser procedures to meet strategic data transformational and business • DG tools - Collibra, Axon, EDC requirements in consensus with senior stakeholders. Experienced in • PL/SQL requirement gathering and documenting BRDs, FSDs, Change requests and customer coordination.
Markets
United Kingdom
Links for more
Once you have created a company account and a job, you can access the profiles links.
Industries
Language
English
Fluently
My experience
2019 - ?
freelance
Data Governance & Data Quality Consultant
UK Retail Giant.
Next generation BI platform on cloud to sunset the SAS stack located on premise; and Aug 2019 - Present to re-platform the existing EDW from the SAS suite to a Microsoft Azure suite,
including migration of existing reports to Power BI. It involved carrying out an
assessment of Data Quality issues and Data Governance processes to enable the client in the new platform
Key Responsibilities:
• Work closely with the Business to develop and implement the Data Governance
Council's forward roadmap and work programme
• Ensure an effective data governance process is applied and continuously improved
within Programme and Change Management
• Ensure good quality data flows within the Data Council and between the Data
Council and senior management
• Develop metrics in discussion with Business and publish a suite of metrics for
informed decision making
• Responsible for defining the data quality framework
• Responsible for Data Profiling, Data Transformation, and standardization using
Informatica IDQ
• Sprint planning with business stakeholders and technical team
• Daily scrum with engineering team, coaching, technical support and oversight
• Quality planning and validation of the technical deliverables
• Customer presentation for acceptance during sprint review
Benefits delivered:
• 25% Reduction in data anomalies with implemented data quality rules
• Consolidation of multi departmental spreadsheet reports into standard interactive
Power BI report across organization with 60% reduction in number of reports
• Definition and adoption of uniform business terms throughout the organisation and implementation of data lineage through Collibra
including migration of existing reports to Power BI. It involved carrying out an
assessment of Data Quality issues and Data Governance processes to enable the client in the new platform
Key Responsibilities:
• Work closely with the Business to develop and implement the Data Governance
Council's forward roadmap and work programme
• Ensure an effective data governance process is applied and continuously improved
within Programme and Change Management
• Ensure good quality data flows within the Data Council and between the Data
Council and senior management
• Develop metrics in discussion with Business and publish a suite of metrics for
informed decision making
• Responsible for defining the data quality framework
• Responsible for Data Profiling, Data Transformation, and standardization using
Informatica IDQ
• Sprint planning with business stakeholders and technical team
• Daily scrum with engineering team, coaching, technical support and oversight
• Quality planning and validation of the technical deliverables
• Customer presentation for acceptance during sprint review
Benefits delivered:
• 25% Reduction in data anomalies with implemented data quality rules
• Consolidation of multi departmental spreadsheet reports into standard interactive
Power BI report across organization with 60% reduction in number of reports
• Definition and adoption of uniform business terms throughout the organisation and implementation of data lineage through Collibra
Data quality, Processes, Framework, Power, BEE, Organization, Assessment, Support, Transformation, Implementation, It, Change management, Engineering, Management, SAS, Cloud, Microsoft azure, Power BI, Azure, Scrum, Coaching
2018 - 2019
job
Data Architect for UK based
Media Company.
Migration of legacy data from Maconomy Insight to Maconomy Core and data
profiling, validation, standardization and transformation to create error free records across organization. Implementation of rule based profiling and remediation report
generation for business for better decision making
Key Responsibilities:
• Responsible for building data pipe consisting of ETL and DQ design
• Defined strategies and process for manual and automated cleansing rules
• Data Profiling and Data Mapping of legacy applications
• ETL architecture definition for Talend jobs
• Collaboration with offshore and onshore ETL development teams and senior
developers to develop architectural requirements to meet customer needs
• Assess requirements for completeness and accuracy.
• Facilitated weekly workshops with client stakeholders to make them understand the current DQ trend, the issues and maintained a DQ checklist, set data quality
objectives and also checked the efficiency and functionality of these processes to make it fully automated/semi-automated process
Benefits delivered:
• Improved data quality management and monitoring with automated data pipeline
Increased data accuracy by 15% in target system
• Delivered the KPI for data quality trend
profiling, validation, standardization and transformation to create error free records across organization. Implementation of rule based profiling and remediation report
generation for business for better decision making
Key Responsibilities:
• Responsible for building data pipe consisting of ETL and DQ design
• Defined strategies and process for manual and automated cleansing rules
• Data Profiling and Data Mapping of legacy applications
• ETL architecture definition for Talend jobs
• Collaboration with offshore and onshore ETL development teams and senior
developers to develop architectural requirements to meet customer needs
• Assess requirements for completeness and accuracy.
• Facilitated weekly workshops with client stakeholders to make them understand the current DQ trend, the issues and maintained a DQ checklist, set data quality
objectives and also checked the efficiency and functionality of these processes to make it fully automated/semi-automated process
Benefits delivered:
• Improved data quality management and monitoring with automated data pipeline
Increased data accuracy by 15% in target system
• Delivered the KPI for data quality trend
Offshore, Maconomy, Processes, Development, KPI, Monitoring, Transformation, Implementation, It, Design, Data mapping, Workshops, Quality Management, Data quality, Architecture, Management, ETL
2016 - 2017
job
Data Quality Architect
unknown.
Migration of risk management and control framework used for monitoring, managing and reporting solvency risks. The project is to standardize the data to improve reliability
across systems during migration from R/3 to S/4 HANA
Key Responsibilities:
• Responsible for defining the data quality framework
• Responsible for Data Profiling, Data Transformation, and standardization using
Informatica IDQ and Power BI.
• Managing tasks and deadlines for IDQ teams in offshore
• PoC for IDQ teams such as reporting, Testing, QA, project status and issues.
• To get the Functional requirements specification from customer/business user and understand the same, break up the work into task list and based on simple, medium
and complex methodology
• Reviewing the High level and Detailed Designs and also weekly status reporting to customer
• Perform thorough analysis of data to find anomalies like data duplication, null check
etc.
• Develop IDQ mapplets for cleansing and standardizing various fields like for name,
phone numbers, SSN, email etc. and scorecard creation
• ETL layer creation involved creation of workflows, sessions, command task, email task,
decision task and various transformation as per design requirement.
across systems during migration from R/3 to S/4 HANA
Key Responsibilities:
• Responsible for defining the data quality framework
• Responsible for Data Profiling, Data Transformation, and standardization using
Informatica IDQ and Power BI.
• Managing tasks and deadlines for IDQ teams in offshore
• PoC for IDQ teams such as reporting, Testing, QA, project status and issues.
• To get the Functional requirements specification from customer/business user and understand the same, break up the work into task list and based on simple, medium
and complex methodology
• Reviewing the High level and Detailed Designs and also weekly status reporting to customer
• Perform thorough analysis of data to find anomalies like data duplication, null check
etc.
• Develop IDQ mapplets for cleansing and standardizing various fields like for name,
phone numbers, SSN, email etc. and scorecard creation
• ETL layer creation involved creation of workflows, sessions, command task, email task,
decision task and various transformation as per design requirement.
Design, Power BI, ETL, R, Risk Management, Management, QA, Data quality, Offshore, Transformation, Monitoring, Testing, BEE, Power, Framework, UP
2014 - 2016
job
Data Migration Lead
unknown.
Cleaning, transformation and migration of semi-structured dealer information into golden
records for a large US consumer Bank.
Key Responsibilities:
• Leading the project from offshore and deploy project at onshore. Communicate with the client to analyses the business needs and provide them with the best integration
solution and manage expectations
• Engaged in Design, SIT, UAT and Performance testing.
• Involved in Code reviews
• Wok consists of any change requirement, analysis, testing, implementation
• Responsible for creating and maintaining project related technical documents
• Developed Data Stage Jobs & Sequences using Stages like XML, Filter, Join, Lookup,
Remove Duplicates, Sort, Transformer and few activities in Sequences
IBM Data Analyst for US Retail Giant
records for a large US consumer Bank.
Key Responsibilities:
• Leading the project from offshore and deploy project at onshore. Communicate with the client to analyses the business needs and provide them with the best integration
solution and manage expectations
• Engaged in Design, SIT, UAT and Performance testing.
• Involved in Code reviews
• Wok consists of any change requirement, analysis, testing, implementation
• Responsible for creating and maintaining project related technical documents
• Developed Data Stage Jobs & Sequences using Stages like XML, Filter, Join, Lookup,
Remove Duplicates, Sort, Transformer and few activities in Sequences
IBM Data Analyst for US Retail Giant
Design, Retail, XML, Integration, Offshore, Implementation, Transformation, Analyst, Testing
2008 - 2011
job
Datastage Developer
Target.
Data Stage SAP R/3 PACK and SAP BW PACK are being used to
Apr 2008 - Oct 2011 integrate with the ECC systems.TGT$100 is the business and technology effort to transform financial applications and related processes with a Finance Integrated Network
that will support Target as a $100 + Billion company.
Key Responsibilities:
• Developed Data Stage Jobs & Sequences using Stages like oracle/DB2 Source, Filter,
Join, Lookup, Remove Duplicates, Sort, Transformer and few activities in Sequences
• Worked on packages, procedures, shared container and generic jobs, extracting
data from SAP and Legacy systems and transforming them using business logic and loading data to the target SAP system, Understanding the MSDs (Mapping Spec Document) received from Client and develop the TS for the same, involved in Unit
Testing, Integration Testing
• Involved in Implementation, migration the Code to Prod Environment., involved in facing Onshore Team for Daily Status and Updates, used to update object tracker,
reviewing codes, understand/Analyze business requirements, Assist PL in estimating
effort, Conducted code review, performance evaluation and tune application
performance
Apr 2008 - Oct 2011 integrate with the ECC systems.TGT$100 is the business and technology effort to transform financial applications and related processes with a Finance Integrated Network
that will support Target as a $100 + Billion company.
Key Responsibilities:
• Developed Data Stage Jobs & Sequences using Stages like oracle/DB2 Source, Filter,
Join, Lookup, Remove Duplicates, Sort, Transformer and few activities in Sequences
• Worked on packages, procedures, shared container and generic jobs, extracting
data from SAP and Legacy systems and transforming them using business logic and loading data to the target SAP system, Understanding the MSDs (Mapping Spec Document) received from Client and develop the TS for the same, involved in Unit
Testing, Integration Testing
• Involved in Implementation, migration the Code to Prod Environment., involved in facing Onshore Team for Daily Status and Updates, used to update object tracker,
reviewing codes, understand/Analyze business requirements, Assist PL in estimating
effort, Conducted code review, performance evaluation and tune application
performance
Oracle, SAP, R, DB2, Finance, Integration, Technology, Network, Developer, Implementation, Support, Testing, Logic, Processes
My education
2000
-
2003
National Institute of Technology
Masters, Computer Application
Masters, Computer Application
Jayita's reviews
Jayita has not received any reviews on Worksome.
Contact Jayita Paul
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Jayita directly in Worksome.
38000+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark