$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Enterprise Architect
Dilip Pungliya
,
London, United Kingdom
Experience
Other titles
Skills
I'm offering
● Twenty-two years of professional experience with the core competency of twelve years as a solution and data architect.
● Proven track record of delivering end-to-end solutions including ERP/ CRM systems for data including data modelling, defining data dictionaries, enterprise metadata lineage, data quality, and governance for multiple business areas in the challenging and continuously evolving domain of investment and commercial banking.
● Expertise in data analysis and data stewards to produce taxonomies, definitions, groupings, and mappings for the multiple domains of data for investment banking such as reference, trades (Equities, FX, Fixed Income, and Derivatives), pricing and finance.
● Created processes to on-board vendor software within the data domain including shortlisting, RFI, RFP, and conducting workshops with organisations real use cases to check fitments for their current business scenarios and future strategic goals.
● Developed, led and delivered - feasibility studies, strategy development, roadmaps, designs, operating and target operating model, suitable architecture selection complying with standards, within timelines, within budget, using agile, iterative, and waterfall methodologies for business areas in investment and commercial banking.
● Excellent communicator with proven skills to manage swift interaction between business, IT services and senior management for clients across the globe.
SKILLS AND EXPERIENCE
Analyst Requirements, specification, acceptance criteria, definition of done, implementation strategy and phases, expectation management and situation management.
Architecture Technical and solution architecture (cloud, on-prem, or hybrid), target operating model, business process model (BPM), capability model, conceptual and logical model with Zachman, or DAMA framework using tools such as Sparx Enterprise Architecture, Power Designer, ER Studio, Erwin, and Visio, and user story using UML, ERD and XSD.
Data Governance Metadata lineage, data quality rules, business glossary and data dictionary using tools such as Collibra and Solidatus.
Vendor Software Tools Solidatus and Collibra for data governance, GoldenSource for enterprise data management (EDM).
Database MS SQL Server 2017, Oracle 12c, and MongoDB 4
BI and data warehouse Tools SSIS, SSAS, OLAP (MOLAP, ROLAP, HOLAP), SQL Anywhere, and SSRS using Kimball and Inmon approaches.
Programming SQL, Python, XML Spy, .Net Scripting, PowerShell scripting
Regulatory Hands on experience for GDPR and BCBS239
● Proven track record of delivering end-to-end solutions including ERP/ CRM systems for data including data modelling, defining data dictionaries, enterprise metadata lineage, data quality, and governance for multiple business areas in the challenging and continuously evolving domain of investment and commercial banking.
● Expertise in data analysis and data stewards to produce taxonomies, definitions, groupings, and mappings for the multiple domains of data for investment banking such as reference, trades (Equities, FX, Fixed Income, and Derivatives), pricing and finance.
● Created processes to on-board vendor software within the data domain including shortlisting, RFI, RFP, and conducting workshops with organisations real use cases to check fitments for their current business scenarios and future strategic goals.
● Developed, led and delivered - feasibility studies, strategy development, roadmaps, designs, operating and target operating model, suitable architecture selection complying with standards, within timelines, within budget, using agile, iterative, and waterfall methodologies for business areas in investment and commercial banking.
● Excellent communicator with proven skills to manage swift interaction between business, IT services and senior management for clients across the globe.
SKILLS AND EXPERIENCE
Analyst Requirements, specification, acceptance criteria, definition of done, implementation strategy and phases, expectation management and situation management.
Architecture Technical and solution architecture (cloud, on-prem, or hybrid), target operating model, business process model (BPM), capability model, conceptual and logical model with Zachman, or DAMA framework using tools such as Sparx Enterprise Architecture, Power Designer, ER Studio, Erwin, and Visio, and user story using UML, ERD and XSD.
Data Governance Metadata lineage, data quality rules, business glossary and data dictionary using tools such as Collibra and Solidatus.
Vendor Software Tools Solidatus and Collibra for data governance, GoldenSource for enterprise data management (EDM).
Database MS SQL Server 2017, Oracle 12c, and MongoDB 4
BI and data warehouse Tools SSIS, SSAS, OLAP (MOLAP, ROLAP, HOLAP), SQL Anywhere, and SSRS using Kimball and Inmon approaches.
Programming SQL, Python, XML Spy, .Net Scripting, PowerShell scripting
Regulatory Hands on experience for GDPR and BCBS239
Markets
United States
United Kingdom
Germany
Lithuania
Denmark
Norway
Sweden
Finland
Links for more
Once you have created a company account and a job, you can access the profiles links.
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Available
My experience
2017 - ?
job
Solution and Data Architect
ICBC Standard Bank.
Part of the Enterprise Architect Team worked to deliver solution architecture
• The data governance program which has delivered transparency for the businesses and management team for the Data usage, data flows, and ownership using the Solidatus lineage tool. Additional capabilities which were delivered with this program were regulatory requirements, i.e. GDPR and BCBS239.
• Documented data flows with various stakeholders to provide lineage of data which has improved IT and business process and change impact cycle.
• Standardize the business process model and terms along with business owners to document and agree on the definitions, synonyms and associations, and implemented with underlying technical data flow.
• Defined data quality rules to establish the process to manage business rules checks for the data.
• Created a process to on-boarding vendor software to validate how well the software provided the capabilities that the vendor claimed, and how process optimization for the quicker on-boarding of vendor software.
• Enterprise data management (EDM) program went live and delivered as a centralized repository for the golden record of referential and trade data for one-stop-shop of a golden copy of the data.
Responsibilities:
• Built capabilities required by the bank to manage and govern the data using the Solidatus lineage tool.
• Built an end-to-end process to on-board vendor software as an evaluation process which includes defining the required capabilities, comparing capability available in the market as part of initial screening, RFI, RFP process, and building process to check with the real time use case of the organization.
• Produced a solution which includes scoping, target operating model, implementation strategy, approach, acceptance criterion, and definition of done.
• Worked with Governance council and business owners to standardize and build the data dictionary, business glossary, data quality rules. Worked various owners to implement the change into practice.
• Analysed various trade message and reference data formats such as FpML, Murex MXML, and in-house trade message to understand the trade flow and transformation.
• Worked with various system owners to produce an end-to-end lineage for the data about data (metadata) and flows, then mapped with the business terms. Implemented changes in the application release cycle to include these mapping changes as part of the release to maintain the relevance of lineage.
• As a architect of the EDM program produced solution design and model for the party data, and message delivery format for party, product, and instrument.
Tools and technologies used Oracle, MS SQL Server, Azure and on-prem infrastructure, Solidatus, GoldenSource, PowerDesigner 16.5 and Visio.
• The data governance program which has delivered transparency for the businesses and management team for the Data usage, data flows, and ownership using the Solidatus lineage tool. Additional capabilities which were delivered with this program were regulatory requirements, i.e. GDPR and BCBS239.
• Documented data flows with various stakeholders to provide lineage of data which has improved IT and business process and change impact cycle.
• Standardize the business process model and terms along with business owners to document and agree on the definitions, synonyms and associations, and implemented with underlying technical data flow.
• Defined data quality rules to establish the process to manage business rules checks for the data.
• Created a process to on-boarding vendor software to validate how well the software provided the capabilities that the vendor claimed, and how process optimization for the quicker on-boarding of vendor software.
• Enterprise data management (EDM) program went live and delivered as a centralized repository for the golden record of referential and trade data for one-stop-shop of a golden copy of the data.
Responsibilities:
• Built capabilities required by the bank to manage and govern the data using the Solidatus lineage tool.
• Built an end-to-end process to on-board vendor software as an evaluation process which includes defining the required capabilities, comparing capability available in the market as part of initial screening, RFI, RFP process, and building process to check with the real time use case of the organization.
• Produced a solution which includes scoping, target operating model, implementation strategy, approach, acceptance criterion, and definition of done.
• Worked with Governance council and business owners to standardize and build the data dictionary, business glossary, data quality rules. Worked various owners to implement the change into practice.
• Analysed various trade message and reference data formats such as FpML, Murex MXML, and in-house trade message to understand the trade flow and transformation.
• Worked with various system owners to produce an end-to-end lineage for the data about data (metadata) and flows, then mapped with the business terms. Implemented changes in the application release cycle to include these mapping changes as part of the release to maintain the relevance of lineage.
• As a architect of the EDM program produced solution design and model for the party data, and message delivery format for party, product, and instrument.
Tools and technologies used Oracle, MS SQL Server, Azure and on-prem infrastructure, Solidatus, GoldenSource, PowerDesigner 16.5 and Visio.
It, Enterprise, Regulatory, Server, Software, RFP, Infrastructure, Transformation, Implementation, Data quality, Process optimization, Design, Architecture, Visio, Management, Solution architecture, GDpr, Data management, Oracle, Azure, SQL Server, Sql
2012 - 2017
job
Solution and Data Architect
BNP Paribas.
Part of Commodity front office trade booking team worked to deliver the solution and data architecture for
• Data warehouse system delivers various real-time reports for the valuations, risk, profit & loss, EMIR Regulation, and accounting purposes.
• Emission Valuation engine is to build an algorithm to calculate the weighted average of the emission certificates requirement by the finance team to produce the company's overall position.
• Performance tuning and automation for the functionality such as creation of the instrument based on the complex formulae of present values and risk, and physical delivery of oil and gas for USA trading.
Responsibilities:
• Worked with various business and system owners to produce the required capability and functionality for commodities enterprise warehouse system.
• Produced solution design, target operating model, implementation strategy, infrastructure requirement, bus matrices, data model, and metadata lineage for the enterprise data warehouse system.
• Produced an end to end lineage for the data flow produced and consumed by commodities system in the various format and source like Oracle & SQL Databases, Web Service, & XML.
• Designed & modelled enterprise data warehouse system based on the dimensional modelling recommended by Kimball's methodologies.
• Worked along with the enterprise architect team to produce various database models, performance tuning, infrastructure restructuring, security, and standardizations for different commodities trading systems.
• Worked to define data migration, storage capacity, partitioning, compressions, and policies.
• Used Scrum Agile development methodologies for project deliverable artefacts.
Tools and technologies used MS SQL Server, C# Scripting, SSIS, SSAS, SSRS, PowerBI, PowerShell scripting, Sparx Enterprise Architecture, Collibra and Team Foundation server.
• Data warehouse system delivers various real-time reports for the valuations, risk, profit & loss, EMIR Regulation, and accounting purposes.
• Emission Valuation engine is to build an algorithm to calculate the weighted average of the emission certificates requirement by the finance team to produce the company's overall position.
• Performance tuning and automation for the functionality such as creation of the instrument based on the complex formulae of present values and risk, and physical delivery of oil and gas for USA trading.
Responsibilities:
• Worked with various business and system owners to produce the required capability and functionality for commodities enterprise warehouse system.
• Produced solution design, target operating model, implementation strategy, infrastructure requirement, bus matrices, data model, and metadata lineage for the enterprise data warehouse system.
• Produced an end to end lineage for the data flow produced and consumed by commodities system in the various format and source like Oracle & SQL Databases, Web Service, & XML.
• Designed & modelled enterprise data warehouse system based on the dimensional modelling recommended by Kimball's methodologies.
• Worked along with the enterprise architect team to produce various database models, performance tuning, infrastructure restructuring, security, and standardizations for different commodities trading systems.
• Worked to define data migration, storage capacity, partitioning, compressions, and policies.
• Used Scrum Agile development methodologies for project deliverable artefacts.
Tools and technologies used MS SQL Server, C# Scripting, SSIS, SSAS, SSRS, PowerBI, PowerShell scripting, Sparx Enterprise Architecture, Collibra and Team Foundation server.
Web, SSRS, Implementation, Infrastructure, Development, SSAS, Office, Storage, Server, Security, Oil and Gas, Enterprise, Dimensional modelling, Booking, Tuning, USA, Team Foundation Server, Kimball, Automation, Sql, Scrum, SQL Server, Oracle, XML, Agile development, Agile, Database, Design, Data Warehouse, C, Scripting, PowerShell, Finance, Service, Ssis, Architecture
2010 - 2012
job
Data and Solution Architect
HSBC PLC.
Part of the asset management team worked to deliver the solution and data architectures
Asset Management Reconciliation system, two principal use cases were
• To produce a comparison of Nav's, Divs, Performance, & synthetic risk data between internally calculated and external published systems to make sure that external system is not in the breach of any investment-related false recommendations to the end customer.
• To provide hedging recommendation against the base currency of a fund when traded for a different currency on a different exchange.
Research IT system, principle use cases were
• To help the analyst to capture research information to produce a research report based on company's standard wording to guide investors in their investment decisions, to support markets such as equity, FI & FX products.
• To help management for the accuracy of the report by checking the rating and commented provided by the end customers on the published research report.
Responsibilities:
• Worked with various business technical owners to produce requirement and use cases, and user stories
• Worked to produce the current operating model, targeting operating model and impact on the business
• Worked to build solution architecture, implementation plan, scope, data model using star schema design, and process design.
• Developed Datamart cube, analytical dashboard, canned reporting and managed development team.
Tools and technologies used MS SQL Server, .Net C#, SSIS, SSAS (MOLAP), SSRS, SQL XML, XML and Visio 2003 architect version.
Asset Management Reconciliation system, two principal use cases were
• To produce a comparison of Nav's, Divs, Performance, & synthetic risk data between internally calculated and external published systems to make sure that external system is not in the breach of any investment-related false recommendations to the end customer.
• To provide hedging recommendation against the base currency of a fund when traded for a different currency on a different exchange.
Research IT system, principle use cases were
• To help the analyst to capture research information to produce a research report based on company's standard wording to guide investors in their investment decisions, to support markets such as equity, FI & FX products.
• To help management for the accuracy of the report by checking the rating and commented provided by the end customers on the published research report.
Responsibilities:
• Worked with various business technical owners to produce requirement and use cases, and user stories
• Worked to produce the current operating model, targeting operating model and impact on the business
• Worked to build solution architecture, implementation plan, scope, data model using star schema design, and process design.
• Developed Datamart cube, analytical dashboard, canned reporting and managed development team.
Tools and technologies used MS SQL Server, .Net C#, SSIS, SSAS (MOLAP), SSRS, SQL XML, XML and Visio 2003 architect version.
Net, Guide, Process design, Server, SSAS, Development, Implementation, Analyst, Support, SSRS, It, Architecture, Asset Management, Design, Ssis, Visio, Management, User stories, Solution architecture, C, Exchange, XML, Research, SQL Server, Sql
2007 - 2008
job
Developer and Database architect for Marketing Data project
MRM Worldwide.
Marketing, Database, Developer
2004 - 2006
job
India, Team Lead, Database architect
Ness Technologies India Ltd.
and Developer for various projects of the clients across Europe and USA.
Database, Developer, USA
1999 - 2001
job
Module Lead and Developer
Sodhani Securities.
for various projects in the area of equities trading and security clearing.
Security, Developer
1997 - 1999
job
Developer, Marketing
Babulin Pharma Pvt. Ltd.
commissions and payrolls.
Marketing, Developer
My education
Amravati University
Bachelors, Electrical & Electronics Power system
Bachelors, Electrical & Electronics Power system
Dilip's reviews
Dilip has not received any reviews on Worksome.
Contact Dilip Pungliya
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Dilip directly in Worksome.
37900+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark