$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Senior
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Data Engineer
Hardik Talati
,
Slough, United Kingdom
Experience
Other titles
Skills
I'm offering
Experience in delivering Data engineering solutions for supporting online platforms, data management and data science projects.
• Experience working with Hadoop ecosystem- HDFS, HBASE, Hive, Pig, Spark (spark-Sql, Pyspark), Oozie etc.
• Experience python developer.
• Experience in working on cloud platform such as Azure and GCP.
• Certified Base & Advanced SAS Programmer for SAS 9.
• Expertise in building scalable, robust and fault tolerant data pipelines. Build and maintain Star-Schema Data Warehouse through ETL jobs using Spark, Hive, Teradata, SAS, SQL.
• Experience in using version control tool such as Git.
• Experience in using and building CI-CD pipelines with tool such as Jenkins and Azure Devops.
• Exposure of working in Agile environment and scrum teams.
• Experience in leading projects and delivering successful solutions on the Data engineering element.
• Good knowledge and exposure in using Kubernetes and docker.
• Used statistical analysis procedures in the generation of reports periodically for analysing the business using procedures such as ANOVA, T-TEST, CHISQ, Regression and many more.
• Excellent ability in problem solving, data analysis, complex reports generation.
• Quick learner and excellent team player, consistently meeting deadlines and can work under pressure.
• Highly motivated individual with excellent organizational and interpersonal skills.
• Experience working with Hadoop ecosystem- HDFS, HBASE, Hive, Pig, Spark (spark-Sql, Pyspark), Oozie etc.
• Experience python developer.
• Experience in working on cloud platform such as Azure and GCP.
• Certified Base & Advanced SAS Programmer for SAS 9.
• Expertise in building scalable, robust and fault tolerant data pipelines. Build and maintain Star-Schema Data Warehouse through ETL jobs using Spark, Hive, Teradata, SAS, SQL.
• Experience in using version control tool such as Git.
• Experience in using and building CI-CD pipelines with tool such as Jenkins and Azure Devops.
• Exposure of working in Agile environment and scrum teams.
• Experience in leading projects and delivering successful solutions on the Data engineering element.
• Good knowledge and exposure in using Kubernetes and docker.
• Used statistical analysis procedures in the generation of reports periodically for analysing the business using procedures such as ANOVA, T-TEST, CHISQ, Regression and many more.
• Excellent ability in problem solving, data analysis, complex reports generation.
• Quick learner and excellent team player, consistently meeting deadlines and can work under pressure.
• Highly motivated individual with excellent organizational and interpersonal skills.
Markets
United Kingdom
Industries
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2019 - ?
job
Data Engineer
Mars Wrigley.
- Work as a lead engineer on various Data science projects for supply chain. Train and upscale junior engineers and vendors.
- Responsible for design and architecture of end to end data engineering on the project.
- Build scalable engineering patterns which can be widely used across the projects
- Work with vendors/consultancies on building support model for the supply chain
applications.
- Heavily use Python and py-spark on Azure Databricks for Data engineering.
- Use different databases such as Hive, cosmos dB, SQL for storing reusable data assets
need.
- Deployed Elasticsearch and kibana on AKS cluster for application logging and monitoring.
- Used Sphinx and other auto documentation framework for creating technical and business documents.
- Used power bi, tableau, and webapps for building front end application.
- Work on POC for multiple cloud platforms.
- Responsible for design and architecture of end to end data engineering on the project.
- Build scalable engineering patterns which can be widely used across the projects
- Work with vendors/consultancies on building support model for the supply chain
applications.
- Heavily use Python and py-spark on Azure Databricks for Data engineering.
- Use different databases such as Hive, cosmos dB, SQL for storing reusable data assets
need.
- Deployed Elasticsearch and kibana on AKS cluster for application logging and monitoring.
- Used Sphinx and other auto documentation framework for creating technical and business documents.
- Used power bi, tableau, and webapps for building front end application.
- Work on POC for multiple cloud platforms.
Spark, Framework, Patterns, Power, BEE, Science, Kibana, Monitoring, Support, Hive, Data engineering, Design, Engineering, Architecture, Elasticsearch, Tableau, Cloud, Power BI, Azure, Data Science, Python, Sql
2018 - 2019
job
Data Science-Data Engineer
Lloyds Banking Group.
• Leading data engineering for various data science projects (i.e. Mortgages, insurance, income verification, customer segmentation, fraud detection etc.).
• Designing data models and data pipelines using big data technologies.
• Building scalable and automated data pipelines from structured, semi structured and unstructured data using Spark (Pyspark).
• Extensively used Hive, HDFS for storing and managing data.
• Used different file formats and compression in HDFS and Hive for efficient data
management.
• Used HBASE for storage with integration with Hive for accessing versioned data and performing updates and delete.
• Also used Google Cloud Storage for data science experimentation work.
• Using git for version control.
• Used different python libraries such as pandas and Numpy.
• Used Pytest for unit testing.
• Also build Jenkins pipeline with support from Devops engineer.
• Ensuring Data Quality and Data collection from Data lake and other structured and unstructured sources.
• Using Big data platform along with SAS and Teradata platform for building high quality
data stores/mart.
• Mentoring Junior engineers on different analytical tools.
• Designing data models and data pipelines using big data technologies.
• Building scalable and automated data pipelines from structured, semi structured and unstructured data using Spark (Pyspark).
• Extensively used Hive, HDFS for storing and managing data.
• Used different file formats and compression in HDFS and Hive for efficient data
management.
• Used HBASE for storage with integration with Hive for accessing versioned data and performing updates and delete.
• Also used Google Cloud Storage for data science experimentation work.
• Using git for version control.
• Used different python libraries such as pandas and Numpy.
• Used Pytest for unit testing.
• Also build Jenkins pipeline with support from Devops engineer.
• Ensuring Data Quality and Data collection from Data lake and other structured and unstructured sources.
• Using Big data platform along with SAS and Teradata platform for building high quality
data stores/mart.
• Mentoring Junior engineers on different analytical tools.
Spark, Pyspark, Performing, Google, Science, Storage, Data collection, Testing, Insurance, Teradata, Support, Hive, Data quality, Data engineering, Python, Engineering, Integration, Google cloud, SAS, Management, Cloud, Mentoring, DevOps, Jenkins, Big Data, Data Science, Git
2017 - 2018
freelance
SAS Consultant
Business Data Partners.
Designed and Developed data marts for a large retail bank using Hadoop and SAS
(Credit card analytics).
• Developed data pipelines using Hadoop for data processing and storage and using SAS
VA for reporting solution.
• Developed and Designed reports and dashboards using SAS Visual Analytics for various
credit card functions such as Fraud, Collections, Applications and Portfolio
performance.
• Also Developed reports using SAS Add-ins for external stakeholders.
• Mentoring junior analyst on HIVE, HDFS, SAS tools, designing of reports and marts.
Tesco Underwriting Ltd- Data developer
September 2015- August 2017
• Responsible for the timely and accurate production of internal (i.e. Daily, Monthly and Quarterly MI for Underwriting, Pricing and Actuarial) and external (i.e. Quarterly, Yearly and half year MI for Reinsurance, ABI and others) management information to support the reporting requirements of the business and perform analytical project
work.
• Delivery of data to assist with underwriting/pricing analysis and related projects
aimed at improving the delivery of Management Information to the business to assist
with business decision making.
• Worked with the Data/MI team to develop, design, maintain and enhance a
comprehensive data warehouse for Home insurance in SAS EG ensuring all data is
clearly defined and consistent throughout the organization.
• Developed Designed and Automate various pricing/underwriting projects with SAS
base/SAS EG for Motor and Home Insurance.
• Mentoring and training junior analyst on different SAS tools (i.e. SAS Base SAS EG and SAS VA).
• Interrogating all data sources to provide reports and ad hoc data requirements using
SAS Visual analytics/SAS EG.
• Perform SAS administration task on Linux platform Via Linux scripting, SAS
management console.
• Producing regular business performance MI for analysis, executive and governance
purposes in a consistent, accurate and timely manner.
• Identified and recommend opportunities to optimise profitable growth through
appropriate MI analysis.
• Ensure workload is managed in order to meet deadlines for MI requirements.
• Worked with the business to ensure that appropriate MI is captured and developed
to support all new initiatives.
• Ensuring data reporting and manipulation is compliant with the TU Data Control
Policy, maintain all documentation and data definition schedules where appropriate.
(Credit card analytics).
• Developed data pipelines using Hadoop for data processing and storage and using SAS
VA for reporting solution.
• Developed and Designed reports and dashboards using SAS Visual Analytics for various
credit card functions such as Fraud, Collections, Applications and Portfolio
performance.
• Also Developed reports using SAS Add-ins for external stakeholders.
• Mentoring junior analyst on HIVE, HDFS, SAS tools, designing of reports and marts.
Tesco Underwriting Ltd- Data developer
September 2015- August 2017
• Responsible for the timely and accurate production of internal (i.e. Daily, Monthly and Quarterly MI for Underwriting, Pricing and Actuarial) and external (i.e. Quarterly, Yearly and half year MI for Reinsurance, ABI and others) management information to support the reporting requirements of the business and perform analytical project
work.
• Delivery of data to assist with underwriting/pricing analysis and related projects
aimed at improving the delivery of Management Information to the business to assist
with business decision making.
• Worked with the Data/MI team to develop, design, maintain and enhance a
comprehensive data warehouse for Home insurance in SAS EG ensuring all data is
clearly defined and consistent throughout the organization.
• Developed Designed and Automate various pricing/underwriting projects with SAS
base/SAS EG for Motor and Home Insurance.
• Mentoring and training junior analyst on different SAS tools (i.e. SAS Base SAS EG and SAS VA).
• Interrogating all data sources to provide reports and ad hoc data requirements using
SAS Visual analytics/SAS EG.
• Perform SAS administration task on Linux platform Via Linux scripting, SAS
management console.
• Producing regular business performance MI for analysis, executive and governance
purposes in a consistent, accurate and timely manner.
• Identified and recommend opportunities to optimise profitable growth through
appropriate MI analysis.
• Ensure workload is managed in order to meet deadlines for MI requirements.
• Worked with the business to ensure that appropriate MI is captured and developed
to support all new initiatives.
• Ensuring data reporting and manipulation is compliant with the TU Data Control
Policy, maintain all documentation and data definition schedules where appropriate.
Analytics, Processing, Production, Underwriting, Storage, Insurance, Analyst, Support, Growth, Hive, Developer, Design, SAS, Management, Hadoop, Data Warehouse, Mentoring, Scripting, Training, Retail, Administration, Linux
2015 - 2015
job
Business Information Analyst
West London Mental Health NHS Trust.
Responsible for high level analysis, interpretation, synthesis and presentation of information reports to meet specific objectives and ensure managers are
advised of any potential problem areas.
• Worked as an effective and integral part of the CSU Business and performance
function to ensure information meets the service needs and commissioning
needs.
• Identified and highlighted any areas of concern with regard to information
quality and accuracy and undertook appropriate action as required.
• Ensured all data extracted from information systems balances back to core
applications and supported and lead any necessary corrective actions.
• Have been the core link for the CSU and Business Intelligence team supporting
any information requirements to meet development of reports, amendments
and data quality issues.
• To lead on the provision of detailed analysis of performance against SLAs for
each Service.
• Using SQL management studio to extract and develop reports.
• Construct SQL queries to generate efficient report for NHS England and commissioners.
• Ensure consistency of data standards for data are applied and supports the implementation of new or change in data standards or reporting
requirements.
advised of any potential problem areas.
• Worked as an effective and integral part of the CSU Business and performance
function to ensure information meets the service needs and commissioning
needs.
• Identified and highlighted any areas of concern with regard to information
quality and accuracy and undertook appropriate action as required.
• Ensured all data extracted from information systems balances back to core
applications and supported and lead any necessary corrective actions.
• Have been the core link for the CSU and Business Intelligence team supporting
any information requirements to meet development of reports, amendments
and data quality issues.
• To lead on the provision of detailed analysis of performance against SLAs for
each Service.
• Using SQL management studio to extract and develop reports.
• Construct SQL queries to generate efficient report for NHS England and commissioners.
• Ensure consistency of data standards for data are applied and supports the implementation of new or change in data standards or reporting
requirements.
Sql, Business Intelligence, Management, Service, Data quality, Implementation, Analyst, Development, SQL Management Studio, SQL Management Studio
2014 - 2015
job
SAS Data Developer
3P BI solutions Ltd.
As a voluntary SAS analyst, I am involved in the project that primarily requires
developing ETL jobs, validating tables and analysing datasets.
• Developed, Automated and Documented new data marts by building ETL jobs in SAS DI Studio
• Registered and updated metadata for tables, libraries, jobs and transformations in SAS DI Studio
• Generated OLAP cubes as per hierarchy's levels and measures requirement to be used in different web tiers.
• In Data Transformation and Manipulation process, extensively used procedures
like SQL, Transpose, Tabulate, Copy, Sort, Datasets etc.
• Extensively worked on PROC REPORT and PROC TABULATE to produce
descriptive statistical reports for the purpose of validation.
• Involved in writing code using data step programming and macros to extract,
clean and validate date from external source.
• Data merging, Data subsetting with PROC SQL, MERGE and SET statements.
• Imported and Exported data files to and from SAS using Proc Import and Proc
Export from Excel and various delimited text based data files such as .TXT (tab delimited) and .CSV (comma delimited) files into SAS datasets for analysis
• In Data Transformation and Manipulation process, extensively used procedures
like SQL, Transpose, Tabulate, Copy, Sort, Datasets etc.
developing ETL jobs, validating tables and analysing datasets.
• Developed, Automated and Documented new data marts by building ETL jobs in SAS DI Studio
• Registered and updated metadata for tables, libraries, jobs and transformations in SAS DI Studio
• Generated OLAP cubes as per hierarchy's levels and measures requirement to be used in different web tiers.
• In Data Transformation and Manipulation process, extensively used procedures
like SQL, Transpose, Tabulate, Copy, Sort, Datasets etc.
• Extensively worked on PROC REPORT and PROC TABULATE to produce
descriptive statistical reports for the purpose of validation.
• Involved in writing code using data step programming and macros to extract,
clean and validate date from external source.
• Data merging, Data subsetting with PROC SQL, MERGE and SET statements.
• Imported and Exported data files to and from SAS using Proc Import and Proc
Export from Excel and various delimited text based data files such as .TXT (tab delimited) and .CSV (comma delimited) files into SAS datasets for analysis
• In Data Transformation and Manipulation process, extensively used procedures
like SQL, Transpose, Tabulate, Copy, Sort, Datasets etc.
Sql, Excel, Writing, ETL, SAS, Developer, Transformation, Analyst, Web, OLAP
My education
Staffordshire University
MSc, N/a
MSc, N/a
University Of Liverpool
BSc, Psychology & Forensic Biology
BSc, Psychology & Forensic Biology
n/a
Secondary, Physics, Chemistry, Biology & English
Secondary, Physics, Chemistry, Biology & English
Hardik's reviews
Hardik has not received any reviews on Worksome.
Contact Hardik Talati
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Hardik directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark