$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Senior Data Engineer specializing in AWS
John Beeston
,
Buckingham, United Kingdom
Experience
Other titles
Skills
I'm offering
A Senior Data Engineer, with 19 years experience in various roles within that field. Successfully delivered projects for blue chip and other clients across multiple sectors: Banking, Insurance, Telecoms, Retail, Travel, Betting, Hospitality and Public. Has full lifecycle knowledge in both waterfall and agile methodologies. Comfortable using many ETL, database,
cloud and reporting technologies. An articulate, well-rounded and adaptable person, capable of fitting into or leading any team, or working independently as needed.
cloud and reporting technologies. An articulate, well-rounded and adaptable person, capable of fitting into or leading any team, or working independently as needed.
Markets
United Kingdom
Industries
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2019 - 2020
freelance
Senior Data Engineer
Sporting Index.
Contract/Permanent: Contract, 1 Extension
Skills: AWS, Python, Snowflake, Oracle, Lambda (Serverless), Boto3, Restful API's,
Step Functions, DMS, S3, Secrets Manager, VPC, IAM, KMS, CloudWatch, SNS,
SQS, CloudFormation, CodePipeline, GitHub, Cloud9, Pandas, PLSQL
● Sporting Index required their legacy Oracle Management Information System to be migrated to Snowflake. I
evaluated different options and toolsets, and reverse engineered the legacy PLSQL code
● Designed and built the historical data migration system using DMS. Re-wrote the existing PLSQL in AWS/Python/Lambda
● Worked as the Data Architect:
○ Gathered requirements for the system, determined scope
○ Evaluated different options and toolsets
○ Specified and built the Proof of Concept
○ Designed and documented the final solution
○ Communicated progress with stakeholders
● Was the sole developer on the project, building a system with the following capabilities:
○ Near real-time data integration from Oracle databases, utilizing RabbitMQ writing to S3, triggering events
managed by SQS
○ Batch loading for historical datasets using DMS
○ Set up of the Snowflake database
○ Event driven, serverless, data loading (ETL) into Snowflake using S3, Lambda, Step Functions, Python
and Pandas
○ Logging in CloudWatch & Step Functions
○ Secure access to databases & applications using IAM and Secrets Manager
○ Alerts on failures using SNS
○ Continuous integration, deploying code and infrastructure in minutes, set up using CodePipeline,
CodeBuild, CodeDeploy, Cloudformation and GitHub
Skills: AWS, Python, Snowflake, Oracle, Lambda (Serverless), Boto3, Restful API's,
Step Functions, DMS, S3, Secrets Manager, VPC, IAM, KMS, CloudWatch, SNS,
SQS, CloudFormation, CodePipeline, GitHub, Cloud9, Pandas, PLSQL
● Sporting Index required their legacy Oracle Management Information System to be migrated to Snowflake. I
evaluated different options and toolsets, and reverse engineered the legacy PLSQL code
● Designed and built the historical data migration system using DMS. Re-wrote the existing PLSQL in AWS/Python/Lambda
● Worked as the Data Architect:
○ Gathered requirements for the system, determined scope
○ Evaluated different options and toolsets
○ Specified and built the Proof of Concept
○ Designed and documented the final solution
○ Communicated progress with stakeholders
● Was the sole developer on the project, building a system with the following capabilities:
○ Near real-time data integration from Oracle databases, utilizing RabbitMQ writing to S3, triggering events
managed by SQS
○ Batch loading for historical datasets using DMS
○ Set up of the Snowflake database
○ Event driven, serverless, data loading (ETL) into Snowflake using S3, Lambda, Step Functions, Python
and Pandas
○ Logging in CloudWatch & Step Functions
○ Secure access to databases & applications using IAM and Secrets Manager
○ Alerts on failures using SNS
○ Continuous integration, deploying code and infrastructure in minutes, set up using CodePipeline,
CodeBuild, CodeDeploy, Cloudformation and GitHub
Github, Lambda, Continuous integration, Infrastructure, Developer, Serverless, Python, Integration, Management, Database, ETL, Oracle, AWS, API, Sql
2019 - 2019
freelance
Senior Data Engineer
Vodafone.
Contract/Permanent: Contract
Skills: AWS, Athena, Python, Lambda (Serverless), Boto3, Step Functions, S3,
CloudWatch, SQS, Cloud9, CloudFormation, SAM
● Designed and built the Ingestion, Storage and Extraction application for the provision of mobile phone sightings
data to various data processors.
● Migration from Oracle Exadata transformed an expensive system, to one that costs only £125/month, saving
the business approximately £1 million per year.
● Worked as the Data Architect (as above)
● Also did the hands on development on the project, building a system with the following capabilities:
○ Near real-time data processing from the S3 landing area, performing Athena DDL, conversion to Parquet
format, partitioning, decryption, compression, de-duplication and filtering of data using Python (Pandas dataframes).
○ All controlled using serverless code written in Lambda, with the queue managed via SQS, and Step
Functions for workflow/orchestration.
○ Extraction was written in Lambda utilizing Athena SQL via the boto3 API.
○ Data load was highly parallel using lambda, allowing the ingestion of 50TB of historical data to be loaded
in a couple of days, which with the old loading approach for Exadata would have previously taken weeks.
Skills: AWS, Athena, Python, Lambda (Serverless), Boto3, Step Functions, S3,
CloudWatch, SQS, Cloud9, CloudFormation, SAM
● Designed and built the Ingestion, Storage and Extraction application for the provision of mobile phone sightings
data to various data processors.
● Migration from Oracle Exadata transformed an expensive system, to one that costs only £125/month, saving
the business approximately £1 million per year.
● Worked as the Data Architect (as above)
● Also did the hands on development on the project, building a system with the following capabilities:
○ Near real-time data processing from the S3 landing area, performing Athena DDL, conversion to Parquet
format, partitioning, decryption, compression, de-duplication and filtering of data using Python (Pandas dataframes).
○ All controlled using serverless code written in Lambda, with the queue managed via SQS, and Step
Functions for workflow/orchestration.
○ Extraction was written in Lambda utilizing Athena SQL via the boto3 API.
○ Data load was highly parallel using lambda, allowing the ingestion of 50TB of historical data to be loaded
in a couple of days, which with the old loading approach for Exadata would have previously taken weeks.
Sql, Python, API, AWS, Serverless, Development, Storage, Lambda
2018 - 2019
freelance
Senior Data Engineer
Veeve.
Contract/Permanent: Contract, 2 Extensions
Skills: AWS, Python, Redshift, Aurora, MySQL, MariaDB, SQL Server, Lambda
(Serverless), Boto3, Step Functions, DMS, S3, EC2, RDS, Secrets Manager,
VPC, IAM, CloudWatch, SNS, CloudTrail, Powershell, Github, Google
OpenRefine
● Evaluated different options and toolsets, and designed and built the Extract Load Transform (ELT) for the Veeve Redshift Data Warehouse
● Worked as the Data Architect (as above)
● Was the sole developer on the project, building a system with the following capabilities:
○ Near real-time data integration from MySQL RDS & SQL Server EC2 databases, utilizing Change Data
Capture (CDC) and DMS
○ Overnight batch loading for initial large datasets
○ Event driven, serverless, data loading of Redshift using S3, Lambda, Step Functions & Python
○ Dynamically generated changes to table structures (DDL) and ELT. Add new data sources without
writing more code
○ Automatic retention of temporal data to enable analysis of historical data (changes)
○ Dynamically generated workflows in Step Functions based on JSON configuration files
○ Cleansing & standardization of data using Google Open Refine
○ Logging in CloudWatch & Step Functions
○ Secure access to databases & applications using IAM and Secrets Manager
○ Alerts on failures using SNS
Skills: AWS, Python, Redshift, Aurora, MySQL, MariaDB, SQL Server, Lambda
(Serverless), Boto3, Step Functions, DMS, S3, EC2, RDS, Secrets Manager,
VPC, IAM, CloudWatch, SNS, CloudTrail, Powershell, Github, Google
OpenRefine
● Evaluated different options and toolsets, and designed and built the Extract Load Transform (ELT) for the Veeve Redshift Data Warehouse
● Worked as the Data Architect (as above)
● Was the sole developer on the project, building a system with the following capabilities:
○ Near real-time data integration from MySQL RDS & SQL Server EC2 databases, utilizing Change Data
Capture (CDC) and DMS
○ Overnight batch loading for initial large datasets
○ Event driven, serverless, data loading of Redshift using S3, Lambda, Step Functions & Python
○ Dynamically generated changes to table structures (DDL) and ELT. Add new data sources without
writing more code
○ Automatic retention of temporal data to enable analysis of historical data (changes)
○ Dynamically generated workflows in Step Functions based on JSON configuration files
○ Cleansing & standardization of data using Google Open Refine
○ Logging in CloudWatch & Step Functions
○ Secure access to databases & applications using IAM and Secrets Manager
○ Alerts on failures using SNS
PowerShell, Redshift, Lambda, Mariadb, Developer, Serverless, Github, Mysql, Integration, Event, Data Warehouse, JSON, SQL Server, AWS, Python, Sql, Database modeling
2018 - 2018
job
Head of Development (Business Intelligence), SSP
unknown.
Contract/Permanent: Permanent
Skills: Scrum, Kanban, Agile Coaching, Line Management, Resource Planning,
Stakeholder Management, Management Reporting
● Ran workshops for 30 people to evangelize the adoption of agile methods, which was successfully adopted
across multiple delivery teams
● Constructed the IT delivery roadmap with input from Product Owners
● Re-shaped the IT org structure to align to products and ensured teams were sufficiently resourced to deliver the roadmap
Skills: Scrum, Kanban, Agile Coaching, Line Management, Resource Planning,
Stakeholder Management, Management Reporting
● Ran workshops for 30 people to evangelize the adoption of agile methods, which was successfully adopted
across multiple delivery teams
● Constructed the IT delivery roadmap with input from Product Owners
● Re-shaped the IT org structure to align to products and ensured teams were sufficiently resourced to deliver the roadmap
Coaching, Scrum, Business Intelligence, Agile, Stakeholder Management, Management, Kanban, Workshops, It, Management Reporting, Development
2014 - 2018
job
Senior Data Engineering Consultant
Agile Solutions.
Contract/Permanent: Permanent
Tech Skills: SSIS, Tableau, SQL Server, TSQL, Informatica Powercenter, Oracle, TDD, VB
.NET
● Accountable for the successful delivery of Data Engineering and Business Intelligence projects across 12
different client sites. Planned projects and wrote statements of work. Managed risks and issues. Performed
project governance / stakeholder management duties as the 1st point of escalation. Evangelized agile methods through workshops. Pre-sales. Wrote the Agile Information Management (AIM) delivery methodology. Line
Management of the Southern practice
● As well as management, continued to do hands-on billable development work for clients
Project 1: Events management client
○ Developed ETL mappings in SSIS to populate a new finance data mart, and created Tableau
dashboards / visualisations for presenting Profit & Loss Accounts & Exhibition Rental reports for hall
space contracted and sold for Exhibitions/Events. The reporting functionality enabled efficiency
improvements allowing all finance teams to focus on business decision support.
Project 2: Public sector client
○ Was a developer on a team of 3 developers that built SSIS mappings to migrate data from a legacy
data model to a new data warehouse in SQL Server.
Project 3: Blue chip Insurance client
○ Multifaceted role as Scrum Master, Solution Architect and Developer - wrote the high-level design,
designed the data model, defined development standards, designed batch processing system,
developed Informatica mappings, wrote test scripts, and peer reviewed the work of other developers.
○ In the first phase, delivered a complex regulatory reporting project in 12 weeks, with 4 other developers.
The team delivered 78 Informatica Powercenter mappings, 2 workflows, 5 Business Objects reports, and staging workflows were migrated from Oracle to SQL Server. Project was genuinely delivered on
time and to budget. This led to 2 further phases of repeat business for this project at the same client.
○ Defined and successfully adopted test-driven development, resulting in defect free Production
releases.
Project 4: Blue chip Banking client
○ On an Informatica Data Quality project, wrote the functional spec and obtained sign-off.
Project 5: Payroll SME
○ Short 2-week project building Informatica Powercenter mappings, including java transformations.
Tech Skills: SSIS, Tableau, SQL Server, TSQL, Informatica Powercenter, Oracle, TDD, VB
.NET
● Accountable for the successful delivery of Data Engineering and Business Intelligence projects across 12
different client sites. Planned projects and wrote statements of work. Managed risks and issues. Performed
project governance / stakeholder management duties as the 1st point of escalation. Evangelized agile methods through workshops. Pre-sales. Wrote the Agile Information Management (AIM) delivery methodology. Line
Management of the Southern practice
● As well as management, continued to do hands-on billable development work for clients
Project 1: Events management client
○ Developed ETL mappings in SSIS to populate a new finance data mart, and created Tableau
dashboards / visualisations for presenting Profit & Loss Accounts & Exhibition Rental reports for hall
space contracted and sold for Exhibitions/Events. The reporting functionality enabled efficiency
improvements allowing all finance teams to focus on business decision support.
Project 2: Public sector client
○ Was a developer on a team of 3 developers that built SSIS mappings to migrate data from a legacy
data model to a new data warehouse in SQL Server.
Project 3: Blue chip Insurance client
○ Multifaceted role as Scrum Master, Solution Architect and Developer - wrote the high-level design,
designed the data model, defined development standards, designed batch processing system,
developed Informatica mappings, wrote test scripts, and peer reviewed the work of other developers.
○ In the first phase, delivered a complex regulatory reporting project in 12 weeks, with 4 other developers.
The team delivered 78 Informatica Powercenter mappings, 2 workflows, 5 Business Objects reports, and staging workflows were migrated from Oracle to SQL Server. Project was genuinely delivered on
time and to budget. This led to 2 further phases of repeat business for this project at the same client.
○ Defined and successfully adopted test-driven development, resulting in defect free Production
releases.
Project 4: Blue chip Banking client
○ On an Informatica Data Quality project, wrote the functional spec and obtained sign-off.
Project 5: Payroll SME
○ Short 2-week project building Informatica Powercenter mappings, including java transformations.
Information Management, Ssis, Data engineering, Data quality, Workshops, Developer, Support, Presenting, Development, Regulatory, Vb, Processing, Stakeholder Management, Sql, Scrum, Business Intelligence, SQL Server, Oracle, Budget, Scrum master, Agile, Design, ETL, Data Warehouse, Tableau, Management, Finance, Sales, Test, TDD
2010 - 2014
job
Data Engineering Team Lead
TUI Travel.
Contract/Permanent: Permanent
Tech Skills: Informatica Powercenter, Oracle, TDD, CI, Unix, Perl
● Successfully delivered a new Data Warehouse following the Kimball methodology for the Phoenix Program as
a developer and technical lead, whilst in parallel delivering Business as Usual (BAU), and other projects for the Airline and Cruise parts of the business.
● Management of a development team of 11 developers. Evangelized and implemented an agile methodology for
the MI and Integration team. Performed the Scrum Master role in addition to the technical lead role.
● Wrote the requirements for a Test-Driven Development application, purchased through a 3rd party vendor.
● Designed and developed continuous integration using Hudson, Perl, DBMaestro and SVN.
● Wrote the development standards for the MI and Integration team covering Informatica, Oracle and Unix.
Designed the "Batch Controls" part of the new data warehouse to ensure robustness and code re-use.
● Ensured infrastructure and licence agreements were fit for purpose and current.
● Won an award for "Technical Excellence and Innovation".
Tech Skills: Informatica Powercenter, Oracle, TDD, CI, Unix, Perl
● Successfully delivered a new Data Warehouse following the Kimball methodology for the Phoenix Program as
a developer and technical lead, whilst in parallel delivering Business as Usual (BAU), and other projects for the Airline and Cruise parts of the business.
● Management of a development team of 11 developers. Evangelized and implemented an agile methodology for
the MI and Integration team. Performed the Scrum Master role in addition to the technical lead role.
● Wrote the requirements for a Test-Driven Development application, purchased through a 3rd party vendor.
● Designed and developed continuous integration using Hudson, Perl, DBMaestro and SVN.
● Wrote the development standards for the MI and Integration team covering Informatica, Oracle and Unix.
Designed the "Batch Controls" part of the new data warehouse to ensure robustness and code re-use.
● Ensured infrastructure and licence agreements were fit for purpose and current.
● Won an award for "Technical Excellence and Innovation".
Test, Kimball, SVN, Development, Continuous integration, Infrastructure, Developer, Data engineering, Engineering, TDD, Scrum, Perl, Integration, Management, Data Warehouse, Unix, Agile, Innovation, Scrum master, Oracle
2004 - 2006
job
Senior Data Analyst
Trafalgar Tours.
Contract/Permanent: Permanent
Tech Skills: Oracle, PL/SQL, QAS, Dataflux, VB, Oracle Warehouse Builder, Discoverer
● Technical Lead. Estimated times of tasks, project planning. Mentored junior developers.
● Gathered requirements and then profiled existing data using Dataflux in order to help define Business Rules.
● Designed, developed and supported the data warehouse using Oracle 10g as the database, Oracle
Warehouse Builder for ETL, Dataflux and QAS for address cleansing, and VBScript for batch automation.
● Performance tuned the SQL & PL/SQL used in the ETL process to provide a 24-hour turnaround from data-load
to OLAP reporting in Discoverer Plus.
Tech Skills: Oracle, PL/SQL, QAS, Dataflux, VB, Oracle Warehouse Builder, Discoverer
● Technical Lead. Estimated times of tasks, project planning. Mentored junior developers.
● Gathered requirements and then profiled existing data using Dataflux in order to help define Business Rules.
● Designed, developed and supported the data warehouse using Oracle 10g as the database, Oracle
Warehouse Builder for ETL, Dataflux and QAS for address cleansing, and VBScript for batch automation.
● Performance tuned the SQL & PL/SQL used in the ETL process to provide a 24-hour turnaround from data-load
to OLAP reporting in Discoverer Plus.
Sql, Oracle, Automation, ETL, Database, Data Warehouse, Pl/sql, Vbscript, Analyst, Planning, OLAP, Vb
2004 - 2004
job
Data Developer
C & C Group.
Placed at: Cisco Systems
Contract/Permanent: Permanent
Tech Skills: Oracle, PL/SQL, SQL, MS Access, Microstrategy
Contract/Permanent: Permanent
Tech Skills: Oracle, PL/SQL, SQL, MS Access, Microstrategy
Sql, Oracle, Cisco, Pl/sql, Access, Developer
1999 - 2004
job
Senior Data Processing Executive
TNS.
Contract/Permanent: Permanent
Tech Skills: Oracle, PL/SQL, SQL, VB, Microstrategy
● Designed the data model, developed and supported Oracle databases for clients (e.g. Burgerking and JNLR).
Developed associated ETL to store data.
Tech Skills: Oracle, PL/SQL, SQL, VB, Microstrategy
● Designed the data model, developed and supported Oracle databases for clients (e.g. Burgerking and JNLR).
Developed associated ETL to store data.
Sql, Oracle, ETL, Pl/sql, Vb, Processing
1998 - 1999
temp
Various Temporary Office Roles, Purchase Ledger & Credit Control
unknown.
Office
My education
1995
-
1998
UMIST
Bachelors, Civil Engineering
Bachelors, Civil Engineering
John's reviews
John has not received any reviews on Worksome.
Contact John Beeston
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to John directly in Worksome.
38000+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark