$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Data Engineer
Olakunle Kuye
,
Birmingham, United Kingdom
Experience
Other titles
Skills
I'm offering
I have contributed to positive business results through effective organisations, prioritisation, and follow-through of key organisational projects. My strengths and qualifications are an ideal match for a Data Engineering role and will bring immediate value to any firm that takes an interest in me.
In my previous roles, I exercised a calculated and systematic approach to problem-solving. While I am independently motivated, I appreciate collective efforts and collaborate productively within group settings. Moreover, among other skills I am competent in data analysis and process development with proficiency in process improvement.
My problem-solving, critical thinking, and motivation abilities will serve to support your continued organisational efforts.
To illustrate the scope of my career history and professional competencies
In my previous roles, I exercised a calculated and systematic approach to problem-solving. While I am independently motivated, I appreciate collective efforts and collaborate productively within group settings. Moreover, among other skills I am competent in data analysis and process development with proficiency in process improvement.
My problem-solving, critical thinking, and motivation abilities will serve to support your continued organisational efforts.
To illustrate the scope of my career history and professional competencies
Markets
United Kingdom
Industries
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2019 - 2020
job
Business Intelligence Developer
NCP.
Designing, building, testing, and maintaining data pipelines to connect
operational systems, data for analytics and BI systems.
• Using Azure Data Factory, Cosmos DB, Azure DevOps and Power BI.
• Designed, maintained and optimised Data Warehouse and ELT/ETL pipeline
solutions to maximise performance using Pentaho Data Integration (Kettle and Spoon).
• Designing and maintaining on-premises, transactional and analytical, MS
SQL Server Infrastructure.
• Designing and developing data warehouse solutions.
• Developed Azure Function Apps in C# that consumed REST API's among
other function.
operational systems, data for analytics and BI systems.
• Using Azure Data Factory, Cosmos DB, Azure DevOps and Power BI.
• Designed, maintained and optimised Data Warehouse and ELT/ETL pipeline
solutions to maximise performance using Pentaho Data Integration (Kettle and Spoon).
• Designing and maintaining on-premises, transactional and analytical, MS
SQL Server Infrastructure.
• Designing and developing data warehouse solutions.
• Developed Azure Function Apps in C# that consumed REST API's among
other function.
Data Warehouse, Power, BEE, Server, Testing, Infrastructure, Apps, Developer, Analytics, Integration, C, Sql, REST, ETL, DevOps, Power BI, Azure, REST API, SQL Server, Business Intelligence, API
2018 - 2019
job
Data Engineer
WEJO.
• Engineering Company data platforms for scale, performance, reliability and security.
• Work with other members of Data Engineering team to design and build
significant data streaming capabilities using AWS data pipeline, S3, SQS,
SNS, EMR and Lambda as well as leveraging technologies like scala, spark,
pulsar and Kafka.
• Work with product owners and business analysts in analysing business
requirements to design and implement data processing pipelines,
associated data and database structures and fine-tune performance to meet those requirements.
• Review new external data sets and open data sources to understand
potential usage.
• Work with Infrastructure and DevOps teams to release and maintain live
products.
• Processing large datasets using Scala/Spark running as Transient Cluster
using AWS EMR
• Design, Implement & Test all data processing systems.
• Participate in establishing processes and best practices around
development standards, version control, quality control, deployment,
maintenance and change management.
• Work with other members of Data Engineering team to design and build
significant data streaming capabilities using AWS data pipeline, S3, SQS,
SNS, EMR and Lambda as well as leveraging technologies like scala, spark,
pulsar and Kafka.
• Work with product owners and business analysts in analysing business
requirements to design and implement data processing pipelines,
associated data and database structures and fine-tune performance to meet those requirements.
• Review new external data sets and open data sources to understand
potential usage.
• Work with Infrastructure and DevOps teams to release and maintain live
products.
• Processing large datasets using Scala/Spark running as Transient Cluster
using AWS EMR
• Design, Implement & Test all data processing systems.
• Participate in establishing processes and best practices around
development standards, version control, quality control, deployment,
maintenance and change management.
Data engineering, Processes, Processing, Lambda, Streaming, Development, Infrastructure, Spark, Security, Engineering, Design, Test, Kafka, Scala, Management, Database, Deployment, DevOps, AWS, Change management
2018 - 2018
job
Scala Developer
Shop Direct Group.
• Spark / Scala coding, unit testing, system testing.
• Involved in design and building infrastructure and application for API using
AWS ALB, EC2, Cassandra, Scala (Akka Framework).
• Jenkins deployment using BLUE/GREEN deployment.
• Involved Kafka cluster using PUPPET and CloudFormation.
• Involved in building streaming jobs using Kafka Streams, Number of source
and sink connectors used in clusters.
• Agile Backlog Grooming.
• Used docker to deploying Python packages as AWS Lambda Layers
• Definition of Acceptance Criteria for QA and Business Analysts.
• Writing technical design documentation (high and low level) as required.
• Liaising with QA team to ensure that documentation is fit for purpose.
• Working with system team to perform load, performance and destructive
testing.
• Developing CI/CD pipeline for production and pre-production environments
Using primarily AWS such as S3, SQS, SNS, EMR and Lambda leveraging
Kafka, and Cassandra.
• Involved in design and building infrastructure and application for API using
AWS ALB, EC2, Cassandra, Scala (Akka Framework).
• Jenkins deployment using BLUE/GREEN deployment.
• Involved Kafka cluster using PUPPET and CloudFormation.
• Involved in building streaming jobs using Kafka Streams, Number of source
and sink connectors used in clusters.
• Agile Backlog Grooming.
• Used docker to deploying Python packages as AWS Lambda Layers
• Definition of Acceptance Criteria for QA and Business Analysts.
• Writing technical design documentation (high and low level) as required.
• Liaising with QA team to ensure that documentation is fit for purpose.
• Working with system team to perform load, performance and destructive
testing.
• Developing CI/CD pipeline for production and pre-production environments
Using primarily AWS such as S3, SQS, SNS, EMR and Lambda leveraging
Kafka, and Cassandra.
Kafka, CI / CD, Framework, Production, Akka, Lambda, Streaming, Testing, Infrastructure, Developer, Cassandra, Spark, Design, QA, Scala, Deployment, Agile, Jenkins, AWS, Docker, Writing, API, Python
2014 - 2017
job
Architect / Analyst Developer
Cap Gemini HMRC.
Designed and developed web applications using Django, which used
parsed SOLR queries based on chosen parameters at runtime.
• Used Sqoop and Flume to move data through various landing stages in Hadoop ecosystem once data had been through the cleansing process
using shell scripts before placing Hive tables over them where required.
• Developed MapReduce programs in Java to parse raw data and also used
Morphlines to perform ETL operations on data before indexing them for
SOLR.
• Used Pentaho PDI and Informatica to transform data.
• Used Spark and Scala for program development using TDD methods and data analysis.
• Used AWS for proof of concept development and deployments.
parsed SOLR queries based on chosen parameters at runtime.
• Used Sqoop and Flume to move data through various landing stages in Hadoop ecosystem once data had been through the cleansing process
using shell scripts before placing Hive tables over them where required.
• Developed MapReduce programs in Java to parse raw data and also used
Morphlines to perform ETL operations on data before indexing them for
SOLR.
• Used Pentaho PDI and Informatica to transform data.
• Used Spark and Scala for program development using TDD methods and data analysis.
• Used AWS for proof of concept development and deployments.
Scala, Web, Development, Analyst, Solr, Developer, Hive, Spark, TDD, Data Analysis, Hadoop, ETL, Django, Operations, AWS, Concept Development, Java
2002 - 2014
job
Lead Application Architect/Developer
Williams Grand Prix Engineering IT.
• Developed Scala programs using TDD methods.
• Used AWS in development and proof of concept programs.
• Shared responsibility for the administration of Hadoop.
• Created Hive queries that helped with the comparison of car data models
and historical metrics.
• Developed MapReduce programs to parse raw data, populate staging
tables and store refined data in Hive tables.
• Produce high and low-level designs for numerous multi-threaded
applications such as desktop, web and mobile utilising frameworks such as
TOGAF, UML, OOD and Agile.
• Development of multiple applications utilising technologies such as C#,
C++, J2EE, WPF, Windows Form, Visual Studio, Dev Express, Dundas and SOA.
• Translate business requirements into functional specification highlighting
interactions between system interfaces and application functionalities.
• Technical design authority ensuring that systems are developed following
policies and standards whilst promoting useability of components.
• Liaison with stakeholders including C-level to elicit, analyse, communicate
and validate requirements for changes to business processes, policies and information systems.
• Co-ordinate and contribute to RFP, RFQ and vendor selection processes.
• Translate business requirements into a business process and system
processes using BPM.
• Led various system integrations and report authoring projects utilising
technologies XSDs/WSDL, Oracle, Crystal, Cognos, PERL, C-Sharp and MS
SharePoint.
• Manage logical design and physical implementation of databases and data-warehouses using Microsoft and Oracle technologies.
• Ensure that functional and non-functional requirements are analysed,
captured, prioritised and validated.
• Design and develop data extraction, transformation and load strategy from
disparate sources into centralised data warehouse using Oracle and Microsoft BI technologies.
• Remove bottlenecks in business processes through transformation and managed upgrade and consolidation of IT infrastructures.
• Manage end-to-end system testing including analysis of specifications,
reviewing documentation, internal /client liaison, regression testing and OAT.
• Built programs that used large datasets and leveraged the Hadoop
ecosystem using MongoDB, where MongoDB was used as a real-time data
store and Hadoop was used for batch data processing and analysis.
• Involved in designing and improving component tracking system.
• Used AWS in development and proof of concept programs.
• Shared responsibility for the administration of Hadoop.
• Created Hive queries that helped with the comparison of car data models
and historical metrics.
• Developed MapReduce programs to parse raw data, populate staging
tables and store refined data in Hive tables.
• Produce high and low-level designs for numerous multi-threaded
applications such as desktop, web and mobile utilising frameworks such as
TOGAF, UML, OOD and Agile.
• Development of multiple applications utilising technologies such as C#,
C++, J2EE, WPF, Windows Form, Visual Studio, Dev Express, Dundas and SOA.
• Translate business requirements into functional specification highlighting
interactions between system interfaces and application functionalities.
• Technical design authority ensuring that systems are developed following
policies and standards whilst promoting useability of components.
• Liaison with stakeholders including C-level to elicit, analyse, communicate
and validate requirements for changes to business processes, policies and information systems.
• Co-ordinate and contribute to RFP, RFQ and vendor selection processes.
• Translate business requirements into a business process and system
processes using BPM.
• Led various system integrations and report authoring projects utilising
technologies XSDs/WSDL, Oracle, Crystal, Cognos, PERL, C-Sharp and MS
SharePoint.
• Manage logical design and physical implementation of databases and data-warehouses using Microsoft and Oracle technologies.
• Ensure that functional and non-functional requirements are analysed,
captured, prioritised and validated.
• Design and develop data extraction, transformation and load strategy from
disparate sources into centralised data warehouse using Oracle and Microsoft BI technologies.
• Remove bottlenecks in business processes through transformation and managed upgrade and consolidation of IT infrastructures.
• Manage end-to-end system testing including analysis of specifications,
reviewing documentation, internal /client liaison, regression testing and OAT.
• Built programs that used large datasets and leveraged the Hadoop
ecosystem using MongoDB, where MongoDB was used as a real-time data
store and Hadoop was used for batch data processing and analysis.
• Involved in designing and improving component tracking system.
Testing, UML, TDD, SOA, It, Hive, Implementation, Developer, Transformation, RFP, Development, Express, Web, BEE, Processing, WSDL, Processes, LED, Hadoop, Administration, AWS, Oracle, MongoDB, SharePoint, Agile, Data Warehouse, C, Design, Regression testing, Scala, J2EE, Visual Studio, Wpf, Perl, Windows
My education
Cranfield University
N/a, Software Engineering For Technical Computing
N/a, Software Engineering For Technical Computing
Oxford University
Somecollege, Computing
Somecollege, Computing
Greenwich University
Bachelors, Business Studies
Bachelors, Business Studies
Olakunle's reviews
Olakunle has not received any reviews on Worksome.
Contact Olakunle Kuye
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Olakunle directly in Worksome.
38000+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark