$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Hitesh1
Hitesh Kotamraju
,
Reading, United Kingdom
Experience
Other titles
Skills
I'm offering
14 Years of Information Technology Experience.
- Strong Data background including Big Data Engineering, Oracle, BI, Big Data (Cloudera) spanning 14+ Years
- 4 Years of Big Data stack Experience -Big Data, Azure Big Data Eco system - Data Factory, Data Lake, Databricks, HD Insight and Cloudera Eco System
- Strong experience on building Data pipelines using Azure ADF, Python, SSIS ETL
- Report building, dashboarding, KPI using Spotfire and power BI tools
- Good experience in programming languages - Python and Java/J2ee/C++
- Extensive knowledge & experience with Hadoop ecosystem & architecture - HDFS & YARN and modern data platforms like Cloudera
- Strong in Big Data - Hadoop, Hive/Impala, Spark with good knowledge in Java and Scala
- Experience handing various file formats like Avro/Parquet, ORC, XML and Json
- Strong experience in data modelling - conceptual to physical, mapping , database and filesystems design and selection for both structured and unstructured
- Strong experience in Linux - managing , working with shell, configuration of the system , packages
- Solid understanding of SDLC, SDLC Governance implementation, TDD, Agile Development - using tools such as TeamCity, Atlassian, Jenkins, Rest API, Git, GitHub
- Understands the teams' goals and objectives and how the team plan supports wider business goals
- Strong Data background including Big Data Engineering, Oracle, BI, Big Data (Cloudera) spanning 14+ Years
- 4 Years of Big Data stack Experience -Big Data, Azure Big Data Eco system - Data Factory, Data Lake, Databricks, HD Insight and Cloudera Eco System
- Strong experience on building Data pipelines using Azure ADF, Python, SSIS ETL
- Report building, dashboarding, KPI using Spotfire and power BI tools
- Good experience in programming languages - Python and Java/J2ee/C++
- Extensive knowledge & experience with Hadoop ecosystem & architecture - HDFS & YARN and modern data platforms like Cloudera
- Strong in Big Data - Hadoop, Hive/Impala, Spark with good knowledge in Java and Scala
- Experience handing various file formats like Avro/Parquet, ORC, XML and Json
- Strong experience in data modelling - conceptual to physical, mapping , database and filesystems design and selection for both structured and unstructured
- Strong experience in Linux - managing , working with shell, configuration of the system , packages
- Solid understanding of SDLC, SDLC Governance implementation, TDD, Agile Development - using tools such as TeamCity, Atlassian, Jenkins, Rest API, Git, GitHub
- Understands the teams' goals and objectives and how the team plan supports wider business goals
Markets
United Kingdom
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Available
My experience
2018 - ?
job
Senior Data Engineer/Developer for Shell Energy Europe Data Platform
Shell Trading & Supply.
* Built data ingestion pipelines from various external sources and internal sources using python/ADF- moving to on-prem to Azure Cloud - Data Lake Gen1 and Gen2, Blog Storage and Azure SQL Server DB/DW, Log Analytics, Azure functions, SSIS ETL
* Worked with business to deliver data ingestion for Exposures, Volatilities and PnL for options and futures for Emissions Trading Desk using Python, delivering data requests from Reuters for various Gas and Carbon data points.
* Building pipelines to bring in data points like Call/Put options for Carbon instruments, Gas instruments liaising with Data Vendors.
* Worked on various tools inside Azure including Azure Data Factory, Cosmos DB, Data Lake, Azure SQL Datawarehouse and databases, Azure Databricks , Service Principals and Power BI
* Worked on Azure Devops - CI/CD integration with visual studio for Azure Functions, Azure Data factory.
* Extensively worked on Azure Data Factory in building pipelines for cross cloud data movement, integration with Azure Functions running on public cloud.
* Extensively worked on Azure Databricks(spark) using pyspark development using Spark SQL, Databricks Delta
* Good experience of Azure Cloud platform through working on setting up various resources as a part of SaaS and PaaS offering
* Worked with business to deliver data ingestion for Exposures, Volatilities and PnL for options and futures for Emissions Trading Desk using Python, delivering data requests from Reuters for various Gas and Carbon data points.
* Building pipelines to bring in data points like Call/Put options for Carbon instruments, Gas instruments liaising with Data Vendors.
* Worked on various tools inside Azure including Azure Data Factory, Cosmos DB, Data Lake, Azure SQL Datawarehouse and databases, Azure Databricks , Service Principals and Power BI
* Worked on Azure Devops - CI/CD integration with visual studio for Azure Functions, Azure Data factory.
* Extensively worked on Azure Data Factory in building pipelines for cross cloud data movement, integration with Azure Functions running on public cloud.
* Extensively worked on Azure Databricks(spark) using pyspark development using Spark SQL, Databricks Delta
* Good experience of Azure Cloud platform through working on setting up various resources as a part of SaaS and PaaS offering
Ssis, UP, Energy, Energy, Power, Server, Storage, BEE, Development, Azure SQL, Developer, Spark, Blog, Sql, Service, Analytics, Integration, Visual Studio, Saas, ETL, Cloud, Power BI, DevOps, Azure, SQL Server, Python
2018 - 2018
freelance
Big Data Platform consultant
Vodafone Group Services.
for Cloudera big data strategic solution RTMT and Newbill
* Engineering the solution involving various Cloudera CDH components integration into 3rd party products for Real Time Network monitoring solution including data ingestion pipelines using Python, Azure Data Factory and Azure Functions (Python)
* Liaised and worked with Cloudera Technical support to resolve various problems hindering the project progression - related to HDFS, Yarn, Kerberos integration and Azure Key Vaults.
* Worked on Azure VSTS CI/CD for Azure Functions for Data extraction into Hybrid Cloud
* Worked on Azure Resources including Azure Container Registries, docker integration.
* Engineering the solution involving various Cloudera CDH components integration into 3rd party products for Real Time Network monitoring solution including data ingestion pipelines using Python, Azure Data Factory and Azure Functions (Python)
* Liaised and worked with Cloudera Technical support to resolve various problems hindering the project progression - related to HDFS, Yarn, Kerberos integration and Azure Key Vaults.
* Worked on Azure VSTS CI/CD for Azure Functions for Data extraction into Hybrid Cloud
* Worked on Azure Resources including Azure Container Registries, docker integration.
Python, Docker, Azure, Big Data, Cloud, Integration, Network, Engineering, Support, Monitoring, Hybrid, VSTS
2015 - 2017
job
Developer
Deutsche Bank.
Canary Wharf, London May 2015 - December 2017
* Worked on configuration and setup of Cloudera cluster and helped Devops team to automate cluster and node deployment.
* Worked on Cloudera CDH components, parcels , Kerberos integration, LDAP, commissioning an decommission of nodes, services, role based access, security configuration
* Extensively worked on building data ingestion pipelines using custom development on AWS - python, sqoop and Informatica
* Worked with Business Owners for requirement analyses mapping deliverables towards the Business strategy outcomes for optimizing compliance and regulatory fines
* Data acquisition and extraction for helping Applications to archive right data to produce information when required and applying retention policies for the regulators using traditional ETL and big data stack
* Developer for Data Archive platform - A bank wide strategic archiving solution to collect, cleanse and validate data from the single source of truth and ingest into Archive both structure and unstructured data that is compliant with existing and upcoming financial regulations such as - Dodd Frank, MIFid ii, GDPR.
* Worked with Banks Big Data platform (Cloudera) initiative to leverage their infrastructure for hosting Archive Data for Big Data Analytics and Reporting - Hadoop, Spark, HBase & Hive
* Data acquisition and integration through informatica ETL , ingest data into Hadoop HDFS leveraging Hive and HBase, Impala for real time query, data management systems on Cloudera platform
* Worked extensively on ETL design using informatica for data extraction and ingestion into HDFS.
* Worked extensively in building of a Data Archiving platform for Active records and real time analyses using Python and Spark.
* Lead PoC of building a Generic Data Lake for inactive data and a combination of mutable and immutable data- querying, analytics and reporting.
* Worked with CI/CD pipe lines for Java development that delivers real time data ingestion into Hive
* Involved in various SCRUM sessions and practicing of Agile, Jira for Kanban boards
* Worked on configuration and setup of Cloudera cluster and helped Devops team to automate cluster and node deployment.
* Worked on Cloudera CDH components, parcels , Kerberos integration, LDAP, commissioning an decommission of nodes, services, role based access, security configuration
* Extensively worked on building data ingestion pipelines using custom development on AWS - python, sqoop and Informatica
* Worked with Business Owners for requirement analyses mapping deliverables towards the Business strategy outcomes for optimizing compliance and regulatory fines
* Data acquisition and extraction for helping Applications to archive right data to produce information when required and applying retention policies for the regulators using traditional ETL and big data stack
* Developer for Data Archive platform - A bank wide strategic archiving solution to collect, cleanse and validate data from the single source of truth and ingest into Archive both structure and unstructured data that is compliant with existing and upcoming financial regulations such as - Dodd Frank, MIFid ii, GDPR.
* Worked with Banks Big Data platform (Cloudera) initiative to leverage their infrastructure for hosting Archive Data for Big Data Analytics and Reporting - Hadoop, Spark, HBase & Hive
* Data acquisition and integration through informatica ETL , ingest data into Hadoop HDFS leveraging Hive and HBase, Impala for real time query, data management systems on Cloudera platform
* Worked extensively on ETL design using informatica for data extraction and ingestion into HDFS.
* Worked extensively in building of a Data Archiving platform for Active records and real time analyses using Python and Spark.
* Lead PoC of building a Generic Data Lake for inactive data and a combination of mutable and immutable data- querying, analytics and reporting.
* Worked with CI/CD pipe lines for Java development that delivers real time data ingestion into Hive
* Involved in various SCRUM sessions and practicing of Agile, Jira for Kanban boards
Management, MiFID, Regulatory, Retention, Development, Infrastructure, Hive, Hosting, Developer, Security, Spark, Compliance, Node, Hadoop, Analytics, Kanban, Design, Integration, Deployment, ETL, GDpr, Data management, Agile, DevOps, Big Data, Jira, AWS, Scrum, Java, Python, Business strategy
2015 - 2015
job
Big data developer for Enterprise Data lake
Arqiva.
* Consolidation of various Reporting and Analytics platforms and lead building of Log File systems analyses using ELK stack worked on log stash and Kibana
* Hadoop - avro, json, xml, consolidation of data to establish Registered Authoritative data source
* Technology and Tooling consolidation, eliminating duplicate solutions for data processing and integration
* Data acquisition and integration through ETL , ingest data into Hadoop HDFS leveraging Hive and HBase, Impala for real time query, data management systems on Cloudera platform
* Delivered ETL Design and ETL Tooling POC for Enterprise Data Acquisition for the data warehouse
* Took responsibility for change initiations and managed the delivery of changes from Development to production.
* Hadoop - avro, json, xml, consolidation of data to establish Registered Authoritative data source
* Technology and Tooling consolidation, eliminating duplicate solutions for data processing and integration
* Data acquisition and integration through ETL , ingest data into Hadoop HDFS leveraging Hive and HBase, Impala for real time query, data management systems on Cloudera platform
* Delivered ETL Design and ETL Tooling POC for Enterprise Data Acquisition for the data warehouse
* Took responsibility for change initiations and managed the delivery of changes from Development to production.
Analytics, Processing, Production, Enterprise, Kibana, Development, Hive, Developer, Technology, Hadoop, Design, Management, Integration, ETL, Data management, Data Warehouse, JSON, XML, Big Data
2013 - 2014
freelance
BI consultant
HSBC Bank.
for the HR Reporting solution - a bank wide solution for Employee records management
* Handled various business requirements and provided solution related to infrastructure, applications and database
* Researched, analysed and worked with Oracle Technology support for a performance optimized solution for Oracle BI Reporting Tool
* Delivered detailed database design and ETL Architecture for Data acquisition and cleansing system
* Managed Technology Architecture for Development, UAT and Production servers , setup of Disaster Recover Environment, Testing automatic switch over from Production to DR
* Provided detailed Service Architecture document for support, SLA and work arounds for the solution
* Handled various business requirements and provided solution related to infrastructure, applications and database
* Researched, analysed and worked with Oracle Technology support for a performance optimized solution for Oracle BI Reporting Tool
* Delivered detailed database design and ETL Architecture for Data acquisition and cleansing system
* Managed Technology Architecture for Development, UAT and Production servers , setup of Disaster Recover Environment, Testing automatic switch over from Production to DR
* Provided detailed Service Architecture document for support, SLA and work arounds for the solution
Design, Oracle, Database, ETL, Database design, Management, Service, Architecture, Technology, Support, Infrastructure, Testing, Development, BEE, Production
2012 - 2013
job
Data engineer detailed database design and Informatica design and implementation document
Capgemini/EDF.
* ETL development using Informatica administration, repository management, ETL migration
* Involved in design configuration setup for Informatica 9 and Oracle Datawarehouse data mappings, ETL mappings using Informatica
* Helped in building Data Extraction pipeline using Informatica powercenter creating Process chains for EDF data warehouse.
* Delivered detailed database design and ETL Architecture for Data acquisition and cleansing system
* Involved in design configuration setup for Informatica 9 and Oracle Datawarehouse data mappings, ETL mappings using Informatica
* Helped in building Data Extraction pipeline using Informatica powercenter creating Process chains for EDF data warehouse.
* Delivered detailed database design and ETL Architecture for Data acquisition and cleansing system
Design, Administration, Oracle, Database, Data Warehouse, ETL, Database design, Management, Architecture, Implementation, Development
2012 - 2012
job
Lead for BI
Logica Consulting.
BAE System March 2012- September 2012
• Involved in Data Modelling , Data Analyses , Reconciliation of Migration data on Target database Build database for ETL tables, Informatica, OBIEE, olap schema and DAC schemas
* Responsible for Development changes, after testing bug fix promotions ,performance tuning
* As a Lead for BI implementation involved in mentoring the members and Functional users about the features of OBIEE 11g, OBIA 7.9.6 pre-built analytics.
* Involved in architecting and providing solution for Single Sign on integration for Oracle R12, OBIEE and ERP
• Involved in Data Modelling , Data Analyses , Reconciliation of Migration data on Target database Build database for ETL tables, Informatica, OBIEE, olap schema and DAC schemas
* Responsible for Development changes, after testing bug fix promotions ,performance tuning
* As a Lead for BI implementation involved in mentoring the members and Functional users about the features of OBIEE 11g, OBIA 7.9.6 pre-built analytics.
* Involved in architecting and providing solution for Single Sign on integration for Oracle R12, OBIEE and ERP
Oracle, ERP, Database, ETL, Mentoring, Integration, Analytics, Implementation, Testing, Development, BEE, Tuning, OLAP
2008 - 2012
job
BI Developer and Data Integration Analyst
Leicestershire County Council, Network Rail, BBC TVL, Gala Coral and RCUK SSC Ltd.
Worked for various customers including - Leicestershire County Council, Network Rail, BBC TVL, Gala Coral and RCUK SSC Ltd
Key Achievements
Key Achievements
Integration, Network, Developer, Analyst, BEE
2003 - 2008
job
Oracle DBA/Developer and ERP
TCS India Limited, Hyderabad, India, Philadelphia, Intelligroup Asia Limited, Hyderabad, India & Kuwait and Apps Associates Pvt Limited.
India, Philadelphia, Intelligroup Asia Limited, Hyderabad, India & Kuwait and Apps Associates Pvt Limited, Hyderabad, India
Key Achievements
* Career started with Apps Associates as Oracle Applications DBA for Oracle ERP 11i
* Worked on system maintenance , installation and configuration of Oracle E-business suite
* Helped various business users in resolving issues related to General Ledger, Accounts processing in Oracle E-business suite.
* Travelled to USA to the client site - Ikon office solutions for gathering business requirements, solution scoping and planning to customize and deliver Oracle applications ERP suite.
* Travelled to Kuwait to gather customer business information and align it to the solution building discussions, technology mapping with functional deliverables.
* Developed high-volume reporting using Data warehouse and transactional database of 7 TB size and high-volume reporting stand-by database, Datawarehouse database.
* Extensively worked on exp/imp and data pump to migrate data from transactional source databases to a Datawarehouse to run ETL for populating OLAP schema.
* Extensively worked on managing RDBMS , database objects including indexes, external tables, integrity constraints, PL/Sql procedures and packages
* Team lead a small team to provide technical leadership for implementation, requirements analyses phases of the project
Business sector Experience
Banking & Financial Services - Tier 1 investment banks- HSBC, Deutsche Bank
Public Sector - Leicestershire City Council & Research Councils, UK
Consulting - Logica, Capgemini, Oracle, NTT Data (Intelligroup), TCS
Key Achievements
* Career started with Apps Associates as Oracle Applications DBA for Oracle ERP 11i
* Worked on system maintenance , installation and configuration of Oracle E-business suite
* Helped various business users in resolving issues related to General Ledger, Accounts processing in Oracle E-business suite.
* Travelled to USA to the client site - Ikon office solutions for gathering business requirements, solution scoping and planning to customize and deliver Oracle applications ERP suite.
* Travelled to Kuwait to gather customer business information and align it to the solution building discussions, technology mapping with functional deliverables.
* Developed high-volume reporting using Data warehouse and transactional database of 7 TB size and high-volume reporting stand-by database, Datawarehouse database.
* Extensively worked on exp/imp and data pump to migrate data from transactional source databases to a Datawarehouse to run ETL for populating OLAP schema.
* Extensively worked on managing RDBMS , database objects including indexes, external tables, integrity constraints, PL/Sql procedures and packages
* Team lead a small team to provide technical leadership for implementation, requirements analyses phases of the project
Business sector Experience
Banking & Financial Services - Tier 1 investment banks- HSBC, Deutsche Bank
Public Sector - Leicestershire City Council & Research Councils, UK
Consulting - Logica, Capgemini, Oracle, NTT Data (Intelligroup), TCS
It, Oracle erp, Asia, Oracle dba, Processing, OLAP, USA, Public sector, Office, Oracle E-Business Suite, Implementation, Apps, Developer, Sql, Banking, Technology, Pl/sql, Leadership, Consulting, ETL, Data Warehouse, Database, ERP, Oracle, Research
My education
?
-
2017
Stanford University
N/a, N/a
N/a, N/a
?
-
2004
Acharya Nagarjuna University
Masters, Computer Science
Masters, Computer Science
?
-
2002
Acharya Nagarjuna University
BSc, Mathematics & Statistics
BSc, Mathematics & Statistics
?
-
2001
n/a
Unspecified, CMC Ltd
Unspecified, CMC Ltd
Hitesh's reviews
Hitesh has not received any reviews on Worksome.
Contact Hitesh Kotamraju
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Hitesh directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark