$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Experienced IT professional in Cloud Architecture
Martins Ainabe
,
Croydon, United Kingdom
Experience
Other titles
Skills
I'm offering
A dedicated IT professional, with 17+ years' experience and a proven track record of success in multiple industries. With the perfect combination of Technical knowledge and business acumen, I seamlessly bridge the gap between IT and the Business. An authoritative and enthusiastic 'go getter', always prepared for a new challenge with a continuously positive attitude. Seeking a senior IT Architecture or consultancy role within a dynamic and progressive organisation.
Markets
United Kingdom
Links for more
Once you have created a company account and a job, you can access the profiles links.
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2018 - ?
job
Lead Cloud Data/Solution Architect
unknown.
Duration - 1 + years
Job Description - Provided technical Architectural design for the migration of the legacy Reporting and Analytics platform onto Google CloudPlatform (GCP) and Amazon Web Services (AWS). Delivered real
time streaming, transformation and data visualisation capabilities. Architected HLD, MLD and LLD's to facilitate the successful business to business collaborations between SKY with Apple, Netflix, BT, Vodaphone. Created
strategic architecture roadmaps for the implementation of Advance analytics and Artificial Intelligence. Created
a 360 degrees data view of the SKY customer, merging data across multiple source systems, CRM's (Salesforce,
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Chordiant, Kenan, MS Dynamics) and ERP's (SAP) to give the business intelligence teams insight to perform
accurate analysis on customers. Architected the efficiency of data delivery across on-prem and cloud systems
(hybrid cloud) to ensure improved data usage. Managed teams of architects, developers and testers to enable
the build and release of solutions into production. Used AGILE methodology in SCRUM's (Squads) iterations
(SPRINTS) to ensure quick and efficient delivery of capability to the business.
Built an AWS data lake to capture data using Kenesis Firehose & Streams, loaded the data into an AWS S3 bucket.
Using AWS Glue for cataloguing and searching of the data entities, used the build a data landscape of the Data
Lake, creating an API to expose the catalogue to the business. Exposed certain areas of the data catalogue to Experian to enable reference data to be exchanged dynamically. Performed data processing on Amazon Redshift
Created an Enterprise data warehouse using a Kimball model to combine and enhance viewing data. Using
Amazon Elastic Map Reduce (EMR) to manage the dilapidating existing HDFS environments and replacing then with S3 buckets. Running Tableau reports from the AWS Redshift and HIVE Datawarehouse datamarts. Secured the AWS Data Lake with AWS IAM, KMS and CloudTrail, CloudWatch to ensure that data is secured at rest and in transit. Ensuring the security models for data prioritisation across the SKY landscape is being maintained.
Architected elegant solutions to complex challenges spanning multiple systems. Shared Strategic Architecture
Artefacts with other departments. Worked with multiple teams across the company in different areas,
collaborating with technical and business owners, solving complex challenges that have had a huge impact on
SKY customers across SKY Mobile and Campaigns. Built high-performance, reliable systems in a complex, multi-
tiered, and distributed environment. Using my experience in Computer Science fundamentals in data structures,
algorithms, object-oriented design and complexity analysis to architect solutions. Providing solution design in Service Oriented Architectures (SOA).
Job Description - Provided technical Architectural design for the migration of the legacy Reporting and Analytics platform onto Google CloudPlatform (GCP) and Amazon Web Services (AWS). Delivered real
time streaming, transformation and data visualisation capabilities. Architected HLD, MLD and LLD's to facilitate the successful business to business collaborations between SKY with Apple, Netflix, BT, Vodaphone. Created
strategic architecture roadmaps for the implementation of Advance analytics and Artificial Intelligence. Created
a 360 degrees data view of the SKY customer, merging data across multiple source systems, CRM's (Salesforce,
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Chordiant, Kenan, MS Dynamics) and ERP's (SAP) to give the business intelligence teams insight to perform
accurate analysis on customers. Architected the efficiency of data delivery across on-prem and cloud systems
(hybrid cloud) to ensure improved data usage. Managed teams of architects, developers and testers to enable
the build and release of solutions into production. Used AGILE methodology in SCRUM's (Squads) iterations
(SPRINTS) to ensure quick and efficient delivery of capability to the business.
Built an AWS data lake to capture data using Kenesis Firehose & Streams, loaded the data into an AWS S3 bucket.
Using AWS Glue for cataloguing and searching of the data entities, used the build a data landscape of the Data
Lake, creating an API to expose the catalogue to the business. Exposed certain areas of the data catalogue to Experian to enable reference data to be exchanged dynamically. Performed data processing on Amazon Redshift
Created an Enterprise data warehouse using a Kimball model to combine and enhance viewing data. Using
Amazon Elastic Map Reduce (EMR) to manage the dilapidating existing HDFS environments and replacing then with S3 buckets. Running Tableau reports from the AWS Redshift and HIVE Datawarehouse datamarts. Secured the AWS Data Lake with AWS IAM, KMS and CloudTrail, CloudWatch to ensure that data is secured at rest and in transit. Ensuring the security models for data prioritisation across the SKY landscape is being maintained.
Architected elegant solutions to complex challenges spanning multiple systems. Shared Strategic Architecture
Artefacts with other departments. Worked with multiple teams across the company in different areas,
collaborating with technical and business owners, solving complex challenges that have had a huge impact on
SKY customers across SKY Mobile and Campaigns. Built high-performance, reliable systems in a complex, multi-
tiered, and distributed environment. Using my experience in Computer Science fundamentals in data structures,
algorithms, object-oriented design and complexity analysis to architect solutions. Providing solution design in Service Oriented Architectures (SOA).
Streaming, Algorithms, Architecture, Security, SOA, Implementation, Transformation, Hive, Amazon, Web, Service, Google, Redshift, Science, Production, Enterprise, Kimball, Processing, Hybrid, Amazon Web Services, Scrum, API, Business Intelligence, AWS, CRM, SAP, ERP, Artificial Intelligence, Agile, Design, REST, Data Warehouse, Cloud, Salesforce, Tableau, Linkedin, Web Services, Analytics
2018 - 2018
job
Data Solution Architect
HMRC.
Duration - 4 months
Job Description - Created the Data flow diagrams (DFD), data models and Entity Relationship Diagrams to illustrate how data was transferred from on premise Oracle, SQL and Neteeza Databases to AWS data Lake.
Using Kenesis Firehose to load batch data. Designed the AWS Data Lake to store all of the data from the on-
premise Credit systems, to resolve the issues of scalability, flexibility and resilience. Using AWS Redshift to build the Snowflake schema for the Enterprise data warehouse. Created a live data stream boarder alerting trigger
using AWS Analytics to search through the boarder inventory tracking system and alert when certain items had
been identified. These AWS architectures to address customer business problems and accelerate the adoption of AWS services in the business. Design one off data loads for the transfer of historical data from deprecated in premise systems onto the AWS S3 storage. Using Amazon Elastic Search to create an extensive search logics for
identifying key word search strings for high availability commodities. Defined and architected solutions using
distributed architectures, including .NET, J2EE, CORBA, .NET Web services, AXIS Java Web services, and Globus
Grid services. Gave presentations to key stake holders on the AWS architecture which was proposed for the HMRC VAT post BREXIT upgrades on an AWS platform. Exceptional Presentation skills with a high degree of comfort speaking with executives, IT Management, Business Stakeholders and developers. Excellent
communication skills with an ability to have the right level of conversations with stake holders at all levels. Able to adapt to new technologies and learn quickly using technologies. Use of common enterprise services such as
Directory Services (Active directory), Information Assurance, Virtual Desktop, etc.). Use of products such as
Oracle, SAP. Use and implementation of IT Frameworks such as ITIL, Zachman, TOGAF.
3. KOHI Engineering
Executive Director
London
Job Description - Created the Data flow diagrams (DFD), data models and Entity Relationship Diagrams to illustrate how data was transferred from on premise Oracle, SQL and Neteeza Databases to AWS data Lake.
Using Kenesis Firehose to load batch data. Designed the AWS Data Lake to store all of the data from the on-
premise Credit systems, to resolve the issues of scalability, flexibility and resilience. Using AWS Redshift to build the Snowflake schema for the Enterprise data warehouse. Created a live data stream boarder alerting trigger
using AWS Analytics to search through the boarder inventory tracking system and alert when certain items had
been identified. These AWS architectures to address customer business problems and accelerate the adoption of AWS services in the business. Design one off data loads for the transfer of historical data from deprecated in premise systems onto the AWS S3 storage. Using Amazon Elastic Search to create an extensive search logics for
identifying key word search strings for high availability commodities. Defined and architected solutions using
distributed architectures, including .NET, J2EE, CORBA, .NET Web services, AXIS Java Web services, and Globus
Grid services. Gave presentations to key stake holders on the AWS architecture which was proposed for the HMRC VAT post BREXIT upgrades on an AWS platform. Exceptional Presentation skills with a high degree of comfort speaking with executives, IT Management, Business Stakeholders and developers. Excellent
communication skills with an ability to have the right level of conversations with stake holders at all levels. Able to adapt to new technologies and learn quickly using technologies. Use of common enterprise services such as
Directory Services (Active directory), Information Assurance, Virtual Desktop, etc.). Use of products such as
Oracle, SAP. Use and implementation of IT Frameworks such as ITIL, Zachman, TOGAF.
3. KOHI Engineering
Executive Director
London
J2EE, Enterprise, Scalability, Search, Redshift, Storage, Web, Amazon, Implementation, It, Net, Architecture, Engineering, It management, Design, Analytics, Management, Web Services, Word, Data Warehouse, Active Directory, ITIL, SAP, Oracle, AWS, Java, Sql
2017 - 2017
job
Business development
Total and Shell.
January 2017 - December 2017
Duration - 1 year
Job Description - Board level decision making, targeting some of the world's largest Oil and Gas companies, such as Exxon Mobile, Total and Shell. Business development, new business acquisition, performing presentations to potential clients and giving an insight into products such as 3D laser scanning and Big data Platform solutions.
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Performing 3D laser scanning and Virtual reality augmentation, creating a unique perspective on asset tagging for medium to large organisations. Introduced AWS architecture for the Large Government projects to load data from disparate locations across the country and load them onto a single platform. In order to accurately measure
Oil & Gas, Liquified Natural Gas and other commodity levels being tracked on disparate systems. Using the Kinesis Streams live streaming capability to load data from different systems onto a single data lake and then
using Amazon Quick Sights to present reports back to senior management. The quick sight dashboards where
used to promote the business intelligence to the Clients.
Introduced new ETRM commodities trading capabilities to clients to enable
Performed 3D Laser Scan of buildings, Powerplants etc. converted them into virtual reality objects, provided
annotation with content specific information with 24/7 availability via an online portal. Managed projects for
clients to deliver Digitized Assets with to the millimetre accuracy. Created 3D and 4D virtual reality platform for
client use. Defined, created and maintained the Enterprise Architecture models and artefacts within Erwin.
Created an Enterprise Continuum and the data and information architecture views, Managed project teams
created solution designs to personalise the customer experience of the Virtual Reality objects. Managed project
teams that created; Point Cloud, Intelligent 3D Models, UAV Mapping, CAD Services and GIS Data 2D As-built/
As-is Documentation Of Assets. Excellent communication skills building relationships with senior level
management and key stakeholders.
4. Department for Work & Pensions (DWP)
Enterprise Data Architect
Sheffield
February 2016 - December 2016
Duration - 10 months
Job Description - Introduced Data As A Service (DAAS) on AWS and HDFS (Cloudera) platforms, created Strategic
& Target Data Architecture, Data principles, Data Governance, Enterprise data modelling, Digitization,
Architecture Repository and Data & Analytics services. Designed an AWS big data lake; Using an architecture
that supported real time and batch process data loading using AWS Kinesis (Streams & Firehose), creating a
landing area onto several S3 buckets. Processing data using AWS Lambda for Script based calculations for
algorithms that would calculate pension payments for all the different permutations of people categories in the UK. Also using AWS redshift and HIVE to build Enterprise Data Warehouse models using Inmon and Kimball
inspired designs. Provided the design of a platform to house the departments Reporting and Analytics
capabilities, using government approved security cloud providers. Used AGILE methodology with JIRA and Confluence to ensure a continuous integration pipeline for production releases at regular but controlled
intervals, using; SCRUM, Kanban. Using detailed knowledge of Microsoft products to provide technical insights to accelerate the adoption of Microsoft workloads on to the AWS platform. Used technical knowledge of the investments in Microsoft technologies and advised the business on an architectural approach to move their
workloads to the cloud. Provided advice as an enterprise architect to help the business to understand the value
proposition of the AWS platform and influence key decision makers in shaping decisions in adopting Microsoft
technology on the AWS platform. Responsible for accelerating the businesses adoption of Microsoft workloads on AWS by informing their understanding of best practices for Microsoft technologies on AWS. Presenting the technical benefits of developing and communicating Microsoft on AWS capability to internal stakeholders.
Designed and managed large SQL Server, Oracle, SAS implementation. Knowledge of AWS services Load
Balancing (Elastic Load Balancing, Amazon Cloud Front), Elastic Load Balancer, Security Management, Elastic
Cache, Amazon RDS.
Duration - 1 year
Job Description - Board level decision making, targeting some of the world's largest Oil and Gas companies, such as Exxon Mobile, Total and Shell. Business development, new business acquisition, performing presentations to potential clients and giving an insight into products such as 3D laser scanning and Big data Platform solutions.
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Performing 3D laser scanning and Virtual reality augmentation, creating a unique perspective on asset tagging for medium to large organisations. Introduced AWS architecture for the Large Government projects to load data from disparate locations across the country and load them onto a single platform. In order to accurately measure
Oil & Gas, Liquified Natural Gas and other commodity levels being tracked on disparate systems. Using the Kinesis Streams live streaming capability to load data from different systems onto a single data lake and then
using Amazon Quick Sights to present reports back to senior management. The quick sight dashboards where
used to promote the business intelligence to the Clients.
Introduced new ETRM commodities trading capabilities to clients to enable
Performed 3D Laser Scan of buildings, Powerplants etc. converted them into virtual reality objects, provided
annotation with content specific information with 24/7 availability via an online portal. Managed projects for
clients to deliver Digitized Assets with to the millimetre accuracy. Created 3D and 4D virtual reality platform for
client use. Defined, created and maintained the Enterprise Architecture models and artefacts within Erwin.
Created an Enterprise Continuum and the data and information architecture views, Managed project teams
created solution designs to personalise the customer experience of the Virtual Reality objects. Managed project
teams that created; Point Cloud, Intelligent 3D Models, UAV Mapping, CAD Services and GIS Data 2D As-built/
As-is Documentation Of Assets. Excellent communication skills building relationships with senior level
management and key stakeholders.
4. Department for Work & Pensions (DWP)
Enterprise Data Architect
Sheffield
February 2016 - December 2016
Duration - 10 months
Job Description - Introduced Data As A Service (DAAS) on AWS and HDFS (Cloudera) platforms, created Strategic
& Target Data Architecture, Data principles, Data Governance, Enterprise data modelling, Digitization,
Architecture Repository and Data & Analytics services. Designed an AWS big data lake; Using an architecture
that supported real time and batch process data loading using AWS Kinesis (Streams & Firehose), creating a
landing area onto several S3 buckets. Processing data using AWS Lambda for Script based calculations for
algorithms that would calculate pension payments for all the different permutations of people categories in the UK. Also using AWS redshift and HIVE to build Enterprise Data Warehouse models using Inmon and Kimball
inspired designs. Provided the design of a platform to house the departments Reporting and Analytics
capabilities, using government approved security cloud providers. Used AGILE methodology with JIRA and Confluence to ensure a continuous integration pipeline for production releases at regular but controlled
intervals, using; SCRUM, Kanban. Using detailed knowledge of Microsoft products to provide technical insights to accelerate the adoption of Microsoft workloads on to the AWS platform. Used technical knowledge of the investments in Microsoft technologies and advised the business on an architectural approach to move their
workloads to the cloud. Provided advice as an enterprise architect to help the business to understand the value
proposition of the AWS platform and influence key decision makers in shaping decisions in adopting Microsoft
technology on the AWS platform. Responsible for accelerating the businesses adoption of Microsoft workloads on AWS by informing their understanding of best practices for Microsoft technologies on AWS. Presenting the technical benefits of developing and communicating Microsoft on AWS capability to internal stakeholders.
Designed and managed large SQL Server, Oracle, SAS implementation. Knowledge of AWS services Load
Balancing (Elastic Load Balancing, Amazon Cloud Front), Elastic Load Balancer, Security Management, Elastic
Cache, Amazon RDS.
Continuous integration, 3D, Security, Natural, Confluence, Implementation, Hive, Presenting, Development, Amazon, Load balancing, Streaming, Server, Redshift, Architecture, Lambda, Pension, 2D, Information Architecture, Production, Enterprise, Oil and Gas, Online, Performing, Calculations, Processing, Kimball, Portal, CAD, Business development, Sql, Scrum, Business Intelligence, SQL Server, AWS, Oracle, Jira, Big Data, Customer experience, Agile, Data Warehouse, Cloud, Design, SoMe, Linkedin, Management, SAS, Integration, Virtual reality, Kanban, Analytics, Content, Service, Gis, Algorithms, Technology
2015 - 2016
job
Lead Data Architect
Cooperative Bank.
Duration- 7 months
Job Description - Responsible for an Enterprise wide initiative to improve the Data Architecture footprint in the bank. By implementing a Teradata & Netezza data warehouse using an Industry standard Financial Services
Logical Data Model (FSLDM) as a guideline. Creating Architecture framework and data governance processes to enable current and future projects to adhere to a common Subject Area Model, Conceptual level data model
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07 and Logical Level data model. Responsible for the creation of architectural artefacts for Product Review project,
Supply to customer level Audit to the Financial Services Authority(PRA). Creating Enterprise level Data Flow
Diagrams to show data sources, data integration and data destination points. Creating CDM and LDM diagrams as supporting information to verify control points within the bank. Defined Data strategy for delivery of an Enterprise data governance framework using Meta Data Management tools. Data Strategy and Architecture
Principles definition was accomplished by specifying the fundamental MUST HAVE's for data within the bank.
Using the ZACHMAN framework. Presented the data strategy to the Major and minor stakeholder.
Job Description - Responsible for an Enterprise wide initiative to improve the Data Architecture footprint in the bank. By implementing a Teradata & Netezza data warehouse using an Industry standard Financial Services
Logical Data Model (FSLDM) as a guideline. Creating Architecture framework and data governance processes to enable current and future projects to adhere to a common Subject Area Model, Conceptual level data model
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07 and Logical Level data model. Responsible for the creation of architectural artefacts for Product Review project,
Supply to customer level Audit to the Financial Services Authority(PRA). Creating Enterprise level Data Flow
Diagrams to show data sources, data integration and data destination points. Creating CDM and LDM diagrams as supporting information to verify control points within the bank. Defined Data strategy for delivery of an Enterprise data governance framework using Meta Data Management tools. Data Strategy and Architecture
Principles definition was accomplished by specifying the fundamental MUST HAVE's for data within the bank.
Using the ZACHMAN framework. Presented the data strategy to the Major and minor stakeholder.
Audit, Data management, Data Warehouse, Linkedin, Management, Integration, Architecture, Teradata, Enterprise, Audit, Framework, Processes
2014 - 2015
job
Lead Data / Solution Architect
HSBC.
Duration - 10 months
Job description - Architected an Enterprise Architecture Strategy including; Global Banking Markets (GBM),
Commercial Banking (CMB), Global Private Banking (GPB) and Retail Banking, Wholesale and Retail Credit Risk,
Pensions, Retail Banking Wealth Management (RBWM). Across 37 countries at regional and group level,
captured onto a singular platform for stress testing. Reviewed and documented business requirements and technical specifications for MIFID II, eTrading and Fixed Income-Bonds, evaluating enterprise impact across all
banking products and technology. Utilised HDFS tools such as HIVE, Mongo DB, PIG, Platfora, SAS, SAS Grid.
Solutioned advance analytics and visualisation tools like Tableau, R Studio to perform calculations for
Projections, Actuals and IFRS9 across all Risk types and business sectors in the bank. Presentations solution
proposals to stakeholders for road mapping. Acquired sign off with the key stakeholders to implement change.
Designed Dynamic automated data submission processes for Financial regulatory (FDSF) data submissions for
GST to satisfy Government PRA & CCAR requirement. Created solution designs for dynamic calculations across
Commercial Real Estate (Interest Cover Ratio), Collateral Secured Percentage. Solutioned high volume data
transfer and data management to resolve; data duplication, data cleansing, data consistency, data referencing,
Meta Data Management. Organised workshops to elaborate the architecture to key stakeholders.
Performed a gap analysis on current state architecture and designed solutions for Balance Sheet reconciliation,
Regulatory model calculation and model creation using SAS Models. Designed the data integration between the core Stress testing applications; Forecast RCT, Moodys, QRM, Mackraken & RERT. Created the Solution design for the migration from the legacy oracle platforms to BIG Data platforms. Create the Architecture Definition
Documents for the strategic architecture and also for Consolidation and reporting tool (CART). Created,
reviewed and signed off the programme level data flow diagrams, conceptual, logical and physical data models.
Job description - Architected an Enterprise Architecture Strategy including; Global Banking Markets (GBM),
Commercial Banking (CMB), Global Private Banking (GPB) and Retail Banking, Wholesale and Retail Credit Risk,
Pensions, Retail Banking Wealth Management (RBWM). Across 37 countries at regional and group level,
captured onto a singular platform for stress testing. Reviewed and documented business requirements and technical specifications for MIFID II, eTrading and Fixed Income-Bonds, evaluating enterprise impact across all
banking products and technology. Utilised HDFS tools such as HIVE, Mongo DB, PIG, Platfora, SAS, SAS Grid.
Solutioned advance analytics and visualisation tools like Tableau, R Studio to perform calculations for
Projections, Actuals and IFRS9 across all Risk types and business sectors in the bank. Presentations solution
proposals to stakeholders for road mapping. Acquired sign off with the key stakeholders to implement change.
Designed Dynamic automated data submission processes for Financial regulatory (FDSF) data submissions for
GST to satisfy Government PRA & CCAR requirement. Created solution designs for dynamic calculations across
Commercial Real Estate (Interest Cover Ratio), Collateral Secured Percentage. Solutioned high volume data
transfer and data management to resolve; data duplication, data cleansing, data consistency, data referencing,
Meta Data Management. Organised workshops to elaborate the architecture to key stakeholders.
Performed a gap analysis on current state architecture and designed solutions for Balance Sheet reconciliation,
Regulatory model calculation and model creation using SAS Models. Designed the data integration between the core Stress testing applications; Forecast RCT, Moodys, QRM, Mackraken & RERT. Created the Solution design for the migration from the legacy oracle platforms to BIG Data platforms. Create the Architecture Definition
Documents for the strategic architecture and also for Consolidation and reporting tool (CART). Created,
reviewed and signed off the programme level data flow diagrams, conceptual, logical and physical data models.
Architecture, MiFID, Processes, Calculations, Forecast, Wholesale, Enterprise, Mongo, Regulatory, Testing, Hive, Workshops, Banking, Design, Technology, Analytics, Integration, SAS, Management, Tableau, R, Data management, Big Data, Oracle, Retail
2014 - 2014
job
Lead Data Architect
Barclays Bank.
Duration- 6 months
Job Description - Architected Data Solution for Electronic Asset Management and Software Management.
Designed a data warehouse to collate data from all systems for Software Asset Metering (SAM)and hardware
Asset Metering (HAM). Documented the architectural data landscape for the department. Designed Conceptual
Models, Logical Models and Physical Data models of the existing databases. Troubleshot data duplication, excess
data volumes and poor performant queries on their Microsoft SQL Databases. Documented, designed and Performed Advanced database tuning, database optimization on their Databases. Performed viability analysis for the adoption of Meta Data Management & Master Data Management. Liaised with third party vendors for
Metadata and Data Quality zones. Trillium, Embarcadero, IBM infosphere, Informatica Meta data
management. Delivered presentations to key stakeholders to make decisions on the implementation.
Integrated Software Asset Management for Barclays Africa using ServiceNow, created architectural designs, data
flow diagrams, data mapping and functional specification documents. Delivered presentations to key
stakeholders for sign off of the tactical and strategic approach. Provided detailed solutions to database specific
issues such as database growth, server upgrade for the databases. Managing very large databases of 7+terabyte.
Redesigned SAM Reporting Dashboards to read off the Datawarehouse's and DataMart's. Designed a strategic
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
BIG Data solution to resolve performance issues on high data volumes, high traffic databases. Managed teams of developers across multiple locations.
Job Description - Architected Data Solution for Electronic Asset Management and Software Management.
Designed a data warehouse to collate data from all systems for Software Asset Metering (SAM)and hardware
Asset Metering (HAM). Documented the architectural data landscape for the department. Designed Conceptual
Models, Logical Models and Physical Data models of the existing databases. Troubleshot data duplication, excess
data volumes and poor performant queries on their Microsoft SQL Databases. Documented, designed and Performed Advanced database tuning, database optimization on their Databases. Performed viability analysis for the adoption of Meta Data Management & Master Data Management. Liaised with third party vendors for
Metadata and Data Quality zones. Trillium, Embarcadero, IBM infosphere, Informatica Meta data
management. Delivered presentations to key stakeholders to make decisions on the implementation.
Integrated Software Asset Management for Barclays Africa using ServiceNow, created architectural designs, data
flow diagrams, data mapping and functional specification documents. Delivered presentations to key
stakeholders for sign off of the tactical and strategic approach. Provided detailed solutions to database specific
issues such as database growth, server upgrade for the databases. Managing very large databases of 7+terabyte.
Redesigned SAM Reporting Dashboards to read off the Datawarehouse's and DataMart's. Designed a strategic
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
BIG Data solution to resolve performance issues on high data volumes, high traffic databases. Managed teams of developers across multiple locations.
Master Data Management, Tuning, Server, Software, Hardware, ServiceNow, Data mapping, Implementation, Growth, Data quality, Sql, Asset Management, Management, Linkedin, Data Warehouse, Database, Microsoft SQL, Data management, Big Data
2014 - 2014
job
Lead Data/ Solution Architect
Career Sabbatical.
London
May 2013 - December 2013
Duration - 8 months
Job Description - Liaised with the business and technical stakeholders to deliver the Australia Prescription Based
Services (PBS) architecture, producing; High Level Designs (HLD) and Mid Level Design (MLD). The solution was
based on delivering large volume of prescription, customer and marketing based data from Netezza to MS SQL.
Also translated the marketing algorithms defined by the statisticians into a component based logical design.
Allowing the calculation of geolocation algorithms for customer to pharmaceutical product to be done
dynamically. Documented and defined business semantics; vocabularies, taxonomies and hierarchies. Presented and communicated diplomatically to agnostic audiences on complex issues. Delivered an information
management governance strategy for implementation of a complex Business Intelligence programme using
TOGAF methods. Implemented environment level Change Management processes and incorporated them
through different environments; Testing, Integration, User Acceptance Testing and Production. Provided
detailed Technical Architectural Documents (TAD) (Business Process Flow and data flow diagrams). Created the Technical Specifications Document from the analysis of the Business Requirements. Delivered low level design
documentation (LLD) to build and deploy the SSIS ETL solution. Team Lead for developers across development,
analytics and testing. Managed a 15 member team of senior and junior; dot net developers, senior MS SQL BI
Stack developers, based in UK, Australia and India. Used AGILE to manage the teams in scrums. Trouble shooting and unit testing developer code using Smoke and Build tests.
May 2013 - December 2013
Duration - 8 months
Job Description - Liaised with the business and technical stakeholders to deliver the Australia Prescription Based
Services (PBS) architecture, producing; High Level Designs (HLD) and Mid Level Design (MLD). The solution was
based on delivering large volume of prescription, customer and marketing based data from Netezza to MS SQL.
Also translated the marketing algorithms defined by the statisticians into a component based logical design.
Allowing the calculation of geolocation algorithms for customer to pharmaceutical product to be done
dynamically. Documented and defined business semantics; vocabularies, taxonomies and hierarchies. Presented and communicated diplomatically to agnostic audiences on complex issues. Delivered an information
management governance strategy for implementation of a complex Business Intelligence programme using
TOGAF methods. Implemented environment level Change Management processes and incorporated them
through different environments; Testing, Integration, User Acceptance Testing and Production. Provided
detailed Technical Architectural Documents (TAD) (Business Process Flow and data flow diagrams). Created the Technical Specifications Document from the analysis of the Business Requirements. Delivered low level design
documentation (LLD) to build and deploy the SSIS ETL solution. Team Lead for developers across development,
analytics and testing. Managed a 15 member team of senior and junior; dot net developers, senior MS SQL BI
Stack developers, based in UK, Australia and India. Used AGILE to manage the teams in scrums. Trouble shooting and unit testing developer code using Smoke and Build tests.
Pharmaceutical, Processes, Production, BEE, Development, Testing, Implementation, Developer, Net, Ssis, Architecture, Marketing, Algorithms, Analytics, Integration, Management, ETL, Agile, Business Intelligence, Change management, Sql, Design
2010 - 2011
job
Technical Data Lead
Royal Bank of Scotland.
Duration - 1 year 4 months
Job Description - Technical Lead on high profile account switching project to transfer customer accounts from RBS to Santander. Responsible for capturing user requirements and analysis, creating a GAP analysis to represent the 'As Is' and 'To Be' status, and securing signoff. Created Functional Specification and Technical Specification
documents. Designed the Solution Architecture for the project and developed the end solution. Using SQL 2008 and SSIS as the core system to migrate and transform data between CSV, Text, SQL 2000, SQL 2008 and Sybase
12.5 databases. Used SSIS 2008 to Load information from an SFTP location and transferred data from the Potter,
Sapphire systems into a data Mart. Transformed the information using derive business logic in SSIS to update
Potter (The Target system). Landed the data in the target Wallstreet System, changing account details and consolidating trades accurately. Used Autosys as the scheduler for SSIS and Argon as the delivery mechanism to WallStreet. Use of subversion 6.5 for source code management. Created test cycles which ran through
Development, SIT, QA and Production whilst adhering to the quality gates. Created metadata driven reports and to control the data transfer mechanism.
Job Description - Technical Lead on high profile account switching project to transfer customer accounts from RBS to Santander. Responsible for capturing user requirements and analysis, creating a GAP analysis to represent the 'As Is' and 'To Be' status, and securing signoff. Created Functional Specification and Technical Specification
documents. Designed the Solution Architecture for the project and developed the end solution. Using SQL 2008 and SSIS as the core system to migrate and transform data between CSV, Text, SQL 2000, SQL 2008 and Sybase
12.5 databases. Used SSIS 2008 to Load information from an SFTP location and transferred data from the Potter,
Sapphire systems into a data Mart. Transformed the information using derive business logic in SSIS to update
Potter (The Target system). Landed the data in the target Wallstreet System, changing account details and consolidating trades accurately. Used Autosys as the scheduler for SSIS and Argon as the delivery mechanism to WallStreet. Use of subversion 6.5 for source code management. Created test cycles which ran through
Development, SIT, QA and Production whilst adhering to the quality gates. Created metadata driven reports and to control the data transfer mechanism.
Sql, Solution architecture, Subversion, Management, Test, QA, Architecture, Ssis, Development, Sybase, Production, Logic, Tech lead, Sftp
2009 - 2010
job
Data Analyst
11. Royal Bank Of Canada.
Duration - 10 months
Job description - Worked as Data Analyst on a High profile information security project perform a company wide
software application attestation to be delivered to Financial Services Agency. Performed data analysis on 30
applications, to create a High Level Design of how the attestation will work. Then automated the data delivery
process using Microsoft SSIS 2005 as ETL and SSRS as the reporting tool for dashboards. Created the Datawarehouse to encompass the security structure of all 30 applications. De-normalising data to the 3rd NF and created a security model from the data. Created SSIS packages to extract data from source systems for
multiple data formats; Flat files, text, excel, PDF and CSV files delivered onto an ftp location. Created VB scripts to extract data from Global directory and Active Directory into the Datawarehouse. Using SSIS tasks SQL tasks,
ForEachLoops, Lookups, MultiCasts, Aggregates, Pivot, UnPivot, Script Components to manipulate the data.
Used Subversion for source code control and worked with a team of developers. Developed a Proof of Concept
application to represent the Attestation process.
12. Pearson Education
Data Analyst
London
September 2008 - November 2009
Duration- 1 year 3 months
Job Description - Led the business analysis and data transfer of a Global implementation of the new Web
Campaign Management and CRM tool. Analysed legacy systems and databases for 12 countries. Determined
Data volumes, database capacity and costings. Performed Gap Analysis on existing business applications and preposition going forward. Evaluated and documented business requirements and created functional
requirements. Designed and maintained the architecture of the DataMart, database, tables, indexes,
constraints, views, stored procedures. Developed SQL 2005 SSIS Packages for data migration to reallocate data from legacy systems to the DataMart. Used AGILE methodology for development, performing weekly scrums
and stories etc. Used Advanced Transact SQL (T-SQL) to dedupe, using Stored procedures and Cursors with in the SSIS. Built advanced error handling in the SSIS packages. Created machine independent packages and deployed into Dev, QA and Production. Performed backups, restores, replication and data tuning. Data quality
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
assurance testing on the DataMart. Created excellent Release Documentation, Technical Specifications and detailed reviews presentations as an audit for handover.
Job description - Worked as Data Analyst on a High profile information security project perform a company wide
software application attestation to be delivered to Financial Services Agency. Performed data analysis on 30
applications, to create a High Level Design of how the attestation will work. Then automated the data delivery
process using Microsoft SSIS 2005 as ETL and SSRS as the reporting tool for dashboards. Created the Datawarehouse to encompass the security structure of all 30 applications. De-normalising data to the 3rd NF and created a security model from the data. Created SSIS packages to extract data from source systems for
multiple data formats; Flat files, text, excel, PDF and CSV files delivered onto an ftp location. Created VB scripts to extract data from Global directory and Active Directory into the Datawarehouse. Using SSIS tasks SQL tasks,
ForEachLoops, Lookups, MultiCasts, Aggregates, Pivot, UnPivot, Script Components to manipulate the data.
Used Subversion for source code control and worked with a team of developers. Developed a Proof of Concept
application to represent the Attestation process.
12. Pearson Education
Data Analyst
London
September 2008 - November 2009
Duration- 1 year 3 months
Job Description - Led the business analysis and data transfer of a Global implementation of the new Web
Campaign Management and CRM tool. Analysed legacy systems and databases for 12 countries. Determined
Data volumes, database capacity and costings. Performed Gap Analysis on existing business applications and preposition going forward. Evaluated and documented business requirements and created functional
requirements. Designed and maintained the architecture of the DataMart, database, tables, indexes,
constraints, views, stored procedures. Developed SQL 2005 SSIS Packages for data migration to reallocate data from legacy systems to the DataMart. Used AGILE methodology for development, performing weekly scrums
and stories etc. Used Advanced Transact SQL (T-SQL) to dedupe, using Stored procedures and Cursors with in the SSIS. Built advanced error handling in the SSIS packages. Created machine independent packages and deployed into Dev, QA and Production. Performed backups, restores, replication and data tuning. Data quality
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
assurance testing on the DataMart. Created excellent Release Documentation, Technical Specifications and detailed reviews presentations as an audit for handover.
Software, Ssis, Security, Data quality, SSRS, Implementation, Analyst, PDF, Testing, Development, Architecture, Web, Campaign, Production, Audit, Tuning, Performing, Vb, LED, Database, Sql, Excel, Data Analysis, CRM, Business Analysis, Active Directory, Agile, Audit, Design, ETL, Information Security, Subversion, Linkedin, Management, T-SQL, QA, Stored procedures
2007 - 2008
job
MIS Database Analyst
13. Lehman Brothers.
Duration- 1 year
Job Description - Worked within the Management of Information Services (MIS) team in the Mortgage and insurance services department. Maintained existing MS SQL and MS Access databases in the Commissions and MIS team. Upgraded and migrated access databases to SQL and SQL to UDB data-warehouses on UNIX. Migrated
Business Objects XI R2 reports and database from test server to the live server. Automated monthly transfer of CAIS data from Lehmans to Experian and Equifax. Migration of monthly CAIS reports (flat files) to SQL, with a
dot.Net web application for customer credit history. Automated the Delphi Customer Management (DCM)
system from Experian. Transferred DCM data from flat file to UDB database using Informatica. Worked with Lloyds TSB, Experian and Equifax to build customer credit scores. Using internal calculations to create customer
credit ratings and ensure accuracy. Used advanced SQL to create Adhoc reports. Implemented change control
processes using JIRA, TAC, REMEDY.
14. Dresdner Kleinwort Benson (Jersey UK)
Database Administrator
London
June 2014 - November 2014
Duration - 6 months
Job Description - Maintain maximum uptime of all 180 SQL Servers Databases in the region. Responsible for all
aspects of DBA Administration; monitoring, back up (hot backups, cold backups) , recovery, performance tuning,
user roles, permissions, apply service packs, documentation etc Created and maintained Disaster Recover plans,
ensured maximum up time of DB Mail Server and other priority 1 Databases. Regularly updated security
passwords. Troubleshot problems that arose and resolved them. Created complex SSIS jobs for migration of data from Sybase to MSSQL for the migration of HR CRM platforms. Wrote complex SQL statements to query data and manage data duplication across large volumes of data. Data partitioning, indexing. Replicated existing SQL
ETL processes by Development Team using Deployment Script routines and provided face to face liaison with clients when needed. Informed relevant Development Teams of issues affecting production processes. Ensured
adherence to database best practices. The use of the following tools SQLPLUS, Toad, Oracle Enterprise Manager.
Job Description - Worked within the Management of Information Services (MIS) team in the Mortgage and insurance services department. Maintained existing MS SQL and MS Access databases in the Commissions and MIS team. Upgraded and migrated access databases to SQL and SQL to UDB data-warehouses on UNIX. Migrated
Business Objects XI R2 reports and database from test server to the live server. Automated monthly transfer of CAIS data from Lehmans to Experian and Equifax. Migration of monthly CAIS reports (flat files) to SQL, with a
dot.Net web application for customer credit history. Automated the Delphi Customer Management (DCM)
system from Experian. Transferred DCM data from flat file to UDB database using Informatica. Worked with Lloyds TSB, Experian and Equifax to build customer credit scores. Using internal calculations to create customer
credit ratings and ensure accuracy. Used advanced SQL to create Adhoc reports. Implemented change control
processes using JIRA, TAC, REMEDY.
14. Dresdner Kleinwort Benson (Jersey UK)
Database Administrator
London
June 2014 - November 2014
Duration - 6 months
Job Description - Maintain maximum uptime of all 180 SQL Servers Databases in the region. Responsible for all
aspects of DBA Administration; monitoring, back up (hot backups, cold backups) , recovery, performance tuning,
user roles, permissions, apply service packs, documentation etc Created and maintained Disaster Recover plans,
ensured maximum up time of DB Mail Server and other priority 1 Databases. Regularly updated security
passwords. Troubleshot problems that arose and resolved them. Created complex SSIS jobs for migration of data from Sybase to MSSQL for the migration of HR CRM platforms. Wrote complex SQL statements to query data and manage data duplication across large volumes of data. Data partitioning, indexing. Replicated existing SQL
ETL processes by Development Team using Deployment Script routines and provided face to face liaison with clients when needed. Informed relevant Development Teams of issues affecting production processes. Ensured
adherence to database best practices. The use of the following tools SQLPLUS, Toad, Oracle Enterprise Manager.
Server, Access, Analyst, Remedy, Monitoring, Development, Insurance, Sybase, Web, Net, Administrator, Production, Enterprise, Tuning, Calculations, Manager, UP, Processes, Database, .Net, Administration, CRM, Oracle, Jira, MSSQL, Deployment, Unix, Sql, ETL, Management, Test, Service, Ssis, Delphi, Security
2006 - 2007
job
Database Administrator
15. Royal MENCAP Society.
Duration - 1 year and 4 months
Job Description - Maintain maximum uptime of all SQL Servers, with all aspects of DBA Administration;
monitoring, back up, recovery, performance tuning, user roles, permissions, apply service packs, documentation
etc Created and maintained Disaster Recover plans, ensured maximum up time of DB Mail Server and other
priority 1 Databases. Regularly updated security passwords. Troubleshot problems that arose and resolved
them. Created complex SSIS jobs for migration of data from Sybase to MSSQL for the migration of HR CRM
platforms. Wrote complex SQL statements to query data and manage data duplication across large volumes of data. Data partitioning, indexing. Replicated existing SQL ETL processes by Development Team using Deployment
Script routines and provided face to face liaison with clients when needed. Informed relevant Development
Teams of issues affecting production processes. Ensured adherence to database best practices. Business Objects
Universe Expert Course in order to create and maintain the Business objects universes for reporting.
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Job Description - Maintain maximum uptime of all SQL Servers, with all aspects of DBA Administration;
monitoring, back up, recovery, performance tuning, user roles, permissions, apply service packs, documentation
etc Created and maintained Disaster Recover plans, ensured maximum up time of DB Mail Server and other
priority 1 Databases. Regularly updated security passwords. Troubleshot problems that arose and resolved
them. Created complex SSIS jobs for migration of data from Sybase to MSSQL for the migration of HR CRM
platforms. Wrote complex SQL statements to query data and manage data duplication across large volumes of data. Data partitioning, indexing. Replicated existing SQL ETL processes by Development Team using Deployment
Script routines and provided face to face liaison with clients when needed. Informed relevant Development
Teams of issues affecting production processes. Ensured adherence to database best practices. Business Objects
Universe Expert Course in order to create and maintain the Business objects universes for reporting.
Tel - +447961790527
Email - [email protected]
Linkedin - www.linkedin.com/in/martins-ainabe-0378a07
Security, UP, Processes, Tuning, Production, Administrator, Server, Sybase, Development, Monitoring, Sql, Ssis, Service, Linkedin, ETL, Database, Deployment, MSSQL, CRM, Administration
2005 - 2005
job
Development Analyst
16. Taylor Francis and Informa Group.
Duration - 6 months
Job Description - Build complex ETL processes to manage customer data migrations from legacy systems onto
oracle 9i database CRM and ERP systems. Using complex PLSQL scripts, Shell scripting, SQL loader and enterprise
data management tool on oracle to load and store large volumes of data.
17. Reed Exhibitions
Database Executive
London
May 2004 - June 2005
Duration - 1 year and 1 month
Job Description - Extract, Load and transform large volumes of customer data from marketing campaigns
created on exhibitors. Sorting customers into different groups, B2B, B2C, exhibitors and visitors. Using Microsoft
SQL 2005 to manipulate data, dedupe large volumes of data and working with OLAP cubes to categorize data.
Writing complex SQL- INNER joins, Left Outer Joins, Full Outer Join, Unions, intersects, store procedures,
Functions and Variables
Job Description - Build complex ETL processes to manage customer data migrations from legacy systems onto
oracle 9i database CRM and ERP systems. Using complex PLSQL scripts, Shell scripting, SQL loader and enterprise
data management tool on oracle to load and store large volumes of data.
17. Reed Exhibitions
Database Executive
London
May 2004 - June 2005
Duration - 1 year and 1 month
Job Description - Extract, Load and transform large volumes of customer data from marketing campaigns
created on exhibitors. Sorting customers into different groups, B2B, B2C, exhibitors and visitors. Using Microsoft
SQL 2005 to manipulate data, dedupe large volumes of data and working with OLAP cubes to categorize data.
Writing complex SQL- INNER joins, Left Outer Joins, Full Outer Join, Unions, intersects, store procedures,
Functions and Variables
Scripting, Processes, OLAP, Enterprise, Development, Analyst, Management, ETL, Database, Marketing, Data management, ERP, B2C, B2B, Oracle, CRM, Writing, Sql
2002 - 2004
job
Data Executive
18. Sample Answers Limited.
Duration - 2 years
Job Description - Collect and store market survey information within databases. Run simple MS SQL Queries to extract lists that were sold to customers. Save the lists as a record of what was sold. Using excel to create
complex formulas to dedupe large volumes of data.
Job Description - Collect and store market survey information within databases. Run simple MS SQL Queries to extract lists that were sold to customers. Save the lists as a record of what was sold. Using excel to create
complex formulas to dedupe large volumes of data.
Sql, Excel
My education
2001
-
2002
Brunel University
MSc, Information Systems and Computing
MSc, Information Systems and Computing
1998
-
2001
Kent University at Canterbury
Bachelors, Relations and Human Resource Management
Bachelors, Relations and Human Resource Management
Martins' reviews
Martins has not received any reviews on Worksome.
Contact Martins Ainabe
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Martins directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark