$$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Expert
{{ $t($store.state.user.experience_search_name) }}
0
jobs
SAS DI Developer and Senior SAS Programmer
Pavan Koppula
,
leeds, United Kingdom
Experience
Other titles
Skills
I'm offering
• Extensive SAS experience on Unix, mainframes and Windows operating systems
• Around 14 years of IT experience using SAS BI Suite
• Strong experience in design, development of complex SAS code, testing and implementation of projects
• Excellent experience in liaising with Customers, Business analysts and external stakeholders
• Proficient in understanding requirements and translate into technical specs/code. Able to work independently too.
• Involved in peer review and sign-off of design, release notes, code
• Strong experience in impact analysis, debugging/analyze issues and effort estimation for projects
• Seasoned SAS Mainframes programmer
• Strong numerical skills and scored 95% in my academics. Adapt quickly to work on multiple technologies
• Security Vetting - Security Check(SC) Level to work for HMRC(Tax Office)
• Worked in financial sector (banking, credit cards, insurance, HMRC UK tax), automobile, manufacturing, telecom sectors
• Around 14 years of IT experience using SAS BI Suite
• Strong experience in design, development of complex SAS code, testing and implementation of projects
• Excellent experience in liaising with Customers, Business analysts and external stakeholders
• Proficient in understanding requirements and translate into technical specs/code. Able to work independently too.
• Involved in peer review and sign-off of design, release notes, code
• Strong experience in impact analysis, debugging/analyze issues and effort estimation for projects
• Seasoned SAS Mainframes programmer
• Strong numerical skills and scored 95% in my academics. Adapt quickly to work on multiple technologies
• Security Vetting - Security Check(SC) Level to work for HMRC(Tax Office)
• Worked in financial sector (banking, credit cards, insurance, HMRC UK tax), automobile, manufacturing, telecom sectors
Markets
United Kingdom
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Available
My experience
2017 - ?
job
Ops SME / SAS DI Developer
Lloyds Banking Group.
Company: Lloyds Banking Group / Leeds
Technologies: SAS DI Studio 4.9, Enterprise Guide 6.1/7.1, Management Console 9.4, Flow Manager 8.1, SPDS Cluster, UNIX, ALM, JIRA, Base SAS, Macros, Teradata, SQL
Description:
* MBNA - Loans(IFRS9, BCBS, Capital & Impairment)
◦ Key person involved in analyzing the end-to-end impacts of entire Lloyds Lemans systems right from data ingestion
into the platform till the creation of reports/models by BCBS, IFRS9, Capital Impairment models and FinRep systems.
◦ Engaging with stakeholders in understanding the requirements and analyze complex data flow between systems
◦ Understand the impacts to raw data layer, loan account management marts, FinRep logical models
◦ Looking at replacing customer data store with a strategic customer_exposure data source for C&I
◦ Loans reporting data pool jobs, MIF jobs, MOF jobs, model jobs like PDFL_TC(Probability of default forward look Transfer criteria), PD (Probability of default lifetime, 12 months, forward look), LGD (Loss Given Default),
NID (Not in Default), IDC (In default Collections), IDR (In default recoveries), EL (Expected Loss) models
◦ Once the impacts are being finalized, would be commencing build followed by route to live activities, business testing
and live implementation
◦ Worked on IFRS9 models and BCBS jobs
* OMDM Lex, Connect Bureau and Bureau Hub (Lex Auto)
◦ Worked on small change requests for implementing into Live
◦ Worked as an Ops SME doing all route to Live activities including signing-off design documents, Operations
Handbook, release catalogues and making recommendations on design etc.
* Credit Bureau DI Development/Impacting/Remediation (Experian/Equifax/Call credit)
◦ Performed analysis of entire Lemans to plan and understand the run sequence of jobs and scheduling time in flow
manager. Also remediated jobs that were impacted.
◦ CBR jobs were built to pull and transform data directly from the main source. This is to own the code/data, gain full
data processing control and avoid errors propagating from intermediate systems. End result was fast turnaround time to apply fixes, skip undesired dependencies on intermediate systems
◦ Fixed defects, performed analysis of data flows to fix issues as the jobs are a lot complex involving multiple sources
◦ Build DI jobs to pull SIRA fraud data for PCA, Cards and Loans
* Credit Bureau reporting(CBR - External Bureau)
◦ Have worked on RTL implementation of around 22 CBR(DM9) through to CIT, UAT, SIT and Live
◦ First run of CBR jobs ran successfully after Live implementation
◦ Reviewed documents and DI jobs to ensure they adhere to Lemans standards
◦ Remove descoped jobs from flow manager
◦ Performed analysis to identify mainframe files that will be decommissioned from BAU
* CARS DM9 Remediation/Credit Risk (Collections and Recoveries)
CARS (Collections and Recoveries Simplification) are a part of E2E Simplification analysis for Lloyds Banking Group. The CARS program aims to rationalize the Business Operational Model by consolidating the Collections and Recoveries debt management platforms from 14 down to a single strategic platform. This single platform is to be the FICO product, Debt Manager 9 (DM 9). During this, have worked as Developer building complex SAS DI jobs and also as Ops SME doing all Route to Live activities from dev to CIT, UAT, SIT and Live implementation. Responsible for reviewing design documents, release catalogues, ops handbook and signing them off.
◦ CARS is a very complex system and performed impacting of entire Lemans SAS DI system independently to identify
impacted systems/jobs for remediation. Project was delivered without any flaws.
◦ Systems that were identified for remediation are BCBS, ulgd, Copenhagen indicator, Asset Impairment, DFTCAM
◦ I've immediately picked the development work for Asset Impairment, ulgd changes and they all went Live
◦ Have worked closely with AI PCA business in helping them analyze questions raised during UAT
◦ Involved in design of lookup tables for router codes and plan types to reverse lookup for DM6 using DM9
* CARS DM9/Credit Risk (Collections and Recoveries)
◦ Have worked as Ops SME to review and perform RTL activities of more than 110 new CARS DM9 SCD2 jobs.
Performed comprehensive testing to identify issues (control tables, scd tables etc) and logged them to defect tracker
◦ Have created flows so the jobs can run iteratively without need for loop jobs to load historic data
◦ Reviewed documentation (design, ops hand book, release catalogue, mapping) for signoff
◦ Prepared SIT scripts for more than 110 CARS DM9 jobs
Technologies: SAS DI Studio 4.9, Enterprise Guide 6.1/7.1, Management Console 9.4, Flow Manager 8.1, SPDS Cluster, UNIX, ALM, JIRA, Base SAS, Macros, Teradata, SQL
Description:
* MBNA - Loans(IFRS9, BCBS, Capital & Impairment)
◦ Key person involved in analyzing the end-to-end impacts of entire Lloyds Lemans systems right from data ingestion
into the platform till the creation of reports/models by BCBS, IFRS9, Capital Impairment models and FinRep systems.
◦ Engaging with stakeholders in understanding the requirements and analyze complex data flow between systems
◦ Understand the impacts to raw data layer, loan account management marts, FinRep logical models
◦ Looking at replacing customer data store with a strategic customer_exposure data source for C&I
◦ Loans reporting data pool jobs, MIF jobs, MOF jobs, model jobs like PDFL_TC(Probability of default forward look Transfer criteria), PD (Probability of default lifetime, 12 months, forward look), LGD (Loss Given Default),
NID (Not in Default), IDC (In default Collections), IDR (In default recoveries), EL (Expected Loss) models
◦ Once the impacts are being finalized, would be commencing build followed by route to live activities, business testing
and live implementation
◦ Worked on IFRS9 models and BCBS jobs
* OMDM Lex, Connect Bureau and Bureau Hub (Lex Auto)
◦ Worked on small change requests for implementing into Live
◦ Worked as an Ops SME doing all route to Live activities including signing-off design documents, Operations
Handbook, release catalogues and making recommendations on design etc.
* Credit Bureau DI Development/Impacting/Remediation (Experian/Equifax/Call credit)
◦ Performed analysis of entire Lemans to plan and understand the run sequence of jobs and scheduling time in flow
manager. Also remediated jobs that were impacted.
◦ CBR jobs were built to pull and transform data directly from the main source. This is to own the code/data, gain full
data processing control and avoid errors propagating from intermediate systems. End result was fast turnaround time to apply fixes, skip undesired dependencies on intermediate systems
◦ Fixed defects, performed analysis of data flows to fix issues as the jobs are a lot complex involving multiple sources
◦ Build DI jobs to pull SIRA fraud data for PCA, Cards and Loans
* Credit Bureau reporting(CBR - External Bureau)
◦ Have worked on RTL implementation of around 22 CBR(DM9) through to CIT, UAT, SIT and Live
◦ First run of CBR jobs ran successfully after Live implementation
◦ Reviewed documents and DI jobs to ensure they adhere to Lemans standards
◦ Remove descoped jobs from flow manager
◦ Performed analysis to identify mainframe files that will be decommissioned from BAU
* CARS DM9 Remediation/Credit Risk (Collections and Recoveries)
CARS (Collections and Recoveries Simplification) are a part of E2E Simplification analysis for Lloyds Banking Group. The CARS program aims to rationalize the Business Operational Model by consolidating the Collections and Recoveries debt management platforms from 14 down to a single strategic platform. This single platform is to be the FICO product, Debt Manager 9 (DM 9). During this, have worked as Developer building complex SAS DI jobs and also as Ops SME doing all Route to Live activities from dev to CIT, UAT, SIT and Live implementation. Responsible for reviewing design documents, release catalogues, ops handbook and signing them off.
◦ CARS is a very complex system and performed impacting of entire Lemans SAS DI system independently to identify
impacted systems/jobs for remediation. Project was delivered without any flaws.
◦ Systems that were identified for remediation are BCBS, ulgd, Copenhagen indicator, Asset Impairment, DFTCAM
◦ I've immediately picked the development work for Asset Impairment, ulgd changes and they all went Live
◦ Have worked closely with AI PCA business in helping them analyze questions raised during UAT
◦ Involved in design of lookup tables for router codes and plan types to reverse lookup for DM6 using DM9
* CARS DM9/Credit Risk (Collections and Recoveries)
◦ Have worked as Ops SME to review and perform RTL activities of more than 110 new CARS DM9 SCD2 jobs.
Performed comprehensive testing to identify issues (control tables, scd tables etc) and logged them to defect tracker
◦ Have created flows so the jobs can run iteratively without need for loop jobs to load historic data
◦ Reviewed documentation (design, ops hand book, release catalogue, mapping) for signoff
◦ Prepared SIT scripts for more than 110 CARS DM9 jobs
Mainframe, Processing, Manager, Guide, Enterprise, Teradata, Router, Testing, Development, Implementation, Ai, Design, Turnaround, Banking, Developer, SAS, Unix, Management, C, Jira, Account management, Sql
2016 - 2017
job
Senior SAS Data Integration Developer
The Automobile Association.
Company: The Automobile Association
Technologies: Management Console 9.4, DI Studio 4.9, Oracle Exalytics, Platform process manager (Flow Manager), UNIX, materialised views, SVN, sql developer for Oracle, Python
Description:
Developed sales/adjustments mart tables from Patrol & DSF source. This project replaces the current system of reading flat files sourced from PMID.
* Sales/Adjustment
◦ Responsible from requirement gathering till Live go into production
◦ Produced job specs, release notes and promoted metadata from dev to UAT1 to Live environments
◦ Developed snapshot, detail and mart layers and finally pushed through to Oracle Exalytics tables
◦ Prepared views, refresh materialized views, materialized logs required for Exalytics tables
◦ Amend flows in flow manager(PPM), setup triggers and analyze existing jobs
◦ Used various transformations like merge, copying massive source SCD2 to SCD2 table, SCD2, user
Generated transformations like generate-key, pass through proc sql, table loader, extract, sort
* Training Skills Matrix
◦ Produced job specs, release notes and related scripts
◦ Developed data mart from Oracle training tables
◦ Built DI jobs and promoted jobs from dev to test to UAT1 to Live
◦ Built flows in LSF and loaded data onto Oracle exalytics tables
* Adhoc tasks
◦ De-dup production Oracle tables
◦ Working on design and building of training skills database
◦ Fix production tickets/issues
Technologies: Management Console 9.4, DI Studio 4.9, Oracle Exalytics, Platform process manager (Flow Manager), UNIX, materialised views, SVN, sql developer for Oracle, Python
Description:
Developed sales/adjustments mart tables from Patrol & DSF source. This project replaces the current system of reading flat files sourced from PMID.
* Sales/Adjustment
◦ Responsible from requirement gathering till Live go into production
◦ Produced job specs, release notes and promoted metadata from dev to UAT1 to Live environments
◦ Developed snapshot, detail and mart layers and finally pushed through to Oracle Exalytics tables
◦ Prepared views, refresh materialized views, materialized logs required for Exalytics tables
◦ Amend flows in flow manager(PPM), setup triggers and analyze existing jobs
◦ Used various transformations like merge, copying massive source SCD2 to SCD2 table, SCD2, user
Generated transformations like generate-key, pass through proc sql, table loader, extract, sort
* Training Skills Matrix
◦ Produced job specs, release notes and related scripts
◦ Developed data mart from Oracle training tables
◦ Built DI jobs and promoted jobs from dev to test to UAT1 to Live
◦ Built flows in LSF and loaded data onto Oracle exalytics tables
* Adhoc tasks
◦ De-dup production Oracle tables
◦ Working on design and building of training skills database
◦ Fix production tickets/issues
Unix, SVN, Go, Manager, PPM, Production, Detail, Developer, SAS, Sales, Design, Test, Management, Integration, Database, Training, Oracle, Sql, Python
2011 - 2014
job
Senior SAS Analyst
RIS (Capgemini).
Company: Capgemini/HMRC
Technologies: UNIX, Enterprise Guide 5.1, BASE SAS, Macros, SAS Management Console 9.3, LSF Platform Process Manager, DI Studio 4.6, Batch Server configuration, Maestro, Stored Processes, SAS/Access for Oracle, Setting up SQL, Oracle and Z/OS DB2 servers in Management Console, SQL, Solaris to Linux Server migration, Data migration, SAS Visual Analytics(SAS VA) 5.1
Description:
Connect is a high profile project to HMRC ((Her Majesty's Revenue and Customs) which cross-match over a billion
pieces of data to enable them to segment taxpayers according to their behavior and their past relationship.
Prepare High Level Design, release management notes and deployed the components for the following brand new
projects. It received the project of the year award in public sector in 2012. Involved in code and data migration of SAS upgrade and server upgrade
* VAT Network Build - DI Studio [ ETL ]
◦ Owned the development and design of DI Studio project
◦ Liaise with Detica and BI upstream to work around the development plan and timelines
◦ Used various transformations like user defined transformations, Control Loop, Lookup, Table loader, SCD
Type2 loader, file reader, file writer, data validation, sql, compare, and splitter
◦ Created, deployed and scheduled jobs using DI Studio, SMC and PPM
◦ Register metadata for Oracle tables, jobs, delimited files and SAS datasets
* Merchant Acquirers, Vision, Vies, Nova, Data Metrics, NOVA, TPSS - Enterprise Guide, SAS Batch
◦ Delivered brand new projects such as Merchant Acquirers, Vision, Vies, NOVA(Notification of Vehicle Arrivals), data metrics (management information), TPSS (The Pension Self Service)
◦ Proposed solutions, HLD, Maestro schedules for change requests and brand new projects
◦ Followed SDLC to design, develop, schedule and implement in LIVE
◦ Sampled full volume files down to 3% and 10% using logical data models
◦ Engaged with customers, Business Analysts, BI upstream during requirements gathering
◦ Worked on Enterprise guide, Unix, Maestro, SAS Batch
◦ Enhancements to output Module for Self Assessment (SA 2012) that include complex macro coding
* TRUCE (Transactional Risking Upstream in Connect Environment) - DI Studio [ ETL ]
◦ Owned the full development of this project and setting up of development environment
◦ Installed SAS Servers (Metadata and Compute Server on UNIX) and Client workstations
◦ Automated the development of over 1000+ file reader transformations in Data Integration Studio which now runs iteratively in few minutes which other wise would take months
◦ Created jobs in DI Studio using various transformations
◦ Setup new project repositories to enable change management in DI Studio
◦ Developed complex SAS macros that ease the development of DI jobs involving file readers
◦ Worked in parallel on both management console and DI Studio in updating metadata, creating libraries in Management Console, setting up permissions/ACTs etc
◦ Setup users on metadata and workspace server using Management Console for access to Enterprise Guide/DI Studio
◦ Register metadata for Oracle tables, jobs, delimited files and SAS datasets
* Real Time Information (RTI) - Management Console, LSF, DI Studio, Enterprise Guide
◦ Prepared accessibility test scripts to run JAWS on SAS Management Console 9.2
◦ Tested scripts, SAS code from SAS 9.0 to SAS 9.1. Also involved in testing components from Solaris to Linux.
◦ Created new schedule flows in SMC and added dependencies in LSF PPM(Load Sharing Facility)
◦ Added new queue to lsb.queues. Created new authentication domains
◦ Installed oracle and integrated with SAS to access oracle tables from SAS EG/DI Studio
◦ Add new Oracle, SQL, and DB2 for Z/OS libraries to the RTI - Compute server box
◦ Configure active directory user authentication. Install SAS/Access to SQL Server.
◦ Add users, groups, roles and Access Control templates in Management Console 9.2
◦ Set up new schedules using scheduler manager and PPM(Platform process manager)
◦ Enablement of private user storage
* Log Parser (Management Information) - SAS Reporting, Enterprise Guide, Unix
◦ Flagship project in logging/monitoring the user activity
◦ Used to audit workspace server logs, metadata, stored proc, OLAP, ARM logs, object Spawner logs.
◦ Co-Designed the log parser project.
◦ Graphs for CPU usage, disk usage, concurrency usage of application/ SAS BI tools
Technologies: UNIX, Enterprise Guide 5.1, BASE SAS, Macros, SAS Management Console 9.3, LSF Platform Process Manager, DI Studio 4.6, Batch Server configuration, Maestro, Stored Processes, SAS/Access for Oracle, Setting up SQL, Oracle and Z/OS DB2 servers in Management Console, SQL, Solaris to Linux Server migration, Data migration, SAS Visual Analytics(SAS VA) 5.1
Description:
Connect is a high profile project to HMRC ((Her Majesty's Revenue and Customs) which cross-match over a billion
pieces of data to enable them to segment taxpayers according to their behavior and their past relationship.
Prepare High Level Design, release management notes and deployed the components for the following brand new
projects. It received the project of the year award in public sector in 2012. Involved in code and data migration of SAS upgrade and server upgrade
* VAT Network Build - DI Studio [ ETL ]
◦ Owned the development and design of DI Studio project
◦ Liaise with Detica and BI upstream to work around the development plan and timelines
◦ Used various transformations like user defined transformations, Control Loop, Lookup, Table loader, SCD
Type2 loader, file reader, file writer, data validation, sql, compare, and splitter
◦ Created, deployed and scheduled jobs using DI Studio, SMC and PPM
◦ Register metadata for Oracle tables, jobs, delimited files and SAS datasets
* Merchant Acquirers, Vision, Vies, Nova, Data Metrics, NOVA, TPSS - Enterprise Guide, SAS Batch
◦ Delivered brand new projects such as Merchant Acquirers, Vision, Vies, NOVA(Notification of Vehicle Arrivals), data metrics (management information), TPSS (The Pension Self Service)
◦ Proposed solutions, HLD, Maestro schedules for change requests and brand new projects
◦ Followed SDLC to design, develop, schedule and implement in LIVE
◦ Sampled full volume files down to 3% and 10% using logical data models
◦ Engaged with customers, Business Analysts, BI upstream during requirements gathering
◦ Worked on Enterprise guide, Unix, Maestro, SAS Batch
◦ Enhancements to output Module for Self Assessment (SA 2012) that include complex macro coding
* TRUCE (Transactional Risking Upstream in Connect Environment) - DI Studio [ ETL ]
◦ Owned the full development of this project and setting up of development environment
◦ Installed SAS Servers (Metadata and Compute Server on UNIX) and Client workstations
◦ Automated the development of over 1000+ file reader transformations in Data Integration Studio which now runs iteratively in few minutes which other wise would take months
◦ Created jobs in DI Studio using various transformations
◦ Setup new project repositories to enable change management in DI Studio
◦ Developed complex SAS macros that ease the development of DI jobs involving file readers
◦ Worked in parallel on both management console and DI Studio in updating metadata, creating libraries in Management Console, setting up permissions/ACTs etc
◦ Setup users on metadata and workspace server using Management Console for access to Enterprise Guide/DI Studio
◦ Register metadata for Oracle tables, jobs, delimited files and SAS datasets
* Real Time Information (RTI) - Management Console, LSF, DI Studio, Enterprise Guide
◦ Prepared accessibility test scripts to run JAWS on SAS Management Console 9.2
◦ Tested scripts, SAS code from SAS 9.0 to SAS 9.1. Also involved in testing components from Solaris to Linux.
◦ Created new schedule flows in SMC and added dependencies in LSF PPM(Load Sharing Facility)
◦ Added new queue to lsb.queues. Created new authentication domains
◦ Installed oracle and integrated with SAS to access oracle tables from SAS EG/DI Studio
◦ Add new Oracle, SQL, and DB2 for Z/OS libraries to the RTI - Compute server box
◦ Configure active directory user authentication. Install SAS/Access to SQL Server.
◦ Add users, groups, roles and Access Control templates in Management Console 9.2
◦ Set up new schedules using scheduler manager and PPM(Platform process manager)
◦ Enablement of private user storage
* Log Parser (Management Information) - SAS Reporting, Enterprise Guide, Unix
◦ Flagship project in logging/monitoring the user activity
◦ Used to audit workspace server logs, metadata, stored proc, OLAP, ARM logs, object Spawner logs.
◦ Co-Designed the log parser project.
◦ Graphs for CPU usage, disk usage, concurrency usage of application/ SAS BI tools
Assessment, Analyst, Development, Testing, BEE, Storage, Server, Public sector, Monitoring, Audit, It, Pension, Enterprise, Guide, PPM, Manager, Processes, UP, OLAP, Management, Change management, Sql, Linux, SQL Server, Oracle, Active Directory, Linux server, ETL, Integration, Design, DB2, Test, Analytics, Unix, Audit, SAS, Network, Service
2011 - 2011
job
Technical Lead
SAS ETL & Data Quality (John Deere).
Technologies: SAS-Regular Expressions, SAS-VSAM, SAS/ACCESS for DB2, SAS-Encoding, PROC SURVEYSELECT, PROC DB2UTIL, PROC DB2EXT, JCL, COBOL, DB2 (EBCDIC/UTF), Triggers, Global temporary tables, DB2 Stored Procedures, Proc SQL, XML
Description:
John Deere is the leading manufacturer of agricultural Machinery was listed in the fortune 100 Companies. Project involves building the brand new data marts for MKC (Machine Knowledge Center), DTAC (Dealer Tracking System) and IKC (Information Knowledge Centre) data marts (insight) from fusion (data ware house). Project aims to integrate data spread across different divisions for making informed decisions.
Responsibilities:
◦ Engaged with customers, analysts during requirements gathering
◦ Built brand new data marts for MKC (Machine Knowledge Center), DTAC (Dealer Tracking System) and IKC (Information Knowledge Centre) data marts from fusion (data ware house)
◦ Built technical design documents, data modeling documents, coded programs (SAS, COBOL, DB2, JCL)
◦ Have good experience in handling massive/huge volume of data using SAS. Replaced the full load process with net change process which had a huge saving to the company
◦ Migrated code from SAS mainframes to SAS Enterprise Guide
◦ Produced pseudo code and flow chart to convert COBOL programs to SAS
Description:
John Deere is the leading manufacturer of agricultural Machinery was listed in the fortune 100 Companies. Project involves building the brand new data marts for MKC (Machine Knowledge Center), DTAC (Dealer Tracking System) and IKC (Information Knowledge Centre) data marts (insight) from fusion (data ware house). Project aims to integrate data spread across different divisions for making informed decisions.
Responsibilities:
◦ Engaged with customers, analysts during requirements gathering
◦ Built brand new data marts for MKC (Machine Knowledge Center), DTAC (Dealer Tracking System) and IKC (Information Knowledge Centre) data marts from fusion (data ware house)
◦ Built technical design documents, data modeling documents, coded programs (SAS, COBOL, DB2, JCL)
◦ Have good experience in handling massive/huge volume of data using SAS. Replaced the full load process with net change process which had a huge saving to the company
◦ Migrated code from SAS mainframes to SAS Enterprise Guide
◦ Produced pseudo code and flow chart to convert COBOL programs to SAS
Design, Sql, XML, Data Modeling, DB2, Cobol, Net, SAS, JCL, Stored procedures, Enterprise, Guide, Tech lead, VSAM
2009 - 2010
job
Associate IT Analyst
Santander Cards.
Technologies: SAS, Access for DB2, SQL, MVS-Oracle, DI Studio, Control-M, DataXpert, CA-Intertest, GIPIH, Banesto mainframes, Santander PCAS system (Partenon system)
Description:
Worked as lead developer and worked closely with Business analyst, attended workshops to understand requirements and then prepared technical specs, Unit test plans, analyzed and fixed issues during unit test/UAT. PROBE system stores the behavioral history of all the PCAS customers and helps reduces loss to the business and offers promotions to privileged customers. Automated the import process for flat files.
* Probe (Risk/Credit Cards/Experian/PCA)
◦ As the only consultant, developed full PROBE system and delivered the project well before the projected timelines.
◦ Create data mart tables from warehouse tables which are Z/OS DB2 database and Oracle.
◦ Engaged closely with business analyst and customers to ensure the development is aligned to the requirements
◦ Owned the development of reporting to simulate existing GE mainframes system
Description:
Worked as lead developer and worked closely with Business analyst, attended workshops to understand requirements and then prepared technical specs, Unit test plans, analyzed and fixed issues during unit test/UAT. PROBE system stores the behavioral history of all the PCAS customers and helps reduces loss to the business and offers promotions to privileged customers. Automated the import process for flat files.
* Probe (Risk/Credit Cards/Experian/PCA)
◦ As the only consultant, developed full PROBE system and delivered the project well before the projected timelines.
◦ Create data mart tables from warehouse tables which are Z/OS DB2 database and Oracle.
◦ Engaged closely with business analyst and customers to ensure the development is aligned to the requirements
◦ Owned the development of reporting to simulate existing GE mainframes system
Sql, Oracle, Database, DB2, Test, Unit test, SAS, It, Developer, Workshops, Analyst, Lead developer, Development
2007 - 2009
job
Senior Software Engineer
Computer Sciences Corporation.
Role: Senior Software Engineer
Technologies: DI Studio 3.4, Enterprise Guide 4.0/4.1, SQL, Information Map Studio 3.1, Web report Studio, Oracle, SAS/BASE, Macros, SAS/ACCESS for Oracle/DB2, Stored Processes, Oracle, Teradata
Description:
Liaised with business analysts and users in gathering requirements and prepared the technical spec from requirements
document. Produced reports in different formats like PDF, Excel, RTF, MDB(Microsoft database)
* Zurich Farmers (Insurance)
◦ Register source/target oracle, DB2 tables. Generated various financial reports, reconciliation reports in PDF/Excel
◦ Involved in data cleansing and validation using SAS and Data Flux
◦ Created/tested new information maps, filters, new data items, additional data sources for use in Web report
◦ Created data marts from warehouse and generated PDF, RTF, html, Excel reports and Graphs
◦ Setup change management in DI Studio and deployed/scheduled batch jobs using SMC and PPM
◦ Experience working on different transformations like Control Loop, Lookup, Table loader, SCD Type2 loader, SCD
Type1 loaders, table loader, file reader, file writer, data validation, sql, compare, splitter, extract, library contents,
surrogate key generator, key effective date
* CAT/CAPTURE - AT&T - (Telecom)
◦ CAPTURE receives data from ~ 15 different systems in different formats and are integrated finally to Oracle tables
◦ Prepared the technical specs, test plans/results, efforts estimation for user requests
◦ Innovatively implemented a process improvement for CAPTURE project that had annual savings of $50,000 to AT&T
◦ Converted windows PC SAS into Enterprise Guide project
◦ Extracted data from various sources like CSV, Excel, VSAM, MDB, DB2, Teradata into Oracle
◦ Created new information maps from SAS datasets/Cubes
Technologies: DI Studio 3.4, Enterprise Guide 4.0/4.1, SQL, Information Map Studio 3.1, Web report Studio, Oracle, SAS/BASE, Macros, SAS/ACCESS for Oracle/DB2, Stored Processes, Oracle, Teradata
Description:
Liaised with business analysts and users in gathering requirements and prepared the technical spec from requirements
document. Produced reports in different formats like PDF, Excel, RTF, MDB(Microsoft database)
* Zurich Farmers (Insurance)
◦ Register source/target oracle, DB2 tables. Generated various financial reports, reconciliation reports in PDF/Excel
◦ Involved in data cleansing and validation using SAS and Data Flux
◦ Created/tested new information maps, filters, new data items, additional data sources for use in Web report
◦ Created data marts from warehouse and generated PDF, RTF, html, Excel reports and Graphs
◦ Setup change management in DI Studio and deployed/scheduled batch jobs using SMC and PPM
◦ Experience working on different transformations like Control Loop, Lookup, Table loader, SCD Type2 loader, SCD
Type1 loaders, table loader, file reader, file writer, data validation, sql, compare, splitter, extract, library contents,
surrogate key generator, key effective date
* CAT/CAPTURE - AT&T - (Telecom)
◦ CAPTURE receives data from ~ 15 different systems in different formats and are integrated finally to Oracle tables
◦ Prepared the technical specs, test plans/results, efforts estimation for user requests
◦ Innovatively implemented a process improvement for CAPTURE project that had annual savings of $50,000 to AT&T
◦ Converted windows PC SAS into Enterprise Guide project
◦ Extracted data from various sources like CSV, Excel, VSAM, MDB, DB2, Teradata into Oracle
◦ Created new information maps from SAS datasets/Cubes
Telecom, Processes, VSAM, PPM, Guide, Enterprise, PDF, PC, Teradata, Web, Insurance, Software, Html, SAS, Windows, Test, DB2, Management, Database, Oracle, HTML/CSS/Javascript, Excel, Sql, Change management
My education
n/a
Bachelors, Computer Science
Bachelors, Computer Science
Pavan's reviews
Pavan has not received any reviews on Worksome.
Contact Pavan Koppula
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Pavan directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark