SlideShare une entreprise Scribd logo
1  sur  5
Télécharger pour lire hors ligne
RESUME	
	
Parthiban Ranganathan
Thousand Oaks, CA.
Phone: +1 8059907090
PROFILE	SUMMARY																																								 	 	 	 	 	 	 	 											
• Over Seven (7 years 3 months) years of IT experience remarkably in Data Warehousing technology,
Business Intelligence and a Big Data aspirant with knowledge in Hadoop.
• Have worked in multiple assignments in Healthcare [Client: Anthem Inc. - Membership, Claims and
Provider Subject areas of one of the largest Healthcare Data Warehouse EDWARD] and Manufacturing
[Client: Cummins Inc.] domains.
• Have solid understanding of OLAP concepts, OLTP Conepts, Data warehousing concepts.
• Possess strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing
dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
• Experience in Data Integration and ETL with various data sources like Teradata, Oracle, DB2, Sybase,
SQL server and non-relational sources like flat files(CSV, Binary, COBOL VSAM, XML ) into staging area.
• Expertize in SQL Scripting.
• Experience in Business Intelligence like Business Modeling and Reports creation, Scheduling &
Publishing.
• Possess excellent analytical, problem solving, communication and interpersonal skills, with ability to
interact with individuals at all levels and work as a part of a team as well as independently.
• Capable of grasping new technology, process and tools quickly and mentor of novice.
TECHNICAL	EXPERTIZE	
Type Particulars
Database Teradata, Oracle, MS SQL Server, DB2, Sybase
ETL Informatica Power Center, Teradata BTEQ, MS SQL Server TSQL
Stored Procedure, Oracle PL/SQL, SSIS
BI Tools Business Objects, OBIEE, Cognos, SSRS
Data Analysis SSAS, MS EXCEL
Scripting UNIX Shell Scripting, PERL
Data Modelling MS VISIO, ERWIN
Versioning Clearcase
Scheduling Tool WLM, Control M, Appworx, DAC
Other Tools RequestPro, JIRA, ClearQuest
Database Utilities Teradata MLOAD, FLOAD, FASTEXPORT scripts, TPUMPs,
Oracle PL/SQL cursors,triggers, stored procedures, functions,
packages
Database Tool MS Management Studio, Teradata SQL Assistant,
Oracle – Toad, SQL Developer and SQL*PLUS.
Secondary Programming Skills C, C++, HTML, EXCEL MACRO.
Operating Systems Operating Systems UNIX, Windows 2000/2003/XP/7/8/10.
PROFESSIONAL	EXPERIENCE	
Designation Company Period
Associate Cognizant Technology Solutions Nov 2011 – Till date
Programmer Analyst Cognizant Technology Solutions Nov 2009 – Nov 2011
EDUCATION	
Degree and Stream College/University Year of Passing
Bachelor of Engineering in
Electronics and Communication
Institute of Road and Transport Technology,
Erode.(Anna University Affiliated)
2008
PROJECTS	AND	RESPONSIBILITY	
Duration Jan 2014 – Dec 2016
Project Title Anthem Provider Finder & Procurement Data Mart
Projects
Role ETL and BI Tech Lead
Funcational Area Healthcare & Data Warehousing, ETL, Business Intelligence
Client Anthem Inc.
Operating Systems Windows XP, UNIX
Tools Informatica, UNIX, Shell Scripting, MS SQL Server 2008/2014, T-SQL,
BCP, Teradata BTEQs, MLOAD, FLOAD scripts, WLM, ClearQuest,
RequestPro, JIRA, Business Objects, Infogix, MSBI (SSAS, SSRS).
PROJECT DESCRIPTION:
• Provider Finder is Find a Doctor tool in WellPoint consumer portal to find Doctors, Facilities,
Pharmacy. It is a critical application for Anthem’s business. Multiple migration projects and multiple
enhancement projects.
• To build Procurement Data Mart that will enable WellPoint to create enterprise-wide vendor reporting
using multiple data sources (SourcePoint, Plan view, Fieldglass, SharePoint, D&B, etc.). Data Mart will
deliver reporting data via customized reports and dashboards to Global Vendor Management, IT
Vendor Management, and Business Process Outsource Risk Committee stakeholders. Data Mart will
also cater to the Procurement’s audit findings.
RESPONSIBILITIES:
• Worked as an Tech Lead and involved in activities like Requirement Gathering & Analysis, Estimate
Project Cost, Design, Coding & Development, Unit testing, System Testing , User Acceptance Testing
Support, Implementation Support, Maintenance Support, Checkout Support, Warranty Support.
• Responsible for understanding, analyzing the Requirement, interacting with SAs and Business
stakeholders for clarification and interpreting the the requirements into coding..
• Responsible for presenting the Design to Clients and getting them signed off.
• Involved in Data Model changes, working along with the Project Data Architect.
• Responsible for Creating LOEs in Initiation Phase, Project budget reports and presented to Project
Management.
• Active member of Right In-Time team, responsible for Analyzing, Resolving real time issues in the
Production, Bug fixing and taking load related decisions.
• Experience in creating Business and Technical Requirements Documentation.
• Created High Level Design and Detailed Design, Technical Specification, Attribute Mapping, Logical
Data Models and Physical Data models, Data flow and HLD and LLD diagrams with MS VISIO in the
Design phase.
• UNIX Shell scripts for File Processing, File transfers, Archiving, Auditing, BO report scheduling and
error handling, Infogix Controls.
• Created Jobs/Jobsets for triggering Informatica Workflows and Shell Scripts in WLM/ Scheduling
Tables/Jobs in Control-M.
• Tuned mappings using Power Center-Designer and changed logic of existing code to provide
maximum efficiency and performance.Tweaked SQL queries in the Stored Procedures to improve
performance and tuned Teradata SQL queries to solves the spool space issues.
• Created Use cases, Test Scenarios, and Test Results/Reports for Unit Testing and Provided Test Data
for System and User Testing. Experience in Clearquest, JIRA for Creating, Triaging, Deferring, Closing
and Canceling Defects.
• Suported DBAs, Informatica Admins, UNIX Admins and WLM/Control-M Admins, BO Admins in
Migration/Deployment of Codes/Changes and created Migration documents for all environments.
• Responsible for in Versioning the Informatica Codes in XML format and other Codes with Clearcase.
• Experience Implementation Support, Business checkout, Technical Checkout, Post Production
Validation and Testing Support.
• Responsible for providing, Presenting Project RAG Status reports to the Clients.
• Responsible for Implementing all the Project changes handled by my Team in Monthly release
activities, on average 6 to 7 Projects/SSCRs everymonth starting from Sep 2014 to Dec 2016.
ACHIEVEMENTS:
• Successful migration of Provider data from EPDSV2 to EDWARD which is built on Teradata.
Write !complex BTEQ scripts and use of MLOAD and FLOAD utilities to load Provider data from
EPDSV2.
• Automated data validation (Reconciliation,comparing previous/current eligible data loaded etc..)
involved in the each and every Production Load Steps with around 2500 validation queries and
created BO reports with validations data and scheduled them to Run appropriately and automatically
send mails to decision making with PASS/FAIL status with reports attached.
• Successfully delivered Data migration projects, changing the sourcing of data from Legacy systems
to EPDSV2 into Provider Finder data layer.
• Converted Weekly loads to Daily loads by changing the Kill-Fill load strategy to Change Data Capture
in the Landing Zone and Stage Layers, thus improving the Functionality of the System.
• Decoupled Stage layer and Target layer loads eliminating the Dependencies between load steps
which involved 14 Source systems (WGS, SSB-CA,SSB-IN, CS90, FACETS, ACES, WEST, Pharmacy,
Dental, Vision, VA, BCBSA, CACTUS, GA) increasing the Maintainability of the system in case of
Invalid Data loads or errors in the data load.
• Spider Application took 4 hours to load 1 GB of XML Source file into the SQL Server as early as
possible, as the data in the DB is critical for business. Reduced the 4 hours running time to 45 mins,
by removing the File dependency, tweaking the SQL codes and configuration changes in the
Informatica Sessions and Workflows.
Duration Dec 2012 – Dec 2013
Project Title Cummins Enhancements and Projects
Role Senior ETL and BI Developer
Funcational Area Manufacturing & Data Warehousing, ETL, Business Intelligence
Client Cummins Inc.
Operating Systems Windows XP, UNIX
Tools Informatica, UNIX Shell Scripting, Oracle 11g PL/SQL, Appworx & DAC
Scheduler, OBIEE 11g, Clearcase, MSBI (SSIS, SSAS, SSRS).
PROJECT DESCRIPTION:
• To capture the bugs in the Cummins’ applications such as EBU (engine business unit), Corp Apps
(Corporate Application) and modification of the application in order to fix the existing bugs and
adding more functionality and capability. !The changes to the existing ETL Codes, Database etc. are
designed and developed based on the Change Requests called CRQs. Delivered many CRQs
notably, SSA, FLD, RPV, NAFTA BOM Correction. NAFTA (North American Free Trade Agreement)
application creates reports like Billing of Materials, Shipment details and Engine Component Items.
Building of new functionality to ensure the correctness of the reports created in the NAFTA
application. To Design, Build new Error Mechanism and correction of reports in MacTrack (Material
Cost Tracking) Application.
RESPONSIBILITIES:
• Worked as an Sr. Informatica and PL/SQL, BI Developer and involved in activities including
Requirement Analysis, Design, Coding & Development (involves policy rule coding, sequence flow,
status flow, and dictionary entries), Unit testing, System Testing, UAT Support, Implementation,
Maintenance.
• Record data errors from the Sources of MacTrack Application and creates reports out of it for the
Clients. Building Error Mechanism for the MacTrack System and send reports to the Business for
making decisions on loading the data. Error Mechanism using Informatica Mapping/Workflow and
publish report to the users in the format of CSV Files.
• Developed Informatica Mappings with Reusable Mapplets,Transformations (Joiner, Sorter,
Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer
and Rank), Worklows with Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks
(Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
• Developed Dashboards/Reports in OBIEE, Reports in SSRS. Analyzed and corrected the errors in
the Bill of Materials (BOM) reports of Cummins Power Generation Unit of North American Free Trade
Agreement (NAFTA) regions. Correction of OBIEE dashboard for BOM.
• Added a column Part Number in the Business model of the BOM, in the Repository and use that in
the aggregation of the BOM list report.
Duration Sep 2009 – Dec 2012
Project Title Anthem EDWARD Development Projects
Role Informatica & Teradata Developer
Funcational Area Healthcare & Data Warehousing, ETL
Client Anthem Inc. (then Wellpoint Inc.)
Operating Systems Windows XP, UNIX
Tools Informatica Power Center 8.6.1, Teradata, UNIX Shell Scripting, Putty,
Clearcase.
PROJECT DESCRIPTION:
• EDWARD (Enterprise Data Warehouse and Reporting Depot) is a Huge Data Warehouse with data
integrated from all the source systems of Wellpoint such as CS90, NASCO, ACES, WGS, FACETS
etc. and all Subject areas such as Membership, Claims, Provider etc. The target layer is itself 250+
tables with huge amount of data, and thus there were hundreds of code components including
Informatica Workflows, Teradata BTEQs etc.This assignment is to perform Functional testing of all
existing components; Modification of existing codes as per the standards and changes to the
existing Informatica Mappings focusing on the EDWARD improvement.
RESPONSIBILITIES:
• Worked as an Senior Informatica/Teradata Developer and involved in activities including
Requirement Analysis, Design, Coding & Development (involves policy rule coding, sequence flow,
status flow, and dictionary entries), Unit testing, System Testing, UAT Support, Implementation,
Maintenance.
• Created Technical Design documents.
• Developed Informatica codes for loading the files in the Landing Zone tables.
• Implemented fixes/solutions to the Bugs encountered during the functional testing and Regression
Testing support.
• Developed BTEQ scripts involving complex queries to derive benefits/coverage of a
member/spouse/dependent incase of multiple insurances subscribed by member, spouse from
other carrier and of dependents.
• Created Test Cases, Test Plans and Test results and documented them. Unit tested the codes
written to check the data loaded in the target is as per the logic and the requirement. Supported SIT,
UAT testing and Implementation of the project.
ACHIEVEMENTS:
• This is one of the biggest project I was invloved in writing 10K lines of more than 40 BTEQ Scripts to
achieve the Coordination of Benefits for the NASCO source system.
• Created SQL Parser with PERL Text Mining, for validating the BTEQ SQL Scripting standards and
cosmetics of the code.
• Pivoted Plan and corresponding Rates with various categories. Created mapping with Normalizer
and Aggregator Transformation for Pivoting Excel data, converting multiple rows into single row,
concatenating attributes in the Plan Spreadsheet and combining(Using Union Transformation) Rate
data with multiple categories like (Family with one child, with two child etc.) from another Excel sheet
• Created reusable Excel templates with checkbox to record test results of each component and
reduced the documentation time.
• Created SCD Type 2 Mappings using Lookup & Update Strategy Transformation to load plan,
pricing, member etc. data into the ODS.

Contenu connexe

Tendances

Jaime Carlson\'s BI Resume
Jaime Carlson\'s BI ResumeJaime Carlson\'s BI Resume
Jaime Carlson\'s BI Resume
jaimecarlson
 
Md 10 G1 Jeamaire Drone
Md 10 G1 Jeamaire DroneMd 10 G1 Jeamaire Drone
Md 10 G1 Jeamaire Drone
HESABLE1
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam
 
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+yearsJayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi parasaram
 
CHETHAN RESUME_MSBI DEVELOPER
CHETHAN RESUME_MSBI DEVELOPERCHETHAN RESUME_MSBI DEVELOPER
CHETHAN RESUME_MSBI DEVELOPER
Chethan a
 
Resume Aden bahdon
Resume Aden bahdonResume Aden bahdon
Resume Aden bahdon
Aden Bahdon
 
Salim Khan.Resume_3.8
Salim Khan.Resume_3.8Salim Khan.Resume_3.8
Salim Khan.Resume_3.8
Salim Khan
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna Kishore
 
PradeepKumar_Tableau Developer
PradeepKumar_Tableau DeveloperPradeepKumar_Tableau Developer
PradeepKumar_Tableau Developer
Pradeep Kumar
 
Sara_Khan_Resume
Sara_Khan_ResumeSara_Khan_Resume
Sara_Khan_Resume
Sara Khan
 
Jim Jones Resume _rev5
Jim Jones Resume _rev5Jim Jones Resume _rev5
Jim Jones Resume _rev5
Jim Jones
 
Business Intelligence Resume - Sam Kamara
Business Intelligence Resume - Sam KamaraBusiness Intelligence Resume - Sam Kamara
Business Intelligence Resume - Sam Kamara
skamara1
 

Tendances (20)

Jaime Carlson\'s BI Resume
Jaime Carlson\'s BI ResumeJaime Carlson\'s BI Resume
Jaime Carlson\'s BI Resume
 
Md 10 G1 Jeamaire Drone
Md 10 G1 Jeamaire DroneMd 10 G1 Jeamaire Drone
Md 10 G1 Jeamaire Drone
 
Resume
ResumeResume
Resume
 
Narayana_Chowdam
Narayana_ChowdamNarayana_Chowdam
Narayana_Chowdam
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
 
Neeti resume 1
Neeti resume 1Neeti resume 1
Neeti resume 1
 
Sanjay Lakhanpal 2015
Sanjay Lakhanpal 2015Sanjay Lakhanpal 2015
Sanjay Lakhanpal 2015
 
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+yearsJayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
Jayalakshmi Parasaram_ SAP Business Objects Consultant_8+years
 
ritabrata_bhattacharya_cv
ritabrata_bhattacharya_cvritabrata_bhattacharya_cv
ritabrata_bhattacharya_cv
 
CHETHAN RESUME_MSBI DEVELOPER
CHETHAN RESUME_MSBI DEVELOPERCHETHAN RESUME_MSBI DEVELOPER
CHETHAN RESUME_MSBI DEVELOPER
 
Resume Aden bahdon
Resume Aden bahdonResume Aden bahdon
Resume Aden bahdon
 
Salim Khan.Resume_3.8
Salim Khan.Resume_3.8Salim Khan.Resume_3.8
Salim Khan.Resume_3.8
 
Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016Resume_Gulley_Oct7_2016
Resume_Gulley_Oct7_2016
 
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant Krishna_IBM_Infosphere_Certified_Datastage_Consultant
Krishna_IBM_Infosphere_Certified_Datastage_Consultant
 
PradeepKumar_Tableau Developer
PradeepKumar_Tableau DeveloperPradeepKumar_Tableau Developer
PradeepKumar_Tableau Developer
 
Sara_Khan_Resume
Sara_Khan_ResumeSara_Khan_Resume
Sara_Khan_Resume
 
GAINES_120102014
GAINES_120102014GAINES_120102014
GAINES_120102014
 
Jim Jones Resume _rev5
Jim Jones Resume _rev5Jim Jones Resume _rev5
Jim Jones Resume _rev5
 
Business Intelligence Resume - Sam Kamara
Business Intelligence Resume - Sam KamaraBusiness Intelligence Resume - Sam Kamara
Business Intelligence Resume - Sam Kamara
 
Vaibhav_Rane
Vaibhav_RaneVaibhav_Rane
Vaibhav_Rane
 

En vedette (10)

Analysing music magazine
Analysing music magazineAnalysing music magazine
Analysing music magazine
 
13a rebekahgallegos
13a rebekahgallegos13a rebekahgallegos
13a rebekahgallegos
 
Sv Goldener Oktober
Sv Goldener OktoberSv Goldener Oktober
Sv Goldener Oktober
 
Luxury swimwear
Luxury swimwearLuxury swimwear
Luxury swimwear
 
bhargav resume[1]
bhargav resume[1]bhargav resume[1]
bhargav resume[1]
 
4 Reasons to Hire Financial Planner
4 Reasons to Hire Financial Planner4 Reasons to Hire Financial Planner
4 Reasons to Hire Financial Planner
 
Foxsports86
Foxsports86Foxsports86
Foxsports86
 
C3 los elementos del curriculum
C3 los elementos del curriculumC3 los elementos del curriculum
C3 los elementos del curriculum
 
Pe
PePe
Pe
 
Rich Dad, Poor Dad
Rich Dad, Poor DadRich Dad, Poor Dad
Rich Dad, Poor Dad
 

Similaire à Resume_Parthiban_Ranganathan (20)

HamsaBalajiresume
HamsaBalajiresumeHamsaBalajiresume
HamsaBalajiresume
 
PradeepDWH
PradeepDWHPradeepDWH
PradeepDWH
 
Subhoshree_ETLDeveloper
Subhoshree_ETLDeveloperSubhoshree_ETLDeveloper
Subhoshree_ETLDeveloper
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
ResumeDilipKumarPanchali (1)
ResumeDilipKumarPanchali (1)ResumeDilipKumarPanchali (1)
ResumeDilipKumarPanchali (1)
 
Subhoshree resume
Subhoshree resumeSubhoshree resume
Subhoshree resume
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 
SivakumarS
SivakumarSSivakumarS
SivakumarS
 
Shane_O'Neill_CV_slim
Shane_O'Neill_CV_slimShane_O'Neill_CV_slim
Shane_O'Neill_CV_slim
 
PratikGhosh_Resume_Final
PratikGhosh_Resume_FinalPratikGhosh_Resume_Final
PratikGhosh_Resume_Final
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Sandeep Grandhi (1)
Sandeep Grandhi (1)Sandeep Grandhi (1)
Sandeep Grandhi (1)
 
ShashankJainMSBI
ShashankJainMSBIShashankJainMSBI
ShashankJainMSBI
 
Resume - RK
Resume - RKResume - RK
Resume - RK
 
David Weston Resume
David Weston ResumeDavid Weston Resume
David Weston Resume
 
Dilchand Kumar_Resume
Dilchand Kumar_ResumeDilchand Kumar_Resume
Dilchand Kumar_Resume
 
Munir_Database_Developer
Munir_Database_DeveloperMunir_Database_Developer
Munir_Database_Developer
 
Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015Ashish updated cv 17 dec 2015
Ashish updated cv 17 dec 2015
 
Copy of Alok_Singh_CV
Copy of Alok_Singh_CVCopy of Alok_Singh_CV
Copy of Alok_Singh_CV
 

Resume_Parthiban_Ranganathan

  • 1. RESUME Parthiban Ranganathan Thousand Oaks, CA. Phone: +1 8059907090 PROFILE SUMMARY • Over Seven (7 years 3 months) years of IT experience remarkably in Data Warehousing technology, Business Intelligence and a Big Data aspirant with knowledge in Hadoop. • Have worked in multiple assignments in Healthcare [Client: Anthem Inc. - Membership, Claims and Provider Subject areas of one of the largest Healthcare Data Warehouse EDWARD] and Manufacturing [Client: Cummins Inc.] domains. • Have solid understanding of OLAP concepts, OLTP Conepts, Data warehousing concepts. • Possess strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema). • Experience in Data Integration and ETL with various data sources like Teradata, Oracle, DB2, Sybase, SQL server and non-relational sources like flat files(CSV, Binary, COBOL VSAM, XML ) into staging area. • Expertize in SQL Scripting. • Experience in Business Intelligence like Business Modeling and Reports creation, Scheduling & Publishing. • Possess excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels and work as a part of a team as well as independently. • Capable of grasping new technology, process and tools quickly and mentor of novice. TECHNICAL EXPERTIZE Type Particulars Database Teradata, Oracle, MS SQL Server, DB2, Sybase ETL Informatica Power Center, Teradata BTEQ, MS SQL Server TSQL Stored Procedure, Oracle PL/SQL, SSIS BI Tools Business Objects, OBIEE, Cognos, SSRS Data Analysis SSAS, MS EXCEL Scripting UNIX Shell Scripting, PERL Data Modelling MS VISIO, ERWIN Versioning Clearcase Scheduling Tool WLM, Control M, Appworx, DAC Other Tools RequestPro, JIRA, ClearQuest Database Utilities Teradata MLOAD, FLOAD, FASTEXPORT scripts, TPUMPs, Oracle PL/SQL cursors,triggers, stored procedures, functions, packages Database Tool MS Management Studio, Teradata SQL Assistant, Oracle – Toad, SQL Developer and SQL*PLUS. Secondary Programming Skills C, C++, HTML, EXCEL MACRO. Operating Systems Operating Systems UNIX, Windows 2000/2003/XP/7/8/10.
  • 2. PROFESSIONAL EXPERIENCE Designation Company Period Associate Cognizant Technology Solutions Nov 2011 – Till date Programmer Analyst Cognizant Technology Solutions Nov 2009 – Nov 2011 EDUCATION Degree and Stream College/University Year of Passing Bachelor of Engineering in Electronics and Communication Institute of Road and Transport Technology, Erode.(Anna University Affiliated) 2008 PROJECTS AND RESPONSIBILITY Duration Jan 2014 – Dec 2016 Project Title Anthem Provider Finder & Procurement Data Mart Projects Role ETL and BI Tech Lead Funcational Area Healthcare & Data Warehousing, ETL, Business Intelligence Client Anthem Inc. Operating Systems Windows XP, UNIX Tools Informatica, UNIX, Shell Scripting, MS SQL Server 2008/2014, T-SQL, BCP, Teradata BTEQs, MLOAD, FLOAD scripts, WLM, ClearQuest, RequestPro, JIRA, Business Objects, Infogix, MSBI (SSAS, SSRS). PROJECT DESCRIPTION: • Provider Finder is Find a Doctor tool in WellPoint consumer portal to find Doctors, Facilities, Pharmacy. It is a critical application for Anthem’s business. Multiple migration projects and multiple enhancement projects. • To build Procurement Data Mart that will enable WellPoint to create enterprise-wide vendor reporting using multiple data sources (SourcePoint, Plan view, Fieldglass, SharePoint, D&B, etc.). Data Mart will deliver reporting data via customized reports and dashboards to Global Vendor Management, IT Vendor Management, and Business Process Outsource Risk Committee stakeholders. Data Mart will also cater to the Procurement’s audit findings. RESPONSIBILITIES: • Worked as an Tech Lead and involved in activities like Requirement Gathering & Analysis, Estimate Project Cost, Design, Coding & Development, Unit testing, System Testing , User Acceptance Testing Support, Implementation Support, Maintenance Support, Checkout Support, Warranty Support. • Responsible for understanding, analyzing the Requirement, interacting with SAs and Business stakeholders for clarification and interpreting the the requirements into coding.. • Responsible for presenting the Design to Clients and getting them signed off. • Involved in Data Model changes, working along with the Project Data Architect. • Responsible for Creating LOEs in Initiation Phase, Project budget reports and presented to Project Management. • Active member of Right In-Time team, responsible for Analyzing, Resolving real time issues in the Production, Bug fixing and taking load related decisions. • Experience in creating Business and Technical Requirements Documentation.
  • 3. • Created High Level Design and Detailed Design, Technical Specification, Attribute Mapping, Logical Data Models and Physical Data models, Data flow and HLD and LLD diagrams with MS VISIO in the Design phase. • UNIX Shell scripts for File Processing, File transfers, Archiving, Auditing, BO report scheduling and error handling, Infogix Controls. • Created Jobs/Jobsets for triggering Informatica Workflows and Shell Scripts in WLM/ Scheduling Tables/Jobs in Control-M. • Tuned mappings using Power Center-Designer and changed logic of existing code to provide maximum efficiency and performance.Tweaked SQL queries in the Stored Procedures to improve performance and tuned Teradata SQL queries to solves the spool space issues. • Created Use cases, Test Scenarios, and Test Results/Reports for Unit Testing and Provided Test Data for System and User Testing. Experience in Clearquest, JIRA for Creating, Triaging, Deferring, Closing and Canceling Defects. • Suported DBAs, Informatica Admins, UNIX Admins and WLM/Control-M Admins, BO Admins in Migration/Deployment of Codes/Changes and created Migration documents for all environments. • Responsible for in Versioning the Informatica Codes in XML format and other Codes with Clearcase. • Experience Implementation Support, Business checkout, Technical Checkout, Post Production Validation and Testing Support. • Responsible for providing, Presenting Project RAG Status reports to the Clients. • Responsible for Implementing all the Project changes handled by my Team in Monthly release activities, on average 6 to 7 Projects/SSCRs everymonth starting from Sep 2014 to Dec 2016. ACHIEVEMENTS: • Successful migration of Provider data from EPDSV2 to EDWARD which is built on Teradata. Write !complex BTEQ scripts and use of MLOAD and FLOAD utilities to load Provider data from EPDSV2. • Automated data validation (Reconciliation,comparing previous/current eligible data loaded etc..) involved in the each and every Production Load Steps with around 2500 validation queries and created BO reports with validations data and scheduled them to Run appropriately and automatically send mails to decision making with PASS/FAIL status with reports attached. • Successfully delivered Data migration projects, changing the sourcing of data from Legacy systems to EPDSV2 into Provider Finder data layer. • Converted Weekly loads to Daily loads by changing the Kill-Fill load strategy to Change Data Capture in the Landing Zone and Stage Layers, thus improving the Functionality of the System. • Decoupled Stage layer and Target layer loads eliminating the Dependencies between load steps which involved 14 Source systems (WGS, SSB-CA,SSB-IN, CS90, FACETS, ACES, WEST, Pharmacy, Dental, Vision, VA, BCBSA, CACTUS, GA) increasing the Maintainability of the system in case of Invalid Data loads or errors in the data load. • Spider Application took 4 hours to load 1 GB of XML Source file into the SQL Server as early as possible, as the data in the DB is critical for business. Reduced the 4 hours running time to 45 mins, by removing the File dependency, tweaking the SQL codes and configuration changes in the Informatica Sessions and Workflows.
  • 4. Duration Dec 2012 – Dec 2013 Project Title Cummins Enhancements and Projects Role Senior ETL and BI Developer Funcational Area Manufacturing & Data Warehousing, ETL, Business Intelligence Client Cummins Inc. Operating Systems Windows XP, UNIX Tools Informatica, UNIX Shell Scripting, Oracle 11g PL/SQL, Appworx & DAC Scheduler, OBIEE 11g, Clearcase, MSBI (SSIS, SSAS, SSRS). PROJECT DESCRIPTION: • To capture the bugs in the Cummins’ applications such as EBU (engine business unit), Corp Apps (Corporate Application) and modification of the application in order to fix the existing bugs and adding more functionality and capability. !The changes to the existing ETL Codes, Database etc. are designed and developed based on the Change Requests called CRQs. Delivered many CRQs notably, SSA, FLD, RPV, NAFTA BOM Correction. NAFTA (North American Free Trade Agreement) application creates reports like Billing of Materials, Shipment details and Engine Component Items. Building of new functionality to ensure the correctness of the reports created in the NAFTA application. To Design, Build new Error Mechanism and correction of reports in MacTrack (Material Cost Tracking) Application. RESPONSIBILITIES: • Worked as an Sr. Informatica and PL/SQL, BI Developer and involved in activities including Requirement Analysis, Design, Coding & Development (involves policy rule coding, sequence flow, status flow, and dictionary entries), Unit testing, System Testing, UAT Support, Implementation, Maintenance. • Record data errors from the Sources of MacTrack Application and creates reports out of it for the Clients. Building Error Mechanism for the MacTrack System and send reports to the Business for making decisions on loading the data. Error Mechanism using Informatica Mapping/Workflow and publish report to the users in the format of CSV Files. • Developed Informatica Mappings with Reusable Mapplets,Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank), Worklows with Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control). • Developed Dashboards/Reports in OBIEE, Reports in SSRS. Analyzed and corrected the errors in the Bill of Materials (BOM) reports of Cummins Power Generation Unit of North American Free Trade Agreement (NAFTA) regions. Correction of OBIEE dashboard for BOM. • Added a column Part Number in the Business model of the BOM, in the Repository and use that in the aggregation of the BOM list report. Duration Sep 2009 – Dec 2012
  • 5. Project Title Anthem EDWARD Development Projects Role Informatica & Teradata Developer Funcational Area Healthcare & Data Warehousing, ETL Client Anthem Inc. (then Wellpoint Inc.) Operating Systems Windows XP, UNIX Tools Informatica Power Center 8.6.1, Teradata, UNIX Shell Scripting, Putty, Clearcase. PROJECT DESCRIPTION: • EDWARD (Enterprise Data Warehouse and Reporting Depot) is a Huge Data Warehouse with data integrated from all the source systems of Wellpoint such as CS90, NASCO, ACES, WGS, FACETS etc. and all Subject areas such as Membership, Claims, Provider etc. The target layer is itself 250+ tables with huge amount of data, and thus there were hundreds of code components including Informatica Workflows, Teradata BTEQs etc.This assignment is to perform Functional testing of all existing components; Modification of existing codes as per the standards and changes to the existing Informatica Mappings focusing on the EDWARD improvement. RESPONSIBILITIES: • Worked as an Senior Informatica/Teradata Developer and involved in activities including Requirement Analysis, Design, Coding & Development (involves policy rule coding, sequence flow, status flow, and dictionary entries), Unit testing, System Testing, UAT Support, Implementation, Maintenance. • Created Technical Design documents. • Developed Informatica codes for loading the files in the Landing Zone tables. • Implemented fixes/solutions to the Bugs encountered during the functional testing and Regression Testing support. • Developed BTEQ scripts involving complex queries to derive benefits/coverage of a member/spouse/dependent incase of multiple insurances subscribed by member, spouse from other carrier and of dependents. • Created Test Cases, Test Plans and Test results and documented them. Unit tested the codes written to check the data loaded in the target is as per the logic and the requirement. Supported SIT, UAT testing and Implementation of the project. ACHIEVEMENTS: • This is one of the biggest project I was invloved in writing 10K lines of more than 40 BTEQ Scripts to achieve the Coordination of Benefits for the NASCO source system. • Created SQL Parser with PERL Text Mining, for validating the BTEQ SQL Scripting standards and cosmetics of the code. • Pivoted Plan and corresponding Rates with various categories. Created mapping with Normalizer and Aggregator Transformation for Pivoting Excel data, converting multiple rows into single row, concatenating attributes in the Plan Spreadsheet and combining(Using Union Transformation) Rate data with multiple categories like (Family with one child, with two child etc.) from another Excel sheet • Created reusable Excel templates with checkbox to record test results of each component and reduced the documentation time. • Created SCD Type 2 Mappings using Lookup & Update Strategy Transformation to load plan, pricing, member etc. data into the ODS.