Ajith_kumar_4.3 Years_Informatica_ETL

OBJECTIVE
"Aimingtobe associatedwithaprogressive organization that gives me the scope and apply my knowledge
and skills to involve as part of team that dynamically works towards the growth of the organization".
PROFESSIONAL SUMMARY
Having 4.3 years of Data warehousing experience using Informatica Power Center 7.X/8.X/9.x. Good
KnowledgeonData Warehousingconceptslike StarSchema, Snowflake Schema, Dimensions and Fact
tables.
Extensively used Informatica client tools – Source Analyzer, Target designer, Mapping designer,
Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
Experience in Development of mappings using needed Transformations using Informatica tool.
Various kinds of the transformations were used to implement simple and complex business logic.
According to transformation logic, we used various transformations like Sorter, Aggregator, Lookup,
Expression, Filters, Router, Update Strategy, Joiner, Normalizer etc. in the Mappings.
Good experience in to Performance tuning of the mappings.
Implemented Change Data Capture (CDC) Techniques like slowly changing dimensions (Type1 and
Type2), and Simple pass through mapping using Informatica Power Center.
Experience inperformance tuning of Informatica Sources, Targets, Mappings, and Transformations &
Sessions and identified the performance bottlenecks for better performance and efficiency.
Responsible forsuccessful execution of validation and data according to the functional specifications
of the business as a part of design, development, test and release.
PREFESSIONAL ACHIEVEMENTS
Skill Development and product Deployment
Multi-location and Multi-Culture Exposure
Quick learner with the ability to grasp new technologies
Good interpersonal skills
CARRIER SUMMARY
Working as Software Engineer in Tech Mahindra Ltd, SEI-CMM 5 organization from April 2011 to till date.
AWARDS & RECOGNITIONS
Received Best Team Member Award, June 2013 being the developer and supporter of the team.
Ajith Kumar pampatti
pampattiajith@gmail.com
Ph.: +91 7799274411
PROJECT SUMMARY
Client : British Telecommunications plc.(BT),UK
Project : PMP Datafactory
Location : Hyderabad
Role : ETL Developer. Mar 2014 – Till Date
Product Data Cosmoss is responsible to maintain UK Private Circuit product data in a master database called
PDB (Portfolio Database), supporting order journey in COSMOSS and billing on a set of billing systems.
PDB is an Informatica based application on top of which product data is configured by data warehouse
processing.Thisdatalatercascadesin CDV files to COSMOSS, EcoX and a set of billing systems, like, GENEVA,
PCNBS, GLOSSI, etc.
Thus PDB acts a masterdatabase controllingentireUKPCproductdata.PDB supportsall BT LOB (BT Wholesale,
BTGS, BT Retail and openreach) product data.
Responsibilities:
Working on private circuit product using Informatica developer for all LOBs of BT.
Worked on Informatica Designer Tool’s components – Source Analyzer, Target Designer,
Transformation Developer and Mapping Designer.
Understandingcustomerrequirementsand component design analysis in detail from the documents
(generally we get from component designers) received.
Createdmappingsusingtransformationssuchas Source Qualifier, Aggregator, Lookup, Joiner, Router
and Update Strategy Involved in creating sessions and managing sessions.
Extracted the data from different source databases/Flat Files and loaded into the Flat Files/Oracle
database etc.
Appliedthe extensivetransformationlogicsforinthe mapping process to complete the development
on time.
Providedthe valuablesupporttothe downstreamsystem,COSMOSS inresolving the issues related to
the data passed.
Work done on time if E2E Testing team comes up with any issues. Understand the business logic,
based on analysing the requirement
Technologies / Tools:
Informatica Powercenter 9.5, Oracle10g and Windows Server 2003.
Client : British Telecommunicationsplc.(BT),UK
Project : Smarts Solutions
Location : Pune
Role : ETL Developer. Sept 2012 – Feb 2014
Data sanity and integrity project to maintain the data correctness across multiple components in an order
management life cycle.
In thisprojectwe have to maintaindataforspecificcustomerina central repositorynamedBFG. Also we have
to check the data integrity between BFG and other database named NMDB. After data analysis we have to
calculate the data quality that is how much data is in sync. Here the required data will be extracted and the
varioustransformationslogics will be appliedtopassthe data to target.Then for loading correct data we have
to prepare a DCMO template toprovide the correctdata for loading into BFG. Apart from this manual process
we have to write procedures to do this work. The data will be received in the XML format from customer.
Hence extract the XML files into required format, apply the transformation logics (written some procedures
when required) and the load the data into target BFG.
Responsibilities:
Analyze and understand the Technical Specification Document (Source to Target matrix) for all
mappings and clarify the issues with Data Architects.
DevelopeddifferentmappingsbyusingdifferentTransformationslike Aggregator,Lookup,Expression,
update Strategy, Joiner, Router etc. to load the data into staging tables and then to target.
Monitored transformation processes using Informatica Workflow monitor.
Design the mapping with a Slowly Changing Dimension type1 and type2 to keep track of
Current and historical data.
Involved in creation of Mapplets, Worklets, Reusable transformations, shared folder, and shortcuts
as per the requirements.
Developmentof ETLmappings,sessionsandworkflows for Data cleansing and Performance tuning of
mappings.
Writing stored procedures logics for the accurate mapping process.
Preparation of test cases and test data for each mapping.
Handledthe releasesandcommunicatedwithprocessdesigners,solutiondesignersto understand the
requirements and produce the deliveries on time.
Extracted the data from different source databases/Flat Files and loaded into the Flat Files/Oracle
database etc.
Work done on time if E2E Testing team comes up with any issues.
Technologies / Tools:
Informatica Powercenter 8.6, Oracle 9i, Windows Server 2003.
Client: British Telecommunicationsplc.(BT),UK
Project: Data change managementoffice
Location: Pune
Role: ETL Developer. June 2011 – Aug 2012
The main objective of Agora is to collect customer’s data which is stored in different source systems like
Classic,Orionand Ipmsand migrate the data intoBFG, whichisthe database of Britishtelecom.Agoracleanses
(extraction) the data coming from source system and loads the consistent data into target system, BFG by
applyingthe varioustransformation logics. Once the data is loaded into target, BFG reports are generated by
discarding the unwanted data and database housekeeping activities are maintained.
Deduplication :
Data is extractedfromvarioussource systems.Same datamay be repeated in different source systems while
migratingthe data into target system all these records will be migrated to the target system. This will create
duplicate records.There hastobe some mechanisminwhichthese similarrecordsare identified and grouped
togetherthisprocessof identifyingandgroupingrecordstogethercalled as Deduplication. Hence by applying
the data cleanse transformation logic this Deduplication is removed.
Responsibilities:
Analyze and understand the Technical Specification Document (Source to Target matrix) for all
mappings and clarify the issues with Data Architects.
Usedmost of the transformationssuch as the Source Qualifier, Expression, Aggregator, Connected &
unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy.
UsedWorkflowmanagerforWorkflow and Session Management, database connection management
and Scheduling of jobs to be run in the batch process.
Monitored transformation processes using Informatica Workflow monitor.
Hands on Informatica Admin activities.
Providing access to the users and assigning privileges.
Developmentof ETL mappings, sessions and workflows for Data cleansing and Performance Tuning of
mappings.
Testing the mapping and Debugging the mapping.
Technologies / Tools:
Informatica Powercenter 8.6, Oracle 9i, Windows Server 2003
KEY SKILLS
Skills
ETL Tool : Informatica powercenter 7.x/ 8.x/9.x
Database : Oracle 9i/10g/11g.
OS : Windows / UNIX
Languages : Sql, Pl/sql.
Other Tools : TOAD, SQL*Plus.
ACADEMIC HISTORY
 Bachelor of Technology (B.TECH, 2010) in Information Technology (IT) From Jawaharlal Nehru
Technological University of Technology (JNTU) with an aggregate of 64.19% Hyderabad.
PERSONAL DETAILS
 Year OF Birth : 1988
 Marital Status : Single
 Languages Known : English, Hindi, Telugu
 Nationality : Indian
 CurrentCompany : Tech MahindraLtd.
 Current Location : Hyderabad, India

Recommandé

Resume_Informatica&IDQ_4+years_of_exp par
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exprajarao marisa
338 vues6 diapositives
Richa_Profile par
Richa_ProfileRicha_Profile
Richa_ProfileRicha Sharma
391 vues5 diapositives
Resume_Arun_Baby_03Jan17 par
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Arun Baby
182 vues8 diapositives
Informatica_Rajesh-CV 28_03_16 par
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Rajesh Dheeti
358 vues6 diapositives
Resume_Kunal N Kadbhane par
Resume_Kunal N KadbhaneResume_Kunal N Kadbhane
Resume_Kunal N Kadbhanekunal kadbhane
100 vues3 diapositives
Resume_Ratna Rao updated par
Resume_Ratna Rao updatedResume_Ratna Rao updated
Resume_Ratna Rao updatedRatna Rao yamani
137 vues4 diapositives

Contenu connexe

Tendances

Arun Kondra par
Arun KondraArun Kondra
Arun KondraArun Roy Kondra
157 vues6 diapositives
Prateek sharma etl_datastage_exp3.9yrs_resume par
Prateek sharma etl_datastage_exp3.9yrs_resumePrateek sharma etl_datastage_exp3.9yrs_resume
Prateek sharma etl_datastage_exp3.9yrs_resumePrateek Sharma
177 vues5 diapositives
VamsiKrishna Maddiboina par
VamsiKrishna MaddiboinaVamsiKrishna Maddiboina
VamsiKrishna MaddiboinaMaddiboina VamsiKrishna
82 vues3 diapositives
Resume_Basu_KS_3.4Yrs_Exp par
Resume_Basu_KS_3.4Yrs_ExpResume_Basu_KS_3.4Yrs_Exp
Resume_Basu_KS_3.4Yrs_Expbasu ks
589 vues3 diapositives
Resume_kallesh_latest par
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latestKallesha CB
219 vues4 diapositives
Sanket_Informatica_CV par
Sanket_Informatica_CVSanket_Informatica_CV
Sanket_Informatica_CVSanket Gaware
251 vues4 diapositives

Tendances(19)

Prateek sharma etl_datastage_exp3.9yrs_resume par Prateek Sharma
Prateek sharma etl_datastage_exp3.9yrs_resumePrateek sharma etl_datastage_exp3.9yrs_resume
Prateek sharma etl_datastage_exp3.9yrs_resume
Prateek Sharma177 vues
Resume_Basu_KS_3.4Yrs_Exp par basu ks
Resume_Basu_KS_3.4Yrs_ExpResume_Basu_KS_3.4Yrs_Exp
Resume_Basu_KS_3.4Yrs_Exp
basu ks589 vues
Resume_kallesh_latest par Kallesha CB
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
Kallesha CB219 vues
Sujit lead plsql par Sujit Jha
Sujit lead plsqlSujit lead plsql
Sujit lead plsql
Sujit Jha131 vues

En vedette

Constructs(Accra, Ghana) par
Constructs(Accra, Ghana)Constructs(Accra, Ghana)
Constructs(Accra, Ghana)Damian-111093
206 vues2 diapositives
Concordia(New Orleans) par
Concordia(New Orleans)Concordia(New Orleans)
Concordia(New Orleans)Damian-111093
242 vues2 diapositives
Morphosis(Santa Monica) par
Morphosis(Santa Monica)Morphosis(Santa Monica)
Morphosis(Santa Monica)Damian-111093
232 vues2 diapositives
Resume par
ResumeResume
ResumeOLUKAYODE NIRANOLUWA
115 vues2 diapositives
William McDonough + Partners (Charlottesville) par
William McDonough + Partners (Charlottesville) William McDonough + Partners (Charlottesville)
William McDonough + Partners (Charlottesville) Damian-111093
541 vues2 diapositives
Fisadelucru persoanaverb par
Fisadelucru persoanaverbFisadelucru persoanaverb
Fisadelucru persoanaverbmihaelapaduraru
202 vues1 diapositive

En vedette(14)

Similaire à Ajith_kumar_4.3 Years_Informatica_ETL

Resume quaish abuzer par
Resume quaish abuzerResume quaish abuzer
Resume quaish abuzerquaish abuzer
317 vues7 diapositives
jagadeesh updated par
jagadeesh updatedjagadeesh updated
jagadeesh updatedjagadeesh yadav
577 vues4 diapositives
Resume par
ResumeResume
ResumeBOYA VEERANJANEYULU
166 vues3 diapositives
Abdul ETL Resume par
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL ResumeAbdul mohammed
1.5K vues4 diapositives
Resume ratna rao updated par
Resume ratna rao updatedResume ratna rao updated
Resume ratna rao updatedRatna Rao yamani
211 vues4 diapositives
Resume par
ResumeResume
ResumeVeeranjaneyulu B B
130 vues4 diapositives

Similaire à Ajith_kumar_4.3 Years_Informatica_ETL(20)

Arun Mathew Thomas_resume par ARUN THOMAS
Arun Mathew Thomas_resumeArun Mathew Thomas_resume
Arun Mathew Thomas_resume
ARUN THOMAS506 vues
ETL_Developer_Resume_Shipra_7_02_17 par Shipra Jaiswal
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
Shipra Jaiswal255 vues
Sakthi Shenbagam - Data warehousing Consultant par Sakthi Shenbagam
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam257 vues
Prakash_Profile(279074) par Prakash s
Prakash_Profile(279074)Prakash_Profile(279074)
Prakash_Profile(279074)
Prakash s257 vues
Resume_RaghavMahajan_ETL_Developer par Raghav Mahajan
Resume_RaghavMahajan_ETL_DeveloperResume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_Developer
Raghav Mahajan287 vues

Ajith_kumar_4.3 Years_Informatica_ETL

  • 1. OBJECTIVE "Aimingtobe associatedwithaprogressive organization that gives me the scope and apply my knowledge and skills to involve as part of team that dynamically works towards the growth of the organization". PROFESSIONAL SUMMARY Having 4.3 years of Data warehousing experience using Informatica Power Center 7.X/8.X/9.x. Good KnowledgeonData Warehousingconceptslike StarSchema, Snowflake Schema, Dimensions and Fact tables. Extensively used Informatica client tools – Source Analyzer, Target designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager. Experience in Development of mappings using needed Transformations using Informatica tool. Various kinds of the transformations were used to implement simple and complex business logic. According to transformation logic, we used various transformations like Sorter, Aggregator, Lookup, Expression, Filters, Router, Update Strategy, Joiner, Normalizer etc. in the Mappings. Good experience in to Performance tuning of the mappings. Implemented Change Data Capture (CDC) Techniques like slowly changing dimensions (Type1 and Type2), and Simple pass through mapping using Informatica Power Center. Experience inperformance tuning of Informatica Sources, Targets, Mappings, and Transformations & Sessions and identified the performance bottlenecks for better performance and efficiency. Responsible forsuccessful execution of validation and data according to the functional specifications of the business as a part of design, development, test and release. PREFESSIONAL ACHIEVEMENTS Skill Development and product Deployment Multi-location and Multi-Culture Exposure Quick learner with the ability to grasp new technologies Good interpersonal skills CARRIER SUMMARY Working as Software Engineer in Tech Mahindra Ltd, SEI-CMM 5 organization from April 2011 to till date. AWARDS & RECOGNITIONS Received Best Team Member Award, June 2013 being the developer and supporter of the team. Ajith Kumar pampatti pampattiajith@gmail.com Ph.: +91 7799274411
  • 2. PROJECT SUMMARY Client : British Telecommunications plc.(BT),UK Project : PMP Datafactory Location : Hyderabad Role : ETL Developer. Mar 2014 – Till Date Product Data Cosmoss is responsible to maintain UK Private Circuit product data in a master database called PDB (Portfolio Database), supporting order journey in COSMOSS and billing on a set of billing systems. PDB is an Informatica based application on top of which product data is configured by data warehouse processing.Thisdatalatercascadesin CDV files to COSMOSS, EcoX and a set of billing systems, like, GENEVA, PCNBS, GLOSSI, etc. Thus PDB acts a masterdatabase controllingentireUKPCproductdata.PDB supportsall BT LOB (BT Wholesale, BTGS, BT Retail and openreach) product data. Responsibilities: Working on private circuit product using Informatica developer for all LOBs of BT. Worked on Informatica Designer Tool’s components – Source Analyzer, Target Designer, Transformation Developer and Mapping Designer. Understandingcustomerrequirementsand component design analysis in detail from the documents (generally we get from component designers) received. Createdmappingsusingtransformationssuchas Source Qualifier, Aggregator, Lookup, Joiner, Router and Update Strategy Involved in creating sessions and managing sessions. Extracted the data from different source databases/Flat Files and loaded into the Flat Files/Oracle database etc. Appliedthe extensivetransformationlogicsforinthe mapping process to complete the development on time. Providedthe valuablesupporttothe downstreamsystem,COSMOSS inresolving the issues related to the data passed. Work done on time if E2E Testing team comes up with any issues. Understand the business logic, based on analysing the requirement Technologies / Tools: Informatica Powercenter 9.5, Oracle10g and Windows Server 2003. Client : British Telecommunicationsplc.(BT),UK Project : Smarts Solutions Location : Pune Role : ETL Developer. Sept 2012 – Feb 2014 Data sanity and integrity project to maintain the data correctness across multiple components in an order management life cycle. In thisprojectwe have to maintaindataforspecificcustomerina central repositorynamedBFG. Also we have to check the data integrity between BFG and other database named NMDB. After data analysis we have to calculate the data quality that is how much data is in sync. Here the required data will be extracted and the varioustransformationslogics will be appliedtopassthe data to target.Then for loading correct data we have to prepare a DCMO template toprovide the correctdata for loading into BFG. Apart from this manual process we have to write procedures to do this work. The data will be received in the XML format from customer.
  • 3. Hence extract the XML files into required format, apply the transformation logics (written some procedures when required) and the load the data into target BFG. Responsibilities: Analyze and understand the Technical Specification Document (Source to Target matrix) for all mappings and clarify the issues with Data Architects. DevelopeddifferentmappingsbyusingdifferentTransformationslike Aggregator,Lookup,Expression, update Strategy, Joiner, Router etc. to load the data into staging tables and then to target. Monitored transformation processes using Informatica Workflow monitor. Design the mapping with a Slowly Changing Dimension type1 and type2 to keep track of Current and historical data. Involved in creation of Mapplets, Worklets, Reusable transformations, shared folder, and shortcuts as per the requirements. Developmentof ETLmappings,sessionsandworkflows for Data cleansing and Performance tuning of mappings. Writing stored procedures logics for the accurate mapping process. Preparation of test cases and test data for each mapping. Handledthe releasesandcommunicatedwithprocessdesigners,solutiondesignersto understand the requirements and produce the deliveries on time. Extracted the data from different source databases/Flat Files and loaded into the Flat Files/Oracle database etc. Work done on time if E2E Testing team comes up with any issues. Technologies / Tools: Informatica Powercenter 8.6, Oracle 9i, Windows Server 2003. Client: British Telecommunicationsplc.(BT),UK Project: Data change managementoffice Location: Pune Role: ETL Developer. June 2011 – Aug 2012 The main objective of Agora is to collect customer’s data which is stored in different source systems like Classic,Orionand Ipmsand migrate the data intoBFG, whichisthe database of Britishtelecom.Agoracleanses (extraction) the data coming from source system and loads the consistent data into target system, BFG by applyingthe varioustransformation logics. Once the data is loaded into target, BFG reports are generated by discarding the unwanted data and database housekeeping activities are maintained. Deduplication : Data is extractedfromvarioussource systems.Same datamay be repeated in different source systems while migratingthe data into target system all these records will be migrated to the target system. This will create duplicate records.There hastobe some mechanisminwhichthese similarrecordsare identified and grouped togetherthisprocessof identifyingandgroupingrecordstogethercalled as Deduplication. Hence by applying the data cleanse transformation logic this Deduplication is removed.
  • 4. Responsibilities: Analyze and understand the Technical Specification Document (Source to Target matrix) for all mappings and clarify the issues with Data Architects. Usedmost of the transformationssuch as the Source Qualifier, Expression, Aggregator, Connected & unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy. UsedWorkflowmanagerforWorkflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process. Monitored transformation processes using Informatica Workflow monitor. Hands on Informatica Admin activities. Providing access to the users and assigning privileges. Developmentof ETL mappings, sessions and workflows for Data cleansing and Performance Tuning of mappings. Testing the mapping and Debugging the mapping. Technologies / Tools: Informatica Powercenter 8.6, Oracle 9i, Windows Server 2003 KEY SKILLS Skills ETL Tool : Informatica powercenter 7.x/ 8.x/9.x Database : Oracle 9i/10g/11g. OS : Windows / UNIX Languages : Sql, Pl/sql. Other Tools : TOAD, SQL*Plus. ACADEMIC HISTORY  Bachelor of Technology (B.TECH, 2010) in Information Technology (IT) From Jawaharlal Nehru Technological University of Technology (JNTU) with an aggregate of 64.19% Hyderabad. PERSONAL DETAILS  Year OF Birth : 1988  Marital Status : Single  Languages Known : English, Hindi, Telugu  Nationality : Indian  CurrentCompany : Tech MahindraLtd.  Current Location : Hyderabad, India