Resume quaish abuzer

Informatica Developer having 5.8 years of Experience with Oracle,Teradata,Unix,Sql Server. Also having good exposure in Big data Hadoop.

Quaish Abuzer
QUAISH ABUZER
Email: abuzer.bit@gmail.com Mobile: +91-9619588402
Qualifications:
Degree and Date Institute Year of Passing
Master of Computer
Application(MCA)
Birla Institute of
Technology(BIT)
,Mesra,Ranchi
2010
Experience Summary:
• Having 5.8 years of experience in Informatica.
• Experience in analysis design, development and implementation of business applications using
Informatica Power Center.
• Strong design and development skills.
• Working experience on Informatica Power Center Tools – Designer, Workflow Manager and
Workflow Monitor.
• Excellent knowledge of DATA WAREHOUSE.
• Significant experience in ETL process in Data Warehouse lifecycle.
• Developed different Mappings/ Mapplets using different transformations.
• Possess excellent learning ability and practical implementation of the same.
• Significant experience in preparing the Unit Test Cases and design document.
• Experience in Performance Tuning the mappings.
• Experience on Oracle SQL. 3 years experience in Teradata (multiset and volatile
table,fastload,multiload, tpt, tpump, daigonastic stats, Joiner index, Loader connection, Teradata
SQL Assistant ,stats,skew factor,dbc tables,export,import,release mload,error table,View point, etc)
• Experience on SQL.
• Performance tuning in sql and Informatica.
• Hands-on experience procedure, function, triggers.
• Error handling in Informatica.
• Worked on all command of Unix. Good exposure of shell scripting in Unix.
• Reviewing the code and documents prepared by the team.
• Can adapt to new technologies quickly
 Capacity to work
 Meet deadlines
• Excellent logical, analytical & debugging skills.
• Have good communication skills, interpersonal relations, hardworking and result oriented as an
individual and in a team.
• A self-starter with clear understanding of the business needs and having radical and tactical
problem solving approaches and possess excellent leadership and motivation skills.
• Client interaction experience.
• Good domain knowledge in Banking and Financial services(BFS).
Additional experience:-
• Informatica admin, deployment/migration, code release, tool release, configuration, setup,
Repository backup/restore,types of connection in workflow manager etc.
• Having good experience in Cloudera BIG data/Hadoop,HDFS,LFS,YARN, Hadoop Eco-friendly
system, PIG,HIVE,Sqoop,Flume ,Hbase,Oozie,Zookeeper,map reduce.
• Connectivity of Hadoop with Informatica to load data from LFSHDFS,HDFSLFS,HDFSHDFS.
1
Quaish Abuzer
• Loading Data from HDFS to HIVE, LFS to HIVE using informatica.
Technology:
Operating Systems Unix, Windows
Databases Oracle,Teradata, Sql Server,Ingress
Technologies Power Center
Tools/IDE Informatica, Toad, Putty, Winscp,Sql
developer,HPQC
Career Profile:
Dates Organization Designation
Dec 2012 - Present Capgemini Consultant
July 2010 – Dec 2012 Tata Consultancy
Services
Systems Engineer
Projects:
The details of the various assignments that I have handled are listed here, in chronological order.
1. Capgemini:-
Project SIMS
Customer Schlumberger, Houston, Texas, United States
Period July 2015 to Till Date
Description We have source system SAP/MS SQL server.BI extractor
comes in picture when source system is SAP. For MS SQL
server source system we create mapping and call procedure.
Pulled data has been directly loaded into Oracle DB (in work
table).Work table to staging table loading done after
implementing business requirement. Staging to Target is
one to one mapping adding some business suggested
column. We maintain history table for work table and Target
table. Once data is ready in target table (Oracle DB), one
team called reported team use that table as source.
We are maintaining audit framework in all tables e.g Work
tableStaging tableTarget tableHistory table
Role • Client owner and responsible for keeping all the track
and follow up with business for my clients.
• Developed codes by interacting with the Business to
understand the requirement and implemented the
ETL logic.
2
Quaish Abuzer
• Implemented cross-reference(Xref) logic.
• Expertise in developing the mappings by using
suitable transformation as per requirement.
• Involved in performance tuning.
• Understanding the whole process and gave
suggestion to change whenever required.
• Involved in analysis before acceptance.
• Provides guidance/training to the new joinees as and
when required.
• Providing effective technical solutions among the
team.
• Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
• Also part of the deployment ,good knowledge if
Informatica administrator
Implemented below scenarios as well being
shared resource:-
• Mapping Variable and Mapping parameter
• Global Parameter
• Power Center utilities (PMCMD command,PMREP
command)
• Global Parameter
• Target Load plan
• Indirect load
• Event Wait
• Dynamic parameter file
• Performance tuning:
Pushdown optimization
Key range partition
Pass through partition
Hash auto key partition
Environment Unix
Tools Informatica 9.6.1,sql-devloper,winscp,putty,DVO
Project Banking Solution
Customer SNS Bank, Netherlands
Period December 2012 to June 2015
Description The project is about all type of banking model. Like net
banking, loan, mortgage, risk management etc.
Creating new models, changing on existing code by CR or
fixing issues on Incident request.
Role • Client owner and responsible for keeping all the track
and follow up with business for my clients.
• Developed codes by interacting with the Business to
understand the requirement and implemented the
ETL logic
• Expertise in developing the mappings by using
suitable transformation as per requirement.
• Involved in performance tuning.
• Understanding the whole process and gave
3
Quaish Abuzer
suggestion to change whenever required.
• Involved in analysis before acceptance.
• Provides guidance/training to the new joinees as and
when required.
• Providing effective technical solutions among the
team.
• Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
• Mapping Variable and Mapping parameter
• Global Parameter
• Power Center utilities (PMCMD command,PMREP
command)
• Global Parameter
• Target Load plan
• Indirect load
• Event Wait
• Dynamic parameter file
• Performance tuning:
Pushdown optimization
Key range partition
Pass through partition
Hash auto key partition
Environment Unix
Tools Informatica 8.6,sql-
devloper,winscp,putty,Teradata,Morphues
2. In Tata Consultancy Services
Project # 1
Project Vehicle upload
Customer GE Capital, France
Period July,2012– December,2012
Description GE has purchased lots of vehicle in Europe. All vehicle
needs to classify on the basis of types and models.
If model changed, then we need to discontinue the
vehicle. On the basis return indicator decide which
vehicle will go for servicing.
Loading the data from flat file Source systems to the
Revel Staging layer and then target involving all the
calculation at transformation level.
Getting source file from customer on business days.
Roles & Responsibilities  Developing and debugging the informatica
mappings to resolve bugs, and identify the causes
of failures.
 User interaction to identify the issues with the data
loaded through the application.
4
Quaish Abuzer
 Developed different Mappings/ Mapplets using
different transformations.
 Providing effective technical solutions among the
team.
 Responsible for handling change.
 Responsible for code check in.
 Participated in the design documentation.
 Enhanced the existing ETL mappings in Informatica
to meet the new requirements.
 Reviewing the code and documents prepared by
other developers.
 Extensively worked on performance tuning.
 Maintaining all the trackers.
 Taking project level initiatives.
 Mapping variable and parameter
 Incremental load
 Mapping Variable and Mapping parameter
 Global Parameter
 Target Load plan
 Indirect load
Tools Informatica 8.6, unix,Ingress DB
Environment Unix
Project # 2
Project NIKE
Customer British Petroleum (BP),UK
Period March,2011– June,2012
Description The NIKE application was using heterogeneous sources
from different countries.That sources is invoice of oil
sale on everyday in that particular country.We use
different transformation as per business
requirements.Finally we populate data in target
table(oracle DB).So business generate report (invoice)
in their standard format.
Roles & Responsibilities  Developing and debugging the informatica
mappings to resolve bugs, and identify the causes
of failures.
 Providing effective technical solutions among the
team.
 Worked on Informatica tool – Source analyzer,
Target Designer, Mapping designer, Mapplet
designer and Transformation Developer.
 Created the partition for newly inserted records.
 Participated in the design documentation.
 The Mappings were Unit tested to check for the
expected results.
 Study and analysis of the mapping document
indicating the source tables, columns, data types,
transformations required, business rules to be
applied, target tables, columns and data types.
 Loaded data into the Data warehouse and there
were two kinds of data loading processes (daily and
monthly) depending on the frequency of source
5
Quaish Abuzer
data.
 Reviewing the code and documents prepared by the
team.
Tools Informatica 8.6, Toad, Putty
Environment Unix
Project # 3
Project Bulk Data Extracts
Customer PNC,US
Period Sept 2010– Feb 2011
Description PNC Financial Services Group, headquartered in
Pittsburgh, PA, is one of the nation's largest financial
holding companies. The company operates through an
extensive banking network primarily in Ohio, Illinois,
Indiana, Kentucky, Michigan, Missouri and
Pennsylvania, and also serves customers in selected
markets nationally.
Bulk Data Extract: Extraction of zipped files and place
the Files in the directory from where informatica can
pick up the files and perform the desired
transformations for loading purpose
Roles & Responsibilities  Understanding the requirements and preparing the
Low Level design doc based on understanding.
 Involved in designing the mappings between
sources and targets
 Developing and debugging the Informatica
mappings to resolve bugs, and identify the causes
of failures.
 Involved in preparing the Unit Test Cases.
Tools Informatica 8.6, Toad
Environment Unix
Passport Details:
Name as on
passport
Relationship Passport
Number
Place of
Issue
Quaish Abuzer Self J4168356 Patna,Bihar
Onsite Experience:
6
Quaish Abuzer
Name of
country
Client Start date End date
Netherlands SNS Bank 03-Mar-
2013
27-Apr-2013
Netherlands SNS Bank 10-Nov-
2014
08-Dec-2014
Current Address:
1204 A-Wing shiv om tower chandivali powai Mumbai 400072
Signature:
7

Recommandé

Komal shekhawat resume2017 par
Komal shekhawat resume2017Komal shekhawat resume2017
Komal shekhawat resume2017KOMAL SHEKHAWAT
85 vues3 diapositives
Resume Pallavi Mishra as of 2017 Feb par
Resume Pallavi Mishra as of 2017 FebResume Pallavi Mishra as of 2017 Feb
Resume Pallavi Mishra as of 2017 FebPallavi Gokhale Mishra
303 vues5 diapositives
Ramachandran_ETL Developer par
Ramachandran_ETL DeveloperRamachandran_ETL Developer
Ramachandran_ETL DeveloperRamachandran Muppidathi
328 vues7 diapositives
Resume_Raj Ganesh Subramanian par
Resume_Raj Ganesh SubramanianResume_Raj Ganesh Subramanian
Resume_Raj Ganesh SubramanianRaj Ganesh Subramanian
247 vues7 diapositives
Shaik Niyas Ahamed M Resume par
Shaik Niyas Ahamed M ResumeShaik Niyas Ahamed M Resume
Shaik Niyas Ahamed M ResumeShaik Niyas Ahamed M
476 vues10 diapositives
Kiran Infromatica developer par
Kiran Infromatica developerKiran Infromatica developer
Kiran Infromatica developerKiran Annamaneni
246 vues6 diapositives

Contenu connexe

Tendances

Amit Kumar_resume par
Amit Kumar_resumeAmit Kumar_resume
Amit Kumar_resumeAmit Kumar
189 vues6 diapositives
Resume (3) par
Resume (3)Resume (3)
Resume (3)KUNAL YADAV
242 vues3 diapositives
Pranabesh Ghosh par
Pranabesh Ghosh Pranabesh Ghosh
Pranabesh Ghosh Pranabesh Ghosh
136 vues15 diapositives
Sanjay Lakhanpal 2015 par
Sanjay Lakhanpal 2015Sanjay Lakhanpal 2015
Sanjay Lakhanpal 2015sanjay lakhanpal
214 vues7 diapositives
Mallikarjun_Konduri par
Mallikarjun_KonduriMallikarjun_Konduri
Mallikarjun_KonduriMallikarjun Konduri
385 vues7 diapositives
smrutikanta jena par
smrutikanta jenasmrutikanta jena
smrutikanta jenaSmrutikanta Jena
496 vues7 diapositives

Tendances(20)

Amit Kumar_resume par Amit Kumar
Amit Kumar_resumeAmit Kumar_resume
Amit Kumar_resume
Amit Kumar189 vues

Similaire à Resume quaish abuzer

Amit_Kumar_CV par
Amit_Kumar_CVAmit_Kumar_CV
Amit_Kumar_CVAmit Kumar
410 vues10 diapositives
Informatica_5+years of experince par
Informatica_5+years of experinceInformatica_5+years of experince
Informatica_5+years of experinceDharma Rao
146 vues6 diapositives
Informatica 5+years of experince par
Informatica 5+years of experinceInformatica 5+years of experince
Informatica 5+years of experinceDharma Rao
102 vues6 diapositives
Informatica 5+years of experince par
Informatica 5+years of experinceInformatica 5+years of experince
Informatica 5+years of experinceDharma Rao
168 vues6 diapositives
RajeshS_ETL par
RajeshS_ETLRajeshS_ETL
RajeshS_ETLRajesh Sundaram
200 vues7 diapositives
Resume par
ResumeResume
ResumeNootan Sharma
225 vues7 diapositives

Similaire à Resume quaish abuzer(20)

Informatica_5+years of experince par Dharma Rao
Informatica_5+years of experinceInformatica_5+years of experince
Informatica_5+years of experince
Dharma Rao146 vues
Informatica 5+years of experince par Dharma Rao
Informatica 5+years of experinceInformatica 5+years of experince
Informatica 5+years of experince
Dharma Rao102 vues
Informatica 5+years of experince par Dharma Rao
Informatica 5+years of experinceInformatica 5+years of experince
Informatica 5+years of experince
Dharma Rao168 vues
Informatica,Teradata,Oracle,SQL par sivakumar s
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
sivakumar s172 vues
Akram_Resume_ETL_Informatica par Akram Bhuyan
Akram_Resume_ETL_InformaticaAkram_Resume_ETL_Informatica
Akram_Resume_ETL_Informatica
Akram Bhuyan298 vues
Sowjanya H J_BE with 4 yr exp_ Informatica par sowjanya H J
Sowjanya H J_BE with 4 yr exp_ InformaticaSowjanya H J_BE with 4 yr exp_ Informatica
Sowjanya H J_BE with 4 yr exp_ Informatica
sowjanya H J443 vues

Dernier

Airline Booking Software par
Airline Booking SoftwareAirline Booking Software
Airline Booking SoftwareSharmiMehta
5 vues26 diapositives
BushraDBR: An Automatic Approach to Retrieving Duplicate Bug Reports par
BushraDBR: An Automatic Approach to Retrieving Duplicate Bug ReportsBushraDBR: An Automatic Approach to Retrieving Duplicate Bug Reports
BushraDBR: An Automatic Approach to Retrieving Duplicate Bug ReportsRa'Fat Al-Msie'deen
5 vues49 diapositives
Programming Field par
Programming FieldProgramming Field
Programming Fieldthehardtechnology
5 vues9 diapositives
Software testing company in India.pptx par
Software testing company in India.pptxSoftware testing company in India.pptx
Software testing company in India.pptxSakshiPatel82
7 vues9 diapositives
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - Geertsema par
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - GeertsemaDSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - Geertsema
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - GeertsemaDeltares
17 vues13 diapositives
Advanced API Mocking Techniques par
Advanced API Mocking TechniquesAdvanced API Mocking Techniques
Advanced API Mocking TechniquesDimpy Adhikary
19 vues11 diapositives

Dernier(20)

BushraDBR: An Automatic Approach to Retrieving Duplicate Bug Reports par Ra'Fat Al-Msie'deen
BushraDBR: An Automatic Approach to Retrieving Duplicate Bug ReportsBushraDBR: An Automatic Approach to Retrieving Duplicate Bug Reports
BushraDBR: An Automatic Approach to Retrieving Duplicate Bug Reports
Software testing company in India.pptx par SakshiPatel82
Software testing company in India.pptxSoftware testing company in India.pptx
Software testing company in India.pptx
SakshiPatel827 vues
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - Geertsema par Deltares
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - GeertsemaDSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - Geertsema
DSD-INT 2023 Delft3D FM Suite 2024.01 1D2D - Beta testing programme - Geertsema
Deltares17 vues
Software evolution understanding: Automatic extraction of software identifier... par Ra'Fat Al-Msie'deen
Software evolution understanding: Automatic extraction of software identifier...Software evolution understanding: Automatic extraction of software identifier...
Software evolution understanding: Automatic extraction of software identifier...
DSD-INT 2023 3D hydrodynamic modelling of microplastic transport in lakes - J... par Deltares
DSD-INT 2023 3D hydrodynamic modelling of microplastic transport in lakes - J...DSD-INT 2023 3D hydrodynamic modelling of microplastic transport in lakes - J...
DSD-INT 2023 3D hydrodynamic modelling of microplastic transport in lakes - J...
Deltares9 vues
DSD-INT 2023 Wave-Current Interaction at Montrose Tidal Inlet System and Its ... par Deltares
DSD-INT 2023 Wave-Current Interaction at Montrose Tidal Inlet System and Its ...DSD-INT 2023 Wave-Current Interaction at Montrose Tidal Inlet System and Its ...
DSD-INT 2023 Wave-Current Interaction at Montrose Tidal Inlet System and Its ...
Deltares10 vues
Headless JS UG Presentation.pptx par Jack Spektor
Headless JS UG Presentation.pptxHeadless JS UG Presentation.pptx
Headless JS UG Presentation.pptx
Jack Spektor7 vues
.NET Developer Conference 2023 - .NET Microservices mit Dapr – zu viel Abstra... par Marc Müller
.NET Developer Conference 2023 - .NET Microservices mit Dapr – zu viel Abstra....NET Developer Conference 2023 - .NET Microservices mit Dapr – zu viel Abstra...
.NET Developer Conference 2023 - .NET Microservices mit Dapr – zu viel Abstra...
Marc Müller38 vues
20231129 - Platform @ localhost 2023 - Application-driven infrastructure with... par sparkfabrik
20231129 - Platform @ localhost 2023 - Application-driven infrastructure with...20231129 - Platform @ localhost 2023 - Application-driven infrastructure with...
20231129 - Platform @ localhost 2023 - Application-driven infrastructure with...
sparkfabrik5 vues
Copilot Prompting Toolkit_All Resources.pdf par Riccardo Zamana
Copilot Prompting Toolkit_All Resources.pdfCopilot Prompting Toolkit_All Resources.pdf
Copilot Prompting Toolkit_All Resources.pdf
Team Transformation Tactics for Holistic Testing and Quality (Japan Symposium... par Lisi Hocke
Team Transformation Tactics for Holistic Testing and Quality (Japan Symposium...Team Transformation Tactics for Holistic Testing and Quality (Japan Symposium...
Team Transformation Tactics for Holistic Testing and Quality (Japan Symposium...
Lisi Hocke28 vues

Resume quaish abuzer

  • 1. Quaish Abuzer QUAISH ABUZER Email: abuzer.bit@gmail.com Mobile: +91-9619588402 Qualifications: Degree and Date Institute Year of Passing Master of Computer Application(MCA) Birla Institute of Technology(BIT) ,Mesra,Ranchi 2010 Experience Summary: • Having 5.8 years of experience in Informatica. • Experience in analysis design, development and implementation of business applications using Informatica Power Center. • Strong design and development skills. • Working experience on Informatica Power Center Tools – Designer, Workflow Manager and Workflow Monitor. • Excellent knowledge of DATA WAREHOUSE. • Significant experience in ETL process in Data Warehouse lifecycle. • Developed different Mappings/ Mapplets using different transformations. • Possess excellent learning ability and practical implementation of the same. • Significant experience in preparing the Unit Test Cases and design document. • Experience in Performance Tuning the mappings. • Experience on Oracle SQL. 3 years experience in Teradata (multiset and volatile table,fastload,multiload, tpt, tpump, daigonastic stats, Joiner index, Loader connection, Teradata SQL Assistant ,stats,skew factor,dbc tables,export,import,release mload,error table,View point, etc) • Experience on SQL. • Performance tuning in sql and Informatica. • Hands-on experience procedure, function, triggers. • Error handling in Informatica. • Worked on all command of Unix. Good exposure of shell scripting in Unix. • Reviewing the code and documents prepared by the team. • Can adapt to new technologies quickly  Capacity to work  Meet deadlines • Excellent logical, analytical & debugging skills. • Have good communication skills, interpersonal relations, hardworking and result oriented as an individual and in a team. • A self-starter with clear understanding of the business needs and having radical and tactical problem solving approaches and possess excellent leadership and motivation skills. • Client interaction experience. • Good domain knowledge in Banking and Financial services(BFS). Additional experience:- • Informatica admin, deployment/migration, code release, tool release, configuration, setup, Repository backup/restore,types of connection in workflow manager etc. • Having good experience in Cloudera BIG data/Hadoop,HDFS,LFS,YARN, Hadoop Eco-friendly system, PIG,HIVE,Sqoop,Flume ,Hbase,Oozie,Zookeeper,map reduce. • Connectivity of Hadoop with Informatica to load data from LFSHDFS,HDFSLFS,HDFSHDFS. 1
  • 2. Quaish Abuzer • Loading Data from HDFS to HIVE, LFS to HIVE using informatica. Technology: Operating Systems Unix, Windows Databases Oracle,Teradata, Sql Server,Ingress Technologies Power Center Tools/IDE Informatica, Toad, Putty, Winscp,Sql developer,HPQC Career Profile: Dates Organization Designation Dec 2012 - Present Capgemini Consultant July 2010 – Dec 2012 Tata Consultancy Services Systems Engineer Projects: The details of the various assignments that I have handled are listed here, in chronological order. 1. Capgemini:- Project SIMS Customer Schlumberger, Houston, Texas, United States Period July 2015 to Till Date Description We have source system SAP/MS SQL server.BI extractor comes in picture when source system is SAP. For MS SQL server source system we create mapping and call procedure. Pulled data has been directly loaded into Oracle DB (in work table).Work table to staging table loading done after implementing business requirement. Staging to Target is one to one mapping adding some business suggested column. We maintain history table for work table and Target table. Once data is ready in target table (Oracle DB), one team called reported team use that table as source. We are maintaining audit framework in all tables e.g Work tableStaging tableTarget tableHistory table Role • Client owner and responsible for keeping all the track and follow up with business for my clients. • Developed codes by interacting with the Business to understand the requirement and implemented the ETL logic. 2
  • 3. Quaish Abuzer • Implemented cross-reference(Xref) logic. • Expertise in developing the mappings by using suitable transformation as per requirement. • Involved in performance tuning. • Understanding the whole process and gave suggestion to change whenever required. • Involved in analysis before acceptance. • Provides guidance/training to the new joinees as and when required. • Providing effective technical solutions among the team. • Enhanced the existing ETL mappings in Informatica to meet the new requirements. • Also part of the deployment ,good knowledge if Informatica administrator Implemented below scenarios as well being shared resource:- • Mapping Variable and Mapping parameter • Global Parameter • Power Center utilities (PMCMD command,PMREP command) • Global Parameter • Target Load plan • Indirect load • Event Wait • Dynamic parameter file • Performance tuning: Pushdown optimization Key range partition Pass through partition Hash auto key partition Environment Unix Tools Informatica 9.6.1,sql-devloper,winscp,putty,DVO Project Banking Solution Customer SNS Bank, Netherlands Period December 2012 to June 2015 Description The project is about all type of banking model. Like net banking, loan, mortgage, risk management etc. Creating new models, changing on existing code by CR or fixing issues on Incident request. Role • Client owner and responsible for keeping all the track and follow up with business for my clients. • Developed codes by interacting with the Business to understand the requirement and implemented the ETL logic • Expertise in developing the mappings by using suitable transformation as per requirement. • Involved in performance tuning. • Understanding the whole process and gave 3
  • 4. Quaish Abuzer suggestion to change whenever required. • Involved in analysis before acceptance. • Provides guidance/training to the new joinees as and when required. • Providing effective technical solutions among the team. • Enhanced the existing ETL mappings in Informatica to meet the new requirements. • Mapping Variable and Mapping parameter • Global Parameter • Power Center utilities (PMCMD command,PMREP command) • Global Parameter • Target Load plan • Indirect load • Event Wait • Dynamic parameter file • Performance tuning: Pushdown optimization Key range partition Pass through partition Hash auto key partition Environment Unix Tools Informatica 8.6,sql- devloper,winscp,putty,Teradata,Morphues 2. In Tata Consultancy Services Project # 1 Project Vehicle upload Customer GE Capital, France Period July,2012– December,2012 Description GE has purchased lots of vehicle in Europe. All vehicle needs to classify on the basis of types and models. If model changed, then we need to discontinue the vehicle. On the basis return indicator decide which vehicle will go for servicing. Loading the data from flat file Source systems to the Revel Staging layer and then target involving all the calculation at transformation level. Getting source file from customer on business days. Roles & Responsibilities  Developing and debugging the informatica mappings to resolve bugs, and identify the causes of failures.  User interaction to identify the issues with the data loaded through the application. 4
  • 5. Quaish Abuzer  Developed different Mappings/ Mapplets using different transformations.  Providing effective technical solutions among the team.  Responsible for handling change.  Responsible for code check in.  Participated in the design documentation.  Enhanced the existing ETL mappings in Informatica to meet the new requirements.  Reviewing the code and documents prepared by other developers.  Extensively worked on performance tuning.  Maintaining all the trackers.  Taking project level initiatives.  Mapping variable and parameter  Incremental load  Mapping Variable and Mapping parameter  Global Parameter  Target Load plan  Indirect load Tools Informatica 8.6, unix,Ingress DB Environment Unix Project # 2 Project NIKE Customer British Petroleum (BP),UK Period March,2011– June,2012 Description The NIKE application was using heterogeneous sources from different countries.That sources is invoice of oil sale on everyday in that particular country.We use different transformation as per business requirements.Finally we populate data in target table(oracle DB).So business generate report (invoice) in their standard format. Roles & Responsibilities  Developing and debugging the informatica mappings to resolve bugs, and identify the causes of failures.  Providing effective technical solutions among the team.  Worked on Informatica tool – Source analyzer, Target Designer, Mapping designer, Mapplet designer and Transformation Developer.  Created the partition for newly inserted records.  Participated in the design documentation.  The Mappings were Unit tested to check for the expected results.  Study and analysis of the mapping document indicating the source tables, columns, data types, transformations required, business rules to be applied, target tables, columns and data types.  Loaded data into the Data warehouse and there were two kinds of data loading processes (daily and monthly) depending on the frequency of source 5
  • 6. Quaish Abuzer data.  Reviewing the code and documents prepared by the team. Tools Informatica 8.6, Toad, Putty Environment Unix Project # 3 Project Bulk Data Extracts Customer PNC,US Period Sept 2010– Feb 2011 Description PNC Financial Services Group, headquartered in Pittsburgh, PA, is one of the nation's largest financial holding companies. The company operates through an extensive banking network primarily in Ohio, Illinois, Indiana, Kentucky, Michigan, Missouri and Pennsylvania, and also serves customers in selected markets nationally. Bulk Data Extract: Extraction of zipped files and place the Files in the directory from where informatica can pick up the files and perform the desired transformations for loading purpose Roles & Responsibilities  Understanding the requirements and preparing the Low Level design doc based on understanding.  Involved in designing the mappings between sources and targets  Developing and debugging the Informatica mappings to resolve bugs, and identify the causes of failures.  Involved in preparing the Unit Test Cases. Tools Informatica 8.6, Toad Environment Unix Passport Details: Name as on passport Relationship Passport Number Place of Issue Quaish Abuzer Self J4168356 Patna,Bihar Onsite Experience: 6
  • 7. Quaish Abuzer Name of country Client Start date End date Netherlands SNS Bank 03-Mar- 2013 27-Apr-2013 Netherlands SNS Bank 10-Nov- 2014 08-Dec-2014 Current Address: 1204 A-Wing shiv om tower chandivali powai Mumbai 400072 Signature: 7