1. N Venkatesh Babu
Email: nvenkateshbabu@yahoo.com
Mobile: +1 408 627 5740
Professional Summary
Having 14 + years of IT experience working in Database/Data Warehousing Technology,
analysis, designing, development and implementation of business systems in various
environments.
10 + years of experience in development and maintenance of SQL, PL/SQL, Stored
procedures, functions, analytic functions, constraints, indexes and triggers.
10 + years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
5 + years of experience in working with GreenPlum MPP databases.
5 + years of experience in working with Big Data HDFS systems.
2 + years of experience in working with in memory SAP HANA system.
Strong experience of working with Hive, Sqoop, Pig.
Strong experience of working Talend in moving the data from different systems to
HDFS Cluster.
Around 6+ years of experience in UNIX Shell Scripting.
Experience in working with python scripts.
8+ Years of Experience in ETL development process using Informatica for Data
Warehousing, Data migration, Data Integration and Production support.
Extensive experience in Design, Development, Implementation, Production Support
and Maintenance of Data Warehouse Business Applications in CRM, ERP, Financial,
Banking and Telecommunication industries.
Experience in both Waterfall and Agile SDLC scrum methodologies.
Involved in requirement gathering, design, development and implementation of data
warehouse projects.
Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse
(EDS/Data marts) concepts and principles (Kimball/Inmon) – Star/Snowflake schema, SCD,
Surrogate keys and Normalization/De-normalization, BI Intelligence/Analytics.
Data modeling experience in creating Conceptual, Logical and Physical Data Models using
ERwin Data Modeler.
Experience with TOAD, SQL Developer database tools to query, test, modify, analyze
data, create indexes, and compare data from different schemas.
Expertise in designing and developing ETL Informatica Mappings, Sessions and Workflows
using Informatica 8.x, and 9.x (Designer, Workflow Manager and Workflow Monitor).
Extensive experience in developing Extracting, Transforming and Loading (ETL) and testing
processes using Informatica PowerCenter and full life cycle implementation of data ware
house.
Worked on Slowly Changing Dimensions (SCD's) to keep track of historical data.
Proficiency in data warehousing techniques for data cleansing, surrogate key assignment
and Change data capture (CDC).
Proficient in using Informatica tools like Informatica Designer, Repository Manager, Workflow
Manager and Workflow Monitor.
Expertise in implementing complex business rules by creating re-usable transformations,
Mapplets and Mappings.
Optimized the Solution using various performance tuning methods (SQL tuning, ETL tuning
(i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions),
Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
Extensively performed Performance Tuning of sources, targets and mappings in
Informatica.
Extensively used Control M for scheduling the UNIX shell scripts and Informatica workflows.
Extensive knowledge in all areas of Project Life Cycle including requirements analysis,
system analysis, design, development, documentation, testing, implementation and
maintenance.
Strong analytical, verbal, written and interpersonal skills.
Proactive in learning new technologies and updating my skills.
Experience in Applications Data Integration, Data warehousing and BI Tools
(Informatica, OBIEE 11g & Tableau), various databases (Oracle, Postgres, MS SQL,
MySql ), various enterprise level applications (EBS, SFDC, EDW).
Page 1 of 5
2. Strong experience in development of mission critical systems on BI, Oracle, PL/SQL,
Informatica 9.1, D2K, OBIEE, Tableau, scripting (shell & python) and Java technologies
ranging from JSP/ Servlets/ XML API.
Ability to develop, Implementation, Testing and Maintenance of software applications in self-
organized team & environment.
Expertise in designing and developing of ETL Jobs using Talend.
Expertise in maintenance and support of applications developed using Talend.
Strong experience in development of Big Data related jobs using Talend.
Expertise in scheduling the jobs using Talend TAC.
Highly motivated and self-initiator with focus on meeting project deadlines without
compromising quality.
Strong experience in delivering time critical projects in Timely manner, along with ensuring
Performance and adhering to quality standards.
Strong experience of working with Tableau Dashboards, Tableau Desktop.
Strong experience of working with Tableau development, Support and deployment to
production.
Strong experience of working with Python.
Worked in the Information Technology field gathered and analyzed business
requirements and produced specification documents for new project requests.
Developed ad-hoc queries and reports for clients and provide analytical support and
recommendation as necessary.
Acted as liaison between business unit and report developer to develop, communicate and
modify existing report.
Developed analyses for, Ops Reviews, management reports, business cases, strategic
planning, and ad hoc quantitative and qualitative analyses.
Worked closely with the project management team to analyze and consolidate
request requirements to improve the decision planning process and meeting project
deadlines.
Defined data requirements and report layouts for the business to review.
Created testing plans and scripts based on requirements.
Partnered with project manager, dev manager, and
business owners to proactively understand ongoing projects
(business and technical changes) and identify project.
Technical Skills:
ETL/Middleware Tools Informatica Power Center 10.1/9.x, Talend , BODS, Hadoop
HDFS, Big Data
Data Modelling Dimensional Data Modelling, Star Join Schema Modelling,
Snow-Flake Modelling, Fact and Dimension tables, Physical and
Logical Data Modelling.
Business Intelligence
Tools
Tableau, OBIEE 11g/10.1.3.x
BIG DATA Technologies Hadoop , HDFS, Hive, Sqoop, PIG
RDBMS Oracle 10g/9i,, MS SQL Server, , MySql, MS Access.
MPP Databases GreenPlum
In Memory Data base SAP HANA
Programming Skills SQL, Oracle PL/SQL, Unix Shell Scripting, Python, Java
Modelling Tool Erwin 4.1/5.0, MS Visio.
Tools TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI.
SharePoint.
Operating Systems Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.
Scheduling Tool Control M, Informatica Scheduling, TAC
Methodologies Data Modelling–Logical/Physical/Dimensional, Star/Snowflake
schemas, Fact and Dimension Tables, Software Development
Life Cycle
Page 2 of 5
3. Experience
Sr. Application developer
Firsttech Credit Union , Mountain View, CA
Sep-16 to Till Date.
Responsibilities:
Analyze data requirements for the Project Firsttech EDW to build Talend Jobs.
Attend meetings with business users to gather the business requirements and provide data
Accordingly without discrepancies.
Work with other team members and co-ordinate data flow into the EDW.
Develop and maintain Talend jobs for ETL Data load process.
Worked on HDFS to import data from different source system and finally load to Hive tables
using Talend Components. .
Perform Quality Assurance on the ETL code, Database objects, and help migrate it to higher
environments like SIT and PROD.
Interact with vendors to setup SFTP (Secure File Transfer Protocol) jobs to get data into
HDFS
Staging layers and finally to downstream Enterprise Data Warehouse for the sources
identified as flat file.
Environment: Talend, Salesforce, SVN, Oracle, Hadoop HDFS, Sqoop, Hive, Talend
TAC, Unix.
Sr. Application Developer & Lead
VMware INC , Bangalore, India
Oct-11 to Sep-16
Responsibilities
Responsible for delivering the BI Solutions and executing BI Projects successfully at Vmware.
Successfully executed the BI Projects and Solutions in VMware.
Helping cross-functional teams to address business or systems issues and help in resolving
the complex analyses and come up with leading edge solutions to business problems
Working with business teams to understand the functional requirements throughout the
project lifecycle and resolve the issues throughout the development, testing and deployment
phases of project lifecycle
Mentoring the BI team, training the resources on DW Skills, BI Tools & on new technologies
such as BigData.
Analysis of Business Requirement Documents (BRDS).
Creation of High Level Design (HLD) and Low Level Design (LLD) documents
Coding of new programs, enhancement of existing programs.
Unit testing, System integration testing, support to user acceptance testing.
Participation in emergency fix, move to production and warranty phases of Project.
Participation in architecture reviews to ensure that the solutions comply with standards and
use approved technologies
Designs, Development, Configuration, Integration and Functional contribution in various
horizontal domain like Informatica, PL/SQL, Python & Shell scripting etc. & various vertical
domains such as Data warehousing, Business Intelligence etc.
For delivering Business critical reports using OBIEE and Tableau.
Responsible for subject Areas related to Revenue Reporting, Renewals, Enterprise Data
Governance, Xlrate, Bookings, GreenPlum Migration, Advanced Analytics, and SAP HANA.
Designed the complete ETL design including the forward and the backward flow.
Parameterized the whole process by using the parameter file for the variables.
Created profile files and shell scripts for creating dynamic parameter files.
Created the operations guide and SDS documents for the Informatica mappings and
workflows as part of end user training
Responsible for delivering mission critical reports using Python using different modules
available as per below.
1. Primarily Python is used to deliver the reports to Business ness user involving huge data
sets.
Page 3 of 5
4. 2. Notifying users incase if any delays with loads.
3. Is used for implementing messaging service, to keep checking if there any new requests
for processing.
4. Used for processing the data from Bugzilla to EDW system.
5. Used for extracting data from GreenPlum.
6. To implement transaction processing for GreenPlum requirements.
Environment: Informatica , Talend, Salesforce, Perforce, Oracle, Hadoop HDFS,
Sqoop, Hive, Talend TAC, Unix, Green Plum, SAP HANA, Unix shell script, Python,
SQL server, My Sql, Flat Files, Control M,Perforce , Toad, Erwin, Tableau & OBIEE.
Programmer Analyst
Warner Music,
London & Bangalore.
Oct-08 to Sep-11
Responsibilities
Consulting role – in designing the application of the clients at their end.
Red side testing of the application
Production support activities.
Analysis of Business Requirements (BRDs).
Creation of High Level Design (HLDs)
Fully responsible for preparing Low Level Design (LLDs).
Fully responsible for System Integration Testing.
Fully responsible for supporting User Acceptance Testing.
Writing shell scripts, configuration shell scripts, variables shell scripts, FTP/SFTP
configuration.
Performance tuning of developed code
Analysis of Business Requirement Documents (BRD).
Creation of High Level Design (HLD) and Low Level Design (LLD) documents
Coding of new programs, enhancement of existing programs.
Unit testing, system integration testing, support to user acceptance testing.
Participation in emergency fix, move to production and warranty phases of Project.
Participation in architecture reviews to ensure that the solutions comply with standards and
use approved technologies
Designs, Development, Configuration, Integration and Functional contribution in various
horizontal domain like Informatica, PL/SQL, Pearl & Shell scripting etc. & various vertical
domains such as Data warehousing, Business Intelligence etc.
Environment: Informatica , Perforce, Oracle, Unix, Unix shell script, SQL server,
Toad, Erwin.
Programmer Analyst
Proxy CP (British Telecom)
London , UK
May-06 to Sep-08
Responsibilities
Consulting role – in designing the application of the clients at their end.
Red side testing of the application
Production support activities.
Analysis of Business Requirements (BRDs).
Creation of High Level Design (HLDs)
Fully responsible for preparing Low Level Design (LLDs).
Fully responsible for System Integration Testing.
Fully responsible for supporting User Acceptance Testing.
Writing shell scripts, configuration shell scripts, variables shell scripts, FTP/SFTP
configuration.
Performance tuning of developed code
Analysis of Business Requirement Documents (BRD).
Creation of High Level Design (HLD) and Low Level Design (LLD) documents
Page 4 of 5
5. Coding of new programs, enhancement of existing programs.
Unit testing, system integration testing, support to user acceptance testing.
Participation in emergency fix, move to production and warranty phases of Project.
Participation in architecture reviews to ensure that the solutions comply with standards and
use approved technologies
Designs, Development, Configuration, Integration and Functional contribution in various
horizontal domain like Informatica, PL/SQL, Pearl & Shell scripting etc. & various vertical
domains such as Data warehousing, Business Intelligence etc.
Environment: Oracle, Unix script, Toad, SQL server.
KenanFX/BP- Telecom Billing Product
Delhi, India
Jun-05 to Feb-06
Responsibilities
Maintenance of one of the tool Called ADEP developed in C, Perl and shell scripts and which
does the automation of entire billing and invoice operation.
Automation of BIP, HDP, Discounts using the Perl.
Installation of product Kenan FX/BP
Review & Validation of code.
Fixing code defects/bugs.
Performance tuning of developed code.
Responsible for Move to Production Phase (Moving data from development to production
environment)
Environment: Oracle, Unix, Unix shell scrip & SQL server.
Flex cube – Banking Product
Bangalore , India
Aug-03 to Sep -04
Responsibilities
Requirement Analysis.
Design and preparation of Functional specification.
Development, Support to Testing
DESCRIPTION: UFJ Bank is one of the leading banks in Japan, who have incorporated
FLEXCUBE into their systems as a Back Office System, for their corporate banking
operations.
Project Profile: The enhancements required were in the Loans and Deposits module. UFJ
wanted to have “Reversal and Revaluation of Payments by using the User Defined Accounts
and GL’s ” . These were complex enhancements since the existing functionality of the Loans
and Deposits module had to be changed in a major way.
Academics and Certifications
- B.E (Mechanical)
- Attended Training on CMM, AMP Middle ware tool.
Page 5 of 5
6. Coding of new programs, enhancement of existing programs.
Unit testing, system integration testing, support to user acceptance testing.
Participation in emergency fix, move to production and warranty phases of Project.
Participation in architecture reviews to ensure that the solutions comply with standards and
use approved technologies
Designs, Development, Configuration, Integration and Functional contribution in various
horizontal domain like Informatica, PL/SQL, Pearl & Shell scripting etc. & various vertical
domains such as Data warehousing, Business Intelligence etc.
Environment: Oracle, Unix script, Toad, SQL server.
KenanFX/BP- Telecom Billing Product
Delhi, India
Jun-05 to Feb-06
Responsibilities
Maintenance of one of the tool Called ADEP developed in C, Perl and shell scripts and which
does the automation of entire billing and invoice operation.
Automation of BIP, HDP, Discounts using the Perl.
Installation of product Kenan FX/BP
Review & Validation of code.
Fixing code defects/bugs.
Performance tuning of developed code.
Responsible for Move to Production Phase (Moving data from development to production
environment)
Environment: Oracle, Unix, Unix shell scrip & SQL server.
Flex cube – Banking Product
Bangalore , India
Aug-03 to Sep -04
Responsibilities
Requirement Analysis.
Design and preparation of Functional specification.
Development, Support to Testing
DESCRIPTION: UFJ Bank is one of the leading banks in Japan, who have incorporated
FLEXCUBE into their systems as a Back Office System, for their corporate banking
operations.
Project Profile: The enhancements required were in the Loans and Deposits module. UFJ
wanted to have “Reversal and Revaluation of Payments by using the User Defined Accounts
and GL’s ” . These were complex enhancements since the existing functionality of the Loans
and Deposits module had to be changed in a major way.
Academics and Certifications
- B.E (Mechanical)
- Attended Training on CMM, AMP Middle ware tool.
Page 5 of 5