VENU_Hadoop_Resume

V
VENUGOPAL CHIPPA 
E-Mail: venuzone@gmail.com 
Contact No.: 9000134110 
Seeking a Hadoop Technical Lead position at high reputed organization where I can maximize my technical and management skills. 
PROFILE SUMMARY 
 4 years of experience in IT industry including Big Data-Hadoop and Data Warehousing technologies. 
 Hands on experience with Big Data core components and Eco System (HDFS, MapReduce, Hive, Sqoop, Flume, Oozie, HBase, 
Pig, Core Java ) 
 Hands-on experience in configuring, developing, monitoring, debugging and performance tuning of Big Data applications using 
Hadoop 
 Deep experience with Distributed systems and large scale non-relational data stores. 
 Experience in Informatica ETL processes. 
 Experience in handling Enterprise Data warehouses-Development, Support and Maintenance 
 Worked with customers closely to provide solutions to various problems 
 Quick learner and self starter to go ahead with any new technology. 
ORGANISATIONAL EXPERIENCE 
 Working with Tech Mahindra (Satyam Computer Services Ltd) from Feb 2011 to Till date 
Education Details 
 B.Tech from JNT University in the year 2010 
Technical Skills 
 Big Data/Hadoop eco system : Hadoop,HDFS,MapReduce,HIVE,HiveQL,Pig,Sqoop,HBase,Oozie,Flume,Splunk 
 ETL Tools : Informatica 9.1/8.x 
 Reporting tool : Business Objects 
 Databases : Oracle 11g/10g/9i/8i 
 Data Management tools :SQL *Plus, SQL Developer, Enterprise Manager, Toad 
 DB Languages : SQL 
 NoSql : HBase, Cassandra 
 Operating Systems : VMware, Linux, Unix, WINDOWS 98,CENT 
Professional Experience: 
Hadoop (Big Data) 
Type: Development 
Customer: GE Atlanta, USA 
Offshore: Mahindra Satyam Computer Services Ltd Limited, Hyderabad 
Tools/Platform: BigData, Hadoop, HDFS, Hive, Hive QL, Pig, Sqoop, Flume, Oozie, Oracle 
Responsibilities: 
 Analyze Client Systems and Gather requirements. 
 Design technical solutions based out of the Client needs and systems architecture. 
 Data is collected from Teradata and pushing into Hadoop using Sqoop. 
 The logs are pulled form log server and stored in the FTP server hourly wise; this data is pushed into Hadoop by deleting data 
in the FTP server. 
 Data was pre-processed using MapReduce and stored into the Hive tables. 
 Implementation part of the Big Data solution has been taken care. 
 Extract data from various log files (Apache weblog, adobe log, Hit Data log) and load it into HDFS under Hadoop.
 Developed and build HIVE tables using Hive QL. 
 Developed scripts using Hive QL for the analysis purpose. 
 Tuned Hive queries to get the best performance. 
 Written Apache Pig scripts for data analysis on top of the HDFS files. 
 Extracted data from RDBMS – Oracle into HADOOP using Sqoop. 
 Used Hive on the RDBMS data for data analysis and stored back to DB. 
 Used Oozie for scheduling. 
 Provided solutions to various problems in Hadoop. 
Hadoop: General Electronics (G.E) 
Type: Development 
Customer: General Electric, Cincinnati, OH, USA 
Off shore : Mahindra Satyam Computer Services Ltd Limited, Hyderabad 
Tools/Platform: Big Data, Hadoop, HDFS, Hive, Hive QL, Apache Pig, Sqoop and Oozie 
Responsibilities: 
 Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS 
 Developed the PIG UDF’S to pre-process the data for analysis 
 Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with PIG 
 Developed HIVE queries for the analysts 
 Moving data using flume 
 Provided solutions to various problems in Hadoop 
 Developed and build HIVE tables using Hive QL 
Title : Shop floor Systems 
Client : GE Energy 
Role : Informatica Developer 
Project Description: 
GE Energy is one of the 12 major businesses of General Electric Company USA and is the world's leading supplier of power 
generation technology, service and energy management systems. GE Energy services customers globally through a network of loca l 
offices and service centers, with three regional headquarters - GE Energy Asia-Pacific in Hong Kong, GE Energy Europe in London 
and GE Energy Americas in Schenectady, NY. 
GE Energy includes 12 manufacturing centers, nearly 22,200 employees, 130 sales and service engineering offices, and 58 
apparatus service centers, worldwide.Relates the turbines data and all the part counts which will be counted in the application where the 
ETL loads run at Informatica and reports with BO 6.5. 
Contribution: 
As an ETL Developer, I was responsible for 
 Design of Extraction Transform & Load (ETL) environment 
 Involved in requirements study and understanding the functionalities 
 Extensively worked on transformations like Filter, Expression, Router, Lookup, Update Strat egy, Sequence generator 
 Analysis of Specifications Provided by Clients 
 Created various Transformations like Connected and Unconnected Lookups, Router, Aggregator, Joiner, Update Strategy etc… 
 Created and Monitored Informatica sessions. 
 Prepared Low level mapping documents for the ETL built 
 Provided Knowledge sharing sessions and helping team to understand the requirements clearly 
 Good interaction with the team and business users 
Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX 
Title : MITTR01 
Client : GE Energy 
Role : Informatica Developer
Project Description: 
The primary objective of the MITT is to bring together information from disparate legacy systems for its data and put the information 
into format that is conducive to make business decisions. Usually the data is fed from COPICS tables into an FTP Server and f rom there 
it is picked up by the scripts and loaded into the base tables and the Rollup tables. It also fetche s the data from other source systems like 
TIMES, COSDOM. 
This data from various source systems is primarily extracted on a daily and weekly basis, which in turn is also, used by fina nce and 
Engineering team data reports. Currently the Functional and IM teams are notified by mails if there is a load failure over the 
weekend/daily loads. 
Contribution: 
 Creating Mapping using Informatica 
 Analyzing data issue problems with data marts and recommending solutions 
 Analyzing ETL mappings and making changes for optimization of load 
 Experience using ETL development (Informatica) 
 Experience using relational databases (Oracle 9i,10g/11g) 
 Experience with developing and supporting ETL applications 
 Responsible to deliver the work on time and customer challenges. 
Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX 
Personal Details: 
Name : venugopal.ch 
Email ID : venuzone@gmail.com 
Passport No : J1238807(Valid upto 20/10/2020) 
Location : Hyderabad 
Contact Numbers : +91 9000134110 
Date: 
Hyderabad (Venugopal)

Recommandé

Java Developer resume par
Java Developer resume Java Developer resume
Java Developer resume Pavel Plakhotnik
8.4K vues3 diapositives
Resume - Taranjeet Singh - 3.5 years - Java/J2EE/GWT par
Resume - Taranjeet Singh - 3.5 years - Java/J2EE/GWTResume - Taranjeet Singh - 3.5 years - Java/J2EE/GWT
Resume - Taranjeet Singh - 3.5 years - Java/J2EE/GWTtaranjs
57.5K vues3 diapositives
Pankaj Resume for Hadoop,Java,J2EE - Outside World par
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K vues10 diapositives
Murat_December_2014 par
Murat_December_2014Murat_December_2014
Murat_December_2014Murat Demirbas
81 vues3 diapositives
Murat_December_2014 par
Murat_December_2014Murat_December_2014
Murat_December_2014Murat Demirbas
75 vues3 diapositives
NTTデータにおけるHadoopへの取り組み & Hadoop Summit 2010 レポート par
NTTデータにおけるHadoopへの取り組み & Hadoop Summit 2010 レポートNTTデータにおけるHadoopへの取り組み & Hadoop Summit 2010 レポート
NTTデータにおけるHadoopへの取り組み & Hadoop Summit 2010 レポートNTT DATA OSS Professional Services
794 vues31 diapositives

Contenu connexe

Similaire à VENU_Hadoop_Resume

Trade Ideas Data: Market Intelligence for the Financial Technology Industry par
Trade Ideas Data: Market Intelligence for the Financial Technology Industry Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry David Aferiat
420 vues16 diapositives
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014 par
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Frauke Ziedorn
461 vues27 diapositives
Webperfdays 2014 - common web performance mistakes par
Webperfdays 2014  - common web performance mistakesWebperfdays 2014  - common web performance mistakes
Webperfdays 2014 - common web performance mistakesdkoston
111 vues51 diapositives
Put Down That Checkbook! - Big Data without the Big Bucks par
Put Down That Checkbook! - Big Data without the Big BucksPut Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big BucksCharlie Greenbacker
862 vues37 diapositives
Presentazione Sharepoint 2007 - MOSS - WSS par
Presentazione Sharepoint 2007 - MOSS - WSSPresentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSSDecatec
750 vues46 diapositives
Cloud Computing par
Cloud ComputingCloud Computing
Cloud ComputingSoftware Park Thailand
337 vues18 diapositives

Similaire à VENU_Hadoop_Resume(20)

Trade Ideas Data: Market Intelligence for the Financial Technology Industry par David Aferiat
Trade Ideas Data: Market Intelligence for the Financial Technology Industry Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry
David Aferiat420 vues
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014 par Frauke Ziedorn
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Frauke Ziedorn461 vues
Webperfdays 2014 - common web performance mistakes par dkoston
Webperfdays 2014  - common web performance mistakesWebperfdays 2014  - common web performance mistakes
Webperfdays 2014 - common web performance mistakes
dkoston111 vues
Put Down That Checkbook! - Big Data without the Big Bucks par Charlie Greenbacker
Put Down That Checkbook! - Big Data without the Big BucksPut Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big Bucks
Presentazione Sharepoint 2007 - MOSS - WSS par Decatec
Presentazione Sharepoint 2007 - MOSS - WSSPresentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSS
Decatec750 vues
以数据驱动为中心-FreeWheel par airsex
以数据驱动为中心-FreeWheel以数据驱动为中心-FreeWheel
以数据驱动为中心-FreeWheel
airsex353 vues
2010 06 15 SecondNug - JAVA vs NET par Bruno Capuano
2010 06 15 SecondNug - JAVA vs NET2010 06 15 SecondNug - JAVA vs NET
2010 06 15 SecondNug - JAVA vs NET
Bruno Capuano389 vues
The new release of Oracle BI 11g R1 - OGH – 15 September 2010 par Daan Bakboord
The new release of Oracle BI 11g R1 - OGH – 15 September 2010The new release of Oracle BI 11g R1 - OGH – 15 September 2010
The new release of Oracle BI 11g R1 - OGH – 15 September 2010
Daan Bakboord421 vues
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web par Christophe Lauer
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le webMix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Christophe Lauer719 vues
Entity+framework+ par Rey zhang
Entity+framework+Entity+framework+
Entity+framework+
Rey zhang85 vues
F5 Networks - Soluciones para Banca & Finanzas par AEC Networks
F5 Networks - Soluciones para Banca & FinanzasF5 Networks - Soluciones para Banca & Finanzas
F5 Networks - Soluciones para Banca & Finanzas
AEC Networks637 vues
Oil & gas strategic it planning leads to million dollar savings par Shepherd Mlambo
Oil & gas strategic it planning leads to million dollar savingsOil & gas strategic it planning leads to million dollar savings
Oil & gas strategic it planning leads to million dollar savings
Shepherd Mlambo198 vues

VENU_Hadoop_Resume

  • 1. VENUGOPAL CHIPPA E-Mail: venuzone@gmail.com Contact No.: 9000134110 Seeking a Hadoop Technical Lead position at high reputed organization where I can maximize my technical and management skills. PROFILE SUMMARY  4 years of experience in IT industry including Big Data-Hadoop and Data Warehousing technologies.  Hands on experience with Big Data core components and Eco System (HDFS, MapReduce, Hive, Sqoop, Flume, Oozie, HBase, Pig, Core Java )  Hands-on experience in configuring, developing, monitoring, debugging and performance tuning of Big Data applications using Hadoop  Deep experience with Distributed systems and large scale non-relational data stores.  Experience in Informatica ETL processes.  Experience in handling Enterprise Data warehouses-Development, Support and Maintenance  Worked with customers closely to provide solutions to various problems  Quick learner and self starter to go ahead with any new technology. ORGANISATIONAL EXPERIENCE  Working with Tech Mahindra (Satyam Computer Services Ltd) from Feb 2011 to Till date Education Details  B.Tech from JNT University in the year 2010 Technical Skills  Big Data/Hadoop eco system : Hadoop,HDFS,MapReduce,HIVE,HiveQL,Pig,Sqoop,HBase,Oozie,Flume,Splunk  ETL Tools : Informatica 9.1/8.x  Reporting tool : Business Objects  Databases : Oracle 11g/10g/9i/8i  Data Management tools :SQL *Plus, SQL Developer, Enterprise Manager, Toad  DB Languages : SQL  NoSql : HBase, Cassandra  Operating Systems : VMware, Linux, Unix, WINDOWS 98,CENT Professional Experience: Hadoop (Big Data) Type: Development Customer: GE Atlanta, USA Offshore: Mahindra Satyam Computer Services Ltd Limited, Hyderabad Tools/Platform: BigData, Hadoop, HDFS, Hive, Hive QL, Pig, Sqoop, Flume, Oozie, Oracle Responsibilities:  Analyze Client Systems and Gather requirements.  Design technical solutions based out of the Client needs and systems architecture.  Data is collected from Teradata and pushing into Hadoop using Sqoop.  The logs are pulled form log server and stored in the FTP server hourly wise; this data is pushed into Hadoop by deleting data in the FTP server.  Data was pre-processed using MapReduce and stored into the Hive tables.  Implementation part of the Big Data solution has been taken care.  Extract data from various log files (Apache weblog, adobe log, Hit Data log) and load it into HDFS under Hadoop.
  • 2.  Developed and build HIVE tables using Hive QL.  Developed scripts using Hive QL for the analysis purpose.  Tuned Hive queries to get the best performance.  Written Apache Pig scripts for data analysis on top of the HDFS files.  Extracted data from RDBMS – Oracle into HADOOP using Sqoop.  Used Hive on the RDBMS data for data analysis and stored back to DB.  Used Oozie for scheduling.  Provided solutions to various problems in Hadoop. Hadoop: General Electronics (G.E) Type: Development Customer: General Electric, Cincinnati, OH, USA Off shore : Mahindra Satyam Computer Services Ltd Limited, Hyderabad Tools/Platform: Big Data, Hadoop, HDFS, Hive, Hive QL, Apache Pig, Sqoop and Oozie Responsibilities:  Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS  Developed the PIG UDF’S to pre-process the data for analysis  Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with PIG  Developed HIVE queries for the analysts  Moving data using flume  Provided solutions to various problems in Hadoop  Developed and build HIVE tables using Hive QL Title : Shop floor Systems Client : GE Energy Role : Informatica Developer Project Description: GE Energy is one of the 12 major businesses of General Electric Company USA and is the world's leading supplier of power generation technology, service and energy management systems. GE Energy services customers globally through a network of loca l offices and service centers, with three regional headquarters - GE Energy Asia-Pacific in Hong Kong, GE Energy Europe in London and GE Energy Americas in Schenectady, NY. GE Energy includes 12 manufacturing centers, nearly 22,200 employees, 130 sales and service engineering offices, and 58 apparatus service centers, worldwide.Relates the turbines data and all the part counts which will be counted in the application where the ETL loads run at Informatica and reports with BO 6.5. Contribution: As an ETL Developer, I was responsible for  Design of Extraction Transform & Load (ETL) environment  Involved in requirements study and understanding the functionalities  Extensively worked on transformations like Filter, Expression, Router, Lookup, Update Strat egy, Sequence generator  Analysis of Specifications Provided by Clients  Created various Transformations like Connected and Unconnected Lookups, Router, Aggregator, Joiner, Update Strategy etc…  Created and Monitored Informatica sessions.  Prepared Low level mapping documents for the ETL built  Provided Knowledge sharing sessions and helping team to understand the requirements clearly  Good interaction with the team and business users Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX Title : MITTR01 Client : GE Energy Role : Informatica Developer
  • 3. Project Description: The primary objective of the MITT is to bring together information from disparate legacy systems for its data and put the information into format that is conducive to make business decisions. Usually the data is fed from COPICS tables into an FTP Server and f rom there it is picked up by the scripts and loaded into the base tables and the Rollup tables. It also fetche s the data from other source systems like TIMES, COSDOM. This data from various source systems is primarily extracted on a daily and weekly basis, which in turn is also, used by fina nce and Engineering team data reports. Currently the Functional and IM teams are notified by mails if there is a load failure over the weekend/daily loads. Contribution:  Creating Mapping using Informatica  Analyzing data issue problems with data marts and recommending solutions  Analyzing ETL mappings and making changes for optimization of load  Experience using ETL development (Informatica)  Experience using relational databases (Oracle 9i,10g/11g)  Experience with developing and supporting ETL applications  Responsible to deliver the work on time and customer challenges. Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX Personal Details: Name : venugopal.ch Email ID : venuzone@gmail.com Passport No : J1238807(Valid upto 20/10/2020) Location : Hyderabad Contact Numbers : +91 9000134110 Date: Hyderabad (Venugopal)