Bharath Hadoop Resume

Bharath Kumar
Bharath KumarIT Analyst at Tata Consultancy Services à Tata Consultancy Services
Bharath Kumar Rapolu
Contact No :+91-9885363653
E-Mail: bharathrapolu.kumar@gmail.com
Professional Summary:
 4.6+ years of overall IT experience in Application Development in Pl/Sql and
Big Data Hadoop.
 1.5 years of exclusive experience in Hadoop and its components like HDFS,
Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
 Extensive Experience in Setting Hadoop Cluster
 Good working knowledge with Map Reduce and Apache Pig
 Involved in writing the Pig scripts and Pig UDFs to reduce the job execution
time
 Experience in creating Hive External and Managed tables and writing queries on
them
 Involved in writing Hive UDFs for specific functionalities.
 Experience in importing and exporting data operation with RDBMS using Sqoop
CLI Commands
 Involved in scheduling MR,Pig and Sqoop Jobs in Apache Oozie.
 Experience in writing Procedures,Functions,Triggers, Indexes and Packages in
RDBMS.
 Written SQL queries for DDL and DML operations.
 Experience in Importing and Exporting of data from text and excel sheets in Sql
Server.
 Having knowledge of Fact and Dimensional Tables in RDBMS.
 Experience in Performance Tuning and Query Optimizing in RDBMS.
 Experience in Requirement Analysis and Table Design.
 Having Knowledge of Pentaho Report Designer.
 Ability to be an effective team player and work under time constraints.
 Good interpersonal communication skills & Technical Documentation skills.
 Having knowledge of usage of Hints in Performance Tuning
 Knowledge on FLUME and NO-SQL
Professional Experience:
 Currently Working as a IT Analyst in TCS (Tata Consultancy Services),
Hyderabad, India since Jul 2011.
Qualifications:
 Bachelor of Technology from SASTRA University, Thanjavur, Chennai,
with 7.7 /10 CGPA.
Technical Skills:
Languages Core java,SQL,PL/SQL MapReduce, Pig, Sqoop, Pig, Hive, Hbase.
Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
Frameworks Hadoop and .NET
Java IDEs Eclipse Europa 2008, PL/SQL Developer
Version Control /
Tracking Tools
Visual Source Safe (VSS)
Databases DB2 9.x,MySql, Sql Server,Oracle- SQL (DDL, DML, DCL) and
PL/SQL.
Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux
Project Details:
PROJECT #3:
Project Name : BestBuy – Web Intelligence
Client : BestBuy Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL
Duration : Nov 2014 to till Date
Role : Hadoop Developer
Description:
This Project is all about the rehousting of their (BestBuy) current existing project
into Hadoop platform. Previously BestBuy was using mysql DB for storing their
competitor’s retailer’s information.[The Crawled web data]. Early BestBuy use to
have only 4 competitor retailers namely Amazon.com, walmart.com etc….
But as and when the competitor retailers are increasing the data generated out of
their web crawling is also increased massively and which cannot be accomodable
in a mysql kind of data box with the same reason BestBuy wants to move it
Hadoop, where exactly we can handle massive amount of data by means of its
cluster nodes and also to satisfy the scaling needs of the BestBuy business
operation.
Roles and Responsibilities:
 Moved all crawl data flat files generated from various retailers to HDFS for
further processing.
 Written the Apache PIG scripts to process the HDFS data.
 Created Hive tables to store the processed results in a tabular format.
 Developed the sqoop scripts in order to make the interaction between Pig
and MySQL Database.
 Writing the script files for processing data and loading to HDFS
 Writing CLI commands using HDFS.
 Developed the UNIX shell scripts for creating the reports from Hive data.
 Completely involved in the requirement analysis phase.
 Created two different users (hduser for performing hdfs operations and map
red user for performing map reduce operations only)
 Ensured NFS is configured for Name Node
 Setting Password less hadoop
 Setting up cron job to delete hadoop logs/local old job files/cluster temp
files
 Setup Hive with MySQL as a Remote Metastore
 Moved all log/text files generated by various products into HDFS location
 Written Map Reduce code that will take input as log files and parse the logs
and structure them in tabular format to facilitate effective querying on the
log data
 Created External Hive Table on top of parsed data.
#Project 2: SERP (Society for Elimination of Rural Poverty)
Client : Govt Of Andhra Pradesh and Telangana.
Project Title : SthreeNidhi
Environment :Windows XP Professional ,Windows 7.
Duration :Sep -2013 to Dec-2014.
Role : Pl/Sql Developer.
Team Size : 12
Tools : Sql Server Management Studio .
Description: SERP mission is to enable the disadvantaged communities to perceive
possibilities for change and bring about desired change by exercising informed choices
through collective action.
-The disadvantaged communities shall be empowered to overcome all social, economic,
cultural and psychological barriers through self-managed organizations
SthreeNidhi credit cooperative Federation Ltd., is promoted by the Government
and the MandalSamkahyas to supplement credit flow from banking sector and is a
flagship programme of the Government. SthreeNidhi provides timely and affordable
credit to the poor SHG members as a part of the overall strategy of SERP for poverty
alleviation.
SHGs are comfortable to access hassle free credit from SthreeNidhi as and when
required using their mobile and therefore do not see any need to borrow from other
sources at usurious rates of interest.SthreeNidhi is in a position to extend credit to the
SHGs even in far flung areas of the state in 48 hours to meet credit needs for exigencies
like health, education and other income generation needs like agriculture, dairy and other
activities. As credit availability is linked to grading of MS and VOs, community is keen
to improve functioning of the same to access higher amount of credit limits from
SthreeNidhi.
Contribution:
 Performed DBA Activities like Creating and Maintaining tables
 Exporting and importing of data from text,CSV,Excel Files.
 Involved in designing tables for screens.
 Analyzing the Requirement and attended client meetings.
 Developed Procedures ,functions and Views for Report Generation.
 Development of User Defined Functions that work for throughout solution.
 Creation and Maintenance of Indexes and Views for Performance Tuning.
 Developed Fact Tables for Analysis Report .
 Resolving the defects at time of production.
 Having knowledge of usage of Hints in Performance Tuning
#Project 1: TCS iON Education Solution
Client : TCS Internal
Customers : Manav Rachna International University,SASTRA University
and few more
Engineering colleges and Univeristies.
Environment :Windows XP Professional ,Oracle 11g
Duration :Aug -2011 to Jun-2013.
Role : Pl/Sql Developer.
Team Size : 6
Tools : Pentaho Report Designer.
Description: TCS iON a cloud based ERP solution was conceptualized by TCS through
close interactions with Small and Medium Businesses (SMB) across relevant
stakeholders.
- iON Education Solution has wide range of cloud based solutions with foot prints that
covers the entire value chain of education eco system covering K12 Schools, Affiliated
Colleges, Vocational Institutes, Boards and Universities.
- This covers Student Life Cycle management; Attendance Management, Fees
Management and Academic Operations Management.
Contribution:
 Devoloped Pre-Configured and On Demand reports.
 Used Pentaho Report Designer to design the format of Reports.
 Analyzing the Requirement and Report Design.
 Development of UserDefined Functions that work for throughout solution.
 Involved in the customer conversations.
 Following up with QA team during testing phase.
 Development and Maintenance of PL/SQL Procedures and Triggers.
 Creation of Indexes and Views for Performance Tuning.
 Documentation of good practices and logics for future reference.
 Developed Fact Tables for On Demand Reports.
 Resolving the defects at time of production.

Recommandé

Anil_BigData Resume par
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
1.8K vues4 diapositives
Suresh_Hadoop_Resume par
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
335 vues2 diapositives
Pankaj Resume for Hadoop,Java,J2EE - Outside World par
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K vues10 diapositives
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop par
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
474 vues3 diapositives
Prashanth Kumar_Hadoop_NEW par
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Shankar kumar
713 vues7 diapositives
Sasmita bigdata resume par
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resumeSasmita Swain
652 vues4 diapositives

Contenu connexe

Tendances

Resume - Narasimha Rao B V (TCS) par
Resume - Narasimha  Rao B V (TCS)Resume - Narasimha  Rao B V (TCS)
Resume - Narasimha Rao B V (TCS)Venkata Narasimha Rao B
751 vues8 diapositives
Resume_Triveni_Bigdata_Hadoop Professional par
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
387 vues3 diapositives
Resume (2) par
Resume (2)Resume (2)
Resume (2)Romy Khetan
382 vues6 diapositives
Resume_Abhinav_Hadoop_Developer par
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
344 vues3 diapositives
hadoop exp par
hadoop exphadoop exp
hadoop expVenkata Ramakumar Maturu
194 vues6 diapositives
Hadoop Big Data Resume par
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
2.7K vues4 diapositives

Tendances(19)

Resume_Triveni_Bigdata_Hadoop Professional par TRIVENI PATRO
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
TRIVENI PATRO387 vues
Hadoop Big Data Resume par arbind_jha
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha2.7K vues
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala par Mopuru Babu
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Mopuru Babu118 vues
Jayaram_Parida- Big Data Architect and Technical Scrum Master par Jayaram Parida
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida2.2K vues
Suresh_Yadav_Hadoop_Fresher_Resume par Suresh Yadav
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh Yadav90 vues
Srikanth hadoop 3.6yrs_hyd par srikanth K
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
srikanth K354 vues
Amith_Hadoop_Admin_CV par Amith R
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
Amith R240 vues

Similaire à Bharath Hadoop Resume

Resume par
ResumeResume
ResumeRama kumar M V
340 vues4 diapositives
Resume (1) par
Resume (1)Resume (1)
Resume (1)NAGESWARA RAO DASARI
99 vues4 diapositives
Swathi EV 3years par
Swathi EV  3yearsSwathi EV  3years
Swathi EV 3yearsswathi EV
171 vues6 diapositives
RESUME_N par
RESUME_NRESUME_N
RESUME_NNageswara Rao Dasari
55 vues4 diapositives
Shiv shakti resume par
Shiv shakti resumeShiv shakti resume
Shiv shakti resumeShiv Shakti
287 vues4 diapositives
Kalyan Hadoop par
Kalyan HadoopKalyan Hadoop
Kalyan HadoopCanarys
333 vues13 diapositives

Similaire à Bharath Hadoop Resume(20)

Swathi EV 3years par swathi EV
Swathi EV  3yearsSwathi EV  3years
Swathi EV 3years
swathi EV171 vues
Kalyan Hadoop par Canarys
Kalyan HadoopKalyan Hadoop
Kalyan Hadoop
Canarys333 vues
Nagarjuna_Damarla par Nag Arjun
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
Nag Arjun153 vues
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME par vamshi krishna
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
vamshi krishna452 vues
Resume_bibhu_prasad_dash par Bibhu Dash
Resume_bibhu_prasad_dash Resume_bibhu_prasad_dash
Resume_bibhu_prasad_dash
Bibhu Dash593 vues

Bharath Hadoop Resume

  • 1. Bharath Kumar Rapolu Contact No :+91-9885363653 E-Mail: bharathrapolu.kumar@gmail.com Professional Summary:  4.6+ years of overall IT experience in Application Development in Pl/Sql and Big Data Hadoop.  1.5 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie  Extensive Experience in Setting Hadoop Cluster  Good working knowledge with Map Reduce and Apache Pig  Involved in writing the Pig scripts and Pig UDFs to reduce the job execution time  Experience in creating Hive External and Managed tables and writing queries on them  Involved in writing Hive UDFs for specific functionalities.  Experience in importing and exporting data operation with RDBMS using Sqoop CLI Commands  Involved in scheduling MR,Pig and Sqoop Jobs in Apache Oozie.  Experience in writing Procedures,Functions,Triggers, Indexes and Packages in RDBMS.  Written SQL queries for DDL and DML operations.  Experience in Importing and Exporting of data from text and excel sheets in Sql Server.  Having knowledge of Fact and Dimensional Tables in RDBMS.  Experience in Performance Tuning and Query Optimizing in RDBMS.  Experience in Requirement Analysis and Table Design.  Having Knowledge of Pentaho Report Designer.  Ability to be an effective team player and work under time constraints.  Good interpersonal communication skills & Technical Documentation skills.  Having knowledge of usage of Hints in Performance Tuning  Knowledge on FLUME and NO-SQL Professional Experience:  Currently Working as a IT Analyst in TCS (Tata Consultancy Services), Hyderabad, India since Jul 2011. Qualifications:  Bachelor of Technology from SASTRA University, Thanjavur, Chennai, with 7.7 /10 CGPA. Technical Skills: Languages Core java,SQL,PL/SQL MapReduce, Pig, Sqoop, Pig, Hive, Hbase. Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
  • 2. Frameworks Hadoop and .NET Java IDEs Eclipse Europa 2008, PL/SQL Developer Version Control / Tracking Tools Visual Source Safe (VSS) Databases DB2 9.x,MySql, Sql Server,Oracle- SQL (DDL, DML, DCL) and PL/SQL. Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux Project Details: PROJECT #3: Project Name : BestBuy – Web Intelligence Client : BestBuy Minneapolis, Minnesota, USA. Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL Duration : Nov 2014 to till Date Role : Hadoop Developer Description: This Project is all about the rehousting of their (BestBuy) current existing project into Hadoop platform. Previously BestBuy was using mysql DB for storing their competitor’s retailer’s information.[The Crawled web data]. Early BestBuy use to have only 4 competitor retailers namely Amazon.com, walmart.com etc…. But as and when the competitor retailers are increasing the data generated out of their web crawling is also increased massively and which cannot be accomodable in a mysql kind of data box with the same reason BestBuy wants to move it Hadoop, where exactly we can handle massive amount of data by means of its cluster nodes and also to satisfy the scaling needs of the BestBuy business operation. Roles and Responsibilities:  Moved all crawl data flat files generated from various retailers to HDFS for further processing.  Written the Apache PIG scripts to process the HDFS data.  Created Hive tables to store the processed results in a tabular format.  Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.  Writing the script files for processing data and loading to HDFS  Writing CLI commands using HDFS.
  • 3.  Developed the UNIX shell scripts for creating the reports from Hive data.  Completely involved in the requirement analysis phase.  Created two different users (hduser for performing hdfs operations and map red user for performing map reduce operations only)  Ensured NFS is configured for Name Node  Setting Password less hadoop  Setting up cron job to delete hadoop logs/local old job files/cluster temp files  Setup Hive with MySQL as a Remote Metastore  Moved all log/text files generated by various products into HDFS location  Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data  Created External Hive Table on top of parsed data. #Project 2: SERP (Society for Elimination of Rural Poverty) Client : Govt Of Andhra Pradesh and Telangana. Project Title : SthreeNidhi Environment :Windows XP Professional ,Windows 7. Duration :Sep -2013 to Dec-2014. Role : Pl/Sql Developer. Team Size : 12 Tools : Sql Server Management Studio . Description: SERP mission is to enable the disadvantaged communities to perceive possibilities for change and bring about desired change by exercising informed choices through collective action. -The disadvantaged communities shall be empowered to overcome all social, economic, cultural and psychological barriers through self-managed organizations SthreeNidhi credit cooperative Federation Ltd., is promoted by the Government and the MandalSamkahyas to supplement credit flow from banking sector and is a flagship programme of the Government. SthreeNidhi provides timely and affordable credit to the poor SHG members as a part of the overall strategy of SERP for poverty alleviation. SHGs are comfortable to access hassle free credit from SthreeNidhi as and when required using their mobile and therefore do not see any need to borrow from other sources at usurious rates of interest.SthreeNidhi is in a position to extend credit to the SHGs even in far flung areas of the state in 48 hours to meet credit needs for exigencies like health, education and other income generation needs like agriculture, dairy and other activities. As credit availability is linked to grading of MS and VOs, community is keen to improve functioning of the same to access higher amount of credit limits from SthreeNidhi.
  • 4. Contribution:  Performed DBA Activities like Creating and Maintaining tables  Exporting and importing of data from text,CSV,Excel Files.  Involved in designing tables for screens.  Analyzing the Requirement and attended client meetings.  Developed Procedures ,functions and Views for Report Generation.  Development of User Defined Functions that work for throughout solution.  Creation and Maintenance of Indexes and Views for Performance Tuning.  Developed Fact Tables for Analysis Report .  Resolving the defects at time of production.  Having knowledge of usage of Hints in Performance Tuning #Project 1: TCS iON Education Solution Client : TCS Internal Customers : Manav Rachna International University,SASTRA University and few more Engineering colleges and Univeristies. Environment :Windows XP Professional ,Oracle 11g Duration :Aug -2011 to Jun-2013. Role : Pl/Sql Developer. Team Size : 6 Tools : Pentaho Report Designer. Description: TCS iON a cloud based ERP solution was conceptualized by TCS through close interactions with Small and Medium Businesses (SMB) across relevant stakeholders. - iON Education Solution has wide range of cloud based solutions with foot prints that covers the entire value chain of education eco system covering K12 Schools, Affiliated Colleges, Vocational Institutes, Boards and Universities. - This covers Student Life Cycle management; Attendance Management, Fees Management and Academic Operations Management. Contribution:  Devoloped Pre-Configured and On Demand reports.  Used Pentaho Report Designer to design the format of Reports.  Analyzing the Requirement and Report Design.  Development of UserDefined Functions that work for throughout solution.  Involved in the customer conversations.  Following up with QA team during testing phase.  Development and Maintenance of PL/SQL Procedures and Triggers.  Creation of Indexes and Views for Performance Tuning.  Documentation of good practices and logics for future reference.  Developed Fact Tables for On Demand Reports.  Resolving the defects at time of production.