Vishnu_HadoopDeveloper

V
vishnu.c E-mail: vishnuch661@gmail.com
Mobile: 91-9550684116
Skills summary
 Having 5+ Years of IT experience in Application Development in Java and Big Data
Hadoop.
 Working knowledge in Object Oriented Programming.
 2+ years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop and Oozie.
 Knowledge in Setting Hadoop Cluster.
 Good working knowledge with Map Reduce ,Apache Pig and Sqoop.
 Good expertise & experience in web application development with sound knowledge of
Core java, JDBC, JSP, Servlets, Struts, Hibernate.
 Very well experienced in designing and developing both server side and client side
applications.
 Good understanding of IBM Web Sphere Application Server10.3 and Apache
Tomcat6.0 in the areas of deployment, configuration settings.
 Working experience in MySQL and Oracle Database.
 Experience with various IDE’s for development of project ( Eclipse,Netbeans) and
efficiently worked on version controlling systems like CVS and SVN.
 Knowledge on FLUME, Hbase and Spark&Scala.
 Excellent communication, interpersonal, analytical skills, and strong ability to perform as
part of team.
 Exceptional ability to learn new concepts, hard working and enthusiastic.
Professional Experience
 Currently working as a Hadoop Developer at CAPGEMINI, Hyderabad from Nov
2013 to till date.
 Previously worked as a system Engineer at Orange Business Services India
Technology Private Limited, Mumbai from Jan 2013 to Oct 2013.
 Previously worked as a software Engineer at Ventech, Hyderabad from June
2011 to November 2012.
Educational Qualifications
 MCA-Master of Computer Applications(2011) from JNTU kukatpally, Hyderabad with
An aggregate of 79%.
Technical Skills
Language : Java, MapReduce, Pig, Sqoop, Hive and HDFS.
J2EE : JDBC, Servlets, JSP.
Web Framework : Struts 1.0.
ORM Framework : Hibernate3.3.
Java-JEE Framework : Spring (awareness)
Servers : Tomcat 6.0 and Websphere10.3.
Database : Oracle10g, MySQL.
IDE’s : Eclipse 3.4, Netbeans.
Operating System : Windows 2000/XP,UNIX and LINUX.
Tools : Log4j,SVN.
Web Programming : HTML, JavaScript, Ajax, Jquery, css.
Project Profile
Project#4
Organization : CAPGEMINI
Project Title : Analyze Cost & Performance
Client : Element Financial Services, USA.
Team Size : 12.
Role : Hadoop Developer.
Environment :Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Oozie, Java, Unix,
Struts1.2, Hibernate3.3, MySQL.
Period : Jan’14 to till date.
Description:
Analyze is web based application, Aim of the project is to user can able to see the
Dashboards, Graphs and Flow charts of vehicles cost and performance on tier wise. The cost of
vehicle is based on cost for fuel, cost for maintenance and cost for accidents … etc and
performance is based on kilometer per litter…etc.
Responsibilities:
 Moved all log/text files generated by various components into HDFS location
 Written MapReduce program to process the HDFS data.
 Written the Apache PIG scripts to process the HDFS data.
 Developed the Ssqoop scripts in order to make the interaction between Pig and
MySQL Database.
 Writing the script files for processing data and loading to HDFS.
 Created Hive tables to store the processed results in a tabular format.
 Created External Hive Table on top of parsed data.
 Involved in writing the OOZIE jobs.
POC:
Sensex Log Data Processing
Environment : Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Java, Unix, Struts,
Hibernate, MySQL.
Role : Hadoop Developer.
Hardware : Virtual Machines, UNIX.
Period : Nov’13 to Jan’14.
Description:
The purpose of the project is to store terabytes of log information generated by
the sensex website and extract meaning information out of it. The solution is based on the open
source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using
Map/Reduce jobs, Pig and export processed data to the MYSQL using SQOOP. By using struts will
generate the graphs based on data in DB.
Responsibilities:
 Moved all PDF data files generated from SENSEX report to HDFS for further
processing.
 Written MapReduce program to process the HDFS data.
 Written the Apache PIG scripts to process the HDFS data.
 Developed the Sqoop scripts inorder to make the interaction between Pig and
MySQL Database.
 For the development of Dashboard solution, developed the Controller, Service and
Dao layers of Struts Framework.
 Completely involved in the requirement analysis phase.
Project#3
Organization : Orange Business Services India Technology Pvt LTD.
Project Title : NBR Component
Client : WorldBank, US.
Team Size : 15.
Role : Application Developer.
Environment : JDK 1.7, Jsp, Struts 2.0, Hibernate 3.3, Oracle,
Tomcat6.0, HTML, Ajax, JavaScript, Jquery, Eclipse.
Period : Jan’13 to Oct’13.
Description:
NBR Component is web based application, Aim of the project is to download
WorldBank people recorded meetings from WebEx cloud to their repository server location and
transfer downloaded meeting from repository server to PDMZ server. And send a mail notification
to the administrator and owner of that recorded meeting file based on the each step. It will apply
ACL permissions on that downloaded recorded meetings and more operations.
Responsibilities:
 Involved in client meetings.
 Responsible for writing code for Action Classes Using Struts2.0.
 Responsible for writing code for Hibernate Configuration & mapping files.
 Involved in writing sql Queries.
 Responsible for writing code for Business logic and DAO.
 Responsible for writing code for Server Side Validations using XWork Validation
Framework.
 Responsible for writing code for Client Side Validations using JavaScript, Jquery,
Ajax.
 Involved in Development of Persistence logic using Hibernate.
 Responsible for writing code for Interfaces using JSP’s, HTML and CSS.
 Involved in Debugging.
Project#1
Organization : Ventech.
Project Title : Inter Communication.
Client : In-House Project.
Team Size : 10
Role : Team Member
Environment : JDK1.6, Servlets, Jsp, JDBC3.0, MySQL, Tomcat 6.0, Netbeans.
Description:
Inter Communication is web based application built on Java technology to
Developing a web application which is directed to the employee’s in a company. Project goal is to
serve as the simulation to them. This system will be owned by a Company, to serve for their
people. This web application will incorporate all requirements needed for the interaction between
Project and IT support team and get the project team members problems solved, like
installations, access to other resources across the network, etc.
Responsibilities:
 Developed View Components.
 Implementation of JSPs with TagLibs.
 Involved in the development of Java Action Classes Using Servlets.
 Involved in the writing of client side validations using JavaScript and Jquery.
 Responsible for Involving in Debugging.
PERSONAL PROFILE:
Name : VISHNU C
D.O.B : 25-MARCH-1986.
Languages Known : English, Hindi and Telugu.
Place : Mumbai
Date : (Vishnu)
Date : (Vishnu)

Recommandé

Suresh_Hadoop_Resume par
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
335 vues2 diapositives
Anil_BigData Resume par
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
1.8K vues4 diapositives
Bharath Hadoop Resume par
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
887 vues4 diapositives
Resume - Shashesh Silwal par
Resume - Shashesh SilwalResume - Shashesh Silwal
Resume - Shashesh SilwalShashesh Silwal
410 vues4 diapositives
Mrigendra kumar bharti resume par
Mrigendra kumar bharti resumeMrigendra kumar bharti resume
Mrigendra kumar bharti resumeMrigendra Kumar Bharti
16 vues2 diapositives
Bala_New par
Bala_NewBala_New
Bala_NewBala Chandar
210 vues8 diapositives

Contenu connexe

Tendances

Rahul 5yr java par
Rahul 5yr javaRahul 5yr java
Rahul 5yr javaRahul Kumar Garg
301 vues6 diapositives
Pushpendra par
PushpendraPushpendra
PushpendraPushpendra Kumar
277 vues6 diapositives
Purti par
PurtiPurti
PurtiPurti Kumari
110 vues7 diapositives
Resume_Karthick par
Resume_KarthickResume_Karthick
Resume_KarthickKarthick Selvaraj
165 vues5 diapositives
Bhargav par
BhargavBhargav
BhargavBhargav Manoharan
647 vues10 diapositives
Resume_Manabrata_Maity_2_Years_Experience par
Resume_Manabrata_Maity_2_Years_ExperienceResume_Manabrata_Maity_2_Years_Experience
Resume_Manabrata_Maity_2_Years_ExperienceManabrata Maity
213 vues4 diapositives

Tendances(20)

Resume_Manabrata_Maity_2_Years_Experience par Manabrata Maity
Resume_Manabrata_Maity_2_Years_ExperienceResume_Manabrata_Maity_2_Years_Experience
Resume_Manabrata_Maity_2_Years_Experience
Manabrata Maity213 vues
Wei ding(resume) par WEI DING
Wei ding(resume)Wei ding(resume)
Wei ding(resume)
WEI DING483 vues
Keyur_Joshi_resume - Copy par Keyur Joshi
Keyur_Joshi_resume - CopyKeyur_Joshi_resume - Copy
Keyur_Joshi_resume - Copy
Keyur Joshi598 vues
Jayaram_Parida- Big Data Architect and Technical Scrum Master par Jayaram Parida
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram Parida2.2K vues
Hadoop Big Data Resume par arbind_jha
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
arbind_jha2.7K vues

Similaire à Vishnu_HadoopDeveloper

Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala par
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaMopuru Babu
118 vues8 diapositives
Pankaj Resume for Hadoop,Java,J2EE - Outside World par
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K vues10 diapositives
Trinath Resume par
Trinath ResumeTrinath Resume
Trinath Resumetrinath anantham
236 vues3 diapositives
hadoop_bigdata par
hadoop_bigdatahadoop_bigdata
hadoop_bigdatasudheer talluri
126 vues4 diapositives
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME par
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEvamshi krishna
453 vues5 diapositives
Gubendran Lakshmanan par
Gubendran LakshmananGubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
895 vues4 diapositives

Similaire à Vishnu_HadoopDeveloper(20)

Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala par Mopuru Babu
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Mopuru Babu118 vues
Pankaj Resume for Hadoop,Java,J2EE - Outside World par Pankaj Kumar
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar30K vues
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME par vamshi krishna
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
vamshi krishna453 vues
shamResume (1) par sham b
shamResume (1)shamResume (1)
shamResume (1)
sham b111 vues

Vishnu_HadoopDeveloper

  • 1. vishnu.c E-mail: vishnuch661@gmail.com Mobile: 91-9550684116 Skills summary  Having 5+ Years of IT experience in Application Development in Java and Big Data Hadoop.  Working knowledge in Object Oriented Programming.  2+ years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop and Oozie.  Knowledge in Setting Hadoop Cluster.  Good working knowledge with Map Reduce ,Apache Pig and Sqoop.  Good expertise & experience in web application development with sound knowledge of Core java, JDBC, JSP, Servlets, Struts, Hibernate.  Very well experienced in designing and developing both server side and client side applications.  Good understanding of IBM Web Sphere Application Server10.3 and Apache Tomcat6.0 in the areas of deployment, configuration settings.  Working experience in MySQL and Oracle Database.  Experience with various IDE’s for development of project ( Eclipse,Netbeans) and efficiently worked on version controlling systems like CVS and SVN.  Knowledge on FLUME, Hbase and Spark&Scala.  Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team.  Exceptional ability to learn new concepts, hard working and enthusiastic. Professional Experience  Currently working as a Hadoop Developer at CAPGEMINI, Hyderabad from Nov 2013 to till date.  Previously worked as a system Engineer at Orange Business Services India Technology Private Limited, Mumbai from Jan 2013 to Oct 2013.  Previously worked as a software Engineer at Ventech, Hyderabad from June 2011 to November 2012.
  • 2. Educational Qualifications  MCA-Master of Computer Applications(2011) from JNTU kukatpally, Hyderabad with An aggregate of 79%. Technical Skills Language : Java, MapReduce, Pig, Sqoop, Hive and HDFS. J2EE : JDBC, Servlets, JSP. Web Framework : Struts 1.0. ORM Framework : Hibernate3.3. Java-JEE Framework : Spring (awareness) Servers : Tomcat 6.0 and Websphere10.3. Database : Oracle10g, MySQL. IDE’s : Eclipse 3.4, Netbeans. Operating System : Windows 2000/XP,UNIX and LINUX. Tools : Log4j,SVN. Web Programming : HTML, JavaScript, Ajax, Jquery, css. Project Profile Project#4 Organization : CAPGEMINI Project Title : Analyze Cost & Performance Client : Element Financial Services, USA. Team Size : 12. Role : Hadoop Developer. Environment :Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Oozie, Java, Unix, Struts1.2, Hibernate3.3, MySQL. Period : Jan’14 to till date. Description: Analyze is web based application, Aim of the project is to user can able to see the Dashboards, Graphs and Flow charts of vehicles cost and performance on tier wise. The cost of vehicle is based on cost for fuel, cost for maintenance and cost for accidents … etc and performance is based on kilometer per litter…etc.
  • 3. Responsibilities:  Moved all log/text files generated by various components into HDFS location  Written MapReduce program to process the HDFS data.  Written the Apache PIG scripts to process the HDFS data.  Developed the Ssqoop scripts in order to make the interaction between Pig and MySQL Database.  Writing the script files for processing data and loading to HDFS.  Created Hive tables to store the processed results in a tabular format.  Created External Hive Table on top of parsed data.  Involved in writing the OOZIE jobs. POC: Sensex Log Data Processing Environment : Hadoop, HDFS, MapReduce, Apache Pig, SQOOP, Java, Unix, Struts, Hibernate, MySQL. Role : Hadoop Developer. Hardware : Virtual Machines, UNIX. Period : Nov’13 to Jan’14. Description: The purpose of the project is to store terabytes of log information generated by the sensex website and extract meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs, Pig and export processed data to the MYSQL using SQOOP. By using struts will generate the graphs based on data in DB. Responsibilities:  Moved all PDF data files generated from SENSEX report to HDFS for further processing.  Written MapReduce program to process the HDFS data.  Written the Apache PIG scripts to process the HDFS data.
  • 4.  Developed the Sqoop scripts inorder to make the interaction between Pig and MySQL Database.  For the development of Dashboard solution, developed the Controller, Service and Dao layers of Struts Framework.  Completely involved in the requirement analysis phase. Project#3 Organization : Orange Business Services India Technology Pvt LTD. Project Title : NBR Component Client : WorldBank, US. Team Size : 15. Role : Application Developer. Environment : JDK 1.7, Jsp, Struts 2.0, Hibernate 3.3, Oracle, Tomcat6.0, HTML, Ajax, JavaScript, Jquery, Eclipse. Period : Jan’13 to Oct’13. Description: NBR Component is web based application, Aim of the project is to download WorldBank people recorded meetings from WebEx cloud to their repository server location and transfer downloaded meeting from repository server to PDMZ server. And send a mail notification to the administrator and owner of that recorded meeting file based on the each step. It will apply ACL permissions on that downloaded recorded meetings and more operations. Responsibilities:  Involved in client meetings.  Responsible for writing code for Action Classes Using Struts2.0.  Responsible for writing code for Hibernate Configuration & mapping files.  Involved in writing sql Queries.  Responsible for writing code for Business logic and DAO.  Responsible for writing code for Server Side Validations using XWork Validation Framework.  Responsible for writing code for Client Side Validations using JavaScript, Jquery, Ajax.  Involved in Development of Persistence logic using Hibernate.
  • 5.  Responsible for writing code for Interfaces using JSP’s, HTML and CSS.  Involved in Debugging. Project#1 Organization : Ventech. Project Title : Inter Communication. Client : In-House Project. Team Size : 10 Role : Team Member Environment : JDK1.6, Servlets, Jsp, JDBC3.0, MySQL, Tomcat 6.0, Netbeans. Description: Inter Communication is web based application built on Java technology to Developing a web application which is directed to the employee’s in a company. Project goal is to serve as the simulation to them. This system will be owned by a Company, to serve for their people. This web application will incorporate all requirements needed for the interaction between Project and IT support team and get the project team members problems solved, like installations, access to other resources across the network, etc. Responsibilities:  Developed View Components.  Implementation of JSPs with TagLibs.  Involved in the development of Java Action Classes Using Servlets.  Involved in the writing of client side validations using JavaScript and Jquery.  Responsible for Involving in Debugging. PERSONAL PROFILE: Name : VISHNU C D.O.B : 25-MARCH-1986. Languages Known : English, Hindi and Telugu. Place : Mumbai