SlideShare une entreprise Scribd logo
1  sur  4
Sudheer
Mobile: +91-9652133441
Email: sudheer.talluri2011@gmail.com
Professional Experience
• 3+ years of overall IT experience in Application Development in Java and Big Data
Hadoop.
• 1.9 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce,
Apache Pig, Hive, Sqoop , HBase and Oozie.
• Involved in writing the Pig scripts to reduce the job execution time
• Extensive Experience in Setting Hadoop Cluster
• Good working knowledge with Map Reduce and Apache Pig
• Have been involved in design and development of applications for high profile customers
and deliver timely and quality solutions.
• Proficiency in developing web based applications using Java/J2EE
• Well versed with Servlets, JSP, JDBC, Struts, web servers.
• Good understanding of IBM Web Sphere Application Server, Apache Tomcat and WebLogic
in the areas of development, deployment, configuration settings and deployment
descriptors
• Working knowledge in HIBERNATE ,EJB, Application Servers.
• Working Knowledge in ZK Framework and Web services.
• Quickly adapting to new technologies
• Excellent communication, interpersonal, analytical skills, and strong ability to perform as
part of team.
• Knowledge on SPARK,SCALA and KAFKA
Experience Summary
 Worked as a Software Engineer in Lince Soft Solutions, Hyderabad, India from
Sept-2014 to Till date.
 Worked as a Software Engineer in Heitech padu, Malaysia from June-2012 to Aug-
2014.
Academic Background:
 B.Tech(IT) from Anna University, Tichy, India.
Technical Skills:
Languages : Java, Java Script, HTML, XML, XSD, XSL ,Web Services, Map Reduce, Pig,
Sqoop, Pig, Hive, Hbase.
J2EE Technologies : JSP, Servlets, JDBC and EJB
Servers : IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
Frameworks : ZK Framework, Struts, Spring, Hibernate, Hadoop.
Java IDEs : RAD, Eclipse.
Databases : DB2 9.x, Oracle, SQL (DDL, DML, DCL)
Version Control Tools : SVN, CVS, Visual SourceSafe
Operating Systems : Windows7, 2000, 2003 and Linux
Project Details:
PROJECT #1:
Project Title : Target – Web Intelligence
Client : Target Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, MySQL
Team Size : 12
Duration : Sept 2014 to till Date
Role : Hadoop Developer
Description:
This Project is all about the rehousting of their (Target) current existing project into
Hadoop platform. Previously Target was using mysql DB for storing their competitor’s
retailer’s information.[The Crawled web data]. Early Target use to have only 4 competitor
retailers namely Amazon.com, walmart.com etc….
But as and when the competitor retailers are increasing the data generated out of their
web crawling is also increased massively and which cannot be accomodable in a mysql
kind of data box with the same reason Target wants to move it Hadoop, where exactly we
can handle massive amount of data by means of its cluster nodes and also to satisfy the
scaling needs of the Target business operation.
Roles and Responsibilities:
• Involved in gathering the requirements, designing, development and testing
• Written the Apache PIG scripts to process the HDFS data.
• Writing the script files for processing data and loading to HDFS
• Writing CLI commands using HDFS.
• Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
• Created Hive tables to store the processed results in a tabular format.
• Developed the sqoop scripts in order to make the interaction between Pig and MySQL
Database.
• Analyzing the requirement to setup a cluster
• Ensured NFS is configured for Name Node
• Setting Password less hadoop
• Completely involved in the requirement analysis phase
• Setting up cron job to delete hadoop logs/local old job files/cluster temp files
• Setup Hive with MySQL as a Remote Metastore
• Moved all log/text files generated by various products into HDFS location
• Written Map Reduce code that will take input as log files and parse the logs and structure
them in tabular format to facilitate effective querying on the log data
• Created External Hive Table on top of parsed data.
PROJECT #2:
Project Title : JPJ-Revamp
Client : JPJ (Jabatan Pengangkutan Jalan),Malaysia
Role : Associate Developer.
Team Size : 18
Duration : June 2012 to Aug 2014
Environment : EJB 3.0, JMS, ZK Framework , IBM Web sphere.
Description:
This System was developed for managing all the activities of the JPJ(Jabatan Pengangkutan Jalan ,
Malaysia) . It is related to the Road Transport Operations of Malayisa. All the activities related to
the road transport are included in this project
It contains many modules based on the activities of the Client. There mainly 4 modules.
1. Licensing
2. Vehicle
3. Enforcement
4. Revenue
It is online application through which user can register and do the operations online along with
the payment. These 3 modules contain all the user operations of the JPJ. The first module is used
for the Licensing Operations of the public like applying license, apply international license, renew
license etc. The second module is used for the vehicle operations like vehicle registration, transfer
etc. The 3rd
module is used to check the blacklisted user and the operation regarding the law .The
4th
module is used for the payment purposes of the fee or fine. So all these modules have circular
dependencies and it has the 2 way communication this communication is done through JMS and
Cobol. This application is mainly used by customer and the employee of JPJ. The project was create
license, renew, appeal,
Delete, print/copy and replacement for license in online. Basic Information: In this section
customer information like Id, Contact numbers, Driver information, Status(waiting or dropped off)
and a section to key in the actual customer request will be displayed.
Roles and Responsibilities:
• Project Analyzing, Development.
• Preparing the Test Plan.
• Unit Testing,SIT and UAT.
• Involved in fixing the bugs
• Involved in deployment in SIT and UAT servers.
hadoop_bigdata

Contenu connexe

Tendances

NoSQLDatabases
NoSQLDatabasesNoSQLDatabases
NoSQLDatabases
Adi Challa
 
Sap integration with_j_boss_technologies
Sap integration with_j_boss_technologiesSap integration with_j_boss_technologies
Sap integration with_j_boss_technologies
Serge Pagop
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOP
renuga V
 
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cmsCreate engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
Serge Pagop
 
Web 2 0 Data Visualization With Jsf
Web 2 0 Data Visualization With JsfWeb 2 0 Data Visualization With Jsf
Web 2 0 Data Visualization With Jsf
rajivmordani
 
X Aware Ajax World V1
X Aware Ajax World V1X Aware Ajax World V1
X Aware Ajax World V1
rajivmordani
 

Tendances (20)

List of Top Local Databases used for react native app developement in 2022
List of Top Local Databases used for react native app developement in 2022					List of Top Local Databases used for react native app developement in 2022
List of Top Local Databases used for react native app developement in 2022
 
NoSQLDatabases
NoSQLDatabasesNoSQLDatabases
NoSQLDatabases
 
Arzu Sahu
Arzu SahuArzu Sahu
Arzu Sahu
 
OVERVIEW OF FACEBOOK SCALABLE ARCHITECTURE.
OVERVIEW  OF FACEBOOK SCALABLE ARCHITECTURE.OVERVIEW  OF FACEBOOK SCALABLE ARCHITECTURE.
OVERVIEW OF FACEBOOK SCALABLE ARCHITECTURE.
 
Sap integration with_j_boss_technologies
Sap integration with_j_boss_technologiesSap integration with_j_boss_technologies
Sap integration with_j_boss_technologies
 
Building a Scalable and Modern Infrastructure at CARFAX
Building a Scalable and Modern Infrastructure at CARFAXBuilding a Scalable and Modern Infrastructure at CARFAX
Building a Scalable and Modern Infrastructure at CARFAX
 
Introduction to Apache Geode (Cork, Ireland)
Introduction to Apache Geode (Cork, Ireland)Introduction to Apache Geode (Cork, Ireland)
Introduction to Apache Geode (Cork, Ireland)
 
Best Practices & Lessons Learned from Deployment of PostgreSQL
 Best Practices & Lessons Learned from Deployment of PostgreSQL Best Practices & Lessons Learned from Deployment of PostgreSQL
Best Practices & Lessons Learned from Deployment of PostgreSQL
 
Migrating traditional Java EE Applications to mobile
Migrating traditional Java EE Applications to mobileMigrating traditional Java EE Applications to mobile
Migrating traditional Java EE Applications to mobile
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOP
 
All-inclusive insights on Building JavaScript microservices with Node!.pdf
All-inclusive insights on Building JavaScript microservices with Node!.pdfAll-inclusive insights on Building JavaScript microservices with Node!.pdf
All-inclusive insights on Building JavaScript microservices with Node!.pdf
 
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cmsCreate engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
Create engaging user_experiences_with_red_hat_j_boss_portal_and_first_spirit_cms
 
Web 2 0 Data Visualization With Jsf
Web 2 0 Data Visualization With JsfWeb 2 0 Data Visualization With Jsf
Web 2 0 Data Visualization With Jsf
 
Which Questions We Should Have
Which Questions We Should HaveWhich Questions We Should Have
Which Questions We Should Have
 
What SharePoint Admins need to know about SQL-Cinncinati
What SharePoint Admins need to know about SQL-CinncinatiWhat SharePoint Admins need to know about SQL-Cinncinati
What SharePoint Admins need to know about SQL-Cinncinati
 
Srikanth_hadoop
Srikanth_hadoopSrikanth_hadoop
Srikanth_hadoop
 
Couch db
Couch dbCouch db
Couch db
 
Resume..
Resume..Resume..
Resume..
 
X Aware Ajax World V1
X Aware Ajax World V1X Aware Ajax World V1
X Aware Ajax World V1
 

Similaire à hadoop_bigdata

mahendhar_java3_8yearsHadoop1yearexp
mahendhar_java3_8yearsHadoop1yearexpmahendhar_java3_8yearsHadoop1yearexp
mahendhar_java3_8yearsHadoop1yearexp
mahendhar e
 
Software Engineer_Ravi
Software Engineer_RaviSoftware Engineer_Ravi
Software Engineer_Ravi
ravi vemula
 
Vishnu_HadoopDeveloper
Vishnu_HadoopDeveloperVishnu_HadoopDeveloper
Vishnu_HadoopDeveloper
vishnu ch
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar
 
Software Engineer_Ravi
Software Engineer_RaviSoftware Engineer_Ravi
Software Engineer_Ravi
ravi vemula
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
bigdata sunil
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
Nag Arjun
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
Yeduvaka Ganesh
 

Similaire à hadoop_bigdata (20)

hadoop exp
hadoop exphadoop exp
hadoop exp
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
mahendhar_java3_8yearsHadoop1yearexp
mahendhar_java3_8yearsHadoop1yearexpmahendhar_java3_8yearsHadoop1yearexp
mahendhar_java3_8yearsHadoop1yearexp
 
Software Engineer_Ravi
Software Engineer_RaviSoftware Engineer_Ravi
Software Engineer_Ravi
 
Resume
ResumeResume
Resume
 
Vishnu_HadoopDeveloper
Vishnu_HadoopDeveloperVishnu_HadoopDeveloper
Vishnu_HadoopDeveloper
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Software Engineer_Ravi
Software Engineer_RaviSoftware Engineer_Ravi
Software Engineer_Ravi
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Madhava_Sr_JAVA_J2EE
Madhava_Sr_JAVA_J2EEMadhava_Sr_JAVA_J2EE
Madhava_Sr_JAVA_J2EE
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 

hadoop_bigdata

  • 1. Sudheer Mobile: +91-9652133441 Email: sudheer.talluri2011@gmail.com Professional Experience • 3+ years of overall IT experience in Application Development in Java and Big Data Hadoop. • 1.9 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie. • Involved in writing the Pig scripts to reduce the job execution time • Extensive Experience in Setting Hadoop Cluster • Good working knowledge with Map Reduce and Apache Pig • Have been involved in design and development of applications for high profile customers and deliver timely and quality solutions. • Proficiency in developing web based applications using Java/J2EE • Well versed with Servlets, JSP, JDBC, Struts, web servers. • Good understanding of IBM Web Sphere Application Server, Apache Tomcat and WebLogic in the areas of development, deployment, configuration settings and deployment descriptors • Working knowledge in HIBERNATE ,EJB, Application Servers. • Working Knowledge in ZK Framework and Web services. • Quickly adapting to new technologies • Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team. • Knowledge on SPARK,SCALA and KAFKA Experience Summary  Worked as a Software Engineer in Lince Soft Solutions, Hyderabad, India from Sept-2014 to Till date.  Worked as a Software Engineer in Heitech padu, Malaysia from June-2012 to Aug- 2014. Academic Background:  B.Tech(IT) from Anna University, Tichy, India. Technical Skills: Languages : Java, Java Script, HTML, XML, XSD, XSL ,Web Services, Map Reduce, Pig, Sqoop, Pig, Hive, Hbase. J2EE Technologies : JSP, Servlets, JDBC and EJB Servers : IBM Web Sphere Application Server 7.0, Web Logic and Tomcat Frameworks : ZK Framework, Struts, Spring, Hibernate, Hadoop. Java IDEs : RAD, Eclipse. Databases : DB2 9.x, Oracle, SQL (DDL, DML, DCL) Version Control Tools : SVN, CVS, Visual SourceSafe Operating Systems : Windows7, 2000, 2003 and Linux
  • 2. Project Details: PROJECT #1: Project Title : Target – Web Intelligence Client : Target Minneapolis, Minnesota, USA. Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, MySQL Team Size : 12 Duration : Sept 2014 to till Date Role : Hadoop Developer Description: This Project is all about the rehousting of their (Target) current existing project into Hadoop platform. Previously Target was using mysql DB for storing their competitor’s retailer’s information.[The Crawled web data]. Early Target use to have only 4 competitor retailers namely Amazon.com, walmart.com etc…. But as and when the competitor retailers are increasing the data generated out of their web crawling is also increased massively and which cannot be accomodable in a mysql kind of data box with the same reason Target wants to move it Hadoop, where exactly we can handle massive amount of data by means of its cluster nodes and also to satisfy the scaling needs of the Target business operation. Roles and Responsibilities: • Involved in gathering the requirements, designing, development and testing • Written the Apache PIG scripts to process the HDFS data. • Writing the script files for processing data and loading to HDFS • Writing CLI commands using HDFS. • Moved all crawl data flat files generated from various retailers to HDFS for further processing. • Created Hive tables to store the processed results in a tabular format. • Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database. • Analyzing the requirement to setup a cluster • Ensured NFS is configured for Name Node • Setting Password less hadoop • Completely involved in the requirement analysis phase • Setting up cron job to delete hadoop logs/local old job files/cluster temp files • Setup Hive with MySQL as a Remote Metastore • Moved all log/text files generated by various products into HDFS location • Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data • Created External Hive Table on top of parsed data.
  • 3. PROJECT #2: Project Title : JPJ-Revamp Client : JPJ (Jabatan Pengangkutan Jalan),Malaysia Role : Associate Developer. Team Size : 18 Duration : June 2012 to Aug 2014 Environment : EJB 3.0, JMS, ZK Framework , IBM Web sphere. Description: This System was developed for managing all the activities of the JPJ(Jabatan Pengangkutan Jalan , Malaysia) . It is related to the Road Transport Operations of Malayisa. All the activities related to the road transport are included in this project It contains many modules based on the activities of the Client. There mainly 4 modules. 1. Licensing 2. Vehicle 3. Enforcement 4. Revenue It is online application through which user can register and do the operations online along with the payment. These 3 modules contain all the user operations of the JPJ. The first module is used for the Licensing Operations of the public like applying license, apply international license, renew license etc. The second module is used for the vehicle operations like vehicle registration, transfer etc. The 3rd module is used to check the blacklisted user and the operation regarding the law .The 4th module is used for the payment purposes of the fee or fine. So all these modules have circular dependencies and it has the 2 way communication this communication is done through JMS and Cobol. This application is mainly used by customer and the employee of JPJ. The project was create license, renew, appeal, Delete, print/copy and replacement for license in online. Basic Information: In this section customer information like Id, Contact numbers, Driver information, Status(waiting or dropped off) and a section to key in the actual customer request will be displayed. Roles and Responsibilities: • Project Analyzing, Development. • Preparing the Test Plan. • Unit Testing,SIT and UAT. • Involved in fixing the bugs • Involved in deployment in SIT and UAT servers.