1. Sudheer
Mobile: +91-9652133441
Email: sudheer.talluri2011@gmail.com
Professional Experience
• 3+ years of overall IT experience in Application Development in Java and Big Data
Hadoop.
• 1.9 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce,
Apache Pig, Hive, Sqoop , HBase and Oozie.
• Involved in writing the Pig scripts to reduce the job execution time
• Extensive Experience in Setting Hadoop Cluster
• Good working knowledge with Map Reduce and Apache Pig
• Have been involved in design and development of applications for high profile customers
and deliver timely and quality solutions.
• Proficiency in developing web based applications using Java/J2EE
• Well versed with Servlets, JSP, JDBC, Struts, web servers.
• Good understanding of IBM Web Sphere Application Server, Apache Tomcat and WebLogic
in the areas of development, deployment, configuration settings and deployment
descriptors
• Working knowledge in HIBERNATE ,EJB, Application Servers.
• Working Knowledge in ZK Framework and Web services.
• Quickly adapting to new technologies
• Excellent communication, interpersonal, analytical skills, and strong ability to perform as
part of team.
• Knowledge on SPARK,SCALA and KAFKA
Experience Summary
Worked as a Software Engineer in Lince Soft Solutions, Hyderabad, India from
Sept-2014 to Till date.
Worked as a Software Engineer in Heitech padu, Malaysia from June-2012 to Aug-
2014.
Academic Background:
B.Tech(IT) from Anna University, Tichy, India.
Technical Skills:
Languages : Java, Java Script, HTML, XML, XSD, XSL ,Web Services, Map Reduce, Pig,
Sqoop, Pig, Hive, Hbase.
J2EE Technologies : JSP, Servlets, JDBC and EJB
Servers : IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
Frameworks : ZK Framework, Struts, Spring, Hibernate, Hadoop.
Java IDEs : RAD, Eclipse.
Databases : DB2 9.x, Oracle, SQL (DDL, DML, DCL)
Version Control Tools : SVN, CVS, Visual SourceSafe
Operating Systems : Windows7, 2000, 2003 and Linux
2. Project Details:
PROJECT #1:
Project Title : Target – Web Intelligence
Client : Target Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, MySQL
Team Size : 12
Duration : Sept 2014 to till Date
Role : Hadoop Developer
Description:
This Project is all about the rehousting of their (Target) current existing project into
Hadoop platform. Previously Target was using mysql DB for storing their competitor’s
retailer’s information.[The Crawled web data]. Early Target use to have only 4 competitor
retailers namely Amazon.com, walmart.com etc….
But as and when the competitor retailers are increasing the data generated out of their
web crawling is also increased massively and which cannot be accomodable in a mysql
kind of data box with the same reason Target wants to move it Hadoop, where exactly we
can handle massive amount of data by means of its cluster nodes and also to satisfy the
scaling needs of the Target business operation.
Roles and Responsibilities:
• Involved in gathering the requirements, designing, development and testing
• Written the Apache PIG scripts to process the HDFS data.
• Writing the script files for processing data and loading to HDFS
• Writing CLI commands using HDFS.
• Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
• Created Hive tables to store the processed results in a tabular format.
• Developed the sqoop scripts in order to make the interaction between Pig and MySQL
Database.
• Analyzing the requirement to setup a cluster
• Ensured NFS is configured for Name Node
• Setting Password less hadoop
• Completely involved in the requirement analysis phase
• Setting up cron job to delete hadoop logs/local old job files/cluster temp files
• Setup Hive with MySQL as a Remote Metastore
• Moved all log/text files generated by various products into HDFS location
• Written Map Reduce code that will take input as log files and parse the logs and structure
them in tabular format to facilitate effective querying on the log data
• Created External Hive Table on top of parsed data.
3. PROJECT #2:
Project Title : JPJ-Revamp
Client : JPJ (Jabatan Pengangkutan Jalan),Malaysia
Role : Associate Developer.
Team Size : 18
Duration : June 2012 to Aug 2014
Environment : EJB 3.0, JMS, ZK Framework , IBM Web sphere.
Description:
This System was developed for managing all the activities of the JPJ(Jabatan Pengangkutan Jalan ,
Malaysia) . It is related to the Road Transport Operations of Malayisa. All the activities related to
the road transport are included in this project
It contains many modules based on the activities of the Client. There mainly 4 modules.
1. Licensing
2. Vehicle
3. Enforcement
4. Revenue
It is online application through which user can register and do the operations online along with
the payment. These 3 modules contain all the user operations of the JPJ. The first module is used
for the Licensing Operations of the public like applying license, apply international license, renew
license etc. The second module is used for the vehicle operations like vehicle registration, transfer
etc. The 3rd
module is used to check the blacklisted user and the operation regarding the law .The
4th
module is used for the payment purposes of the fee or fine. So all these modules have circular
dependencies and it has the 2 way communication this communication is done through JMS and
Cobol. This application is mainly used by customer and the employee of JPJ. The project was create
license, renew, appeal,
Delete, print/copy and replacement for license in online. Basic Information: In this section
customer information like Id, Contact numbers, Driver information, Status(waiting or dropped off)
and a section to key in the actual customer request will be displayed.
Roles and Responsibilities:
• Project Analyzing, Development.
• Preparing the Test Plan.
• Unit Testing,SIT and UAT.
• Involved in fixing the bugs
• Involved in deployment in SIT and UAT servers.