1. Prasanna Kumar
Email: rajupr1990@gmail.com
Mobile: +91 7331108854/8884601978
Summary:
Having total 4 Years of IT experience in Application and Hadoop development.
Expertise in Hadoop Framework and Big Data concepts (HDFS, Map Reduce, Hive, Impala, Pig,
Phoenix, HBase, Sqoop, Flume, Oozie, Spark, Scala, Kudu).
Having Experience in writing IMPALA and HIVE queries & Pig scripts.
Having Experience in writing Phoenix scripts and HBase.
Hands on with MapReduce Programming Model
Having good knowledge in Spark, Scala, and Kudu.
Importing and exporting data from HDFS using Sqoop.
Experience in Setting Hadoop Cluster, Performance benchmarking and monitoring of Hadoop
clusters
Having good working knowledge on UNIX command and Shell Programming.
Having Experience in Oracle ADF.
Experience in SDLC Cycle, Creating and maintaining Project Documents.
Excellent communicational, analytical, business and interpersonal skills. Self-motivated with a
proactive,resourcefulapproach toproblem solving and ability toworkindependently and tolead
or be part of a team.
Work Experience:
Worked as a Software Engineer in Tibco Software from Dec 2015 to Till Date.
Worked as a Software Engineer in SLK Software Services from Apr 2014 to Nov 2015.
Worked as an Associate Software Engineer in 3i InfoTech from Sep 2012 to Mar 2014.
Educational Qualifications:
BE/B.TECH (ECE) from Sri Venkateswara University.
Technical Proficiencies:
Frameworks : Hadoop, Oracle ADF
Hadoop Ecosystem : HDFS, Hive, Impala, Phoenix, Pig, Map Reduce, Sqoop, Flume
Programming skills : Java, SQL
NoSQL : HBase
J2EE Technologie : Oracle ADF
Servers : Tomcat, Web Logic
Scripting language : Shell Script
DBMS : Oracle and MySQL
Version Tools/ IDE : SVN, Eclipse, JDeveloper11g
Operating Systems : Windows, UNIX
2. Professional Experience:
Project #1:
Title : Data Science D2O
Client : Nielsen
Environment : Impala, Hive, Phoenix, HDFS, Map Reduce, HBase
Role : Developer
Description: This project willdigitize data science to align with the Nielsen vision to become more
digital. All Data Science tools are dependent on Global Factory Data being available in NDX.
NDX is uniquely positioned to deliver local and global insights into consumer behavior and product
sales across categories in nearly 100 countries. Nielsen’s powerful combination of deep data and
insights arms clients with actionable intelligence for their business planning. Our tools provide
clients with timely, flexible analytics, presenting a holistic view of the marketplace. Infusing digital
into data science to unleash innovation & provide cutting edge solutions for our clients. The major
implementations are CVCalc & Replica in Data Science.
CVCALC:
Implementation of new robust methodology forCV calculation based on modelling approach.
Improved data stability through minimized sample changes during sample re-designs and
annual updates and data precision through more robust variance calculations.
Integrate tool with NDX and other tools within it improving calculation efficiency
REPLICA:
Replica is a tool for calculating Relative Standard Error. Using a variance engine which is
based on bootstrap principles. It simulates thevariation associated withsample participation
by forming replicated samples where sampling units are giving variable rights to participate
in the sample.
This tool willallow standard decision in compliancewith our WBS,in terms of sample design
and MBD reporting. And therefore will translate in better quality to our client deliverables.
Responsibilities:
Interaction with client in gathering and understanding the requirements
Involved in writing Impala queries to load and process data in Hadoop file system.
Involved in writing Phoenix scripts to load and process data in HBase.
Writing Shell scripts that are invoked by TibcoBW.
Writing Map Reduce code for bulk data loading to Phoenix.
Involved in integrating Hive and HBase for purpose of generating reports.
Involved in team level meetings for knowledge transfer.
3. Project #2:
Title : Research and Analysis in Sales and Operations.
Client : Saint-Gobain
Environment : HDFS, Map Reduce, Hive, Pig, Sqoop, Cloudera
Role : Developer
Description: TheProjectbased on sales and operation information forthe business analysts. These
reports show the current month sales, prior month, previous year and year to date sales information
for different business units and different products. We get the flat files from SAP BI.
Saint-Gobain is French multinational company founded in 1665 in Paris. Originally, a mirror
manufacturer, it now produces a widevariety of construction and high performance materials. Saint-
Gobain is organized intofour major sectors.Building Distribution, Construction Products,Innovative
Materials, Packaging. Each Sector is further organized into Business Units (BUs) that serve specific
markets within each Sector.
Responsibilities:
Analyzing the functional Specifications bases on project required.
Involved in loading data from UNIX file system to HDFS.
Responsible for uploading dataset into Hadoop cluster.
Supported Map Reduce programs those are running on the cluster.
Involved in creating Hive tables, loading with data and writing hive queries which will run
internally in map reduce way.
Created partitioned tables and bucketing, using data from various regions.
Developed Sqoop scripts for having and interaction between HDFS, Hive and MySQL.
Project#3:
Title : Global Force Alliance Insurance
Role : Developer
Environment : Oracle ADF, Oracle10g, Jdeveloper11g
Description: Global Force Alliance is one of the reputed insurance companies in UK. It provides a
wide range of insurance policies such as Life Insurance, Fire Insurance, Motor Vehicle Insurance,
Accidental and healthcare Insurance etc. The main modules being contained in this project are
Production Processing, Production Definition, Customer Services, Customer Policy, Cashier, Billing,
and Underwriting. Production Processing: This is the entry point to start the business processes.
Insurance system automates the management of insurance activities:Branch Manager Details, Agent
Commission, Customer Policies Details, and Agents Details
Responsibilities
Interaction with client in gathering and understanding the requirements.
Created Database structure and extended the existing objects for customizations.
Designed and implemented ADF Business Components using EO,VO, AM, VL, Associations.
Developed User Interface using ADF Faces and ADF Task flows.
Developed Pages using ADF, JSF Components.
Application Testing, Debugging and Deployment.