SlideShare une entreprise Scribd logo
1  sur  8
MOPURU BABU
Mob no#+1-321-210-3823
E Mail : babu.m@sunshineconsulting.co
EXPERIENCE SUMMARY
Over 9 years of professional experience in Software development in Java Technologies with 2 years
in Hadoop development using Hadoop, Hive, HBase, MapReduce, Pig, Sqoop, Oozie, Shell scripting,
Yarn,Scala,Spark to include design, developing and deploying n-tired and enterprise level distributed
applications.
• Working as HADOOP Developer since 3 years on various Hadoop platforms like Cloudera &
Hortonworks and over SIX years of experience in Software development on Java & Spring frame
works(Spring IoC/core, Spring DAO support, Spring ORM, Spring AOP, Spring Security, Spring
MVC, Spring Cache and Spring Integration).
• Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session
Facade, Business Delegate and Data Access Object for building J2EE Applications.
• Designed & developed several multi-tier Web based, Client-Server and Multithread applications
using Object Oriented Analysis and Design concepts and Service Oriented Architecture (SOA)
mostly in cross platform environment.
• Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC.
• Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts,
Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS.
• Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA).
• Experience in developing Hive and Pig scripts.
• Extensive knowledge of creating different Hive tables with different file compressions formats.
• Hands on experience in installing, configuring and administrating Hadoop cluster components
like MapReduce, HDFS, HBase, Hive, Sqoop, Spark, Pig, Zookeeper, Oozie and Flume using
Apache Code base.
• Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning,
custom configurations, monitoring and maintaining using Hadoop distributions: Cloudera CDH.
• Good experience using Apache SPARK.
• Worked on a prototype Apache Spark Streaming project, and converted our existing Java Storm
Topology.
• Experience managing Cloudera distribution of Hadoop(Cloudera Manager).
• Excellent understanding of NoSQL databases like HBase.
• Extensive working knowledge in setting up and running Clusters, monitoring, Data analytics,
Sentiment analysis, Predictive analysis, Data presentation with big data world.
• Hands on experience working on structured, unstructured data with various file formats such as
xml files, Json files, sequence files using Map Reduce programs.
• Extensive experience with wiring SQL queries using HiveQL to perform analytics on structured
data.
• Expertise in Data load management, importing & exporting data using SQOOP & FLUME.
• Performed different PIG operations, joining operations and transformations on data to join,
clean, aggregate and analyze data.
• Excellent interpersonal and communication skills, creative, research-minded, technically
competent and result-oriented with problem solving and leadership skills.
• Expert in Database, RDBMS concepts and using MS SQL Server and Oracle 10g.
• Expertise in working with web development technologies such as HTML, CSS, and JavaScript.
• Expertise in working with different methodologies like Waterfall and Agile.
• Proficient experience in using the databases such as MySQL, MS SQL Server, DB2.
1
SOFTWARE SKILLS
Elements Particulars
Primary Skills Analysis, Design, Development, Implementation, Testing & Packaging.
Languages Java
Big Data Skills Hadoop, Map Reduce, Hive, Pig, Sqoop, Oozie, Scala, Spark
RDBMS MS SQL server 2000/2005, DB2, Oracle 8.x/9i/10g
No SQL HBase, MarkLogic
Internet Technology JSP, HTML, XML & CSS
Scripting Language Java Script, JSON, Angular JS
Application Server Web Sphere 6.0, Web Logic 8.1, Jboss5.0
Web Server Tomcat 5.0
Frameworks Struts, Spring, Hibernate, Log4j
CM Tools IBM Clear Case, IBM Rational Team Concert (Think client &Web version),
WinCVS, SVN, GIT
Defect Tracking Tools IBM TSRM, IBM Rational Team Concert, IBM Rational Quality Manager
Build Tools Apache Ant - 1.6.5
Testing Tools JUnit
IDE & GUI Eclipse 3.3, IBM RAD, IBM RSA & Net Beans
Operating System Windows 7, Windows 95/98/ME/NT/XP, Unix & Linux
UML Modeling Tools Star UML
Web Technologies Servlets, JSP
INDUSTRY EXPERIENCE
• UK Public Sector
• Finance Sector
• Retail Sector
Achievements at Workplace
• Received star performer in IBM.
• Was appreciated for committed and reliable work.
• Received value awards, Deep Skill Adder awards in IBM.
• Received lot of appreciation in showing the team work, adapting to new technologies & many
client Appreciation.
Interpersonal Competencies
• Ability to interact successfully with multiple teams across the global organization, including
services and support for all regions.
• Strong mathematical, analytical background with innovative thoughts and creative action.
• Sound Tech-skills, getting-across business flow, Quick-grasp self-learning professional
• Value-added attitudes.
• Zeal to learn New Technologies.
2
Project #1 : SPST (Service Pac Product Selector Tool)
Client : IBM, Raleigh, NC
Environment : Hadoop, Hive, Pig, SQOOP, Map Reduce, Java(jdk1.7), LINUX, MySQL,
NoSQL, Cloudera, Spring
Duration : Apr’15 - Till date
Description : This is a web based on-line tool designed to help Offering and Sales
Managers and global Geo’s create, maintain and distribute Service Pac Product Offerings data
worldwide. Along with release enhancements and maintenance activities business analytics also
implemented.
Delivered multiple Big Data use cases to support custom sales offerings using Hadoop. Delivered
different KPIs and Metrics like.
Per country wide
- which offer description has least and more times occurred
- which offer number has least and most occurrence
- which offer type has least and more occurrences
- which partpin has least and more occurrences
- which offerpin has least and most occurrences
- what is the highest and what is the lowest cost. Are they repeated? if so how many times and for
what offer types they repeated
- List down the top 10 offer descriptions
- List down the top 10 offer types
- Which offer description has highest price and which one is lowest price
- What is the difference between the highest price and lowest price
- How many offers are released in each year
- for the customer which Part pin you will suggest - At least suggest 3 categories
- For the customer which Offer pin you will not suggest - you should not suggest 3 categories
- for how many records its under loss
- For the combination top offer description and the top offer code how many Part pins are repeated
- Country code has how many Offer numbers
- For each year prepare a tab separated file as below
year maxopencost maxclosecost offernumber offercode country
Zone wide
- All the above mentioned should be repeated for the entire zone
- zone has how many country codes
Total
- All the above should be repeated for the entire zone
- Total how many country codes are there and each zone has how many country codes. Come out
with a CSV formatted file output
Ex: zone1,AM,300
zone1,AT,365
Different Zones
- from each zone which country has more machine types
- from each zone which country has highest price and give the complete details.
- In the zone what is the highest price and give the complete details
− what is the latest announce date in a country, zone.
3
As a Senior Team Member, was responsible for
• Installation & configuration of a Hadoop cluster along with Hive.
• Developed Map Reduce application using Hadoop map reduce programming, a framework
for processing.
• Large data sets in parallel across the Hadoop cluster for pre-processing.
• Developed the code for Importing and exporting data into HDFS and Hive using Sqoop.
• Responsible for writing Hive Queries for analysing data in Hive warehouse using Hive Query
Language(HQL).
• Involved in defining job flows using Oozie for scheduling jobs to manage apache Hadoop jobs
by directed.
• Developing Hive User Defined Functions in java, compiling them into jars and adding them to
the HDFS and executing them with Hive Queries.
• Experienced in managing and reviewing Hadoop log files.
• Responsible to manage data coming from different sources.
• Assisted in monitoring the Hadoop cluster
• Dealing with high volume of data in the cluster.
• Tested and reported defects in an Agile Methodology perspective.
• Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and
drive it to closure.
• Installed Hadoop ecosystems (Hive, Pig, Sqoop, HBase, Oozie) on top of Hadoop cluster
• Importing data from SQL to HDFS & Hive for analytical purpose.
• Involved in developing the Controller, Service and DAO layers of Spring Framework for
developing dashboard for SPST project.
• Attending Business requirement Meetings, UAT Support, Onsite- Offshore Co-ordination,
Work Assignment.
Project #2 : Inventory Management
Client : Wal-Mart, Bentonville, AR
Environment : CDH4,Eclipse,HDFS,Hive, Map Reduce, Spark, Spark-SQL, Oozie, Sqoop, Pig
Duration : 22 Months (Jun’13 – Mar’15)
Description : The application tasks include data extraction from various input sources
like DB2, xml into Cassandra database. The incoming data is specifically related to the Wal-Mart
stores details like address, alignments, divisions, departments and etc. these data filtered according
the business logic and stored in the respective column families. Written some of the services to
retrieve the data from the Cassandra.
As a Senior Team Member, was responsible for
• Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing
of data.
• Import the data from different sources like HDFS/HBase into Spark RDD Developed a data
pipeline using Kafka and Storm to store data into HDFS. Performed real time analysis on the
incoming data.
• Automated the process for extraction of data from warehouses and weblogs by developing
work-flows and coordinator jobs in OOZIE.
• Developed Scala scripts, UDFs using both Data frames/SQL and RDD/Map Reduce in Spark
for Data Aggregation, queries and writing data back to OLTP system directly or through
Sqoop.
4
• Exploring with the Spark improving the performance and optimization of the existing
algorithms in Hadoop using Spark Context, Spark-SQL ,Spark YARN.
• Performance optimization dealing with large datasets using Partitions, Spark in Memory
capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other heavy
lifting during ingestion process itself.
• Used Spark API over Horton Works Hadoop YARN to perform analytics on data in Hive.
• Performed transformations like event joins, filter bot traffic and some pre-aggregations
using Pig.
• Developed Map Reduce jobs to convert data files into Parquet file format.
• Developed business specific Custom UDF's in Hive, Pig.
• Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with
time and data availability.
• Optimized Map Reduce code, pig scripts and performance tuning and analysis.
• Involvement in design, development and testing phases of Software Development Life Cycle.
• Performed Hadoop installation, updates, patches and version upgrades when required.
Project #3 : DWP CIS (Customer Information System)
Client : Public Sector, UK Government, Preston, UK
Environment : Core Java, J2ee, JSP, Struts, Web services, Altova XML Spy, WAS 6.0, RAD,
Apache ANT, Oracle SQL Developer, RTC 3.0
RDBMS : Oracle
Duration : 19 Months (Nov’ 11 – May’ 13)
Description : CIS is Customer Information system which holds all the personal details of
the UK Citizens. Along with the personal details that they are entitled for and registered are also
stored in CIS.CIS is primary for the personal and slave for the benefits and award details stored in the
database.
CIS was formed in 2004 by merging two systems (DCI and PDCS). DCI is Department Central Index
and PDCS is Personal Data Computer Store. These two were two different systems altogether and
they never used to communicate to each other.
Because of that whenever the change is happening in one system, the same was not getting
reflected in the other system. As part of Release 1, CIS database was built by having the information
to flow from DCI and PDCS to CIS.PDCS was COBOL based and all systems which were talking to PDCS
started to talk to CIS now through batch and online. Dialogues were replaced by SEF screens and
functions. This will mean that the mainframe dialogues will be replaced by browser based screens.
The CIS online Functions which will provide as replacement for DCI can be decomposed into number
of discrete function types like Primary access, Secondary access, Data link Requests and Inter Service
Access.
As a Senior Team Member, was responsible for
• Implementing GUI as per requirement.
• Involved to create the Functional Design and Technical Design.
• Involved in coding Business logic, Persist data with struts and JDBC, Unit Testing and
Integration Testing.
• Involved Re-Usable Components which can be used in all modules.
• Involved in supporting UAT activities and Production issues, fixing bugs and Defect Tracking.
• Involved in creating XSD.
• Involved in debugging, troubleshooting and defect fixing
• Involved in updating the XSDs based on the business requirements.
• Involved in testing of the Web services and integrating with external vendors and internal
clients.
5
• Assist developers in the technical design phase, construction and unit testing phase.
• Analyzing and understanding the architectural requirements.
• Proposing a new design solution which exceeds the client expectations.
• Involved in conducting peer code Reviews and Functional documents.
Project #4 : E Referrals
Client : Public Sector, UK Government, Preston, UK
Environment : Java, HTML, JSP, J2ee, JDBC, EDS Tool
RDBMS : My SQL 5.5
Servers : Tomcat
Duration : 12 Months (Nov’ 10 to Oct’ 11)
Description : The eReferrals over payments application provides a solution that allows
the user to complete online debt referrals. The service pre-populates, where possible, debt referral
screens with client data held in CIS. This is retrieved using a keyed NINO for the case.
The application at its most basic will:
 Create an eReferral to Debt Manager via user input and data scrape from CIS.
 Route the referral through the approval process
 Collate daily approved referrals into a Debt Referral Batch file.
 Dispatch file to Debt Manager
 A copy of the file held on eReferrals for 7 th day period only.
Incoming data will consist of client data via CIS, in XML format. Outgoing data destined for Debt
Manager will consist of XML document made up Debt referrals and various reports in CSV format.
As a Senior Team Member, was responsible for
• Involved in implementing GUI as per requirement.
• Involved in LLD, Functional Design and Technical Design.
• Involved create the web pages using Java server pages.
• Involved Controller logic in Servlets.
• Implemented client side validations using Java Script.
• Implemented JDBC components.
• Preparing Unit Test Cases.
Project #5 : E forms
Client : Public Sector, UK Government, Preston, UK
Environment : Core Java, HTML, Java script, JSP, Servlets, JDBC, RTC
RDBMS : My SQL 5.5
Servers : Tomcat
Duration : 9 Months (Feb’ 10 to Oct’ 10)
Description : E forms is a service available to all staff via the DWP intranet that allows
DWP staff to submit various personal related forms online. It was initially developed to allow staff
access to their pay information, and to submit expense and overtime claims online. However, the
majority of this week has now been taken over by the strategic resource management system. E
forms continue to provides services for few pay/expenses forms.
As a Senior Team Member, was responsible for
• Involved in implementing GUI as per requirement.
6
• Involved in LLD, Functional Design and Technical Design.
• Involved create the web pages using Java server pages.
• Involved Controller logic in Servlets.
• Implemented client side validations using Java Script.
• Implemented JDBC components.
• Preparing Unit Test Cases
Project #6 : ACG Italy
Client : ACG VISION4, Italy
Environment : Java, JSON, Struts, Hibernate, Web Sphere, RSA, Clear Case, DB2, JUnit,
RQM
RDBMS : DB2
Duration : 18 Months (Aug ’08 – Jan’ 10)
Description : ACG Vision4 is an ERP product this covers all processes of the company,
organized to process, designed to be used easily by all the different types of users, flexible in its
ability to integrate and interact with other systems inside and outside the company and whose
functions and data are accessible with the most popular systems for office automation and data-
independent, so intuitive and simple.
It’s having some modules like Finance, Controlling, Supply Chain Management and SVM.ACG has
developed Vision4 taking advantage of IBM's best technology and the most popular open source
technology on the market. The database and platform infrastructure is built on IBM DB2 and IBM
WebSphere, the activity of query, reporting and analysis is made with Cognos. For the part of
architectural construction, adhering to the principles of SOA, there was an undertaken using proven
industry standard such Hybernia, Struts, and Dojo Web 2.0, development was carried out entirely in
Java, everything installed on multiple platforms: Linux, OS400 and Windows.
As a Team Member, was responsible for
• Responsible for leading the project in various phases including design, development and Unit
testing of the application modules and management of a team.
• Worked as a Functional Group Leader for implementing the functional use cases.
• Develop the Re-Usable Components which can be used in all modules.
• Develop Proof of Concepts for ACG framework and provide technical solutions.
• Developed UI by using JSON.
• Implementing the Action Classes using Struts.
• Written Hibernate components and conducting peer code views.
• Requirement analysis from the business users and from onshore counter partner.
• Involve in enhancements, debugging, troubleshooting and defect fixing.
• Involve in Unit Testing, White Box Testing and Integration Testing of the application.
• Customizing and exploring new things on ACG framework
• Involved in assisting juniors.
Project #7 : eCIS (electronic Customer Information System)
Client : Service Master & MerryMaids, United States
Environment : Java, XML, Spring, Hibernate, JMS, Web Services and EXTJS
RDBMS : MSSQL Server 2005, Oracle
Servers : Tomcat 6 & JBoss 5.4
Duration : 11 Months (Sep '07 – July ’08)
Description : The eCIS –Electronic Customer Information System provides the service to
the customers and maintain the list of services for different branches and Franchises. eCIS is web
7
based and distributed web application. Here we are maintaining customer and Employee
information for different services. The end user is main active participate in the system.
The following are the list of modules for customer:
Customer / Employee / Service / Account Receivables / Utilities / Maintenance
As a Team Member, was responsible for
• Involved in Analysis, Effort estimation, Design & Development in the part of Employee Audit,
Email Reminders, Audit Rules and Workflow features enhancements.
• Involved in Analysis, Design and Development of Customization Frame Work enhancements
such as , Allocation Verification on Save, etc.
• Involved in debugging, troubleshooting and defect fixing.
• Involving in setting up and configuring the development and deployment of the application.
• Design & developed Customized Ext JS components for Application.
• Implementation of Spring Security for application, EXT Delegate / facade Data object controller,
Domain locking, Exception handling using springs.
EDUCATION
• Bachelor of Computer Science (Bsc) from S.V University.
• Master of Computer Application (MCA) from S.V University.
8

Contenu connexe

Tendances

Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNDataWorks Summit
 
HBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQLHBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQLDataWorks Summit
 
DoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics PlatformDoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics Platformmartinbpeters
 
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop WebinarWhy Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop WebinarCloudera, Inc.
 
Multitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and OozieMultitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and OozieDataWorks Summit
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
 
How Salesforce.com uses Hadoop
How Salesforce.com uses HadoopHow Salesforce.com uses Hadoop
How Salesforce.com uses HadoopNarayan Bharadwaj
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing DataWorks Summit
 
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARNDeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARNDataWorks Summit
 
Apache drill self service data exploration (113)
Apache drill   self service data exploration (113)Apache drill   self service data exploration (113)
Apache drill self service data exploration (113)MapR Technologies
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!DataWorks Summit
 
Data warehousing with Hadoop
Data warehousing with HadoopData warehousing with Hadoop
Data warehousing with Hadoophadooparchbook
 

Tendances (19)

Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 
HBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQLHBase and Drill: How loosley typed SQL is ideal for NoSQL
HBase and Drill: How loosley typed SQL is ideal for NoSQL
 
DoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics PlatformDoneDeal - AWS Data Analytics Platform
DoneDeal - AWS Data Analytics Platform
 
BIGDATA ppts
BIGDATA pptsBIGDATA ppts
BIGDATA ppts
 
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop WebinarWhy Every NoSQL Deployment Should Be Paired with Hadoop Webinar
Why Every NoSQL Deployment Should Be Paired with Hadoop Webinar
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Pallavi_Resume
Pallavi_ResumePallavi_Resume
Pallavi_Resume
 
Multitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and OozieMultitenancy At Bloomberg - HBase and Oozie
Multitenancy At Bloomberg - HBase and Oozie
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
 
How Salesforce.com uses Hadoop
How Salesforce.com uses HadoopHow Salesforce.com uses Hadoop
How Salesforce.com uses Hadoop
 
Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing Apache Tez: Accelerating Hadoop Query Processing
Apache Tez: Accelerating Hadoop Query Processing
 
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARNDeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
DeathStar: Easy, Dynamic, Multi-Tenant HBase via YARN
 
Splice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakesSplice machine-bloor-webinar-data-lakes
Splice machine-bloor-webinar-data-lakes
 
Introduction to Hadoop
Introduction to HadoopIntroduction to Hadoop
Introduction to Hadoop
 
50 Shades of SQL
50 Shades of SQL50 Shades of SQL
50 Shades of SQL
 
Apache drill self service data exploration (113)
Apache drill   self service data exploration (113)Apache drill   self service data exploration (113)
Apache drill self service data exploration (113)
 
Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!Analyzing the World's Largest Security Data Lake!
Analyzing the World's Largest Security Data Lake!
 
Data warehousing with Hadoop
Data warehousing with HadoopData warehousing with Hadoop
Data warehousing with Hadoop
 

En vedette

Medios de transmision
Medios de transmisionMedios de transmision
Medios de transmisionacademico
 
Schneeberger positioning tables_catalog
Schneeberger positioning tables_catalogSchneeberger positioning tables_catalog
Schneeberger positioning tables_catalogElectromate
 
Nuevas formas de comunicacion (ntic's)
Nuevas formas de comunicacion (ntic's)Nuevas formas de comunicacion (ntic's)
Nuevas formas de comunicacion (ntic's)Lina Penagos
 
Cables y redes
Cables y redesCables y redes
Cables y redeskjosorio00
 
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesanne
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesannePrimend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesanne
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesannePrimend
 
Ciberterrorismo: La nueva realidad de la Seguridad de la Información
Ciberterrorismo: La nueva realidad de la Seguridad de la InformaciónCiberterrorismo: La nueva realidad de la Seguridad de la Información
Ciberterrorismo: La nueva realidad de la Seguridad de la InformaciónManuel Santander
 
Herramientas e insumos para mantenimiento preventivo
Herramientas e insumos para mantenimiento preventivoHerramientas e insumos para mantenimiento preventivo
Herramientas e insumos para mantenimiento preventivoCarolinaSepulvedabetancur16
 
AMC New Products Webinar Cctober 2011
AMC New Products Webinar Cctober 2011AMC New Products Webinar Cctober 2011
AMC New Products Webinar Cctober 2011Electromate
 
02. cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero
02.  cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero02.  cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero
02. cuestionario de gálatas 1.3-24 solo hay un evangelio verdaderoComparte la Biblia
 
Glosario sistema digestivo
Glosario sistema digestivoGlosario sistema digestivo
Glosario sistema digestivoDaniel García
 
requisitos para instalacion de software
requisitos para instalacion de softwarerequisitos para instalacion de software
requisitos para instalacion de softwarealejalopezt5
 
Peritoneo y cavidad peritoneal
Peritoneo y cavidad peritonealPeritoneo y cavidad peritoneal
Peritoneo y cavidad peritonealLu Guamanquispe
 

En vedette (20)

Medios de transmision
Medios de transmisionMedios de transmision
Medios de transmision
 
Actividad 1.6
Actividad 1.6Actividad 1.6
Actividad 1.6
 
Bazo
BazoBazo
Bazo
 
Sharecash + adf.ly
Sharecash + adf.lySharecash + adf.ly
Sharecash + adf.ly
 
Schneeberger positioning tables_catalog
Schneeberger positioning tables_catalogSchneeberger positioning tables_catalog
Schneeberger positioning tables_catalog
 
Taller de sena macn
Taller de sena macnTaller de sena macn
Taller de sena macn
 
Presentación sobre gerena
Presentación sobre gerenaPresentación sobre gerena
Presentación sobre gerena
 
Nuevas formas de comunicacion (ntic's)
Nuevas formas de comunicacion (ntic's)Nuevas formas de comunicacion (ntic's)
Nuevas formas de comunicacion (ntic's)
 
Cables y redes
Cables y redesCables y redes
Cables y redes
 
Instalar software
Instalar software Instalar software
Instalar software
 
Unió Europea
Unió EuropeaUnió Europea
Unió Europea
 
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesanne
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesannePrimend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesanne
Primend Ärikonverents - Keynote: Tuleviku teadmine, juhi uus ülesanne
 
Medios de transmision
Medios de transmisionMedios de transmision
Medios de transmision
 
Ciberterrorismo: La nueva realidad de la Seguridad de la Información
Ciberterrorismo: La nueva realidad de la Seguridad de la InformaciónCiberterrorismo: La nueva realidad de la Seguridad de la Información
Ciberterrorismo: La nueva realidad de la Seguridad de la Información
 
Herramientas e insumos para mantenimiento preventivo
Herramientas e insumos para mantenimiento preventivoHerramientas e insumos para mantenimiento preventivo
Herramientas e insumos para mantenimiento preventivo
 
AMC New Products Webinar Cctober 2011
AMC New Products Webinar Cctober 2011AMC New Products Webinar Cctober 2011
AMC New Products Webinar Cctober 2011
 
02. cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero
02.  cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero02.  cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero
02. cuestionario de gálatas 1.3-24 solo hay un evangelio verdadero
 
Glosario sistema digestivo
Glosario sistema digestivoGlosario sistema digestivo
Glosario sistema digestivo
 
requisitos para instalacion de software
requisitos para instalacion de softwarerequisitos para instalacion de software
requisitos para instalacion de software
 
Peritoneo y cavidad peritoneal
Peritoneo y cavidad peritonealPeritoneo y cavidad peritoneal
Peritoneo y cavidad peritoneal
 

Similaire à Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark

Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerramaprasad owk
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copysrikanth K
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_ResumeYeduvaka Ganesh
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 

Similaire à Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark (20)

Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Resume
ResumeResume
Resume
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Gubendran Lakshmanan
Gubendran LakshmananGubendran Lakshmanan
Gubendran Lakshmanan
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
Venkata
VenkataVenkata
Venkata
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 

Dernier

Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...Steffen Staab
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...panagenda
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxComplianceQuest1
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsArshad QA
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️Delhi Call girls
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionSolGuruz
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...harshavardhanraghave
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerThousandEyes
 
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...Health
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...MyIntelliSource, Inc.
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfkalichargn70th171
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providermohitmore19
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsJhone kinadey
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AIABDERRAOUF MEHENNI
 
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female service
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female serviceCALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female service
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female serviceanilsa9823
 

Dernier (20)

Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
Microsoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdfMicrosoft AI Transformation Partner Playbook.pdf
Microsoft AI Transformation Partner Playbook.pdf
 
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
 
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
W01_panagenda_Navigating-the-Future-with-The-Hitchhikers-Guide-to-Notes-and-D...
 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
Diamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with PrecisionDiamond Application Development Crafting Solutions with Precision
Diamond Application Development Crafting Solutions with Precision
 
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
Reassessing the Bedrock of Clinical Function Models: An Examination of Large ...
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
 
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
 
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS LiveVip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
Vip Call Girls Noida ➡️ Delhi ➡️ 9999965857 No Advance 24HRS Live
 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
 
CHEAP Call Girls in Pushp Vihar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICECHEAP Call Girls in Pushp Vihar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
 
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdfThe Ultimate Test Automation Guide_ Best Practices and Tips.pdf
The Ultimate Test Automation Guide_ Best Practices and Tips.pdf
 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial Goals
 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
 
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female service
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female serviceCALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female service
CALL ON ➥8923113531 🔝Call Girls Badshah Nagar Lucknow best Female service
 

Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark

  • 1. MOPURU BABU Mob no#+1-321-210-3823 E Mail : babu.m@sunshineconsulting.co EXPERIENCE SUMMARY Over 9 years of professional experience in Software development in Java Technologies with 2 years in Hadoop development using Hadoop, Hive, HBase, MapReduce, Pig, Sqoop, Oozie, Shell scripting, Yarn,Scala,Spark to include design, developing and deploying n-tired and enterprise level distributed applications. • Working as HADOOP Developer since 3 years on various Hadoop platforms like Cloudera & Hortonworks and over SIX years of experience in Software development on Java & Spring frame works(Spring IoC/core, Spring DAO support, Spring ORM, Spring AOP, Spring Security, Spring MVC, Spring Cache and Spring Integration). • Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session Facade, Business Delegate and Data Access Object for building J2EE Applications. • Designed & developed several multi-tier Web based, Client-Server and Multithread applications using Object Oriented Analysis and Design concepts and Service Oriented Architecture (SOA) mostly in cross platform environment. • Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC. • Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts, Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS. • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA). • Experience in developing Hive and Pig scripts. • Extensive knowledge of creating different Hive tables with different file compressions formats. • Hands on experience in installing, configuring and administrating Hadoop cluster components like MapReduce, HDFS, HBase, Hive, Sqoop, Spark, Pig, Zookeeper, Oozie and Flume using Apache Code base. • Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning, custom configurations, monitoring and maintaining using Hadoop distributions: Cloudera CDH. • Good experience using Apache SPARK. • Worked on a prototype Apache Spark Streaming project, and converted our existing Java Storm Topology. • Experience managing Cloudera distribution of Hadoop(Cloudera Manager). • Excellent understanding of NoSQL databases like HBase. • Extensive working knowledge in setting up and running Clusters, monitoring, Data analytics, Sentiment analysis, Predictive analysis, Data presentation with big data world. • Hands on experience working on structured, unstructured data with various file formats such as xml files, Json files, sequence files using Map Reduce programs. • Extensive experience with wiring SQL queries using HiveQL to perform analytics on structured data. • Expertise in Data load management, importing & exporting data using SQOOP & FLUME. • Performed different PIG operations, joining operations and transformations on data to join, clean, aggregate and analyze data. • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills. • Expert in Database, RDBMS concepts and using MS SQL Server and Oracle 10g. • Expertise in working with web development technologies such as HTML, CSS, and JavaScript. • Expertise in working with different methodologies like Waterfall and Agile. • Proficient experience in using the databases such as MySQL, MS SQL Server, DB2. 1
  • 2. SOFTWARE SKILLS Elements Particulars Primary Skills Analysis, Design, Development, Implementation, Testing & Packaging. Languages Java Big Data Skills Hadoop, Map Reduce, Hive, Pig, Sqoop, Oozie, Scala, Spark RDBMS MS SQL server 2000/2005, DB2, Oracle 8.x/9i/10g No SQL HBase, MarkLogic Internet Technology JSP, HTML, XML & CSS Scripting Language Java Script, JSON, Angular JS Application Server Web Sphere 6.0, Web Logic 8.1, Jboss5.0 Web Server Tomcat 5.0 Frameworks Struts, Spring, Hibernate, Log4j CM Tools IBM Clear Case, IBM Rational Team Concert (Think client &Web version), WinCVS, SVN, GIT Defect Tracking Tools IBM TSRM, IBM Rational Team Concert, IBM Rational Quality Manager Build Tools Apache Ant - 1.6.5 Testing Tools JUnit IDE & GUI Eclipse 3.3, IBM RAD, IBM RSA & Net Beans Operating System Windows 7, Windows 95/98/ME/NT/XP, Unix & Linux UML Modeling Tools Star UML Web Technologies Servlets, JSP INDUSTRY EXPERIENCE • UK Public Sector • Finance Sector • Retail Sector Achievements at Workplace • Received star performer in IBM. • Was appreciated for committed and reliable work. • Received value awards, Deep Skill Adder awards in IBM. • Received lot of appreciation in showing the team work, adapting to new technologies & many client Appreciation. Interpersonal Competencies • Ability to interact successfully with multiple teams across the global organization, including services and support for all regions. • Strong mathematical, analytical background with innovative thoughts and creative action. • Sound Tech-skills, getting-across business flow, Quick-grasp self-learning professional • Value-added attitudes. • Zeal to learn New Technologies. 2
  • 3. Project #1 : SPST (Service Pac Product Selector Tool) Client : IBM, Raleigh, NC Environment : Hadoop, Hive, Pig, SQOOP, Map Reduce, Java(jdk1.7), LINUX, MySQL, NoSQL, Cloudera, Spring Duration : Apr’15 - Till date Description : This is a web based on-line tool designed to help Offering and Sales Managers and global Geo’s create, maintain and distribute Service Pac Product Offerings data worldwide. Along with release enhancements and maintenance activities business analytics also implemented. Delivered multiple Big Data use cases to support custom sales offerings using Hadoop. Delivered different KPIs and Metrics like. Per country wide - which offer description has least and more times occurred - which offer number has least and most occurrence - which offer type has least and more occurrences - which partpin has least and more occurrences - which offerpin has least and most occurrences - what is the highest and what is the lowest cost. Are they repeated? if so how many times and for what offer types they repeated - List down the top 10 offer descriptions - List down the top 10 offer types - Which offer description has highest price and which one is lowest price - What is the difference between the highest price and lowest price - How many offers are released in each year - for the customer which Part pin you will suggest - At least suggest 3 categories - For the customer which Offer pin you will not suggest - you should not suggest 3 categories - for how many records its under loss - For the combination top offer description and the top offer code how many Part pins are repeated - Country code has how many Offer numbers - For each year prepare a tab separated file as below year maxopencost maxclosecost offernumber offercode country Zone wide - All the above mentioned should be repeated for the entire zone - zone has how many country codes Total - All the above should be repeated for the entire zone - Total how many country codes are there and each zone has how many country codes. Come out with a CSV formatted file output Ex: zone1,AM,300 zone1,AT,365 Different Zones - from each zone which country has more machine types - from each zone which country has highest price and give the complete details. - In the zone what is the highest price and give the complete details − what is the latest announce date in a country, zone. 3
  • 4. As a Senior Team Member, was responsible for • Installation & configuration of a Hadoop cluster along with Hive. • Developed Map Reduce application using Hadoop map reduce programming, a framework for processing. • Large data sets in parallel across the Hadoop cluster for pre-processing. • Developed the code for Importing and exporting data into HDFS and Hive using Sqoop. • Responsible for writing Hive Queries for analysing data in Hive warehouse using Hive Query Language(HQL). • Involved in defining job flows using Oozie for scheduling jobs to manage apache Hadoop jobs by directed. • Developing Hive User Defined Functions in java, compiling them into jars and adding them to the HDFS and executing them with Hive Queries. • Experienced in managing and reviewing Hadoop log files. • Responsible to manage data coming from different sources. • Assisted in monitoring the Hadoop cluster • Dealing with high volume of data in the cluster. • Tested and reported defects in an Agile Methodology perspective. • Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and drive it to closure. • Installed Hadoop ecosystems (Hive, Pig, Sqoop, HBase, Oozie) on top of Hadoop cluster • Importing data from SQL to HDFS & Hive for analytical purpose. • Involved in developing the Controller, Service and DAO layers of Spring Framework for developing dashboard for SPST project. • Attending Business requirement Meetings, UAT Support, Onsite- Offshore Co-ordination, Work Assignment. Project #2 : Inventory Management Client : Wal-Mart, Bentonville, AR Environment : CDH4,Eclipse,HDFS,Hive, Map Reduce, Spark, Spark-SQL, Oozie, Sqoop, Pig Duration : 22 Months (Jun’13 – Mar’15) Description : The application tasks include data extraction from various input sources like DB2, xml into Cassandra database. The incoming data is specifically related to the Wal-Mart stores details like address, alignments, divisions, departments and etc. these data filtered according the business logic and stored in the respective column families. Written some of the services to retrieve the data from the Cassandra. As a Senior Team Member, was responsible for • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data. • Import the data from different sources like HDFS/HBase into Spark RDD Developed a data pipeline using Kafka and Storm to store data into HDFS. Performed real time analysis on the incoming data. • Automated the process for extraction of data from warehouses and weblogs by developing work-flows and coordinator jobs in OOZIE. • Developed Scala scripts, UDFs using both Data frames/SQL and RDD/Map Reduce in Spark for Data Aggregation, queries and writing data back to OLTP system directly or through Sqoop. 4
  • 5. • Exploring with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL ,Spark YARN. • Performance optimization dealing with large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other heavy lifting during ingestion process itself. • Used Spark API over Horton Works Hadoop YARN to perform analytics on data in Hive. • Performed transformations like event joins, filter bot traffic and some pre-aggregations using Pig. • Developed Map Reduce jobs to convert data files into Parquet file format. • Developed business specific Custom UDF's in Hive, Pig. • Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with time and data availability. • Optimized Map Reduce code, pig scripts and performance tuning and analysis. • Involvement in design, development and testing phases of Software Development Life Cycle. • Performed Hadoop installation, updates, patches and version upgrades when required. Project #3 : DWP CIS (Customer Information System) Client : Public Sector, UK Government, Preston, UK Environment : Core Java, J2ee, JSP, Struts, Web services, Altova XML Spy, WAS 6.0, RAD, Apache ANT, Oracle SQL Developer, RTC 3.0 RDBMS : Oracle Duration : 19 Months (Nov’ 11 – May’ 13) Description : CIS is Customer Information system which holds all the personal details of the UK Citizens. Along with the personal details that they are entitled for and registered are also stored in CIS.CIS is primary for the personal and slave for the benefits and award details stored in the database. CIS was formed in 2004 by merging two systems (DCI and PDCS). DCI is Department Central Index and PDCS is Personal Data Computer Store. These two were two different systems altogether and they never used to communicate to each other. Because of that whenever the change is happening in one system, the same was not getting reflected in the other system. As part of Release 1, CIS database was built by having the information to flow from DCI and PDCS to CIS.PDCS was COBOL based and all systems which were talking to PDCS started to talk to CIS now through batch and online. Dialogues were replaced by SEF screens and functions. This will mean that the mainframe dialogues will be replaced by browser based screens. The CIS online Functions which will provide as replacement for DCI can be decomposed into number of discrete function types like Primary access, Secondary access, Data link Requests and Inter Service Access. As a Senior Team Member, was responsible for • Implementing GUI as per requirement. • Involved to create the Functional Design and Technical Design. • Involved in coding Business logic, Persist data with struts and JDBC, Unit Testing and Integration Testing. • Involved Re-Usable Components which can be used in all modules. • Involved in supporting UAT activities and Production issues, fixing bugs and Defect Tracking. • Involved in creating XSD. • Involved in debugging, troubleshooting and defect fixing • Involved in updating the XSDs based on the business requirements. • Involved in testing of the Web services and integrating with external vendors and internal clients. 5
  • 6. • Assist developers in the technical design phase, construction and unit testing phase. • Analyzing and understanding the architectural requirements. • Proposing a new design solution which exceeds the client expectations. • Involved in conducting peer code Reviews and Functional documents. Project #4 : E Referrals Client : Public Sector, UK Government, Preston, UK Environment : Java, HTML, JSP, J2ee, JDBC, EDS Tool RDBMS : My SQL 5.5 Servers : Tomcat Duration : 12 Months (Nov’ 10 to Oct’ 11) Description : The eReferrals over payments application provides a solution that allows the user to complete online debt referrals. The service pre-populates, where possible, debt referral screens with client data held in CIS. This is retrieved using a keyed NINO for the case. The application at its most basic will:  Create an eReferral to Debt Manager via user input and data scrape from CIS.  Route the referral through the approval process  Collate daily approved referrals into a Debt Referral Batch file.  Dispatch file to Debt Manager  A copy of the file held on eReferrals for 7 th day period only. Incoming data will consist of client data via CIS, in XML format. Outgoing data destined for Debt Manager will consist of XML document made up Debt referrals and various reports in CSV format. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases. Project #5 : E forms Client : Public Sector, UK Government, Preston, UK Environment : Core Java, HTML, Java script, JSP, Servlets, JDBC, RTC RDBMS : My SQL 5.5 Servers : Tomcat Duration : 9 Months (Feb’ 10 to Oct’ 10) Description : E forms is a service available to all staff via the DWP intranet that allows DWP staff to submit various personal related forms online. It was initially developed to allow staff access to their pay information, and to submit expense and overtime claims online. However, the majority of this week has now been taken over by the strategic resource management system. E forms continue to provides services for few pay/expenses forms. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. 6
  • 7. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases Project #6 : ACG Italy Client : ACG VISION4, Italy Environment : Java, JSON, Struts, Hibernate, Web Sphere, RSA, Clear Case, DB2, JUnit, RQM RDBMS : DB2 Duration : 18 Months (Aug ’08 – Jan’ 10) Description : ACG Vision4 is an ERP product this covers all processes of the company, organized to process, designed to be used easily by all the different types of users, flexible in its ability to integrate and interact with other systems inside and outside the company and whose functions and data are accessible with the most popular systems for office automation and data- independent, so intuitive and simple. It’s having some modules like Finance, Controlling, Supply Chain Management and SVM.ACG has developed Vision4 taking advantage of IBM's best technology and the most popular open source technology on the market. The database and platform infrastructure is built on IBM DB2 and IBM WebSphere, the activity of query, reporting and analysis is made with Cognos. For the part of architectural construction, adhering to the principles of SOA, there was an undertaken using proven industry standard such Hybernia, Struts, and Dojo Web 2.0, development was carried out entirely in Java, everything installed on multiple platforms: Linux, OS400 and Windows. As a Team Member, was responsible for • Responsible for leading the project in various phases including design, development and Unit testing of the application modules and management of a team. • Worked as a Functional Group Leader for implementing the functional use cases. • Develop the Re-Usable Components which can be used in all modules. • Develop Proof of Concepts for ACG framework and provide technical solutions. • Developed UI by using JSON. • Implementing the Action Classes using Struts. • Written Hibernate components and conducting peer code views. • Requirement analysis from the business users and from onshore counter partner. • Involve in enhancements, debugging, troubleshooting and defect fixing. • Involve in Unit Testing, White Box Testing and Integration Testing of the application. • Customizing and exploring new things on ACG framework • Involved in assisting juniors. Project #7 : eCIS (electronic Customer Information System) Client : Service Master & MerryMaids, United States Environment : Java, XML, Spring, Hibernate, JMS, Web Services and EXTJS RDBMS : MSSQL Server 2005, Oracle Servers : Tomcat 6 & JBoss 5.4 Duration : 11 Months (Sep '07 – July ’08) Description : The eCIS –Electronic Customer Information System provides the service to the customers and maintain the list of services for different branches and Franchises. eCIS is web 7
  • 8. based and distributed web application. Here we are maintaining customer and Employee information for different services. The end user is main active participate in the system. The following are the list of modules for customer: Customer / Employee / Service / Account Receivables / Utilities / Maintenance As a Team Member, was responsible for • Involved in Analysis, Effort estimation, Design & Development in the part of Employee Audit, Email Reminders, Audit Rules and Workflow features enhancements. • Involved in Analysis, Design and Development of Customization Frame Work enhancements such as , Allocation Verification on Save, etc. • Involved in debugging, troubleshooting and defect fixing. • Involving in setting up and configuring the development and deployment of the application. • Design & developed Customized Ext JS components for Application. • Implementation of Spring Security for application, EXT Delegate / facade Data object controller, Domain locking, Exception handling using springs. EDUCATION • Bachelor of Computer Science (Bsc) from S.V University. • Master of Computer Application (MCA) from S.V University. 8