Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité

Consultez-les par la suite

1 sur 8 Publicité

Plus De Contenu Connexe

Diaporamas pour vous (19)

Les utilisateurs ont également aimé (20)

Publicité

Similaire à Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark (20)

Plus récents (20)

Publicité

Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark

  1. 1. MOPURU BABU Mob no#+1-321-210-3823 E Mail : babu.m@sunshineconsulting.co EXPERIENCE SUMMARY Over 9 years of professional experience in Software development in Java Technologies with 2 years in Hadoop development using Hadoop, Hive, HBase, MapReduce, Pig, Sqoop, Oozie, Shell scripting, Yarn,Scala,Spark to include design, developing and deploying n-tired and enterprise level distributed applications. • Working as HADOOP Developer since 3 years on various Hadoop platforms like Cloudera & Hortonworks and over SIX years of experience in Software development on Java & Spring frame works(Spring IoC/core, Spring DAO support, Spring ORM, Spring AOP, Spring Security, Spring MVC, Spring Cache and Spring Integration). • Expert knowledge over J2EE Design Patterns like MVC Architecture, Front Controller, Session Facade, Business Delegate and Data Access Object for building J2EE Applications. • Designed & developed several multi-tier Web based, Client-Server and Multithread applications using Object Oriented Analysis and Design concepts and Service Oriented Architecture (SOA) mostly in cross platform environment. • Excellent working knowledge of popular frameworks like Struts, Hibernate, and Spring MVC. • Developed core modules in large cross-platform applications using JAVA, J2EE, Spring, Struts, Hibernate, JAX-WS (SOAP)/JAX-RS(REST) Web Services JMS. • Good understanding of HDFS Designs, Daemons, federation and HDFS high availability (HA). • Experience in developing Hive and Pig scripts. • Extensive knowledge of creating different Hive tables with different file compressions formats. • Hands on experience in installing, configuring and administrating Hadoop cluster components like MapReduce, HDFS, HBase, Hive, Sqoop, Spark, Pig, Zookeeper, Oozie and Flume using Apache Code base. • Experience in Managing scalable Hadoop clusters including Cluster designing, provisioning, custom configurations, monitoring and maintaining using Hadoop distributions: Cloudera CDH. • Good experience using Apache SPARK. • Worked on a prototype Apache Spark Streaming project, and converted our existing Java Storm Topology. • Experience managing Cloudera distribution of Hadoop(Cloudera Manager). • Excellent understanding of NoSQL databases like HBase. • Extensive working knowledge in setting up and running Clusters, monitoring, Data analytics, Sentiment analysis, Predictive analysis, Data presentation with big data world. • Hands on experience working on structured, unstructured data with various file formats such as xml files, Json files, sequence files using Map Reduce programs. • Extensive experience with wiring SQL queries using HiveQL to perform analytics on structured data. • Expertise in Data load management, importing & exporting data using SQOOP & FLUME. • Performed different PIG operations, joining operations and transformations on data to join, clean, aggregate and analyze data. • Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills. • Expert in Database, RDBMS concepts and using MS SQL Server and Oracle 10g. • Expertise in working with web development technologies such as HTML, CSS, and JavaScript. • Expertise in working with different methodologies like Waterfall and Agile. • Proficient experience in using the databases such as MySQL, MS SQL Server, DB2. 1
  2. 2. SOFTWARE SKILLS Elements Particulars Primary Skills Analysis, Design, Development, Implementation, Testing & Packaging. Languages Java Big Data Skills Hadoop, Map Reduce, Hive, Pig, Sqoop, Oozie, Scala, Spark RDBMS MS SQL server 2000/2005, DB2, Oracle 8.x/9i/10g No SQL HBase, MarkLogic Internet Technology JSP, HTML, XML & CSS Scripting Language Java Script, JSON, Angular JS Application Server Web Sphere 6.0, Web Logic 8.1, Jboss5.0 Web Server Tomcat 5.0 Frameworks Struts, Spring, Hibernate, Log4j CM Tools IBM Clear Case, IBM Rational Team Concert (Think client &Web version), WinCVS, SVN, GIT Defect Tracking Tools IBM TSRM, IBM Rational Team Concert, IBM Rational Quality Manager Build Tools Apache Ant - 1.6.5 Testing Tools JUnit IDE & GUI Eclipse 3.3, IBM RAD, IBM RSA & Net Beans Operating System Windows 7, Windows 95/98/ME/NT/XP, Unix & Linux UML Modeling Tools Star UML Web Technologies Servlets, JSP INDUSTRY EXPERIENCE • UK Public Sector • Finance Sector • Retail Sector Achievements at Workplace • Received star performer in IBM. • Was appreciated for committed and reliable work. • Received value awards, Deep Skill Adder awards in IBM. • Received lot of appreciation in showing the team work, adapting to new technologies & many client Appreciation. Interpersonal Competencies • Ability to interact successfully with multiple teams across the global organization, including services and support for all regions. • Strong mathematical, analytical background with innovative thoughts and creative action. • Sound Tech-skills, getting-across business flow, Quick-grasp self-learning professional • Value-added attitudes. • Zeal to learn New Technologies. 2
  3. 3. Project #1 : SPST (Service Pac Product Selector Tool) Client : IBM, Raleigh, NC Environment : Hadoop, Hive, Pig, SQOOP, Map Reduce, Java(jdk1.7), LINUX, MySQL, NoSQL, Cloudera, Spring Duration : Apr’15 - Till date Description : This is a web based on-line tool designed to help Offering and Sales Managers and global Geo’s create, maintain and distribute Service Pac Product Offerings data worldwide. Along with release enhancements and maintenance activities business analytics also implemented. Delivered multiple Big Data use cases to support custom sales offerings using Hadoop. Delivered different KPIs and Metrics like. Per country wide - which offer description has least and more times occurred - which offer number has least and most occurrence - which offer type has least and more occurrences - which partpin has least and more occurrences - which offerpin has least and most occurrences - what is the highest and what is the lowest cost. Are they repeated? if so how many times and for what offer types they repeated - List down the top 10 offer descriptions - List down the top 10 offer types - Which offer description has highest price and which one is lowest price - What is the difference between the highest price and lowest price - How many offers are released in each year - for the customer which Part pin you will suggest - At least suggest 3 categories - For the customer which Offer pin you will not suggest - you should not suggest 3 categories - for how many records its under loss - For the combination top offer description and the top offer code how many Part pins are repeated - Country code has how many Offer numbers - For each year prepare a tab separated file as below year maxopencost maxclosecost offernumber offercode country Zone wide - All the above mentioned should be repeated for the entire zone - zone has how many country codes Total - All the above should be repeated for the entire zone - Total how many country codes are there and each zone has how many country codes. Come out with a CSV formatted file output Ex: zone1,AM,300 zone1,AT,365 Different Zones - from each zone which country has more machine types - from each zone which country has highest price and give the complete details. - In the zone what is the highest price and give the complete details − what is the latest announce date in a country, zone. 3
  4. 4. As a Senior Team Member, was responsible for • Installation & configuration of a Hadoop cluster along with Hive. • Developed Map Reduce application using Hadoop map reduce programming, a framework for processing. • Large data sets in parallel across the Hadoop cluster for pre-processing. • Developed the code for Importing and exporting data into HDFS and Hive using Sqoop. • Responsible for writing Hive Queries for analysing data in Hive warehouse using Hive Query Language(HQL). • Involved in defining job flows using Oozie for scheduling jobs to manage apache Hadoop jobs by directed. • Developing Hive User Defined Functions in java, compiling them into jars and adding them to the HDFS and executing them with Hive Queries. • Experienced in managing and reviewing Hadoop log files. • Responsible to manage data coming from different sources. • Assisted in monitoring the Hadoop cluster • Dealing with high volume of data in the cluster. • Tested and reported defects in an Agile Methodology perspective. • Consolidate all defects, report it to PM/Leads for prompt fixes by development teams and drive it to closure. • Installed Hadoop ecosystems (Hive, Pig, Sqoop, HBase, Oozie) on top of Hadoop cluster • Importing data from SQL to HDFS & Hive for analytical purpose. • Involved in developing the Controller, Service and DAO layers of Spring Framework for developing dashboard for SPST project. • Attending Business requirement Meetings, UAT Support, Onsite- Offshore Co-ordination, Work Assignment. Project #2 : Inventory Management Client : Wal-Mart, Bentonville, AR Environment : CDH4,Eclipse,HDFS,Hive, Map Reduce, Spark, Spark-SQL, Oozie, Sqoop, Pig Duration : 22 Months (Jun’13 – Mar’15) Description : The application tasks include data extraction from various input sources like DB2, xml into Cassandra database. The incoming data is specifically related to the Wal-Mart stores details like address, alignments, divisions, departments and etc. these data filtered according the business logic and stored in the respective column families. Written some of the services to retrieve the data from the Cassandra. As a Senior Team Member, was responsible for • Developed Spark code using Scala and Spark-SQL/Streaming for faster testing and processing of data. • Import the data from different sources like HDFS/HBase into Spark RDD Developed a data pipeline using Kafka and Storm to store data into HDFS. Performed real time analysis on the incoming data. • Automated the process for extraction of data from warehouses and weblogs by developing work-flows and coordinator jobs in OOZIE. • Developed Scala scripts, UDFs using both Data frames/SQL and RDD/Map Reduce in Spark for Data Aggregation, queries and writing data back to OLTP system directly or through Sqoop. 4
  5. 5. • Exploring with the Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL ,Spark YARN. • Performance optimization dealing with large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other heavy lifting during ingestion process itself. • Used Spark API over Horton Works Hadoop YARN to perform analytics on data in Hive. • Performed transformations like event joins, filter bot traffic and some pre-aggregations using Pig. • Developed Map Reduce jobs to convert data files into Parquet file format. • Developed business specific Custom UDF's in Hive, Pig. • Configured Oozie workflow to run multiple Hive and Pig jobs which run independently with time and data availability. • Optimized Map Reduce code, pig scripts and performance tuning and analysis. • Involvement in design, development and testing phases of Software Development Life Cycle. • Performed Hadoop installation, updates, patches and version upgrades when required. Project #3 : DWP CIS (Customer Information System) Client : Public Sector, UK Government, Preston, UK Environment : Core Java, J2ee, JSP, Struts, Web services, Altova XML Spy, WAS 6.0, RAD, Apache ANT, Oracle SQL Developer, RTC 3.0 RDBMS : Oracle Duration : 19 Months (Nov’ 11 – May’ 13) Description : CIS is Customer Information system which holds all the personal details of the UK Citizens. Along with the personal details that they are entitled for and registered are also stored in CIS.CIS is primary for the personal and slave for the benefits and award details stored in the database. CIS was formed in 2004 by merging two systems (DCI and PDCS). DCI is Department Central Index and PDCS is Personal Data Computer Store. These two were two different systems altogether and they never used to communicate to each other. Because of that whenever the change is happening in one system, the same was not getting reflected in the other system. As part of Release 1, CIS database was built by having the information to flow from DCI and PDCS to CIS.PDCS was COBOL based and all systems which were talking to PDCS started to talk to CIS now through batch and online. Dialogues were replaced by SEF screens and functions. This will mean that the mainframe dialogues will be replaced by browser based screens. The CIS online Functions which will provide as replacement for DCI can be decomposed into number of discrete function types like Primary access, Secondary access, Data link Requests and Inter Service Access. As a Senior Team Member, was responsible for • Implementing GUI as per requirement. • Involved to create the Functional Design and Technical Design. • Involved in coding Business logic, Persist data with struts and JDBC, Unit Testing and Integration Testing. • Involved Re-Usable Components which can be used in all modules. • Involved in supporting UAT activities and Production issues, fixing bugs and Defect Tracking. • Involved in creating XSD. • Involved in debugging, troubleshooting and defect fixing • Involved in updating the XSDs based on the business requirements. • Involved in testing of the Web services and integrating with external vendors and internal clients. 5
  6. 6. • Assist developers in the technical design phase, construction and unit testing phase. • Analyzing and understanding the architectural requirements. • Proposing a new design solution which exceeds the client expectations. • Involved in conducting peer code Reviews and Functional documents. Project #4 : E Referrals Client : Public Sector, UK Government, Preston, UK Environment : Java, HTML, JSP, J2ee, JDBC, EDS Tool RDBMS : My SQL 5.5 Servers : Tomcat Duration : 12 Months (Nov’ 10 to Oct’ 11) Description : The eReferrals over payments application provides a solution that allows the user to complete online debt referrals. The service pre-populates, where possible, debt referral screens with client data held in CIS. This is retrieved using a keyed NINO for the case. The application at its most basic will:  Create an eReferral to Debt Manager via user input and data scrape from CIS.  Route the referral through the approval process  Collate daily approved referrals into a Debt Referral Batch file.  Dispatch file to Debt Manager  A copy of the file held on eReferrals for 7 th day period only. Incoming data will consist of client data via CIS, in XML format. Outgoing data destined for Debt Manager will consist of XML document made up Debt referrals and various reports in CSV format. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases. Project #5 : E forms Client : Public Sector, UK Government, Preston, UK Environment : Core Java, HTML, Java script, JSP, Servlets, JDBC, RTC RDBMS : My SQL 5.5 Servers : Tomcat Duration : 9 Months (Feb’ 10 to Oct’ 10) Description : E forms is a service available to all staff via the DWP intranet that allows DWP staff to submit various personal related forms online. It was initially developed to allow staff access to their pay information, and to submit expense and overtime claims online. However, the majority of this week has now been taken over by the strategic resource management system. E forms continue to provides services for few pay/expenses forms. As a Senior Team Member, was responsible for • Involved in implementing GUI as per requirement. 6
  7. 7. • Involved in LLD, Functional Design and Technical Design. • Involved create the web pages using Java server pages. • Involved Controller logic in Servlets. • Implemented client side validations using Java Script. • Implemented JDBC components. • Preparing Unit Test Cases Project #6 : ACG Italy Client : ACG VISION4, Italy Environment : Java, JSON, Struts, Hibernate, Web Sphere, RSA, Clear Case, DB2, JUnit, RQM RDBMS : DB2 Duration : 18 Months (Aug ’08 – Jan’ 10) Description : ACG Vision4 is an ERP product this covers all processes of the company, organized to process, designed to be used easily by all the different types of users, flexible in its ability to integrate and interact with other systems inside and outside the company and whose functions and data are accessible with the most popular systems for office automation and data- independent, so intuitive and simple. It’s having some modules like Finance, Controlling, Supply Chain Management and SVM.ACG has developed Vision4 taking advantage of IBM's best technology and the most popular open source technology on the market. The database and platform infrastructure is built on IBM DB2 and IBM WebSphere, the activity of query, reporting and analysis is made with Cognos. For the part of architectural construction, adhering to the principles of SOA, there was an undertaken using proven industry standard such Hybernia, Struts, and Dojo Web 2.0, development was carried out entirely in Java, everything installed on multiple platforms: Linux, OS400 and Windows. As a Team Member, was responsible for • Responsible for leading the project in various phases including design, development and Unit testing of the application modules and management of a team. • Worked as a Functional Group Leader for implementing the functional use cases. • Develop the Re-Usable Components which can be used in all modules. • Develop Proof of Concepts for ACG framework and provide technical solutions. • Developed UI by using JSON. • Implementing the Action Classes using Struts. • Written Hibernate components and conducting peer code views. • Requirement analysis from the business users and from onshore counter partner. • Involve in enhancements, debugging, troubleshooting and defect fixing. • Involve in Unit Testing, White Box Testing and Integration Testing of the application. • Customizing and exploring new things on ACG framework • Involved in assisting juniors. Project #7 : eCIS (electronic Customer Information System) Client : Service Master & MerryMaids, United States Environment : Java, XML, Spring, Hibernate, JMS, Web Services and EXTJS RDBMS : MSSQL Server 2005, Oracle Servers : Tomcat 6 & JBoss 5.4 Duration : 11 Months (Sep '07 – July ’08) Description : The eCIS –Electronic Customer Information System provides the service to the customers and maintain the list of services for different branches and Franchises. eCIS is web 7
  8. 8. based and distributed web application. Here we are maintaining customer and Employee information for different services. The end user is main active participate in the system. The following are the list of modules for customer: Customer / Employee / Service / Account Receivables / Utilities / Maintenance As a Team Member, was responsible for • Involved in Analysis, Effort estimation, Design & Development in the part of Employee Audit, Email Reminders, Audit Rules and Workflow features enhancements. • Involved in Analysis, Design and Development of Customization Frame Work enhancements such as , Allocation Verification on Save, etc. • Involved in debugging, troubleshooting and defect fixing. • Involving in setting up and configuring the development and deployment of the application. • Design & developed Customized Ext JS components for Application. • Implementation of Spring Security for application, EXT Delegate / facade Data object controller, Domain locking, Exception handling using springs. EDUCATION • Bachelor of Computer Science (Bsc) from S.V University. • Master of Computer Application (MCA) from S.V University. 8

×