Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Ramesh kutumbaka resume

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité

Consultez-les par la suite

1 sur 6 Publicité

Ramesh kutumbaka resume

Télécharger pour lire hors ligne

25 plus years of seasoned data professional in building, managing practices, Global Delivery in Big Data Analytics, Big Data Migration from On-premise to GCP and Azure, EDW & BI, Business analytics, SAP HANA, Predictive Analytics, Data QA, Automation of solutions, Big Data Framework & Methodologies, and Data Products Development

25 plus years of seasoned data professional in building, managing practices, Global Delivery in Big Data Analytics, Big Data Migration from On-premise to GCP and Azure, EDW & BI, Business analytics, SAP HANA, Predictive Analytics, Data QA, Automation of solutions, Big Data Framework & Methodologies, and Data Products Development

Publicité
Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Similaire à Ramesh kutumbaka resume (20)

Publicité

Plus récents (20)

Ramesh kutumbaka resume

  1. 1. Ramesh Kutumbaka 25 plus years of seasoned data professional in building, managing practices, Global Delivery in Big Data Analytics, Big Data Migration from On-premise to GCP and Azure, EDW & BI, Business analytics, SAP HANA, Predictive Analytics, Data QA, Automation of solutions, Big Data Framework & Methodologies, and Data Products Development. Phone: +919502023450 E-mail: kutumbakaramesh@gmail.com Professional Summary  A dynamic, seasoned professional, big data analytics evangelist with 25 plus years of rich experience in global delivery, Practice leading / Management , solution architecting, building Automated solutions, adept in diversified Big Data Analytics, EDW & BI tools &technologies stack, and various cloud platforms.  Built the practice from a team size of 40 with DataStage to 200 plus team on diversified tools & technology stack including Big Data Analytics, Data Science, EDW & BI, Data QA, and SAP HANA teams.  Good Knowledge and familiar with on Open source and Cloud Platforms [GCP, Azure and AWS] specific tools & technologies stack.  Helped customers with strategizing, new solution offerings, refactoring/ modernizing the existing solutions for idea to market, faster to market to improve the business growth.  Developed KM and Knowledge sharing frameworks and IPs, Automation of Big Data Data Ingestion, Data Processing, and Data QA.  Designed and Developed Big Data ETL Automation Testing (BEAT), CARBON data ingestion Automation solutions.  Developed Frameworks for DevOps for Managing Data Integrations, Hadoop Cluster Automation framework, and Cluster Automaton Framework Extension.  Designed 4ilytics platform, which is an end-to-end Automation of Data to Insights.  Designing an Automation modernized monitoring of Self-serve health check solution for Big Data production engineering environment.  Data Integration Tools Evaluation along with Tool Vendors for one of the customers in London.  Worked as a Solution Architect for preparing a Data warehouse cookbook for one of the customers in London. Big Data Delivery head, Practice head / Practice Manager, Solution Architect, Delivery Manager, Data Manager, Big Data with GCP, Azure and AWS Cloud Platforms, EDW & BI, Products development for Automation of Data Ingestion, Data QA and Insights, Agile Scrum methodology, Hadoop, Spark, Machine Learning (ML) & Artficial Intelligence, IoT and Predictive Analytics.
  2. 2. Page 2 of 6  Provided a BI Roadmap strategy for one of the customers in South Africa.  Led Enterprise Mobility CoE and built the competencies in Mobile Enterprise Application Platform (MEAP)/Mobile Application Development Platforms (MADP) such as PhoneGap, Titanium Appcelerator Classic & Alloy, jQuery Mobile, Oracle ADF Mobile, Sencha and Oracle Database Mobile Server (ODMS), Web Services and Oracle jDeveloper, Java J2EE, JMS, Tibco messaging system, Web Services.  Extensively worked on Application Development Environments and used open source systems and web Servers, Application Servers, IDE’s, Design Patterns and open source frame works.  Designed multi-tier architectures for Applications and products.  Qualified Six-Sigma Black Belt in Satyam Computer Services Ltd and certified Satyam Project Management Practitioner (SPMP). Specialties: Strategic Planning, Big Data Analytics, Machine Learning, Artificial Intelligence, Predictive Analytics, IPs Development, Solution Architecting, Practice Management, Data Integration, Data Governance, Data Management, Knowledge Management, Pre-sales support, and Agile Scrum Methodology. Organizational Experience Organization Designation/Role Duration Whisk Software Private Limited Senior Director 09/2015 – 06/2019 Apps Associates Pvt Ltd Practice Manager 12/2010 – 06/2015 Mahindra Satyam Band-Bi2 – Program Manager 10/2002 – 12/2010 Meteor Management Services Private Limited, India. Team Leader 09/2002 – 10/2002 Powerhare, India (P) Ltd, India. Team Leader 07/2001 – 09/2002 Fortune 500 Systems Ltd, USA. Programmer/Analyst 08/2000 – 06/2001 KDSS (P) Ltd, India. Member Product Development 02/2000 – 07/2000 Info India Pvt Ltd, India. Software Engineer 06/1999 – 02/2000 Maxpro Consultancy Services India. Programmer 01/1997 – 05/1999 Qualifications 1993 - Bachelors of Engineering (B.E) Degree in Electronics & Communication from Karnatak University. Experience Whisk software India Private Limited / GSPANN Technologies – 09/2015 to 06/2019 Senior Director –Information Analytics Practice Head offshore –Global Delivery & Products Development - Big Data Analytics, Data Lakes on GCP & Azure, EDW & BI, and Predictive Analytics Global Delivery: I am responsible for the practice build, Solution architecting and delivery assurance for 6 key customers. Delivery assurance responsibilities including: - Provided a refactoring / modernize (on-premise to Big data Azure cloud platform) Solution
  3. 3. architecture to address the challenges in the existing architecture to reduce the TCO, improve the performance and faster to the market. - Ingesting the data generated from various machines to create a data lake for machine learning. - Big Data Migration from on-premise EDW and Hadoop cluster Applications data to GCP Big Query. - DataStage enhancement projects and data migration from the on-premise to Azure and then later implemented the data migration from On-premise to GCP. - Data Warehouse / Data Integration Design & Implementations, MDM Integration Implementations and Data Governance & Data Security Automation projects. - Defining strategy for production engineering BI platform, design and implemented automation process. Prepared end-to-end data flow diagram for the BI platform. - Guided the team and identified frequent failure scenarios and documented them with proper actions, Identified the critical issues/frequent issues in production and worked with the team and other stake holders to ensure that those were addressed on time. - Strategized and implemented how effectively incident management can be used as knowledge base. Also created a knowledge base for production engineering team, job inventory, contacts [engineering team, business team, DevOps team etc.], frequent failure scenarios, critical business issues and resolution etc. Built Practice Capabilities: - Big Data Batch Ingestion & Processing: Open Source - Hadoop, Sqoop, Nifi, Flume, and Flink, GCP - Cloud DataProc, Azure - HDInsight, Batch, AWS - Amazon Elastic MapReduce, AWS Batch, MR, Hive, Pig, PySpark, Scala, and Java. - Data Storage: Open Source – Hive and HBase, GCP - Buckets, Cloud Storage, Azure - Azure Blob, AWS - S3 Buckets - Data Storage: Open Source – Hive and HBase, GCP - Buckets, Cloud Storage, Azure - Azure Blob, AWS - S3 Buckets - Streaming Ingestion & Process: Open Source - Spark Streaming, Flume, Kafka and Storm, GCP - Pub/Sub, Azure - Event Hubs and Service Bus, AWS - Kinesis and Kinesis firehose, Open Source - Spark Streaming, Flume, Kafka, Elastic Logstash, Samza, Goblin, and Storm, GCP – Data Flow, Azure - Stream Analytics, AWS – Kinesis Kinesis firehose - Analytics: Open Source – Hive and HBase, GCP – BigQuery, Azure - Data Lake Analytics, Data Lake Store, AWS - Redshift and Athena - Cloud Platforms – GCP, Azure, and AWS, - ETL & BI Tools – DataStage, Informatica, Talend, SSIS, Tableau, MSTR, Cognos, QS,QV, and Domo - Data Virtualization - Denedo - DBA – MS SQL Server, DB2, Oracle and MongoDB - SAP Tools & Technology stack – HANA Modeling, and BODS - Predictive Analytics – ML, R, and Python. Contribution to Improvising the margins & Business Growth: - Practice Growth and Resource Utilization - The practice has been grown from 40 to 200 plus and maintained 100% billing all the time. Referred and brought the best resources (Data Scientists, Data Architects, Data Modelers, Delivery Managers, ETL & BI, Data QA and Hadoop
  4. 4. Page 4 of 6 Resources) from my N/W into the system. Improved the practice margins by implementing the pyramid resource deployment structure and effective utilization of resource bandwidth. - Support to Sales Team to mine the Existing Accounts and for New Business:  Proof of Values  Case Studies  White Papers  Frameworks  Addressed RFPS Products Developed BEAT – Big Data ETL Automation Testing BEAT solution is to overcome all the traditional ETL testing challenges and 100% end-to-end ETL testing automation to achieve the 100% data validation & governance to achieve the trust of business users and SMEs to take the right business decisions. BEAT solution is very intuitive and customizable based solution on the data testing needs. Implemented for couple of key customers. The product meets the following objectives:  100% automation with zero programming  More tests coverage  100% Complete data validation  Accelerate testing  Cost savings  Faster to market CARBON – Automation of Data Ingestion Designed data ingestion product along with the delivery for key accounts. CARBON is an accelerator to Automate the Ingestion of Big Data from Any Data Source to Any Cloud Platform DW DBs. It creates a healthy data ecosystem to accelerate business insights. The objective of this product is to reduce the data ingestion time from months to weeks and weeks to hours. This has been implemented for one of the key Customers. CARBON meets the following objectives:  100% Automation of Data ingestion  Data Auditing  Data Validation 4ilytics – An end-to-end Automation of Data Ingestion to Insights 4iLytics is an Automation of end-to-end Data Ingestion to Insights Analytics Platform. It consists of 5 Automation components, which are 1). Data Lake Layers creation, 2). Data Ingestion & Data Auditing 3). Data Processing 4). Data Validation and 5). Machine Learning & Artificial Intelligence components. It is capable of supporting industry leading open source tools & technology stack and industry leading cloud platform tools & technology stack to build the Data Lakes from variety of Source Systems to achieve the “Single Version of the Truth” across the organization. 4ilytics ensure the trust of Business Intelligence and Business Analytics users for making their business decisions. It is an end-to- end data to decisions automation platform. It reduces the implementation time from months to weeks. All these 5 components can be decoupled and use as an individual purposes in the various stages of the data journey or in the data life cycle management. Just with few clicks Governed Data Lakes, Data Dictionary, Data Catalogs, Machine Learning Catalog, Diagnostic, Predictive and Prescriptive capabilities can be Automated. Business Analysts can come up with their own business use cases or business scenarios and
  5. 5. leverage the power of 4iLytics to build Machine Learning Models, diagnostic, predictive and prescriptive analytics. Less-technically savvy business users will be able to search what kind of Data Sets, KPIs, Metrics and Dimensions are available in the Data Lake, EDW, Data marts for “self-service BI” and “Self-service ML Modeling. 4iLytics platform will reduce TCO, Higher RoI, and Faster to market. Conferences 2018 Strata Data Conference - New York Sep 11-13, 2018 –Location: Jacob K. Javits Convention Center, New York, United States Participated in the Strata data conference in Sep, 2018as an exhibitorto exhibit BEAT, CARBON and Nitrate products. There was a tremendous response on these products. Innovations- Frameworks & IPs Developed: Constant innovation as part of Analytics Labs and ideation with research, framework & Products build outs, certification, learning and knowledge sharing with a goal of incubating ideas that will directly solve industry pain points.  BEAT – Big Data ETL Automation Testing Product  CARBON – Data Ingestion Automation Product  4ilytics – An end-to-end Automation of Data to Insights solution  DeManD – DevOps for Managing Data Integrations Framework  Test Results Dashboard - Test Results Web Portal Framework  CAFe – Cluster Automation Framework  AHC – Automation of Hadoop Cluster Framework  Proactive Monitor PE Big Data Health Framework  Map Utility - Monitoring Application for PDFM At Apps Associates Pvt Ltd – 12/2010 to June 2015 BI Practice Manager  Big Data & Advanced Analytics Initiatives  Global Projects Delivery  Build competency in DW & BI, Analytics and Enterprise Mobility Tools & Technologies  Build new solutions & service offerings in BI and Enterprise Mobility  Developed Data Vault methodology for EDW & Data Marts Implementation  Developed Enterprise Mobile Apps Implementation methodology  Strategic Initiatives and Build Assets  New service offerings  Build Proof of Value and present to customers and prospects  Defend Proposals  Sales Support At Mahindra Satyam & Satyam Computer Services, India - 10/2002 – 12/2010 Architect/Project and Program management  Define BI & DW Solution Architecture, DW Assessment, BI& DW Tools Evaluation, and Ownership of Building Competencies in various Tools & technologies  Build IP’s and Frameworks such as BI-Chassis Framework, Data Model Repository and iDecision Framework  Presales Support and defend Proposals  Conducted assessments and due diligence exercises of BI/DS/DSS environments, recommend roadmaps and technical architecture, and define BI Strategy Meteor Management Services Private Limited, India - 09/2002 – 10/2002 Architect  Design and Develop Enterprise Architecture for Invenio Tech Server
  6. 6. Page 6 of 6  Guided development and testing team  Built competenciesinOracle, Java, Struts Frame work, XML Schemas, DOM, Xerces1.4/2.0.1, Design Patterns and developed methodologies At Powerhare, India (P) Ltd, India - 07/2001 – 09/2002 Architect  Design and Develop Enterprise Architecture for Order Management and Portfolio Management products  Guided development and testing team  Buillt the competenciesinOracle, Java, Struts Frame work, XML Schemas, DOM, Xerces1.4/2.0.1, Design Patterns and development methodologies Fortune 500 Systems Ltd, USA - 08/2000 – 06/2001 Systems Analyst & Architect  Programmer/Analyst consultant for products and Applications development At KDSS (P) Ltd, India - 02/2000 – 07/2000 Software Engineer  Member of the Data Mining Product development, configure the product  Testing the product At Info India Pvt Ltd, India, - 06/1999 – 02/2000 Software Engineer  Interacted with the end users and prepared the business requirements  Contributed in Design of the Application  Developed and tested the code. At Maxpro Consultancy Services, India - 01/1997 – 05/1999 Programmer  Developed the code and tested. At Suryovonics Limited 03/1995 – Oct 1996 Sr. Engineer At Administrative Staff College of India 03/1994 – 03/1995 Research Associate PERSONAL DETAILS  Residential Address : House # 2-4-821/1,Gate # 19/7,Road #1,New Nagole Colony Hyderabad – 500035.  Passport No. : G4092037  Business Visa USA : Valid up to 10 FEB 2021

×