SlideShare une entreprise Scribd logo
1  sur  7
Big Data Technical Solution Architect & Data Analyst
Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM
Jayaram Parida
Mobile: +44-7478212120
Email: jairamparida@gmail.com
Jayaram Experience Summary
Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and
Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have
worked on various domains ( Banking, Telecom, Health Care and Logistics for product development
and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent
technical as well as communication skill with requirement analysis to design and development in
day to day client and team interaction.
Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG,
R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho,
CauchDB, Elastic Search and Lumify.
Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology
Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA,
Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and
Developer with ADF, AIA and Oracle SOA Suits.
3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics
Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business
Intelligence and Analytics Services.
10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications
experience for solution with SOA, Web Services and Middleware EA Integration for Banking,
Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical
Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments.
5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada,
UK and Europe.
3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data
Integration, Data enrichment and processing with service oriented architecture (SOA) and Business
Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM
Cognos.
1 | P a g e
Role Responsibilities:
• Provide technical oversight and mentoring to peers and junior teammates to grow the team’s
overall capability using Agile and SCRUM
• Provide consultative subject matter scoping assistance to Technical Sales Specialists as
needed to produce accurate statements of work as a customer deliverable
• Deliver customer facing consultative white-boarding sessions to help cultivate Big Data
services opportunities in partnership with the Sales team as needed
• Attend and support the Sales team at the Executive Briefing Center by presenting and
explaining the value of proposed Big Data solutions
• Build effective relationships with the Sales community by establishing self and team as the
Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization
• Present comprehensive analysis and documentation of the customer’s “As Is” state of
operations and recommendations for optimal “To Be” state. This requires proficiency in
creating and operating ROI & TCO models.
• Contribute to IP development, best practices, and knowledge sharing to drive DA
standardization.
As a Data Architect :
• 10+ years of business, software, or consulting experience with an emphasis on Business
Intelligence & Analytics, and project management. Professional services experience .
- Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels.
- 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive,
HBase, STORM)
- 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python)
- Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB,
Accumulo)
- 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools.
- 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau,
Cognos, Pentaho)
- Demonstrated experience in effectively supporting a Big Data practice in a professional
Services firm
- Proven ability to generate and build offers sets that drive multi-million dollar services
projects
- Demonstrated experience in generating and closing identified revenue targets
- Demonstrated ability to multitask and work multiple projects concurrently
- Demonstrated ability to provide technical oversight for large complex projects and achieve
desired customer satisfaction from inception to deployment in a consulting environment
- Advanced analytical, problem solving, negotiation and organizational skills with
demonstrated ability to multi-task, organize, prioritize and meet deadline
- Ability to interface with the client in a pre-sales fashion, and on an on-going basis as
projects progress
- Demonstrated initiative and positive can-do attitude
- High level of professionalism, integrity and commitment to quality
- Ability to work independently and as part of a solution team using agile and scrum
- Demonstrated attentiveness to quality and productivity
- Technical and project management experience with SCRUM & PMP
2 | P a g e
-
Hands on Project Experience with past projects :
• Experience setting up, optimizing and sustaining distributed big data environments at scale
(5+ nodes, 1+ TB)
• Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and
Impala
• Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and
Impala )
• Background with relational OLAP and OLTP databases, including one or more of PostgreSQL,
MySQL and SQL Server
• Experience writing scalable Map-Reduce jobs using general purpose languages such as Java,
Python
• Knowledge of common ETL packages/libraries, common analytical/statistical libraries and
common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend )
• Hands on Design and Development of Visualization Tools for Dashboards and Reporting
(Tableu, Microstrategy, QlickView and SAS)
• Experience with documentation of design, system builds, configurations, and support
procedures absolutely essential. The ability to clearly articulate strategic business value,
outside of technology, and present ideas to a client concisely.
• Experience with standard document used in support of the sales cycle (RFI/RFQ responses,
solutions proposals, current-state/future-state diagrams, platform descriptions, etc).
Big Data Infra structure architect:
• Real Life Hadoop deployment & administration engineer with 3+ years of experience with
good Technical Architect skills
• Hadoop ecosystem design, configuration and deployment Architecture definition and
documentation for Hadoop based production environment that can scale to petabytes of data
store and processing
• Having additional knowledge of Data and Infrastructure architecture with – designing of
cluster node configuration, connectivity, capacity, compute architecture, name node/data
node/job tracker deployment layout, server requirements, SAN, RAID configurations etc.
• Hands on practical working experience with leading big data technologies like HBase, Hive,
Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more.
• Deployed, administer and managed Hadoop Software on large cluster implementations
• Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms
3 | P a g e
(CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri)
• Strong experience of deployments and administration in Linux/Unix environments
• Past experience installing & administering Web App server's, RDBMS database design and
Modeling in a production data center environment
Professional Experience and Accomplishments
Present company : iGate Global Solutions, London , UK, working as a Big Data Technical
Architect from Dec’2014 and iGate, India from Dec 2012.
Academic and Professional Credentials
Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology
from Utkal University.
Professional Certification
• TOGAF 9 Certification
• SOA Certification from IBM.
• PMI, PMP Certification Course for appearing PMP exam
• Java 2 from Brain bench, Business English from Cambridge University
TRAINING
• Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware
School.UK
• IBM Web Sphere Message Broker, from IBM
• SAP Netweaver XI and EP from IBM
• Business Process Reengineering/Automation (BPR/BPA).
• GoF Design Patterns, Component Based Development (CBD) from IBM
• Performance Engineering, Software Quality Assurance from IBM.
• Personality improvements and behavioral patterns from IBM.
4 | P a g e
• Presentation and Communication Skills from IBM.
5 | P a g e
Big Data Project Detail :
At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto
Trading and Monitor Analytics application for online Sports and Gaming.
1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab
sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and
analyse the % of minerals, based on chemical composition. in specific mines.
Data Analysis : 1. Create cluster of ore diposites
2. Classify ores based on mineral ores.
3. find the maximum presence of Cupper
Client : RioTinto Innovation Center, Emerging Technology, Australia.
Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase,
Python, Pandas, MatplotLib and Scipy.
2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) :
Project Description: This big data analytics platform showcase how real time analytics use cases
can be solved using popular open source technologies (Storm+ Hadoop) to process real time data
in a fault tolerant and distributed manner. The major challenge here is; it needs to be always
available to process real time feeds; it needs to be scalable enough to process hundreds of
thousands of message per second; and it needs to support a scale-out distributed architecture to
process the stream in parallel. One the data collected from the sources, it can be processed and
analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions.
The multiwall real time dashboard has capable of monitoring multiple well drilling in a single
window. The system is also capable of storing real-time data in HDFS, HBase and any DW system
for batch Log Analysis and reporting.
Data Analysis : Stuck Pipe Analysis , Dashboard, Alert.
Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis
Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery,
WS and DOJO
Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime
Analystics Platform as POC using scrumas a Technical Scrum master.
3. Smart Meter Data Analytics (SMDA Analytics) Platform :
Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big
Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage
maximum benefit to different utility customers and service provider to get better business insight with low
investment for a developing high capacity big data analytic center.
Data Analysis : Customer Sentiment, Recommendation,
Client : COMCAST, USA
Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3
4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) :
is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for
patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is
possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to
6 | P a g e
be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The
discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we
report on the findings discovered using enhanced fuzzy mining and trace alignment.
Client : Philips, USA
Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView
Awards / Major Accomplishments
• Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer
Profile Rewrite) as Subject Matter Expert for the Offshore Team
• Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time
feed management to financial service Industries)
• Successfully developed more than 5 CD-Rom titles on different project as Project Leader &
Designer
• Designed and hosted many web sites for Private & Govt. Sectors
7 | P a g e

Contenu connexe

Tendances

Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
Abhinav khanduja
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
TRIVENI PATRO
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
abinash bindhani
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
Kishore Babu
 
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh Yadav
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
Bharath Kumar
 
Resume For Java Devloper
Resume For Java DevloperResume For Java Devloper
Resume For Java Devloper
veerendra_veeru
 
Manoj(Java Developer)_Resume
Manoj(Java Developer)_ResumeManoj(Java Developer)_Resume
Manoj(Java Developer)_Resume
Vamsi Manoj
 

Tendances (19)

hadoop resume
hadoop resumehadoop resume
hadoop resume
 
Resume
ResumeResume
Resume
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
 
Resume (2)
Resume (2)Resume (2)
Resume (2)
 
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Shanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resumeShanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Resume_Ayush_Hadoop
Resume_Ayush_HadoopResume_Ayush_Hadoop
Resume_Ayush_Hadoop
 
Soundarya Reddy Resume
Soundarya Reddy ResumeSoundarya Reddy Resume
Soundarya Reddy Resume
 
Sriramjasti
SriramjastiSriramjasti
Sriramjasti
 
Resume For Java Devloper
Resume For Java DevloperResume For Java Devloper
Resume For Java Devloper
 
Arindam Sengupta _ Resume
Arindam Sengupta _ ResumeArindam Sengupta _ Resume
Arindam Sengupta _ Resume
 
Murali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BIMurali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BI
 
Manoj(Java Developer)_Resume
Manoj(Java Developer)_ResumeManoj(Java Developer)_Resume
Manoj(Java Developer)_Resume
 

Similaire à Jayaram_Parida- Big Data Architect and Technical Scrum Master

Khalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI resume
Khalid SRIJI resume
Khalid SRIJI
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
bigdata sunil
 
Xu Zhe CV EN
Xu Zhe CV ENXu Zhe CV EN
Xu Zhe CV EN
Zhe Xu
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
Chandan Das
 

Similaire à Jayaram_Parida- Big Data Architect and Technical Scrum Master (20)

Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2
 
Khalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI resume
Khalid SRIJI resume
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
GauravSriastava
GauravSriastavaGauravSriastava
GauravSriastava
 
Purnachandra_Hadoop_N
Purnachandra_Hadoop_NPurnachandra_Hadoop_N
Purnachandra_Hadoop_N
 
Hemant Shedge_CV
Hemant Shedge_CVHemant Shedge_CV
Hemant Shedge_CV
 
Xu Zhe CV EN
Xu Zhe CV ENXu Zhe CV EN
Xu Zhe CV EN
 
Ramesh kutumbaka resume
Ramesh kutumbaka resumeRamesh kutumbaka resume
Ramesh kutumbaka resume
 
Resume Vardan Karapetian Updated
Resume Vardan Karapetian UpdatedResume Vardan Karapetian Updated
Resume Vardan Karapetian Updated
 
KEDAR_TERDALKAR
KEDAR_TERDALKARKEDAR_TERDALKAR
KEDAR_TERDALKAR
 
Big Data Analyst at BankofAmerica
Big Data Analyst at BankofAmericaBig Data Analyst at BankofAmerica
Big Data Analyst at BankofAmerica
 
Cendien - Senior Cognos Developer & Consultant
Cendien - Senior Cognos Developer & ConsultantCendien - Senior Cognos Developer & Consultant
Cendien - Senior Cognos Developer & Consultant
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Vaibhav_Rane
Vaibhav_RaneVaibhav_Rane
Vaibhav_Rane
 
SunilTiwari_CV
SunilTiwari_CVSunilTiwari_CV
SunilTiwari_CV
 
SunilTiwari_CV
SunilTiwari_CVSunilTiwari_CV
SunilTiwari_CV
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
 

Jayaram_Parida- Big Data Architect and Technical Scrum Master

  • 1. Big Data Technical Solution Architect & Data Analyst Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM Jayaram Parida Mobile: +44-7478212120 Email: jairamparida@gmail.com Jayaram Experience Summary Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have worked on various domains ( Banking, Telecom, Health Care and Logistics for product development and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent technical as well as communication skill with requirement analysis to design and development in day to day client and team interaction. Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG, R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho, CauchDB, Elastic Search and Lumify. Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA, Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and Developer with ADF, AIA and Oracle SOA Suits. 3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business Intelligence and Analytics Services. 10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications experience for solution with SOA, Web Services and Middleware EA Integration for Banking, Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments. 5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada, UK and Europe. 3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data Integration, Data enrichment and processing with service oriented architecture (SOA) and Business Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM Cognos. 1 | P a g e
  • 2. Role Responsibilities: • Provide technical oversight and mentoring to peers and junior teammates to grow the team’s overall capability using Agile and SCRUM • Provide consultative subject matter scoping assistance to Technical Sales Specialists as needed to produce accurate statements of work as a customer deliverable • Deliver customer facing consultative white-boarding sessions to help cultivate Big Data services opportunities in partnership with the Sales team as needed • Attend and support the Sales team at the Executive Briefing Center by presenting and explaining the value of proposed Big Data solutions • Build effective relationships with the Sales community by establishing self and team as the Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization • Present comprehensive analysis and documentation of the customer’s “As Is” state of operations and recommendations for optimal “To Be” state. This requires proficiency in creating and operating ROI & TCO models. • Contribute to IP development, best practices, and knowledge sharing to drive DA standardization. As a Data Architect : • 10+ years of business, software, or consulting experience with an emphasis on Business Intelligence & Analytics, and project management. Professional services experience . - Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels. - 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive, HBase, STORM) - 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python) - Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB, Accumulo) - 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools. - 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau, Cognos, Pentaho) - Demonstrated experience in effectively supporting a Big Data practice in a professional Services firm - Proven ability to generate and build offers sets that drive multi-million dollar services projects - Demonstrated experience in generating and closing identified revenue targets - Demonstrated ability to multitask and work multiple projects concurrently - Demonstrated ability to provide technical oversight for large complex projects and achieve desired customer satisfaction from inception to deployment in a consulting environment - Advanced analytical, problem solving, negotiation and organizational skills with demonstrated ability to multi-task, organize, prioritize and meet deadline - Ability to interface with the client in a pre-sales fashion, and on an on-going basis as projects progress - Demonstrated initiative and positive can-do attitude - High level of professionalism, integrity and commitment to quality - Ability to work independently and as part of a solution team using agile and scrum - Demonstrated attentiveness to quality and productivity - Technical and project management experience with SCRUM & PMP 2 | P a g e
  • 3. - Hands on Project Experience with past projects : • Experience setting up, optimizing and sustaining distributed big data environments at scale (5+ nodes, 1+ TB) • Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and Impala • Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and Impala ) • Background with relational OLAP and OLTP databases, including one or more of PostgreSQL, MySQL and SQL Server • Experience writing scalable Map-Reduce jobs using general purpose languages such as Java, Python • Knowledge of common ETL packages/libraries, common analytical/statistical libraries and common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend ) • Hands on Design and Development of Visualization Tools for Dashboards and Reporting (Tableu, Microstrategy, QlickView and SAS) • Experience with documentation of design, system builds, configurations, and support procedures absolutely essential. The ability to clearly articulate strategic business value, outside of technology, and present ideas to a client concisely. • Experience with standard document used in support of the sales cycle (RFI/RFQ responses, solutions proposals, current-state/future-state diagrams, platform descriptions, etc). Big Data Infra structure architect: • Real Life Hadoop deployment & administration engineer with 3+ years of experience with good Technical Architect skills • Hadoop ecosystem design, configuration and deployment Architecture definition and documentation for Hadoop based production environment that can scale to petabytes of data store and processing • Having additional knowledge of Data and Infrastructure architecture with – designing of cluster node configuration, connectivity, capacity, compute architecture, name node/data node/job tracker deployment layout, server requirements, SAN, RAID configurations etc. • Hands on practical working experience with leading big data technologies like HBase, Hive, Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more. • Deployed, administer and managed Hadoop Software on large cluster implementations • Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms 3 | P a g e
  • 4. (CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri) • Strong experience of deployments and administration in Linux/Unix environments • Past experience installing & administering Web App server's, RDBMS database design and Modeling in a production data center environment Professional Experience and Accomplishments Present company : iGate Global Solutions, London , UK, working as a Big Data Technical Architect from Dec’2014 and iGate, India from Dec 2012. Academic and Professional Credentials Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology from Utkal University. Professional Certification • TOGAF 9 Certification • SOA Certification from IBM. • PMI, PMP Certification Course for appearing PMP exam • Java 2 from Brain bench, Business English from Cambridge University TRAINING • Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware School.UK • IBM Web Sphere Message Broker, from IBM • SAP Netweaver XI and EP from IBM • Business Process Reengineering/Automation (BPR/BPA). • GoF Design Patterns, Component Based Development (CBD) from IBM • Performance Engineering, Software Quality Assurance from IBM. • Personality improvements and behavioral patterns from IBM. 4 | P a g e
  • 5. • Presentation and Communication Skills from IBM. 5 | P a g e
  • 6. Big Data Project Detail : At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto Trading and Monitor Analytics application for online Sports and Gaming. 1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and analyse the % of minerals, based on chemical composition. in specific mines. Data Analysis : 1. Create cluster of ore diposites 2. Classify ores based on mineral ores. 3. find the maximum presence of Cupper Client : RioTinto Innovation Center, Emerging Technology, Australia. Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase, Python, Pandas, MatplotLib and Scipy. 2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) : Project Description: This big data analytics platform showcase how real time analytics use cases can be solved using popular open source technologies (Storm+ Hadoop) to process real time data in a fault tolerant and distributed manner. The major challenge here is; it needs to be always available to process real time feeds; it needs to be scalable enough to process hundreds of thousands of message per second; and it needs to support a scale-out distributed architecture to process the stream in parallel. One the data collected from the sources, it can be processed and analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions. The multiwall real time dashboard has capable of monitoring multiple well drilling in a single window. The system is also capable of storing real-time data in HDFS, HBase and any DW system for batch Log Analysis and reporting. Data Analysis : Stuck Pipe Analysis , Dashboard, Alert. Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery, WS and DOJO Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime Analystics Platform as POC using scrumas a Technical Scrum master. 3. Smart Meter Data Analytics (SMDA Analytics) Platform : Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage maximum benefit to different utility customers and service provider to get better business insight with low investment for a developing high capacity big data analytic center. Data Analysis : Customer Sentiment, Recommendation, Client : COMCAST, USA Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3 4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) : is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to 6 | P a g e
  • 7. be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we report on the findings discovered using enhanced fuzzy mining and trace alignment. Client : Philips, USA Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView Awards / Major Accomplishments • Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer Profile Rewrite) as Subject Matter Expert for the Offshore Team • Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time feed management to financial service Industries) • Successfully developed more than 5 CD-Rom titles on different project as Project Leader & Designer • Designed and hosted many web sites for Private & Govt. Sectors 7 | P a g e