Jayaram_Parida- Big Data Architect and Technical Scrum Master

Jayaram Parida
Jayaram ParidaSenior Solution & Technical Architect for Big Data Analytics platform with Spark,Spark- ML, Scala and Machine Learning à Big Data Technical Architect at Cybage
Big Data Technical Solution Architect & Data Analyst
Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM
Jayaram Parida
Mobile: +44-7478212120
Email: jairamparida@gmail.com
Jayaram Experience Summary
Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and
Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have
worked on various domains ( Banking, Telecom, Health Care and Logistics for product development
and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent
technical as well as communication skill with requirement analysis to design and development in
day to day client and team interaction.
Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG,
R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho,
CauchDB, Elastic Search and Lumify.
Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology
Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA,
Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and
Developer with ADF, AIA and Oracle SOA Suits.
3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics
Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business
Intelligence and Analytics Services.
10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications
experience for solution with SOA, Web Services and Middleware EA Integration for Banking,
Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical
Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments.
5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada,
UK and Europe.
3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data
Integration, Data enrichment and processing with service oriented architecture (SOA) and Business
Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM
Cognos.
1 | P a g e
Role Responsibilities:
• Provide technical oversight and mentoring to peers and junior teammates to grow the team’s
overall capability using Agile and SCRUM
• Provide consultative subject matter scoping assistance to Technical Sales Specialists as
needed to produce accurate statements of work as a customer deliverable
• Deliver customer facing consultative white-boarding sessions to help cultivate Big Data
services opportunities in partnership with the Sales team as needed
• Attend and support the Sales team at the Executive Briefing Center by presenting and
explaining the value of proposed Big Data solutions
• Build effective relationships with the Sales community by establishing self and team as the
Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization
• Present comprehensive analysis and documentation of the customer’s “As Is” state of
operations and recommendations for optimal “To Be” state. This requires proficiency in
creating and operating ROI & TCO models.
• Contribute to IP development, best practices, and knowledge sharing to drive DA
standardization.
As a Data Architect :
• 10+ years of business, software, or consulting experience with an emphasis on Business
Intelligence & Analytics, and project management. Professional services experience .
- Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels.
- 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive,
HBase, STORM)
- 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python)
- Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB,
Accumulo)
- 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools.
- 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau,
Cognos, Pentaho)
- Demonstrated experience in effectively supporting a Big Data practice in a professional
Services firm
- Proven ability to generate and build offers sets that drive multi-million dollar services
projects
- Demonstrated experience in generating and closing identified revenue targets
- Demonstrated ability to multitask and work multiple projects concurrently
- Demonstrated ability to provide technical oversight for large complex projects and achieve
desired customer satisfaction from inception to deployment in a consulting environment
- Advanced analytical, problem solving, negotiation and organizational skills with
demonstrated ability to multi-task, organize, prioritize and meet deadline
- Ability to interface with the client in a pre-sales fashion, and on an on-going basis as
projects progress
- Demonstrated initiative and positive can-do attitude
- High level of professionalism, integrity and commitment to quality
- Ability to work independently and as part of a solution team using agile and scrum
- Demonstrated attentiveness to quality and productivity
- Technical and project management experience with SCRUM & PMP
2 | P a g e
-
Hands on Project Experience with past projects :
• Experience setting up, optimizing and sustaining distributed big data environments at scale
(5+ nodes, 1+ TB)
• Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and
Impala
• Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and
Impala )
• Background with relational OLAP and OLTP databases, including one or more of PostgreSQL,
MySQL and SQL Server
• Experience writing scalable Map-Reduce jobs using general purpose languages such as Java,
Python
• Knowledge of common ETL packages/libraries, common analytical/statistical libraries and
common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend )
• Hands on Design and Development of Visualization Tools for Dashboards and Reporting
(Tableu, Microstrategy, QlickView and SAS)
• Experience with documentation of design, system builds, configurations, and support
procedures absolutely essential. The ability to clearly articulate strategic business value,
outside of technology, and present ideas to a client concisely.
• Experience with standard document used in support of the sales cycle (RFI/RFQ responses,
solutions proposals, current-state/future-state diagrams, platform descriptions, etc).
Big Data Infra structure architect:
• Real Life Hadoop deployment & administration engineer with 3+ years of experience with
good Technical Architect skills
• Hadoop ecosystem design, configuration and deployment Architecture definition and
documentation for Hadoop based production environment that can scale to petabytes of data
store and processing
• Having additional knowledge of Data and Infrastructure architecture with – designing of
cluster node configuration, connectivity, capacity, compute architecture, name node/data
node/job tracker deployment layout, server requirements, SAN, RAID configurations etc.
• Hands on practical working experience with leading big data technologies like HBase, Hive,
Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more.
• Deployed, administer and managed Hadoop Software on large cluster implementations
• Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms
3 | P a g e
(CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri)
• Strong experience of deployments and administration in Linux/Unix environments
• Past experience installing & administering Web App server's, RDBMS database design and
Modeling in a production data center environment
Professional Experience and Accomplishments
Present company : iGate Global Solutions, London , UK, working as a Big Data Technical
Architect from Dec’2014 and iGate, India from Dec 2012.
Academic and Professional Credentials
Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology
from Utkal University.
Professional Certification
• TOGAF 9 Certification
• SOA Certification from IBM.
• PMI, PMP Certification Course for appearing PMP exam
• Java 2 from Brain bench, Business English from Cambridge University
TRAINING
• Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware
School.UK
• IBM Web Sphere Message Broker, from IBM
• SAP Netweaver XI and EP from IBM
• Business Process Reengineering/Automation (BPR/BPA).
• GoF Design Patterns, Component Based Development (CBD) from IBM
• Performance Engineering, Software Quality Assurance from IBM.
• Personality improvements and behavioral patterns from IBM.
4 | P a g e
• Presentation and Communication Skills from IBM.
5 | P a g e
Big Data Project Detail :
At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto
Trading and Monitor Analytics application for online Sports and Gaming.
1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab
sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and
analyse the % of minerals, based on chemical composition. in specific mines.
Data Analysis : 1. Create cluster of ore diposites
2. Classify ores based on mineral ores.
3. find the maximum presence of Cupper
Client : RioTinto Innovation Center, Emerging Technology, Australia.
Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase,
Python, Pandas, MatplotLib and Scipy.
2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) :
Project Description: This big data analytics platform showcase how real time analytics use cases
can be solved using popular open source technologies (Storm+ Hadoop) to process real time data
in a fault tolerant and distributed manner. The major challenge here is; it needs to be always
available to process real time feeds; it needs to be scalable enough to process hundreds of
thousands of message per second; and it needs to support a scale-out distributed architecture to
process the stream in parallel. One the data collected from the sources, it can be processed and
analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions.
The multiwall real time dashboard has capable of monitoring multiple well drilling in a single
window. The system is also capable of storing real-time data in HDFS, HBase and any DW system
for batch Log Analysis and reporting.
Data Analysis : Stuck Pipe Analysis , Dashboard, Alert.
Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis
Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery,
WS and DOJO
Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime
Analystics Platform as POC using scrumas a Technical Scrum master.
3. Smart Meter Data Analytics (SMDA Analytics) Platform :
Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big
Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage
maximum benefit to different utility customers and service provider to get better business insight with low
investment for a developing high capacity big data analytic center.
Data Analysis : Customer Sentiment, Recommendation,
Client : COMCAST, USA
Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3
4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) :
is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for
patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is
possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to
6 | P a g e
be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The
discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we
report on the findings discovered using enhanced fuzzy mining and trace alignment.
Client : Philips, USA
Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView
Awards / Major Accomplishments
• Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer
Profile Rewrite) as Subject Matter Expert for the Offshore Team
• Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time
feed management to financial service Industries)
• Successfully developed more than 5 CD-Rom titles on different project as Project Leader &
Designer
• Designed and hosted many web sites for Private & Govt. Sectors
7 | P a g e

Recommandé

Borja González - Resume ​Big Data Architect par
Borja González - Resume ​Big Data ArchitectBorja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data ArchitectBorja Gonzalez Martinez-Cabrera
559 vues4 diapositives
Hadoop Big Data Resume par
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
2.7K vues4 diapositives
sudipto_resume par
sudipto_resumesudipto_resume
sudipto_resumeSudipto Saha
478 vues7 diapositives
Suresh_Hadoop_Resume par
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
335 vues2 diapositives
Resume par
ResumeResume
ResumePraveen G
615 vues2 diapositives
Anil_BigData Resume par
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
1.8K vues4 diapositives

Contenu connexe

Tendances

hadoop resume par
hadoop resumehadoop resume
hadoop resumeHassan Qureshi
2.5K vues4 diapositives
Resume par
ResumeResume
ResumeSergey Sundukovskiy
3.6K vues7 diapositives
Resume_Abhinav_Hadoop_Developer par
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
344 vues3 diapositives
Resume_Triveni_Bigdata_Hadoop Professional par
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
387 vues3 diapositives
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop par
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
474 vues3 diapositives
Pankaj Resume for Hadoop,Java,J2EE - Outside World par
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K vues10 diapositives

Tendances(19)

Resume_Triveni_Bigdata_Hadoop Professional par TRIVENI PATRO
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
TRIVENI PATRO387 vues
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop par abinash bindhani
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
abinash bindhani474 vues
Pankaj Resume for Hadoop,Java,J2EE - Outside World par Pankaj Kumar
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar30K vues
Suresh_Yadav_Hadoop_Fresher_Resume par Suresh Yadav
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh Yadav90 vues
Shanthkumar 6yrs-java-analytics-resume par Shantha Kumar N
Shanthkumar 6yrs-java-analytics-resumeShanthkumar 6yrs-java-analytics-resume
Shanthkumar 6yrs-java-analytics-resume
Shantha Kumar N587 vues
Murali tummala resume in SAP BO/BI par Murali Tummala
Murali tummala resume in SAP BO/BIMurali tummala resume in SAP BO/BI
Murali tummala resume in SAP BO/BI
Murali Tummala10.2K vues
Manoj(Java Developer)_Resume par Vamsi Manoj
Manoj(Java Developer)_ResumeManoj(Java Developer)_Resume
Manoj(Java Developer)_Resume
Vamsi Manoj1.4K vues

Similaire à Jayaram_Parida- Big Data Architect and Technical Scrum Master

Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark par
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkMopuru Babu
59 vues8 diapositives
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark par
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkMopuru Babu
151 vues8 diapositives
Sourav banerjee resume par
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resumeSourav Banerjee
482 vues8 diapositives
Venkatesh-Babu-Profile2 par
Venkatesh-Babu-Profile2Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2Venkatesh Naidu
228 vues6 diapositives
Khalid SRIJI resume par
Khalid SRIJI resumeKhalid SRIJI resume
Khalid SRIJI resumeKhalid SRIJI
512 vues6 diapositives
Bigdata.sunil_6+yearsExp par
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
54 vues4 diapositives

Similaire à Jayaram_Parida- Big Data Architect and Technical Scrum Master(20)

Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark par Mopuru Babu
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Mopuru Babu59 vues
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark par Mopuru Babu
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Mopuru Babu151 vues
Xu Zhe CV EN par Zhe Xu
Xu Zhe CV ENXu Zhe CV EN
Xu Zhe CV EN
Zhe Xu259 vues
Resume Vardan Karapetian Updated par vkarapet
Resume Vardan Karapetian UpdatedResume Vardan Karapetian Updated
Resume Vardan Karapetian Updated
vkarapet726 vues
Cendien - Senior Cognos Developer & Consultant par Israel Denis
Cendien - Senior Cognos Developer & ConsultantCendien - Senior Cognos Developer & Consultant
Cendien - Senior Cognos Developer & Consultant
Israel Denis779 vues

Jayaram_Parida- Big Data Architect and Technical Scrum Master

  • 1. Big Data Technical Solution Architect & Data Analyst Senior Java/J2EE/SOA/ETL Solution Architect with Agile and SCRUM Jayaram Parida Mobile: +44-7478212120 Email: jairamparida@gmail.com Jayaram Experience Summary Total 19+ years of IT experience includes 3 years as Big Data Technical solution Architect) and Senior Java/J2EE, SOA, ETL, DI Technical Architect with 10+ years hands on experince have worked on various domains ( Banking, Telecom, Health Care and Logistics for product development and services) in multiple MNCs(IBM and iGate) in India and Abroad. Jayaram has excellent technical as well as communication skill with requirement analysis to design and development in day to day client and team interaction. Big Data Primary Skills : Hadoop, HDFS, HBase, Hive, Map Reduce, KAFKA, STORM, YARN, PIG, R, Python(Pandas), Talend, Cassandra, Scoop, HortonWorks, Cloudera, GrigGain, Pentaho, CauchDB, Elastic Search and Lumify. Secondary Skills: Java/J2EE, EAI/SOA, Application Integrator/Solution Architect/Technology Evangelist, SDLC System Architecture, IBM SOA(Web Sphere Tools), SOA Suits 11g, BPEL, SOA, Java, J2EE, ESB, Web Services, STRUTS, SPRING and Hibernate using ECLIPSE, NETBEANS and Developer with ADF, AIA and Oracle SOA Suits. 3+ Years as a Big Data Solution Architect/Data Scientist, focused on Data Analytics Requirement, RealTime Analytics platform, Large Scale Data Store and analysis, with Business Intelligence and Analytics Services. 10+ years as a Java and J2EE Technical Solution Architect for Enterprise Applications experience for solution with SOA, Web Services and Middleware EA Integration for Banking, Finance and Energy Utility Service Sector. 5+ years’ experience as an SOA, EAI Technical Leader/EAI Lead with hands on IBM SOA, Oracle SOA, Mule ESB design and Developments. 5+ years as an Onsite Technical Project Co-Coordinator for Onsite-Offshore Projects at Canada, UK and Europe. 3 + years of experience as an ETL and EAI as a Middleware Architect in fields like Data Integration, Data enrichment and processing with service oriented architecture (SOA) and Business Process Modeling (BPM), as well as implementation of SOA an SAP XI/BI, IBM Data Stage and IBM Cognos. 1 | P a g e
  • 2. Role Responsibilities: • Provide technical oversight and mentoring to peers and junior teammates to grow the team’s overall capability using Agile and SCRUM • Provide consultative subject matter scoping assistance to Technical Sales Specialists as needed to produce accurate statements of work as a customer deliverable • Deliver customer facing consultative white-boarding sessions to help cultivate Big Data services opportunities in partnership with the Sales team as needed • Attend and support the Sales team at the Executive Briefing Center by presenting and explaining the value of proposed Big Data solutions • Build effective relationships with the Sales community by establishing self and team as the Subject Matter Experts (SMEs) and “trusted advisors” for the BI &A Services organization • Present comprehensive analysis and documentation of the customer’s “As Is” state of operations and recommendations for optimal “To Be” state. This requires proficiency in creating and operating ROI & TCO models. • Contribute to IP development, best practices, and knowledge sharing to drive DA standardization. As a Data Architect : • 10+ years of business, software, or consulting experience with an emphasis on Business Intelligence & Analytics, and project management. Professional services experience . - Exceptional interpersonal, communication and presentation skills at CxO, and CTO levels. - 2+ years of experience with Apache Hadoop stack (e.g., YARN, MapReduce, Pig, Hive, HBase, STORM) - 10+ years of experience with related languages (e.g. Java, Linux, Apache, Python) - Knowledge of NoSQL platforms (Hbase, MongoDB, Cassandra, Elastic Search, Couch DB, Accumulo) - 6 years of experience with ETL (Extract-Transform-Load, Talend ) tools. - 3 years of experience with BI tools and reporting software (e.g. Microstrategy, Tableau, Cognos, Pentaho) - Demonstrated experience in effectively supporting a Big Data practice in a professional Services firm - Proven ability to generate and build offers sets that drive multi-million dollar services projects - Demonstrated experience in generating and closing identified revenue targets - Demonstrated ability to multitask and work multiple projects concurrently - Demonstrated ability to provide technical oversight for large complex projects and achieve desired customer satisfaction from inception to deployment in a consulting environment - Advanced analytical, problem solving, negotiation and organizational skills with demonstrated ability to multi-task, organize, prioritize and meet deadline - Ability to interface with the client in a pre-sales fashion, and on an on-going basis as projects progress - Demonstrated initiative and positive can-do attitude - High level of professionalism, integrity and commitment to quality - Ability to work independently and as part of a solution team using agile and scrum - Demonstrated attentiveness to quality and productivity - Technical and project management experience with SCRUM & PMP 2 | P a g e
  • 3. - Hands on Project Experience with past projects : • Experience setting up, optimizing and sustaining distributed big data environments at scale (5+ nodes, 1+ TB) • Strong knowledge of Hadoop and related tools including STORM, Pig, Hive, Sqoop, Flume and Impala • Knowledge of one or more NoSQL database such as Casandra or Hadoop NoSQL(Hive and Impala ) • Background with relational OLAP and OLTP databases, including one or more of PostgreSQL, MySQL and SQL Server • Experience writing scalable Map-Reduce jobs using general purpose languages such as Java, Python • Knowledge of common ETL packages/libraries, common analytical/statistical libraries and common graphical display packages (SAS, PENTAHO BA, Informatics BigData, and Talend ) • Hands on Design and Development of Visualization Tools for Dashboards and Reporting (Tableu, Microstrategy, QlickView and SAS) • Experience with documentation of design, system builds, configurations, and support procedures absolutely essential. The ability to clearly articulate strategic business value, outside of technology, and present ideas to a client concisely. • Experience with standard document used in support of the sales cycle (RFI/RFQ responses, solutions proposals, current-state/future-state diagrams, platform descriptions, etc). Big Data Infra structure architect: • Real Life Hadoop deployment & administration engineer with 3+ years of experience with good Technical Architect skills • Hadoop ecosystem design, configuration and deployment Architecture definition and documentation for Hadoop based production environment that can scale to petabytes of data store and processing • Having additional knowledge of Data and Infrastructure architecture with – designing of cluster node configuration, connectivity, capacity, compute architecture, name node/data node/job tracker deployment layout, server requirements, SAN, RAID configurations etc. • Hands on practical working experience with leading big data technologies like HBase, Hive, Pig, Oozie, Flume, Sqoop, Mahout, Hue, Hadoop, Zookeeper and more. • Deployed, administer and managed Hadoop Software on large cluster implementations • Well versed in installing & managed Apache , Horton and Cloudera Hadoop Data platforms 3 | P a g e
  • 4. (CDH3, CDH4, Cloudera manager, HUE and Hortonworks Ambri) • Strong experience of deployments and administration in Linux/Unix environments • Past experience installing & administering Web App server's, RDBMS database design and Modeling in a production data center environment Professional Experience and Accomplishments Present company : iGate Global Solutions, London , UK, working as a Big Data Technical Architect from Dec’2014 and iGate, India from Dec 2012. Academic and Professional Credentials Completed Masters in Computer Science(MCS) from DIT and DOEACC with BSC Hons in Zoology from Utkal University. Professional Certification • TOGAF 9 Certification • SOA Certification from IBM. • PMI, PMP Certification Course for appearing PMP exam • Java 2 from Brain bench, Business English from Cambridge University TRAINING • Oracle Fusion SOA Suits 11g, Oracle Fusion Middleware from Oracle Fusion Middleware School.UK • IBM Web Sphere Message Broker, from IBM • SAP Netweaver XI and EP from IBM • Business Process Reengineering/Automation (BPR/BPA). • GoF Design Patterns, Component Based Development (CBD) from IBM • Performance Engineering, Software Quality Assurance from IBM. • Personality improvements and behavioral patterns from IBM. 4 | P a g e
  • 5. • Presentation and Communication Skills from IBM. 5 | P a g e
  • 6. Big Data Project Detail : At present engaged with Betfair, UK as a Onsite Big Data Technical Architect to develop a Real time Auto Trading and Monitor Analytics application for online Sports and Gaming. 1. Mineral Exploration Data Analysis for Chemical Composition : To Collect large volume of Lab sample ore data from diifferent lab, and store it in hadoop File system, extract the mineral datae, and analyse the % of minerals, based on chemical composition. in specific mines. Data Analysis : 1. Create cluster of ore diposites 2. Classify ores based on mineral ores. 3. find the maximum presence of Cupper Client : RioTinto Innovation Center, Emerging Technology, Australia. Tools and Languages : Linux, with 3 data node 16GB / 500 GB, Hadoop 2.1, YARN, MR2, Hbase, Python, Pandas, MatplotLib and Scipy. 2. RT Drilling Well Data Analytics Platform ( iRTAnalytics) : Project Description: This big data analytics platform showcase how real time analytics use cases can be solved using popular open source technologies (Storm+ Hadoop) to process real time data in a fault tolerant and distributed manner. The major challenge here is; it needs to be always available to process real time feeds; it needs to be scalable enough to process hundreds of thousands of message per second; and it needs to support a scale-out distributed architecture to process the stream in parallel. One the data collected from the sources, it can be processed and analyzed for any anomalies, with predictive analysis rules for multiple well hazardous conditions. The multiwall real time dashboard has capable of monitoring multiple well drilling in a single window. The system is also capable of storing real-time data in HDFS, HBase and any DW system for batch Log Analysis and reporting. Data Analysis : Stuck Pipe Analysis , Dashboard, Alert. Client : SLUMBERGE, for Real Time Drilling Monitor and Analysis for Stuck-Pipe Analysis Technology : Cloudera Data Platform 4.1, KAFKA, STORM, Radis, HBASE and HIVE, JQuery, WS and DOJO Role was to take complete technical ownership for Architect, Design and Develop Hadoop Realtime Analystics Platform as POC using scrumas a Technical Scrum master. 3. Smart Meter Data Analytics (SMDA Analytics) Platform : Project Description: The objective of developing this Smart Grid and MDM Data Analytics platform using Big Data – Hadoop and Analytics tools and technology is to showcase iGate capability , how we can leverage maximum benefit to different utility customers and service provider to get better business insight with low investment for a developing high capacity big data analytic center. Data Analysis : Customer Sentiment, Recommendation, Client : COMCAST, USA Technology : Cloudera Data Platform 4.4, STORM, KAFKA, Mahout and JQuery, D3 4. iHeal BigData Analytics : Log event Analysis , Philips , Hospital Information System (HealtCare) : is analyzed using process mining techniques. The log contains events related to treatment and diagnosis steps for patients diagnosed with cancer. Given the heterogeneous nature of these cases, we first demonstrate that it is possible to create more homogeneous subsets of cases (e.g. patients having a particular type of cancer that need to 6 | P a g e
  • 7. be treated urgently). Such preprocessing is crucial given the variation and variability found in the event log. The discovered homogeneous subsets are analyzed using state of-the-art process mining approaches. In this Project, we report on the findings discovered using enhanced fuzzy mining and trace alignment. Client : Philips, USA Technology : Cloudera Data Platform 4.6, SPLUNK/HUNK, PIG, YARN, Mahout and QlickView Awards / Major Accomplishments • Received BRAVO Award in 2006 for the executing the large enterprise projects (APL Customer Profile Rewrite) as Subject Matter Expert for the Offshore Team • Active Member of Core Architect Team of IBM Web sphere Front Office (new product for real time feed management to financial service Industries) • Successfully developed more than 5 CD-Rom titles on different project as Project Leader & Designer • Designed and hosted many web sites for Private & Govt. Sectors 7 | P a g e