Aryaan_CV

CURRICULUM VITAE
Mobile : (+91) 9049992064/ 9541587545
Email : sharma.aryaan.170991@gmail.com
Summary
 Having total of 1 years 11 Months industry experience.
 Having 1+ Year of experience in Hadoop Technologies.
 Having 9 Months experience in Microsoft .Net Technology.
 In depth and extensive knowledge of Hadoop architecture and various components.
 Passionate about Hadoop and Big Data technology.
 Having good knowledge in JAVA
 Knowledge of Eclipse tool for the development of Java projects.
 Excellent knowledge in OOPS (Object Oriented Programming Structure).
 Familiar with components of Hadoop Ecosystem: Map-Reduce, HDFS, Hive, SQOOP, HBase, Pig.
 Proficiency in using Map-Reduce to develop Hadoop applications and jobs.
 Developed applications for Distributed Environment using Hadoop, Mapreduce and Java.
 Experience in setting up a Hadoop Cluster with the size of 15 nodes.
 Written SQOOP Queries to import data into Hadoop from SQL Server and MySQL.
 Good communication and interpersonal skills and outstanding team player with an aptitude to learn.
Educational/Professional Qualification
Qualification Institution Year of
Passing
Board Percentage
PG-DAC Sunbeam Institute Pune 2014 DAC 66%
B.Tech (CSE) University Institute of
Technology
2013 Maharshi Dayanand
University,Rohtak
69 %
Senior Secondary SRS Public School 2009 (CBSE) 55%
Matriculation Delhi Public School 2007 (CBSE) 64%
Technical Skills
Skill Type Skill Name Years of experience Last used
Tools:
Oracle SQL Developer, PL SQL, Eclipse
Mars 2.0, MS Visual Studio 2013, Sales
Force Data Loader, Soap UI, HotDocs
1.11 Years July/2016
Languages: J2SE, C, C++, C#, SQL 1.11 Years July /2016
Name Total IT Experience
Aryaan Sharma 1.11 Years
Feameworks /Technologies :
Hadoop 2.0, Hive, Sqoop, Pig, HBase,
Spark, MapReduce, Asp.Net, MVC 5.0,
JavaScript, JQuery, AngularJS
1.11 Years July /2016
Operating System
Ubuntu , Cassandra, Windows 10,
Windows 8, Windows7, Windows XP,
Windows Server 2008 R2
1.11 Years July /2016
Employment History
Organization Name Organization Location &
Address
Tenure (MM/YY) Designation
Start Date End Date
Digital Group InfoTech Pvt.
Ltd
Hinjewadi, Pune September
10, 2014
Till Date Software Engineer
Project Details
Project 1:
Project Name * Verizon Network
Client name* Verizon, Network Providers in the US
Project Description* Verizon is one of the major network providers in the US offering a wide variety of plans for
consumers. This involves storing millions of callers' records, providing real time access to call
records and billing information to the customers. Traditional storage systems would not be able to
scale to the load. Since the data is really large, manual analysis is not possible. For the handling of
such large data and for providing analytics, Hadoop is used.
Technologies Used * Hadoop, HDFS, Hive, HBase, Zookeeper, Oozie, Hadoop Distribution of Cloudera, Java (jdk1.6),
Oracle, Spark, Sqoop.
Team Size* 5
Duration* On Going
•Responsibilities :-
1. Responsible for gathering requirements from the business partners.
2. Application development using Hadoop tools like Map-Reduce, Hive, Pig, HBase, oozie,
3. Cluster Monitoring and Troubleshooting, manage and review data backups and log
•Zookeeper and Sqoop:-
1. Collected the log data from web servers and integrated into HDFS using Sqoop.
2. Developed a process for Sqooping data from multiple sources like SQL Serve.
•Oracle :-
1. Developed Oozie workflow's for executing Sqoop and Hive actions.
2. Worked on Hadoop cluster which ranged from 8-10 nodes during pre-production stage
and it was sometimes extended up to 15 nodes during production
3. Responsible for Cluster maintenance, adding and removing cluster nodes.
•Files : -
1. Managing and scheduling Jobs on a Hadoop cluster.
2. Involved in defining job flows, managing and reviewing log files.
3. Installed Oozie workflow engine to run multiple Map Reduce, Hive HQL and Pig
• Jobs : -
1. Responsible to manage data coming from different sources.
2. Data manupulation used the NOSQL (Hbase)
3. Developed Hive scripts for performing transformation logic and also loading
Project 2:
Project Name * ADS- Arrow Data Services
Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA
Project Description* As the name Suggest, ADS is a service based Application which is developed for providing Service
End Points to “CT” on going Applications. In this project we have used Service Stack as Feamework
as it provides more features than Web API.
Technologies Used * Asp.Net, Web API, Service Stack, SQL
Team Size* 3
Duration* 4 Months
Responsibilities*  Development
 Unit Testing
 Data Base
Project 3:
Project Name * CT BLM
Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA
Project Description* BLMS is an application for business customers who wants to manage their business licenses.
Application is a flexible reporting functionality that would allow internal users to view CTA Business
License data.
The following are the requirements which will be built in this application
 View License-Location
 License Fee History
 License History
 License Documents
 License Compliance Events information
 License Group
Advance Search for licenses and quick views
Technologies Used * ASP.Net, MVC, Angular JS, Web API, SQL
Team Size* 12
Duration* 1 Months
Responsibilities*  UI Design
 Development
 Unit Testing( Client Side as well as Server Side)
 Data Base
Project 4:
Project Name * Knowledge Express Redesign
Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA
Project Description* Knowledge Express (KE) is a tool that assists employees in filing and retrieving documents. It
has existed in some form since CT has been providing "on-demand" services. As of today, KE
has the most information and robust search ability options than it has ever had.
Knowledge Express is primarily an internal information database designed to maintain statutory
and verified administrative policy information needed to assist Service Teams with filing
requirements for the various entity types supported by CT.
You will find business entity filing requirements including charts, news and filing office
schedule information, registered agent information, fees pertaining to filings covered within
KE, completion and execution instructions, special agency information and UCC and Fulfilment
information
Technologies Used * ASP.Net, MVC, Angular JS, Web API, Salesforce, SQL
Team Size* 9
Duration* 4 Months
Responsibilities*  UI Design
 Development
 Unit Testing
 Data Base
Personal Details:
Father’s Name Ashwani Sharma Date of Birth 17-09-1991
Sex Male Place of Birth Rohtak
Nationality Indian Marital Status Single
Passport Details (Mention if applicable)
Valid Passport (Y/N) G6953228 Valid Through 11/02/2018
Declaration
I confirm that the information provided by me in this application form is accurate and correct.
Signature Aryaan Sharma
Date July 04, 2016
Place Hinjewadi Pune

Recommandé

Unify Stream and Batch Processing using Dataflow, a Portable Programmable Mod... par
Unify Stream and Batch Processing using Dataflow, a Portable Programmable Mod...Unify Stream and Batch Processing using Dataflow, a Portable Programmable Mod...
Unify Stream and Batch Processing using Dataflow, a Portable Programmable Mod...DataWorks Summit
3K vues40 diapositives
Scaling Data Science on Big Data par
Scaling Data Science on Big DataScaling Data Science on Big Data
Scaling Data Science on Big DataDataWorks Summit
957 vues15 diapositives
HDInsight Hadoop on Windows Azure par
HDInsight Hadoop on Windows AzureHDInsight Hadoop on Windows Azure
HDInsight Hadoop on Windows AzureLynn Langit
6K vues35 diapositives
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop par
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
474 vues3 diapositives
Insights into Real-world Data Management Challenges par
Insights into Real-world Data Management ChallengesInsights into Real-world Data Management Challenges
Insights into Real-world Data Management ChallengesDataWorks Summit
2.2K vues74 diapositives
Manikyam_Hadoop_5+Years par
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam M
96 vues3 diapositives

Contenu connexe

Tendances

Using LLVM to accelerate processing of data in Apache Arrow par
Using LLVM to accelerate processing of data in Apache ArrowUsing LLVM to accelerate processing of data in Apache Arrow
Using LLVM to accelerate processing of data in Apache ArrowDataWorks Summit
2.1K vues31 diapositives
50 Shades of SQL par
50 Shades of SQL50 Shades of SQL
50 Shades of SQLDataWorks Summit
1K vues21 diapositives
Compute-based sizing and system dashboard par
Compute-based sizing and system dashboardCompute-based sizing and system dashboard
Compute-based sizing and system dashboardDataWorks Summit
286 vues27 diapositives
Hadoop in the Cloud: Real World Lessons from Enterprise Customers par
Hadoop in the Cloud: Real World Lessons from Enterprise CustomersHadoop in the Cloud: Real World Lessons from Enterprise Customers
Hadoop in the Cloud: Real World Lessons from Enterprise CustomersDataWorks Summit/Hadoop Summit
858 vues44 diapositives
Hadoop in the cloud – The what, why and how from the experts par
Hadoop in the cloud – The what, why and how from the expertsHadoop in the cloud – The what, why and how from the experts
Hadoop in the cloud – The what, why and how from the expertsDataWorks Summit
1.8K vues28 diapositives
Empowering you with Democratized Data Access, Data Science and Machine Learning par
Empowering you with Democratized Data Access, Data Science and Machine LearningEmpowering you with Democratized Data Access, Data Science and Machine Learning
Empowering you with Democratized Data Access, Data Science and Machine LearningDataWorks Summit
471 vues30 diapositives

Tendances(20)

Using LLVM to accelerate processing of data in Apache Arrow par DataWorks Summit
Using LLVM to accelerate processing of data in Apache ArrowUsing LLVM to accelerate processing of data in Apache Arrow
Using LLVM to accelerate processing of data in Apache Arrow
DataWorks Summit2.1K vues
Compute-based sizing and system dashboard par DataWorks Summit
Compute-based sizing and system dashboardCompute-based sizing and system dashboard
Compute-based sizing and system dashboard
DataWorks Summit286 vues
Hadoop in the cloud – The what, why and how from the experts par DataWorks Summit
Hadoop in the cloud – The what, why and how from the expertsHadoop in the cloud – The what, why and how from the experts
Hadoop in the cloud – The what, why and how from the experts
DataWorks Summit1.8K vues
Empowering you with Democratized Data Access, Data Science and Machine Learning par DataWorks Summit
Empowering you with Democratized Data Access, Data Science and Machine LearningEmpowering you with Democratized Data Access, Data Science and Machine Learning
Empowering you with Democratized Data Access, Data Science and Machine Learning
DataWorks Summit471 vues
Overview of stinger interactive query for hive par David Kaiser
Overview of stinger   interactive query for hiveOverview of stinger   interactive query for hive
Overview of stinger interactive query for hive
David Kaiser1.6K vues
Can you Re-Platform your Teradata, Oracle, Netezza and SQL Server Analytic Wo... par DataWorks Summit
Can you Re-Platform your Teradata, Oracle, Netezza and SQL Server Analytic Wo...Can you Re-Platform your Teradata, Oracle, Netezza and SQL Server Analytic Wo...
Can you Re-Platform your Teradata, Oracle, Netezza and SQL Server Analytic Wo...
DataWorks Summit990 vues
Protecting your Critical Hadoop Clusters Against Disasters par DataWorks Summit
Protecting your Critical Hadoop Clusters Against DisastersProtecting your Critical Hadoop Clusters Against Disasters
Protecting your Critical Hadoop Clusters Against Disasters
DataWorks Summit1.1K vues
Insights into Real World Data Management Challenges par DataWorks Summit
Insights into Real World Data Management ChallengesInsights into Real World Data Management Challenges
Insights into Real World Data Management Challenges
DataWorks Summit336 vues
Evolution of Big Data at Intel - Crawl, Walk and Run Approach par DataWorks Summit
Evolution of Big Data at Intel - Crawl, Walk and Run ApproachEvolution of Big Data at Intel - Crawl, Walk and Run Approach
Evolution of Big Data at Intel - Crawl, Walk and Run Approach
DataWorks Summit9.3K vues
Introduction to Designing and Building Big Data Applications par Cloudera, Inc.
Introduction to Designing and Building Big Data ApplicationsIntroduction to Designing and Building Big Data Applications
Introduction to Designing and Building Big Data Applications
Cloudera, Inc.5.7K vues

En vedette

Nagesh Hadoop Profile par
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profilenagesh madanala
173 vues3 diapositives
Ms dynamics ax hr training hyderabad par
Ms dynamics ax hr training hyderabadMs dynamics ax hr training hyderabad
Ms dynamics ax hr training hyderabadEcorp trainings
124 vues3 diapositives
ICT Council of Canada IWES Graduation par
ICT Council of Canada IWES GraduationICT Council of Canada IWES Graduation
ICT Council of Canada IWES GraduationVijayananda Mohire
229 vues1 diapositive
Srivenkata_Resume par
Srivenkata_ResumeSrivenkata_Resume
Srivenkata_ResumeSri Venkata
216 vues6 diapositives
Ramesh_MS Dynamics AX par
Ramesh_MS Dynamics AXRamesh_MS Dynamics AX
Ramesh_MS Dynamics AXRamesh Balakrishnan
964 vues7 diapositives
Software Training Services par
Software Training ServicesSoftware Training Services
Software Training ServicesVijayananda Mohire
378 vues4 diapositives

Similaire à Aryaan_CV

Pankaj Resume for Hadoop,Java,J2EE - Outside World par
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
30K vues10 diapositives
Sudhir hadoop and Data warehousing resume par
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir Saxena
1.2K vues6 diapositives
SreenivasulaReddy par
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddySreenivasula Reddy B
325 vues10 diapositives
Sourav_Giri_Resume_2015 par
Sourav_Giri_Resume_2015Sourav_Giri_Resume_2015
Sourav_Giri_Resume_2015sourav giri
267 vues7 diapositives
Balamurugan.KM_Arch par
Balamurugan.KM_Arch Balamurugan.KM_Arch
Balamurugan.KM_Arch Balamurugan KM
125 vues4 diapositives
C#_DotNET_RajeshKannanSubbiah par
C#_DotNET_RajeshKannanSubbiahC#_DotNET_RajeshKannanSubbiah
C#_DotNET_RajeshKannanSubbiahRajesh Kannan Subbiah
362 vues8 diapositives

Similaire à Aryaan_CV(20)

Pankaj Resume for Hadoop,Java,J2EE - Outside World par Pankaj Kumar
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Kumar30K vues
Sudhir hadoop and Data warehousing resume par Sudhir Saxena
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume
Sudhir Saxena1.2K vues
Sourav_Giri_Resume_2015 par sourav giri
Sourav_Giri_Resume_2015Sourav_Giri_Resume_2015
Sourav_Giri_Resume_2015
sourav giri267 vues
Resume_Dip_Shah par Dip Shah
Resume_Dip_ShahResume_Dip_Shah
Resume_Dip_Shah
Dip Shah310 vues

Aryaan_CV

  • 1. CURRICULUM VITAE Mobile : (+91) 9049992064/ 9541587545 Email : sharma.aryaan.170991@gmail.com Summary  Having total of 1 years 11 Months industry experience.  Having 1+ Year of experience in Hadoop Technologies.  Having 9 Months experience in Microsoft .Net Technology.  In depth and extensive knowledge of Hadoop architecture and various components.  Passionate about Hadoop and Big Data technology.  Having good knowledge in JAVA  Knowledge of Eclipse tool for the development of Java projects.  Excellent knowledge in OOPS (Object Oriented Programming Structure).  Familiar with components of Hadoop Ecosystem: Map-Reduce, HDFS, Hive, SQOOP, HBase, Pig.  Proficiency in using Map-Reduce to develop Hadoop applications and jobs.  Developed applications for Distributed Environment using Hadoop, Mapreduce and Java.  Experience in setting up a Hadoop Cluster with the size of 15 nodes.  Written SQOOP Queries to import data into Hadoop from SQL Server and MySQL.  Good communication and interpersonal skills and outstanding team player with an aptitude to learn. Educational/Professional Qualification Qualification Institution Year of Passing Board Percentage PG-DAC Sunbeam Institute Pune 2014 DAC 66% B.Tech (CSE) University Institute of Technology 2013 Maharshi Dayanand University,Rohtak 69 % Senior Secondary SRS Public School 2009 (CBSE) 55% Matriculation Delhi Public School 2007 (CBSE) 64% Technical Skills Skill Type Skill Name Years of experience Last used Tools: Oracle SQL Developer, PL SQL, Eclipse Mars 2.0, MS Visual Studio 2013, Sales Force Data Loader, Soap UI, HotDocs 1.11 Years July/2016 Languages: J2SE, C, C++, C#, SQL 1.11 Years July /2016 Name Total IT Experience Aryaan Sharma 1.11 Years
  • 2. Feameworks /Technologies : Hadoop 2.0, Hive, Sqoop, Pig, HBase, Spark, MapReduce, Asp.Net, MVC 5.0, JavaScript, JQuery, AngularJS 1.11 Years July /2016 Operating System Ubuntu , Cassandra, Windows 10, Windows 8, Windows7, Windows XP, Windows Server 2008 R2 1.11 Years July /2016 Employment History Organization Name Organization Location & Address Tenure (MM/YY) Designation Start Date End Date Digital Group InfoTech Pvt. Ltd Hinjewadi, Pune September 10, 2014 Till Date Software Engineer Project Details Project 1: Project Name * Verizon Network Client name* Verizon, Network Providers in the US Project Description* Verizon is one of the major network providers in the US offering a wide variety of plans for consumers. This involves storing millions of callers' records, providing real time access to call records and billing information to the customers. Traditional storage systems would not be able to scale to the load. Since the data is really large, manual analysis is not possible. For the handling of such large data and for providing analytics, Hadoop is used. Technologies Used * Hadoop, HDFS, Hive, HBase, Zookeeper, Oozie, Hadoop Distribution of Cloudera, Java (jdk1.6), Oracle, Spark, Sqoop. Team Size* 5 Duration* On Going •Responsibilities :- 1. Responsible for gathering requirements from the business partners. 2. Application development using Hadoop tools like Map-Reduce, Hive, Pig, HBase, oozie, 3. Cluster Monitoring and Troubleshooting, manage and review data backups and log •Zookeeper and Sqoop:- 1. Collected the log data from web servers and integrated into HDFS using Sqoop. 2. Developed a process for Sqooping data from multiple sources like SQL Serve. •Oracle :- 1. Developed Oozie workflow's for executing Sqoop and Hive actions. 2. Worked on Hadoop cluster which ranged from 8-10 nodes during pre-production stage and it was sometimes extended up to 15 nodes during production 3. Responsible for Cluster maintenance, adding and removing cluster nodes. •Files : - 1. Managing and scheduling Jobs on a Hadoop cluster. 2. Involved in defining job flows, managing and reviewing log files. 3. Installed Oozie workflow engine to run multiple Map Reduce, Hive HQL and Pig • Jobs : - 1. Responsible to manage data coming from different sources. 2. Data manupulation used the NOSQL (Hbase)
  • 3. 3. Developed Hive scripts for performing transformation logic and also loading Project 2: Project Name * ADS- Arrow Data Services Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA Project Description* As the name Suggest, ADS is a service based Application which is developed for providing Service End Points to “CT” on going Applications. In this project we have used Service Stack as Feamework as it provides more features than Web API. Technologies Used * Asp.Net, Web API, Service Stack, SQL Team Size* 3 Duration* 4 Months Responsibilities*  Development  Unit Testing  Data Base Project 3: Project Name * CT BLM Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA Project Description* BLMS is an application for business customers who wants to manage their business licenses. Application is a flexible reporting functionality that would allow internal users to view CTA Business License data. The following are the requirements which will be built in this application  View License-Location  License Fee History  License History  License Documents  License Compliance Events information  License Group Advance Search for licenses and quick views Technologies Used * ASP.Net, MVC, Angular JS, Web API, SQL Team Size* 12 Duration* 1 Months Responsibilities*  UI Design  Development  Unit Testing( Client Side as well as Server Side)  Data Base Project 4: Project Name * Knowledge Express Redesign Client name* CT Corporation System(“CT)”, a Wolters Kluwer Company, USA Project Description* Knowledge Express (KE) is a tool that assists employees in filing and retrieving documents. It has existed in some form since CT has been providing "on-demand" services. As of today, KE has the most information and robust search ability options than it has ever had. Knowledge Express is primarily an internal information database designed to maintain statutory and verified administrative policy information needed to assist Service Teams with filing
  • 4. requirements for the various entity types supported by CT. You will find business entity filing requirements including charts, news and filing office schedule information, registered agent information, fees pertaining to filings covered within KE, completion and execution instructions, special agency information and UCC and Fulfilment information Technologies Used * ASP.Net, MVC, Angular JS, Web API, Salesforce, SQL Team Size* 9 Duration* 4 Months Responsibilities*  UI Design  Development  Unit Testing  Data Base Personal Details: Father’s Name Ashwani Sharma Date of Birth 17-09-1991 Sex Male Place of Birth Rohtak Nationality Indian Marital Status Single Passport Details (Mention if applicable) Valid Passport (Y/N) G6953228 Valid Through 11/02/2018 Declaration I confirm that the information provided by me in this application form is accurate and correct. Signature Aryaan Sharma Date July 04, 2016 Place Hinjewadi Pune