SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez nos Conditions d’utilisation et notre Politique de confidentialité.
SlideShare utilise les cookies pour améliorer les fonctionnalités et les performances, et également pour vous montrer des publicités pertinentes. Si vous continuez à naviguer sur ce site, vous acceptez l’utilisation de cookies. Consultez notre Politique de confidentialité et nos Conditions d’utilisation pour en savoir plus.
Big-data Hadoop developer
Mobile no: +91 7769 056 120, +91 8591 455 561
Seeking a challenging job where I can contribute my best towards the success of the
Company and company provide me with an opportunity to explore my potential to the fullest.
3.3 years of IT experience in TCS.
1.5 year of experience as Hadoop Developer and MongoDBA.
Knowledge and Experience of Hadoop and its ecosystem (HDFS, YARN, Map Reduce,
Hive, Impala, Sqoop, Flume, Oozie), Hue and MongoDB.
Working as MongoDB Administrator.
Basics of Map Reduce programming.
Unix Commands and Shell Scripting.
MapR Hadoop Developer certificated.
AS400 (SQL), Core Java, AWD (Automated Work Distributor) Tool , Jira , Putty ,
Projects and Trainings:
Customer Barclays (Bank UK) (Dec,2015- Till Date)
Role MongoDB Administration
Technical Skills Data Restoration,Replicated Cluster setup, Shard cluster setup, Upgrading
Cluster, Troubleshootingcluster issues.
Achievement M 102 MongoDB certified.
Appreciated by client on resolving Cluster issues.
Provided training sessions on MongoDB to associates.
Understood the concepts of NoSQL database.
Experience in CRUD and Aggregation operations of MongoDB.
Inserted of Large Documents using Mongo Import Tool.
Extracted data by using Mongo Export tool.
Worked on three types of files CSV, TSV and JSON.
Well known to Components of MongoDB package and their uses.
Customer PNC (Bank US) (Aug,2015-Nov,2015)
Role Hadoop Developer
Technical Skills Hive, Map Reduce, HDFS, YARN, Hue (Hadoop Web UI), Oozie, MySQL, Sqoop.
Achievement Ingestion of large data from RDBMS (MySQL) to HDFS by using Sqoop.
Monitoring of Oozie jobs.
Understood various hive optimization techniques
Well versed with hive functions.
Customer Aviva (Insurance UK) (April,2015- July,2015)
Role Hadoop Developer
Technical Skills Hive, Map Reduce, Hue (Hadoop Web UI), HDFS, YARN, Oozie.
Achievement Processingof largedata volume.
Solution provided for near real time capability to generate singleviewof
Provided Reusableand Scalable“HiveUser Defined Function" (UDF)
Code which generates primary key for each row.
Reduced the batch window for the ETL process.
Data rows belongingto same customer gets same Master Id.
Saved time by over 6 hours compared to Traditional Master Data
TCS Horizontal DESS PUNE (Jan,2015- Mar,2015)
Role Hadoop Developer
Technical Skills Map Reduce, Hive, Pig,Impala,Sqoop, Flume
Achievement Understoodthe differencebetweenRDBMS(MySQL) and
Hadoopalso solve differentproblemswhile migratingdata
fromMySQL to Hadoop (HDFS).
Implemented SCD type-1 and type-2 usingHiveand Impala.
Setup MongoDB shard cluster
Removed special characters fromthe data set usingHive UDF and
Used RowSequenceNumber UDF to create surrogatekey in Hive.
Created Agent configuration fileand injected data into HDFS using
Customer Friends Life (FL) (Sept ,2013 - Dec, 2014)
Project Profile AWD Tool,a tool which is used to Automate the work distribution
Responsibilities Review SRS (System Requirement Specification) after receivingfrom
business and do walk through/clarification of our understandingwith
business and provideeffortestimation on same.
Work distribution/allocation amongteam.
Enhancement and code review.
Analytics on the number of Incidents and MWI per month which helped to
reduce the number of repeat cases.
Automated the process usingJava
Issues handlingof the AWD users acrosstheglobe.
Clientinteraction over the ongoing issues and major incidents.
Technologies & Tool Core Java , Assysttool, SQL(AS400)
1) Removed special characters from the data present in HDFS using Hive UDF.
2) Imported relational data from MySQL using Sqoop. The imported data was loaded into Hive
tables and then SCD (Slowly Changing Dimensions type-1 and type-2) were performed on the
data to calculate the delta .Further real time query engine Impala was used for querying purpose.
3) Cluster setup of MongoDB in Project Lab of TCS DESS Pune.
IBM recognized certification from Big Data University in Hadoop fundamentals and Pig.
MapR certified in Hadoop essentials.
M102 MongoDBA certified from MongoDB University.
Class/Degree Institution/College Board/University Year of
Amritsar college of
Engg. and Technology
Manav Public School,
C.B.S.E 2008 65%
10th ST. Francis School,
I.C.S.E 2006 78%
INTERESTS AND HOBBIES:
Counter strike, Internet surfing, Casual follower of ‘Fighter Jets’, be up-to-date with latest
technology trends in market.
Extra Co-Curricular activities:
Actively participate in events organized by TCS.
Passed my graduation degree with distinction.
Participated in table tennis tournaments.
Participated in intra-college Gaming competition and stood SECOND
Worked as student coordinator in FUSION (science and cultural fair).
Active participator in intra-college Aptitude Test.
Participated national level aptitude test NITAT.
Coordinator in national level tech. fest “PRAYAAS-2011”.
Date of Birth : 14-12-1989
Father’s Name : Mr. Sushil Mahajan
Sex : Male
Linguistic skills : English, Hindi, Punjabi
Address : 143- A Tilak Nagar Near
Shivala Mandir Amritsar,
I assure you that the above given information is true to best of my knowledge.