SlideShare une entreprise Scribd logo
1  sur  7
RESUME
Anil Kumar Thyagarajan
#203, 3rd
Main 6th
Cross, HAL 3rd
Stage, Bangalore 560075
Phone: +91 9845403599
Email: anil.thyagarajan@gmail.com
URL: http://in.linkedin.com/pub/anil-thyagarajan/2/ab2/29a/
With 15+ years of IT experience primarily in Software development for various domains
such as Big Data Analytic, Structured Data, AWS/Azure Cloud Computing, Payment
Gateways, Search Advertising, Systems/Networking/ISP, Capital Market financial products
and Supply Chain Products. Articulate and professional business skills, highly productive in
team and individual projects, strong research and time-management with programming
skills driven to maintain the industry knowledge and technical skills through formal and
independent training.
Objective:
Seeking technical lead role in the area of Big Data Analytics, Cloud Computing and
Infrastructure development.
Career Profile:
Senior SDE Big Data Platform, Microsoft R&D India, Oct2014 – Till date
Technical Specialist R&D Cloud Computing, Nokia, Aug2011 – Oct2014
Principal Engineer for Yahoo India, Oct2009 – Aug2011
Sr Systems Programmer for AOL Inc, Aug2005 – Oct2009
Consultant for Capco IT services, India, Mar 2004 – Aug-2005
Software Engineer for HP India, India, Jun 2003 – Mar 2004
Product Engineer for i2 Technologies Inc, Bangalore, India, Jan 2000 – May 30 2003
Professional Strength:
• Leading team in technical direction, delivery and deploy.
• Bridging the gap with Architecture and development team, by transforming design to
product.
• Involve in Architectural design and contribute with new ideas.
• Effective team player with clear communication.
• Cloud Computing on Amazon Web Services for Building Big data Analytics platform.
• Distributed computing - Hadoop, NoSQL
• Expertise in areas of System/Network Tools development and large infrastructure
monitoring.
• Advanced Perl programming skills.
• Intermediate skills in Java & Python
• Experience with Amazon Web Services like EC2, S3 and EMR.
• Analytics tools experience – MapReduce, Hive
• Worked on multiple horizontal domains.
• Diverse functions – Development, Service Engineering, Infrastructure Tools and
Products.
• Practical thinker and quick learner.
Last Updated on 14 April, 2016 Page 1 of 7
Education:
Bachelors in Electrical & Electronics Engineering (1st Class)
(B M S College of Engineering, Bangalore, 1995-1999)
Technical Skills:
Operating Systems: Unix, Linux & Windows
Databases : Postgres, Mysql, Oracle 10g, Sybase
Languages : Advanced PERL, Python, C#, Java
Cloud : Azure, AWS
Distributed Systems : Virtualization, Hadoop ecosystem, NoSQL
Training & Courses
Technical Details:
• Perl - POE, CGI, Catalyst
• C#, .NET
• Java Spring resteasy
• Products on Amazon Web Services
• BGP, OSPF & ISIS configuration for routers.
• Java, ExtJS, HTML, Servlets, JSP
• Webserver – Apache, Jboss, Tomcat
• Hadoop Developer Training (Cloudera)
• Shell & Awk programming (System Logic, Bangalore)
• PL/SQL with Oracle9i (i2 Technologies)
• Data Ware Housing Tools(Erwin, Informatica, BI) – ETL
• C, C++ (LearnSoft, Bangalore)
• Linux Kernel Internals (Linux Training Center, Bangalore)
• Javascript, CSS (Yahoo Internal Training)
Management & Functional:
• License to Lead (Nokia Internal)
• PMPC Classes – Project Management Practitioners conference, PMI Bangalore
• Fundamentals of Capital Market (Capco, Sunnyvale US)
• PRM (partner Relationship Management) portal study and analysis(HP, Cupertino
USA)
• Quality System Awareness (Zandig TQM Solutions)
Last Updated on 14 April, 2016 Page 2 of 7
WORK ASSIGNMENTS
• Oct 2014 – Till Date ( Microsoft )
Working on Azure-HDInsight Big Data platform - A managed Apache Hadoop, Spark,
R, HBase, and Storm cloud service made easy.
- Worked on the platform to enable Hadoop ResourceManager High Availability
(RM-HA) in HDInsight Hadoop cluster.
- Enabled Azure Datalake store for HDInsight Linux clusters.
- Enhanced features in Linux Hadoop clusters offerings by introducing new
workflows.
- Working on tools to enhance monitoring and deployment productivity.
• Aug 2011 – Oct 2014 ( NOKIA )
Worked as a Tech Lead in Big Data Analytics. Being part of a cloud platform team, I
was engaged in development of platform tools and integrating functionalities around NoSQL
and Hadoop systems. My recent role was in building a platform for Analytics as a Service on
Amazon cloud (AWS).
Work Details:
- Part of the architecture team to fully design Hadoop eco system in Amazon
(AWS) as Analytic service. Also played a role of lead and mentor for the team.
Was responsible for the end-to-end execution of the project right from inception
to deployment in AWS.
o This included provisioning user/group and datasets with access control
using IAM and S3 buckets policy.
o Building many smaller services for data ingestion, extraction and user
authentication.
o Job Management for abstracting AWS EMR as part of the Analytics service
platform.
o Query Service for real time execution of Hive query in both synchronous
and asynchronous model over EMR as part of the Analytics service
platform
o Designing the orchestration of all the components that makes the
Analytics Service platform for Nokia
- Developed Tools and executed migration of 2PetaBytes of data from Hadoop to
Amazon-S3. This is very challenging in terms of handling millions of file and
validating the data integrity after transfer.
- Was part of the team to integrate Qubole (http://www.qubole.com - Big Data as
a Service) to our Analytics platform in AWS.
- Designed and developed a fully functional visual performance tool for Structured
Data (NoSQL). User interface in ExtJS, Middle layer using Perl-POE and load
driver in Python. This tool generates distributed load for large RPS (Req/sec) in a
timeline series.
- Experience on working with structured NoSQL key value store.
- Was part of the design and development of distributed system deployment tool.
Last Updated on 14 April, 2016 Page 3 of 7
- End2End Design and development of REST API layer for Hadoop. This included
client authentication strategy, rest layer authorization with HDFS using KDC and
exposing different functionalities as resources. JAVA Spring resteasy framework
was used to build the rest system.
- Experience on working with Map Reduce and tools around the Hadoop eco
system.
• Oct 2009 – Aug 2011 ( YAHOO India)
Played a role as Principal Engineer in Service Engineering of Yahoo’s Search
Advertising product called Panama (http://searchmarketing.yahoo.com). I also work
as a lead engineer for Yahoo payment gateway (wallet.yahoo.com).
Activities Include:
- Writing software in Perl to monitor large-scale diverse applications. Enhancing
framework for metric collection system.
- Writing plugins for monitoring large infrastructure through Nagios using Yahoo
built in wrappers.
- Deployment operations related to product development cycle and SCM.
- Managing the complete operations of the Search advertisement applications
related to technical issues and escalations. Contributing to system architecture
and designs.
- Complete ownership of hardware Systems related to sustenance and capacity
planning.
- Tool development related to System infrastructure, like vip viewers, network
topology and resources.
- New launches and release planning.
- 12X7 support and escalation for critical Oncall activities.
• Aug 2005 – Oct 2009 ( AOL India)
Details of Work Involved
I worked in the systems & network infrastructure team building and designing
solutions for the range of infrastructure products that directly impact the ISP back
bone and Internet services. Perl programming language is extensively used in
building network products that monitor and collect metric from thousands of host
and network devices like switches & routers.
• MEMOSYS – Metrics & Monitoring system is developed using the POE
networking CPAN module in Perl. This complex framework written in Perl uses
event driven mechanism to gather statistics. Socket, pipes and Tcl are heavily
used for inter process communication.
• ATDN ( AOL Transit Data Network ) - Writing frameworks & software for the
network engineering team. These involve communicating with Cisco/Juniper
routers, switches for configuration read & write. The software’s written in Perl
are used by network engineers to manage the network devices for various
topology related and maintenance activities.
• IRR – Internet routing registry ( whois.aoltw.net ) This is a daemon running a
database with information of worldwide prefixes commonly known as cidr’s.
We manage this critical data that gets pushed to all the ATDN pop and
backbone routers. Software written in Perl using POE for network data
collection and building GUI interfaces for customers.
Last Updated on 14 April, 2016 Page 4 of 7
• KPI – Key performance Indices is a system to statistically determine the
network usage for different parameters like device uplink utilization, pop to
backbone traffic etc.
• Billing Application – AOL ISP business bills its networking peers via this
application. The information of bandwidth used like the megabytes transferred
in/out through a router interface is stored in a huge Berkley DB and Perl
framework pulls data and builds reports that are used by finance analysts.
• NMS – Network management System is the centre point for all the meta
information of network devices. This is built around a complex Perl regex
engine to parse all the network device command output. This is a common
point of network data access for all other dependent applications. This has an
exposed GUI via Ruby on Rails.
• IVY (Install View & Yawn) – Automated Linux installing framework for all
verities of chip architecture like i386, x86_64. This framework written in
Perl(POE based) helps in remote installation and upgrading servers from
different versions.
• Standalone monitors – There are thousands of services that AOL provides and
they need to be monitored. So Perl is extensively used to monitor these
processes with POE (Perl object Environment) module and other CPAN
modules and home grown modules for router communication, DNS query,
IRR(Routing registry), Database connections etc.
• Mar 2004 – Aug 2005 ( Capco )
Capco is the first services and technology solutions provider exclusively focused on
forming the future of the financial services industry. We unite thought leadership and
practical application to improve efficiency and profitability for our clients.
Details of Work Involved
• Detailed system study on different products in the Finance Capital Market
domain (USA, CA SanJose)
• Products like GIM (Global Index Monitor), SW (Sector Watch) etc
programmed using perl and Sybase.
• Enhancement and new application development.
• Designed the enterprise frame work using Appconfig, SPOPS and Openinteract
to build a complete perl based engine for Global Dividend forecast.
• Jun 2003 – Mar 2004 (Digital – HP Company)
Digital being a subsidiary of HP executes many of its IT requirements. One of them
being the portal for IPG group called Pweb. This is a two way communication vehicle
between HP and the distributor/retailers. Many applications are hosted on this portal
which involves the Partners transacting for most of the business requirements with
HP.
Details of Work Involved
• Detailed system study and analysis for changing business requirements.
• Enhancement and new application development.
• 24/7 L3/L4 support of the system and escalations based on SLA.
• System Topology – HP Unix (Requires Unix admin skills)
Oracle 9i (Latest techniques used)
Last Updated on 14 April, 2016 Page 5 of 7
Perl & Mod Perl (For CGI programming)
Apache (Web server)
• System Architecture - Audiencing technique (To generate separate views for
every user)
Application interacts with multiple remote databases.
• Working with the above said technologies wrt complex business requirements.
• Jan 2000 – May 30 2003 (i2 Technologies)
i2 is a leading provider of value chain management solutions. i2's value chain management
solutions help companies plan and execute the activities involved in managing supply and
demand. These solutions span the entire scope of value chain interactions, including
supplier relationship management, supply chain management and demand chain
management. As a product Engineer in Content operations which is a life blood for other i2
solutions I was given the responsibility of vendor Management. Most of the database
development was created in the vendor location(Outsourced work). I was actively involved
in Vendor development, Process Building and development of required tools i2 is a pioneer
in MRO(Maintenance & Repair Operations) vertical and hence it required lot of expertise &
technical skills in the Core Electrical & Electronics component segment. Complex parent-
Class hierarchy is being developed to accommodate part information of technical products
across thousands of Manufacturers and suppliers. Parametric Data created is integrated or
Migrated into multiple customer databases The databases are then linked to many of the i2
Products to reap the benefits of an overall Supply Chain Management Solution.
Details of Work Involved
• Support and Maintenance of i2 Enterprise Solution products (SRM) and also
deploying enhancements in products for licensed clients.
• Shell programming on Unix (Sun OS)/Linux.
• Sed & AWK programming for processing of large amount of data before release to
customer database, Legal validation of content, file handling and report generation.
• PERL scripting for data insertion into Oracle database with the required modules,
report generation and process handling. Updations of data with current and complete
content with cross verification across database. Pattern searching/Mapping for critical
technical/commerce data across MRO manufacturers technical catalogs.
• Designing the schema/model for the database that will be best suited for the
activities involved in the implementation of the business logics.
• Writing procedures in Oracle for data hopping between databases through VPN
network. Report generation and updations with PL/SQL and J2EE. Implementation of
oracle-Triggers to construct module based actions.
• Vendor management and Development as most of the content development in MRO
vertical was outsourced. Process and Guideline building to take the necessary
proactive step in ever changing dynamic requirements of the customers. Actively
involved in the recruitments of personals at vendor premises.
• Technical consultant for MRO electrical content between i2 and clients/vendors,
which included detailed understanding & analysis of components across
Electrical/Electronics vertical for building mission critical parametric content
database.
• Integration & migration of data using knowledge based tools such as i2Explore and
Discovery.
• Developed and deployed the complete online catalog tracking system for the i2 Input
Management team. Business components included JSP, Java Beans and Oracle
Dbase on Tomcat Web server.
• Quality Control with implementation of documentation and Corrective & Preventive
action for the running of tools and correctness of data.
Last Updated on 14 April, 2016 Page 6 of 7
References
Available on Request.
Last Updated on 14 April, 2016 Page 7 of 7

Contenu connexe

Tendances

Rajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developerRajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developerRajeev Kumar
 
Innovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseDataWorks Summit
 
Spark + Hadoop Perfect together
Spark + Hadoop Perfect togetherSpark + Hadoop Perfect together
Spark + Hadoop Perfect togetherIsheeta Sanghi
 
Visualizing Big Data in Realtime
Visualizing Big Data in RealtimeVisualizing Big Data in Realtime
Visualizing Big Data in RealtimeDataWorks Summit
 
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksUsing Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksMapR Technologies
 
Running Zeppelin in Enterprise
Running Zeppelin in EnterpriseRunning Zeppelin in Enterprise
Running Zeppelin in EnterpriseDataWorks Summit
 
MLOps with a Feature Store: Filling the Gap in ML Infrastructure
MLOps with a Feature Store: Filling the Gap in ML InfrastructureMLOps with a Feature Store: Filling the Gap in ML Infrastructure
MLOps with a Feature Store: Filling the Gap in ML InfrastructureData Science Milan
 
YARN - Past, Present, & Future
YARN - Past, Present, & FutureYARN - Past, Present, & Future
YARN - Past, Present, & FutureDataWorks Summit
 
Boost Performance with Scala – Learn From Those Who’ve Done It!
Boost Performance with Scala – Learn From Those Who’ve Done It! Boost Performance with Scala – Learn From Those Who’ve Done It!
Boost Performance with Scala – Learn From Those Who’ve Done It! Cécile Poyet
 
Hortonworks tech workshop in-memory processing with spark
Hortonworks tech workshop   in-memory processing with sparkHortonworks tech workshop   in-memory processing with spark
Hortonworks tech workshop in-memory processing with sparkHortonworks
 
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...DataWorks Summit
 
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDruid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDataWorks Summit
 
Successes, Challenges, and Pitfalls Migrating a SAAS business to Hadoop
Successes, Challenges, and Pitfalls Migrating a SAAS business to HadoopSuccesses, Challenges, and Pitfalls Migrating a SAAS business to Hadoop
Successes, Challenges, and Pitfalls Migrating a SAAS business to HadoopDataWorks Summit/Hadoop Summit
 
Solving Cybersecurity at Scale
Solving Cybersecurity at ScaleSolving Cybersecurity at Scale
Solving Cybersecurity at ScaleDataWorks Summit
 
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...Databricks
 
Hadoop first ETL on Apache Falcon
Hadoop first ETL on Apache FalconHadoop first ETL on Apache Falcon
Hadoop first ETL on Apache FalconDataWorks Summit
 
Treat your enterprise data lake indigestion: Enterprise ready security and go...
Treat your enterprise data lake indigestion: Enterprise ready security and go...Treat your enterprise data lake indigestion: Enterprise ready security and go...
Treat your enterprise data lake indigestion: Enterprise ready security and go...DataWorks Summit
 

Tendances (20)

Rajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developerRajeev kumar apache_spark & scala developer
Rajeev kumar apache_spark & scala developer
 
Innovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data Warehouse
 
Spark + Hadoop Perfect together
Spark + Hadoop Perfect togetherSpark + Hadoop Perfect together
Spark + Hadoop Perfect together
 
Visualizing Big Data in Realtime
Visualizing Big Data in RealtimeVisualizing Big Data in Realtime
Visualizing Big Data in Realtime
 
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise NetworksUsing Familiar BI Tools and Hadoop to Analyze Enterprise Networks
Using Familiar BI Tools and Hadoop to Analyze Enterprise Networks
 
Running Zeppelin in Enterprise
Running Zeppelin in EnterpriseRunning Zeppelin in Enterprise
Running Zeppelin in Enterprise
 
MLOps with a Feature Store: Filling the Gap in ML Infrastructure
MLOps with a Feature Store: Filling the Gap in ML InfrastructureMLOps with a Feature Store: Filling the Gap in ML Infrastructure
MLOps with a Feature Store: Filling the Gap in ML Infrastructure
 
YARN - Past, Present, & Future
YARN - Past, Present, & FutureYARN - Past, Present, & Future
YARN - Past, Present, & Future
 
Boost Performance with Scala – Learn From Those Who’ve Done It!
Boost Performance with Scala – Learn From Those Who’ve Done It! Boost Performance with Scala – Learn From Those Who’ve Done It!
Boost Performance with Scala – Learn From Those Who’ve Done It!
 
Hortonworks tech workshop in-memory processing with spark
Hortonworks tech workshop   in-memory processing with sparkHortonworks tech workshop   in-memory processing with spark
Hortonworks tech workshop in-memory processing with spark
 
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
Worldwide Scalable and Resilient Messaging Services by CQRS and Event Sourcin...
 
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming DataDruid: Sub-Second OLAP queries over Petabytes of Streaming Data
Druid: Sub-Second OLAP queries over Petabytes of Streaming Data
 
Apache Atlas: Governance for your Data
Apache Atlas: Governance for your DataApache Atlas: Governance for your Data
Apache Atlas: Governance for your Data
 
Apache deep learning 101
Apache deep learning 101Apache deep learning 101
Apache deep learning 101
 
Successes, Challenges, and Pitfalls Migrating a SAAS business to Hadoop
Successes, Challenges, and Pitfalls Migrating a SAAS business to HadoopSuccesses, Challenges, and Pitfalls Migrating a SAAS business to Hadoop
Successes, Challenges, and Pitfalls Migrating a SAAS business to Hadoop
 
Solving Cybersecurity at Scale
Solving Cybersecurity at ScaleSolving Cybersecurity at Scale
Solving Cybersecurity at Scale
 
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
Spark HBase Connector: Feature Rich and Efficient Access to HBase Through Spa...
 
Hadoop first ETL on Apache Falcon
Hadoop first ETL on Apache FalconHadoop first ETL on Apache Falcon
Hadoop first ETL on Apache Falcon
 
Treat your enterprise data lake indigestion: Enterprise ready security and go...
Treat your enterprise data lake indigestion: Enterprise ready security and go...Treat your enterprise data lake indigestion: Enterprise ready security and go...
Treat your enterprise data lake indigestion: Enterprise ready security and go...
 
Splunk Architecture
Splunk ArchitectureSplunk Architecture
Splunk Architecture
 

En vedette

En vedette (12)

Mike resume, Nov 2014
Mike resume, Nov 2014Mike resume, Nov 2014
Mike resume, Nov 2014
 
Shashank_Venkataramanacharya
Shashank_VenkataramanacharyaShashank_Venkataramanacharya
Shashank_Venkataramanacharya
 
ASHWINI RANE RESUME
ASHWINI RANE RESUMEASHWINI RANE RESUME
ASHWINI RANE RESUME
 
Muthuraj_resume
Muthuraj_resumeMuthuraj_resume
Muthuraj_resume
 
Pooja_Koparde_Testing
Pooja_Koparde_TestingPooja_Koparde_Testing
Pooja_Koparde_Testing
 
JyothishNewResume5exp
JyothishNewResume5expJyothishNewResume5exp
JyothishNewResume5exp
 
Praveen_Resume
Praveen_ResumePraveen_Resume
Praveen_Resume
 
Resume: Research Engineer
Resume: Research Engineer Resume: Research Engineer
Resume: Research Engineer
 
Resume model
Resume modelResume model
Resume model
 
Swiggy JD
Swiggy JDSwiggy JD
Swiggy JD
 
Shubhi_Resume_Updated
Shubhi_Resume_UpdatedShubhi_Resume_Updated
Shubhi_Resume_Updated
 
Resume (CV) Senior PHP developer Javier Valderrama
Resume (CV) Senior PHP developer Javier ValderramaResume (CV) Senior PHP developer Javier Valderrama
Resume (CV) Senior PHP developer Javier Valderrama
 

Similaire à AnilKumarT_Resume_latest

Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav Srivastav
 
Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016Sanjay Mane
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-mlShubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-mlShubham Mallick
 
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & InfraClient Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & InfraRawud Manasseh
 
Architecting an Open Source AI Platform 2018 edition
Architecting an Open Source AI Platform   2018 editionArchitecting an Open Source AI Platform   2018 edition
Architecting an Open Source AI Platform 2018 editionDavid Talby
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_ResumeChandan Das
 
Jesy George_CV_LATEST
Jesy George_CV_LATESTJesy George_CV_LATEST
Jesy George_CV_LATESTJesy George
 
Skillwise Consulting -Technical competency
Skillwise Consulting -Technical competencySkillwise Consulting -Technical competency
Skillwise Consulting -Technical competencySkillwise Consulting
 

Similaire à AnilKumarT_Resume_latest (20)

Resume
ResumeResume
Resume
 
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
Gaurav dev ops (AWS, Linux, Automation-ansible, jenkins:CI and CD:Ansible)
 
Resume
ResumeResume
Resume
 
Abhijit_Saurabh_Resume
Abhijit_Saurabh_ResumeAbhijit_Saurabh_Resume
Abhijit_Saurabh_Resume
 
Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016Sanjaykumar Kakaso Mane_MAY2016
Sanjaykumar Kakaso Mane_MAY2016
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Pronobesh resume
Pronobesh resumePronobesh resume
Pronobesh resume
 
Pronobesh_Resume
Pronobesh_ResumePronobesh_Resume
Pronobesh_Resume
 
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-mlShubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
Shubham, 7.5+ years exp, mcp, map r spark-hive-bi-etl-azure-dataengineer-ml
 
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & InfraClient Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
Client Solutions Executive - Niche Skilled on AWS Cloud, Digital Apps & Infra
 
Architecting an Open Source AI Platform 2018 edition
Architecting an Open Source AI Platform   2018 editionArchitecting an Open Source AI Platform   2018 edition
Architecting an Open Source AI Platform 2018 edition
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Kamala_latest_CV
Kamala_latest_CVKamala_latest_CV
Kamala_latest_CV
 
CV_Gervano_Fernandes
CV_Gervano_FernandesCV_Gervano_Fernandes
CV_Gervano_Fernandes
 
Ahmad_Resume_Ar
Ahmad_Resume_ArAhmad_Resume_Ar
Ahmad_Resume_Ar
 
Jesy George_CV_LATEST
Jesy George_CV_LATESTJesy George_CV_LATEST
Jesy George_CV_LATEST
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Skillwise Consulting -Technical competency
Skillwise Consulting -Technical competencySkillwise Consulting -Technical competency
Skillwise Consulting -Technical competency
 

AnilKumarT_Resume_latest

  • 1. RESUME Anil Kumar Thyagarajan #203, 3rd Main 6th Cross, HAL 3rd Stage, Bangalore 560075 Phone: +91 9845403599 Email: anil.thyagarajan@gmail.com URL: http://in.linkedin.com/pub/anil-thyagarajan/2/ab2/29a/ With 15+ years of IT experience primarily in Software development for various domains such as Big Data Analytic, Structured Data, AWS/Azure Cloud Computing, Payment Gateways, Search Advertising, Systems/Networking/ISP, Capital Market financial products and Supply Chain Products. Articulate and professional business skills, highly productive in team and individual projects, strong research and time-management with programming skills driven to maintain the industry knowledge and technical skills through formal and independent training. Objective: Seeking technical lead role in the area of Big Data Analytics, Cloud Computing and Infrastructure development. Career Profile: Senior SDE Big Data Platform, Microsoft R&D India, Oct2014 – Till date Technical Specialist R&D Cloud Computing, Nokia, Aug2011 – Oct2014 Principal Engineer for Yahoo India, Oct2009 – Aug2011 Sr Systems Programmer for AOL Inc, Aug2005 – Oct2009 Consultant for Capco IT services, India, Mar 2004 – Aug-2005 Software Engineer for HP India, India, Jun 2003 – Mar 2004 Product Engineer for i2 Technologies Inc, Bangalore, India, Jan 2000 – May 30 2003 Professional Strength: • Leading team in technical direction, delivery and deploy. • Bridging the gap with Architecture and development team, by transforming design to product. • Involve in Architectural design and contribute with new ideas. • Effective team player with clear communication. • Cloud Computing on Amazon Web Services for Building Big data Analytics platform. • Distributed computing - Hadoop, NoSQL • Expertise in areas of System/Network Tools development and large infrastructure monitoring. • Advanced Perl programming skills. • Intermediate skills in Java & Python • Experience with Amazon Web Services like EC2, S3 and EMR. • Analytics tools experience – MapReduce, Hive • Worked on multiple horizontal domains. • Diverse functions – Development, Service Engineering, Infrastructure Tools and Products. • Practical thinker and quick learner. Last Updated on 14 April, 2016 Page 1 of 7
  • 2. Education: Bachelors in Electrical & Electronics Engineering (1st Class) (B M S College of Engineering, Bangalore, 1995-1999) Technical Skills: Operating Systems: Unix, Linux & Windows Databases : Postgres, Mysql, Oracle 10g, Sybase Languages : Advanced PERL, Python, C#, Java Cloud : Azure, AWS Distributed Systems : Virtualization, Hadoop ecosystem, NoSQL Training & Courses Technical Details: • Perl - POE, CGI, Catalyst • C#, .NET • Java Spring resteasy • Products on Amazon Web Services • BGP, OSPF & ISIS configuration for routers. • Java, ExtJS, HTML, Servlets, JSP • Webserver – Apache, Jboss, Tomcat • Hadoop Developer Training (Cloudera) • Shell & Awk programming (System Logic, Bangalore) • PL/SQL with Oracle9i (i2 Technologies) • Data Ware Housing Tools(Erwin, Informatica, BI) – ETL • C, C++ (LearnSoft, Bangalore) • Linux Kernel Internals (Linux Training Center, Bangalore) • Javascript, CSS (Yahoo Internal Training) Management & Functional: • License to Lead (Nokia Internal) • PMPC Classes – Project Management Practitioners conference, PMI Bangalore • Fundamentals of Capital Market (Capco, Sunnyvale US) • PRM (partner Relationship Management) portal study and analysis(HP, Cupertino USA) • Quality System Awareness (Zandig TQM Solutions) Last Updated on 14 April, 2016 Page 2 of 7
  • 3. WORK ASSIGNMENTS • Oct 2014 – Till Date ( Microsoft ) Working on Azure-HDInsight Big Data platform - A managed Apache Hadoop, Spark, R, HBase, and Storm cloud service made easy. - Worked on the platform to enable Hadoop ResourceManager High Availability (RM-HA) in HDInsight Hadoop cluster. - Enabled Azure Datalake store for HDInsight Linux clusters. - Enhanced features in Linux Hadoop clusters offerings by introducing new workflows. - Working on tools to enhance monitoring and deployment productivity. • Aug 2011 – Oct 2014 ( NOKIA ) Worked as a Tech Lead in Big Data Analytics. Being part of a cloud platform team, I was engaged in development of platform tools and integrating functionalities around NoSQL and Hadoop systems. My recent role was in building a platform for Analytics as a Service on Amazon cloud (AWS). Work Details: - Part of the architecture team to fully design Hadoop eco system in Amazon (AWS) as Analytic service. Also played a role of lead and mentor for the team. Was responsible for the end-to-end execution of the project right from inception to deployment in AWS. o This included provisioning user/group and datasets with access control using IAM and S3 buckets policy. o Building many smaller services for data ingestion, extraction and user authentication. o Job Management for abstracting AWS EMR as part of the Analytics service platform. o Query Service for real time execution of Hive query in both synchronous and asynchronous model over EMR as part of the Analytics service platform o Designing the orchestration of all the components that makes the Analytics Service platform for Nokia - Developed Tools and executed migration of 2PetaBytes of data from Hadoop to Amazon-S3. This is very challenging in terms of handling millions of file and validating the data integrity after transfer. - Was part of the team to integrate Qubole (http://www.qubole.com - Big Data as a Service) to our Analytics platform in AWS. - Designed and developed a fully functional visual performance tool for Structured Data (NoSQL). User interface in ExtJS, Middle layer using Perl-POE and load driver in Python. This tool generates distributed load for large RPS (Req/sec) in a timeline series. - Experience on working with structured NoSQL key value store. - Was part of the design and development of distributed system deployment tool. Last Updated on 14 April, 2016 Page 3 of 7
  • 4. - End2End Design and development of REST API layer for Hadoop. This included client authentication strategy, rest layer authorization with HDFS using KDC and exposing different functionalities as resources. JAVA Spring resteasy framework was used to build the rest system. - Experience on working with Map Reduce and tools around the Hadoop eco system. • Oct 2009 – Aug 2011 ( YAHOO India) Played a role as Principal Engineer in Service Engineering of Yahoo’s Search Advertising product called Panama (http://searchmarketing.yahoo.com). I also work as a lead engineer for Yahoo payment gateway (wallet.yahoo.com). Activities Include: - Writing software in Perl to monitor large-scale diverse applications. Enhancing framework for metric collection system. - Writing plugins for monitoring large infrastructure through Nagios using Yahoo built in wrappers. - Deployment operations related to product development cycle and SCM. - Managing the complete operations of the Search advertisement applications related to technical issues and escalations. Contributing to system architecture and designs. - Complete ownership of hardware Systems related to sustenance and capacity planning. - Tool development related to System infrastructure, like vip viewers, network topology and resources. - New launches and release planning. - 12X7 support and escalation for critical Oncall activities. • Aug 2005 – Oct 2009 ( AOL India) Details of Work Involved I worked in the systems & network infrastructure team building and designing solutions for the range of infrastructure products that directly impact the ISP back bone and Internet services. Perl programming language is extensively used in building network products that monitor and collect metric from thousands of host and network devices like switches & routers. • MEMOSYS – Metrics & Monitoring system is developed using the POE networking CPAN module in Perl. This complex framework written in Perl uses event driven mechanism to gather statistics. Socket, pipes and Tcl are heavily used for inter process communication. • ATDN ( AOL Transit Data Network ) - Writing frameworks & software for the network engineering team. These involve communicating with Cisco/Juniper routers, switches for configuration read & write. The software’s written in Perl are used by network engineers to manage the network devices for various topology related and maintenance activities. • IRR – Internet routing registry ( whois.aoltw.net ) This is a daemon running a database with information of worldwide prefixes commonly known as cidr’s. We manage this critical data that gets pushed to all the ATDN pop and backbone routers. Software written in Perl using POE for network data collection and building GUI interfaces for customers. Last Updated on 14 April, 2016 Page 4 of 7
  • 5. • KPI – Key performance Indices is a system to statistically determine the network usage for different parameters like device uplink utilization, pop to backbone traffic etc. • Billing Application – AOL ISP business bills its networking peers via this application. The information of bandwidth used like the megabytes transferred in/out through a router interface is stored in a huge Berkley DB and Perl framework pulls data and builds reports that are used by finance analysts. • NMS – Network management System is the centre point for all the meta information of network devices. This is built around a complex Perl regex engine to parse all the network device command output. This is a common point of network data access for all other dependent applications. This has an exposed GUI via Ruby on Rails. • IVY (Install View & Yawn) – Automated Linux installing framework for all verities of chip architecture like i386, x86_64. This framework written in Perl(POE based) helps in remote installation and upgrading servers from different versions. • Standalone monitors – There are thousands of services that AOL provides and they need to be monitored. So Perl is extensively used to monitor these processes with POE (Perl object Environment) module and other CPAN modules and home grown modules for router communication, DNS query, IRR(Routing registry), Database connections etc. • Mar 2004 – Aug 2005 ( Capco ) Capco is the first services and technology solutions provider exclusively focused on forming the future of the financial services industry. We unite thought leadership and practical application to improve efficiency and profitability for our clients. Details of Work Involved • Detailed system study on different products in the Finance Capital Market domain (USA, CA SanJose) • Products like GIM (Global Index Monitor), SW (Sector Watch) etc programmed using perl and Sybase. • Enhancement and new application development. • Designed the enterprise frame work using Appconfig, SPOPS and Openinteract to build a complete perl based engine for Global Dividend forecast. • Jun 2003 – Mar 2004 (Digital – HP Company) Digital being a subsidiary of HP executes many of its IT requirements. One of them being the portal for IPG group called Pweb. This is a two way communication vehicle between HP and the distributor/retailers. Many applications are hosted on this portal which involves the Partners transacting for most of the business requirements with HP. Details of Work Involved • Detailed system study and analysis for changing business requirements. • Enhancement and new application development. • 24/7 L3/L4 support of the system and escalations based on SLA. • System Topology – HP Unix (Requires Unix admin skills) Oracle 9i (Latest techniques used) Last Updated on 14 April, 2016 Page 5 of 7
  • 6. Perl & Mod Perl (For CGI programming) Apache (Web server) • System Architecture - Audiencing technique (To generate separate views for every user) Application interacts with multiple remote databases. • Working with the above said technologies wrt complex business requirements. • Jan 2000 – May 30 2003 (i2 Technologies) i2 is a leading provider of value chain management solutions. i2's value chain management solutions help companies plan and execute the activities involved in managing supply and demand. These solutions span the entire scope of value chain interactions, including supplier relationship management, supply chain management and demand chain management. As a product Engineer in Content operations which is a life blood for other i2 solutions I was given the responsibility of vendor Management. Most of the database development was created in the vendor location(Outsourced work). I was actively involved in Vendor development, Process Building and development of required tools i2 is a pioneer in MRO(Maintenance & Repair Operations) vertical and hence it required lot of expertise & technical skills in the Core Electrical & Electronics component segment. Complex parent- Class hierarchy is being developed to accommodate part information of technical products across thousands of Manufacturers and suppliers. Parametric Data created is integrated or Migrated into multiple customer databases The databases are then linked to many of the i2 Products to reap the benefits of an overall Supply Chain Management Solution. Details of Work Involved • Support and Maintenance of i2 Enterprise Solution products (SRM) and also deploying enhancements in products for licensed clients. • Shell programming on Unix (Sun OS)/Linux. • Sed & AWK programming for processing of large amount of data before release to customer database, Legal validation of content, file handling and report generation. • PERL scripting for data insertion into Oracle database with the required modules, report generation and process handling. Updations of data with current and complete content with cross verification across database. Pattern searching/Mapping for critical technical/commerce data across MRO manufacturers technical catalogs. • Designing the schema/model for the database that will be best suited for the activities involved in the implementation of the business logics. • Writing procedures in Oracle for data hopping between databases through VPN network. Report generation and updations with PL/SQL and J2EE. Implementation of oracle-Triggers to construct module based actions. • Vendor management and Development as most of the content development in MRO vertical was outsourced. Process and Guideline building to take the necessary proactive step in ever changing dynamic requirements of the customers. Actively involved in the recruitments of personals at vendor premises. • Technical consultant for MRO electrical content between i2 and clients/vendors, which included detailed understanding & analysis of components across Electrical/Electronics vertical for building mission critical parametric content database. • Integration & migration of data using knowledge based tools such as i2Explore and Discovery. • Developed and deployed the complete online catalog tracking system for the i2 Input Management team. Business components included JSP, Java Beans and Oracle Dbase on Tomcat Web server. • Quality Control with implementation of documentation and Corrective & Preventive action for the running of tools and correctness of data. Last Updated on 14 April, 2016 Page 6 of 7
  • 7. References Available on Request. Last Updated on 14 April, 2016 Page 7 of 7