1. RESUME
Anil Kumar Thyagarajan
#203, 3rd
Main 6th
Cross, HAL 3rd
Stage, Bangalore 560075
Phone: +91 9845403599
Email: anil.thyagarajan@gmail.com
URL: http://in.linkedin.com/pub/anil-thyagarajan/2/ab2/29a/
With 15+ years of IT experience primarily in Software development for various domains
such as Big Data Analytic, Structured Data, AWS/Azure Cloud Computing, Payment
Gateways, Search Advertising, Systems/Networking/ISP, Capital Market financial products
and Supply Chain Products. Articulate and professional business skills, highly productive in
team and individual projects, strong research and time-management with programming
skills driven to maintain the industry knowledge and technical skills through formal and
independent training.
Objective:
Seeking technical lead role in the area of Big Data Analytics, Cloud Computing and
Infrastructure development.
Career Profile:
Senior SDE Big Data Platform, Microsoft R&D India, Oct2014 – Till date
Technical Specialist R&D Cloud Computing, Nokia, Aug2011 – Oct2014
Principal Engineer for Yahoo India, Oct2009 – Aug2011
Sr Systems Programmer for AOL Inc, Aug2005 – Oct2009
Consultant for Capco IT services, India, Mar 2004 – Aug-2005
Software Engineer for HP India, India, Jun 2003 – Mar 2004
Product Engineer for i2 Technologies Inc, Bangalore, India, Jan 2000 – May 30 2003
Professional Strength:
• Leading team in technical direction, delivery and deploy.
• Bridging the gap with Architecture and development team, by transforming design to
product.
• Involve in Architectural design and contribute with new ideas.
• Effective team player with clear communication.
• Cloud Computing on Amazon Web Services for Building Big data Analytics platform.
• Distributed computing - Hadoop, NoSQL
• Expertise in areas of System/Network Tools development and large infrastructure
monitoring.
• Advanced Perl programming skills.
• Intermediate skills in Java & Python
• Experience with Amazon Web Services like EC2, S3 and EMR.
• Analytics tools experience – MapReduce, Hive
• Worked on multiple horizontal domains.
• Diverse functions – Development, Service Engineering, Infrastructure Tools and
Products.
• Practical thinker and quick learner.
Last Updated on 14 April, 2016 Page 1 of 7
2. Education:
Bachelors in Electrical & Electronics Engineering (1st Class)
(B M S College of Engineering, Bangalore, 1995-1999)
Technical Skills:
Operating Systems: Unix, Linux & Windows
Databases : Postgres, Mysql, Oracle 10g, Sybase
Languages : Advanced PERL, Python, C#, Java
Cloud : Azure, AWS
Distributed Systems : Virtualization, Hadoop ecosystem, NoSQL
Training & Courses
Technical Details:
• Perl - POE, CGI, Catalyst
• C#, .NET
• Java Spring resteasy
• Products on Amazon Web Services
• BGP, OSPF & ISIS configuration for routers.
• Java, ExtJS, HTML, Servlets, JSP
• Webserver – Apache, Jboss, Tomcat
• Hadoop Developer Training (Cloudera)
• Shell & Awk programming (System Logic, Bangalore)
• PL/SQL with Oracle9i (i2 Technologies)
• Data Ware Housing Tools(Erwin, Informatica, BI) – ETL
• C, C++ (LearnSoft, Bangalore)
• Linux Kernel Internals (Linux Training Center, Bangalore)
• Javascript, CSS (Yahoo Internal Training)
Management & Functional:
• License to Lead (Nokia Internal)
• PMPC Classes – Project Management Practitioners conference, PMI Bangalore
• Fundamentals of Capital Market (Capco, Sunnyvale US)
• PRM (partner Relationship Management) portal study and analysis(HP, Cupertino
USA)
• Quality System Awareness (Zandig TQM Solutions)
Last Updated on 14 April, 2016 Page 2 of 7
3. WORK ASSIGNMENTS
• Oct 2014 – Till Date ( Microsoft )
Working on Azure-HDInsight Big Data platform - A managed Apache Hadoop, Spark,
R, HBase, and Storm cloud service made easy.
- Worked on the platform to enable Hadoop ResourceManager High Availability
(RM-HA) in HDInsight Hadoop cluster.
- Enabled Azure Datalake store for HDInsight Linux clusters.
- Enhanced features in Linux Hadoop clusters offerings by introducing new
workflows.
- Working on tools to enhance monitoring and deployment productivity.
• Aug 2011 – Oct 2014 ( NOKIA )
Worked as a Tech Lead in Big Data Analytics. Being part of a cloud platform team, I
was engaged in development of platform tools and integrating functionalities around NoSQL
and Hadoop systems. My recent role was in building a platform for Analytics as a Service on
Amazon cloud (AWS).
Work Details:
- Part of the architecture team to fully design Hadoop eco system in Amazon
(AWS) as Analytic service. Also played a role of lead and mentor for the team.
Was responsible for the end-to-end execution of the project right from inception
to deployment in AWS.
o This included provisioning user/group and datasets with access control
using IAM and S3 buckets policy.
o Building many smaller services for data ingestion, extraction and user
authentication.
o Job Management for abstracting AWS EMR as part of the Analytics service
platform.
o Query Service for real time execution of Hive query in both synchronous
and asynchronous model over EMR as part of the Analytics service
platform
o Designing the orchestration of all the components that makes the
Analytics Service platform for Nokia
- Developed Tools and executed migration of 2PetaBytes of data from Hadoop to
Amazon-S3. This is very challenging in terms of handling millions of file and
validating the data integrity after transfer.
- Was part of the team to integrate Qubole (http://www.qubole.com - Big Data as
a Service) to our Analytics platform in AWS.
- Designed and developed a fully functional visual performance tool for Structured
Data (NoSQL). User interface in ExtJS, Middle layer using Perl-POE and load
driver in Python. This tool generates distributed load for large RPS (Req/sec) in a
timeline series.
- Experience on working with structured NoSQL key value store.
- Was part of the design and development of distributed system deployment tool.
Last Updated on 14 April, 2016 Page 3 of 7
4. - End2End Design and development of REST API layer for Hadoop. This included
client authentication strategy, rest layer authorization with HDFS using KDC and
exposing different functionalities as resources. JAVA Spring resteasy framework
was used to build the rest system.
- Experience on working with Map Reduce and tools around the Hadoop eco
system.
• Oct 2009 – Aug 2011 ( YAHOO India)
Played a role as Principal Engineer in Service Engineering of Yahoo’s Search
Advertising product called Panama (http://searchmarketing.yahoo.com). I also work
as a lead engineer for Yahoo payment gateway (wallet.yahoo.com).
Activities Include:
- Writing software in Perl to monitor large-scale diverse applications. Enhancing
framework for metric collection system.
- Writing plugins for monitoring large infrastructure through Nagios using Yahoo
built in wrappers.
- Deployment operations related to product development cycle and SCM.
- Managing the complete operations of the Search advertisement applications
related to technical issues and escalations. Contributing to system architecture
and designs.
- Complete ownership of hardware Systems related to sustenance and capacity
planning.
- Tool development related to System infrastructure, like vip viewers, network
topology and resources.
- New launches and release planning.
- 12X7 support and escalation for critical Oncall activities.
• Aug 2005 – Oct 2009 ( AOL India)
Details of Work Involved
I worked in the systems & network infrastructure team building and designing
solutions for the range of infrastructure products that directly impact the ISP back
bone and Internet services. Perl programming language is extensively used in
building network products that monitor and collect metric from thousands of host
and network devices like switches & routers.
• MEMOSYS – Metrics & Monitoring system is developed using the POE
networking CPAN module in Perl. This complex framework written in Perl uses
event driven mechanism to gather statistics. Socket, pipes and Tcl are heavily
used for inter process communication.
• ATDN ( AOL Transit Data Network ) - Writing frameworks & software for the
network engineering team. These involve communicating with Cisco/Juniper
routers, switches for configuration read & write. The software’s written in Perl
are used by network engineers to manage the network devices for various
topology related and maintenance activities.
• IRR – Internet routing registry ( whois.aoltw.net ) This is a daemon running a
database with information of worldwide prefixes commonly known as cidr’s.
We manage this critical data that gets pushed to all the ATDN pop and
backbone routers. Software written in Perl using POE for network data
collection and building GUI interfaces for customers.
Last Updated on 14 April, 2016 Page 4 of 7
5. • KPI – Key performance Indices is a system to statistically determine the
network usage for different parameters like device uplink utilization, pop to
backbone traffic etc.
• Billing Application – AOL ISP business bills its networking peers via this
application. The information of bandwidth used like the megabytes transferred
in/out through a router interface is stored in a huge Berkley DB and Perl
framework pulls data and builds reports that are used by finance analysts.
• NMS – Network management System is the centre point for all the meta
information of network devices. This is built around a complex Perl regex
engine to parse all the network device command output. This is a common
point of network data access for all other dependent applications. This has an
exposed GUI via Ruby on Rails.
• IVY (Install View & Yawn) – Automated Linux installing framework for all
verities of chip architecture like i386, x86_64. This framework written in
Perl(POE based) helps in remote installation and upgrading servers from
different versions.
• Standalone monitors – There are thousands of services that AOL provides and
they need to be monitored. So Perl is extensively used to monitor these
processes with POE (Perl object Environment) module and other CPAN
modules and home grown modules for router communication, DNS query,
IRR(Routing registry), Database connections etc.
• Mar 2004 – Aug 2005 ( Capco )
Capco is the first services and technology solutions provider exclusively focused on
forming the future of the financial services industry. We unite thought leadership and
practical application to improve efficiency and profitability for our clients.
Details of Work Involved
• Detailed system study on different products in the Finance Capital Market
domain (USA, CA SanJose)
• Products like GIM (Global Index Monitor), SW (Sector Watch) etc
programmed using perl and Sybase.
• Enhancement and new application development.
• Designed the enterprise frame work using Appconfig, SPOPS and Openinteract
to build a complete perl based engine for Global Dividend forecast.
• Jun 2003 – Mar 2004 (Digital – HP Company)
Digital being a subsidiary of HP executes many of its IT requirements. One of them
being the portal for IPG group called Pweb. This is a two way communication vehicle
between HP and the distributor/retailers. Many applications are hosted on this portal
which involves the Partners transacting for most of the business requirements with
HP.
Details of Work Involved
• Detailed system study and analysis for changing business requirements.
• Enhancement and new application development.
• 24/7 L3/L4 support of the system and escalations based on SLA.
• System Topology – HP Unix (Requires Unix admin skills)
Oracle 9i (Latest techniques used)
Last Updated on 14 April, 2016 Page 5 of 7
6. Perl & Mod Perl (For CGI programming)
Apache (Web server)
• System Architecture - Audiencing technique (To generate separate views for
every user)
Application interacts with multiple remote databases.
• Working with the above said technologies wrt complex business requirements.
• Jan 2000 – May 30 2003 (i2 Technologies)
i2 is a leading provider of value chain management solutions. i2's value chain management
solutions help companies plan and execute the activities involved in managing supply and
demand. These solutions span the entire scope of value chain interactions, including
supplier relationship management, supply chain management and demand chain
management. As a product Engineer in Content operations which is a life blood for other i2
solutions I was given the responsibility of vendor Management. Most of the database
development was created in the vendor location(Outsourced work). I was actively involved
in Vendor development, Process Building and development of required tools i2 is a pioneer
in MRO(Maintenance & Repair Operations) vertical and hence it required lot of expertise &
technical skills in the Core Electrical & Electronics component segment. Complex parent-
Class hierarchy is being developed to accommodate part information of technical products
across thousands of Manufacturers and suppliers. Parametric Data created is integrated or
Migrated into multiple customer databases The databases are then linked to many of the i2
Products to reap the benefits of an overall Supply Chain Management Solution.
Details of Work Involved
• Support and Maintenance of i2 Enterprise Solution products (SRM) and also
deploying enhancements in products for licensed clients.
• Shell programming on Unix (Sun OS)/Linux.
• Sed & AWK programming for processing of large amount of data before release to
customer database, Legal validation of content, file handling and report generation.
• PERL scripting for data insertion into Oracle database with the required modules,
report generation and process handling. Updations of data with current and complete
content with cross verification across database. Pattern searching/Mapping for critical
technical/commerce data across MRO manufacturers technical catalogs.
• Designing the schema/model for the database that will be best suited for the
activities involved in the implementation of the business logics.
• Writing procedures in Oracle for data hopping between databases through VPN
network. Report generation and updations with PL/SQL and J2EE. Implementation of
oracle-Triggers to construct module based actions.
• Vendor management and Development as most of the content development in MRO
vertical was outsourced. Process and Guideline building to take the necessary
proactive step in ever changing dynamic requirements of the customers. Actively
involved in the recruitments of personals at vendor premises.
• Technical consultant for MRO electrical content between i2 and clients/vendors,
which included detailed understanding & analysis of components across
Electrical/Electronics vertical for building mission critical parametric content
database.
• Integration & migration of data using knowledge based tools such as i2Explore and
Discovery.
• Developed and deployed the complete online catalog tracking system for the i2 Input
Management team. Business components included JSP, Java Beans and Oracle
Dbase on Tomcat Web server.
• Quality Control with implementation of documentation and Corrective & Preventive
action for the running of tools and correctness of data.
Last Updated on 14 April, 2016 Page 6 of 7