1. Srikanth Rangaraju
Mobile: (309)-281-0545
E-mail:SrikanthRangaraju@gmail.com
SUMMARY
A professional with 12 years of IT experience in diversified fields of the Software Development
Life Cycle (SDLC) specialized in Quality Assurance Process and Methodologies with Business
Analyst skills in Agriculture, Warranty, and Marketing domains.
Proven ability in designing and creating automation frameworks for J2EE web based applications using
Java, Selenium WebDriver, AppPerfect, Quality Center,TestNG,Maven, ANT, Jenkins, Hudson,and
SOAP UI.
Proficient in Functional Testing, System Integration Testing, Regression Testing,Performance Testing,
and User Acceptance Testing.
Experience in designing Test Cases, Test Scenarios, Test Scripts and Test reports of manual and
automated tests.
Experience in building Continuous Integration and Continuous Deployment from a testing perspective
Created monitors, alarms and notifications for EC2 hosts using CloudWatch
Experienced in runningthe Load Tests on AWS EC2 instances by varyinginstancesizes and varyingauto
scaling capacity for Analysis purpose.
Experience in Load Testing using JMeter, Gatling
Working experience on Agile methodologies
Team Player with good oral and written communication skills.
EDUCATION
Masters of Science in Computer Science Feb 2002-Dec 2004
Jawaharlal Nehru Technological University, AP, India
SKILLS AND KNOWLEDGE
Platforms AWS, Linux, Unix, Windows
Testing Selenium(WebDriver,Grid),TestNG,AppPerfect,JMeter,
Gatling, SOAPUI
Cloud/AWS
Compute EC2, ELB, Auto Scaling
Storage S3
Database DynamoDB, ElastiCache
Networking VPC, ELB, Route 53
Analytics ElasticSearch
Dev Tooling & Management Code Deploy, Cloud Watch
Security Identity & Access Management (IAM)
Programming Languages Java
Server Side
Technologies JSP, JMS, JDBC,, XML
Frameworks / Libraries Akka, Spring
Application Servers WebLogic, JBoss, WebSphere, Tomcat
IDE IntelliJ IDEA, RAD, Eclipse
Build / CI SBT, Jenkins, Maven
Code Coverage Sonar
Monitoring AppDynamics, JConsole, JVisualVM
Data Serialization Protocol Buffers
Relational Databases DB2, Oracle, SQL Server
NoSQL Databases DynamoDB, MongoDB, ElasticSearch
Version Control Git/Github, Mercurial, SVN,TFS
Project Management Rally, JIRA, Quickbase, Quality Center 9.0
2. PROFESSIONAL EXPERIENCE
Client: John Deere, Illinois, Moline June2013 to Till Date
Project: Streaming Data Platform
Role: IT QA Analyst
Description: Streaming Data Platform [SDP] is a horizontally scalable Distributed System designed to ingest
sensor data collected on themachines in real-time.Data is ingested from Agricultural Equipment (like Planters
and Harvesters)via Web Sockets and stored in MongoDB and DynamoDB.Operations Center web application
provides a map view of Ag Operation to customers by building visualizations based on the data ingested at
real-time.
Responsibilities
Quality analysis for various projects under the Streaming Data Platform (SDP) umbrella.
Scripted necessary infrastructure in Python for Cloud Testing using AWS Cloud Automation
Framework
Used AWS CloudWatch to monitor performance metrics
Leveraged ElasticSearch, Logstash, Kibana (ELK) Stack to troubleshoot issues and for root cause
analysis by analyzing logs across various components in SDP.
Identified major test scenarios and automated them to reduce overhead on manual testing using
Selenium Web Driver and TestNG framework.
Verified REST Microservices exposed by SDP components using Postman, RestUI and Httpie.
Data validation using RoboMongo as the client for MongoDB and AWS Console for DynamoDB
Hands-on experience in working on Amazon Web Services. Experience with spinning up EC2
instances, upgrading dynamo collection capacities, monitoring CloudWatch metrics.
Actively involved in production support which included analyzinglogs on Kibana and takingsteps to
avoid production down situations.
Utilized Eclipse Memory Analyzer (MAT) for JVM Heap Dump analysis to identify memory leaks.
Used AppDynamics, JConsole and JVisualVM todetermine Memory and CPU usage on nodes, check
to see if all nodes are in cluster, add nodes back to cluster when they leave.
Involved in discussions around Clustering and Sharding tomake suredata is evenly distributed across
all application nodes and database nodes.
Safeguarding application against failures in Production down scenarios. Developed and Tested
mechanisms to recover from failures.
Involved in coordinating with teams and business to continuously help, analyze and deliver quality
code to our end client.
Defining test plans and strategies to make sure all requirements are well translated into test cases.
Environment Java, Scala, Akka, Play, TestNG, JSON, XML, Web Services using REST, MongoDB, DynamoDB.
Client: PTC, Illinois, Moline March 2007 to June 2013
Project: Service Life Cycle Management (SLM) Implementations
Role: Senior QA Consultant
Description: PTC provides Service Life Cycle Management (SLM) solutions to the automotive industry
covering warranty, service parts, repair management, service contracts and fleet services. PTC is involved in
customization of existing product and developing it based on customer requirements, involved in to provide
the quality of software products in the areas of service life cycle management (SLM) which includes
i-Warranty, iSupport, iParts and
iService.
SLM products weredesigned specifically for the heavy equipment industry.It uses a servicemodel, which has
Cost Management and Quality Improvement. It helps global manufacturers optimizetheir entirewarranty life
cycle through the use of the modules such as Service Contracts/Extended Warranty, Warranty Center, Parts
Ordering ,Returns Management, Supplier Recovery and Analytics.
3. Responsibilities
Responsible for leading a Team at off shore
Designing the Test Plan, Test Strategy and Test Approach for the project
Prepared Automation scripts using App perfect web Functional Testing Tool
Updating the Developed Automation script for Functional and Regression Testing purposes
Working as a point of contact from Moline Development center with the team at Indian
Development center
Responsible for taking care of the clarification if any from the team at Offshore and getting the
clarification ASAP.
Establishing the weekly status call with the Onsite team
Sharing the defect with the onsite team and having a defect triage to resolve issues.
Functional and Performance Testing using App Perfect tool
Validating the custom Reports developed in Congnos
Environment: J2EE, XML, Java Scripts,IBM WebsphereApplication server 5.1, CLM Framework,WSAD5.1
with SQL Server,Oracle as backend databases
Client: Virtusa, MA , Boston March 2004 to Feb 2007
Project: Aprimo Marketing
Role: Associate Software Engineer – Software Engineer
Description: Aprimo Marketing is an EMM (Enterprise Marketing Management) product with suite of Web-
based application modules like MRM (Marketing Resource Management),
Campaign Management, Financial Management, Lead Management, Marketing Calendar,Workflow
Management that works seamlessly together to help marketing users
manage the business of Marketing. Aprimo offers flexible integration with other third party system, and a
component to support single sign-on authentication
Responsibilities
Gatheringand analyzingthe requirements from various stakeholders and helping in defining
the product scope
Adopting and enforcing the best practices to achieve prescribed quality expectations as per
Aprimo’s standards, defining test methodologies and test approach
Creating Test plans and documenting Test cases to accomplish complete testing coverage.
Manage builds and deployments for development, and test environments and Performing
Build Verification Testing
Created and executed the functional, integration, regression tests cases and reported bugs
using the Aprimo Defect Tracking System
Interfaced with the Users for the User Acceptance Testing (UAT)
Environment: ASP.NET,C#,SQLServer2000,Oracle9i,JavaScriptt,XML,IIS,QuickTesto,LoadRunner,
ADTS(Aprimo Defect Tracking System)