1. NEELIMA KAMBAM
Hadoop Developer
Skill / Experience
Summary
10+ years of experience in IT with 3+ years of experience in Hadoop technologies, HDFS,
PIG, HIVE, Sqoop, Flume, and MapReduce.
Expertise in Hadoop echo systems HDFS, Pig, Hive, Sqoop, Flume and Map Reduce for
scalability, distributed computing and high performance computing.
2 years of experience in design/development of Solution Architecture for B2B/EAI
implementations in Healthcare, Manufacturing and Logistics verticals using Edifecs products.
Experienced in implementing EAI and B2B and Business based solutions. Excellent
communication, documentation and presentation skills.
6+ years of experience as Mainframe Programmer/Analyst. Involved in Design,
Development, Maintenance and Testing of Applications on Mainframes using COBOL, DB2,
CICS, VSAM, JCL and various Project Methodology tools/utilities. Worked with software
Applications, which includes Banking, Insurance and Manufacturing. Played key role in
doing Estimations, Analysis, Coding, Testing, giving production support and interacting with
the Users.
Design/Implementati
on experience
EAI, and B2B Solution Design, Development
Experienced in 837, 270/271, 276, 277, 835 Healthcare transactions , IT tools evaluation , EAI
competency center setup , Performance tuning of EAI and B2B solutions
Domain Experience Marketing, Healthcare, Manufacturing, Logistics
Technology
Experience
Hadoop MapR Hadoop Distribution, Cloudera Hadoop Distribution (CDH3), Hive, Pig, Cloudera
Manager, HDFS, Sqoop, Flume.
Edifecs
XEngine, Specbuilder, Mapbuilder, EDI-HIPAA, EDI-X12, Rosettanet Transactions, Transaction
Manager
Oracle Fusion
Middleware
Oracle SOA Suite 10.1.3.x/11.1.3 EAI/SOA B2B - EDI-X12, EDI-HIPAA Oracle B2B
Business Activity Monitoring IDE Jdeveloper, Eclipse
Application Servers &
Web Servers
Oracle Application Server, Apache
Database and Related
API
Oracle 9.2, PL/SQL, MySQL, NoSQL, DB2
Programming
Languages
Java, Cobol, JCL, SQL, VSAM & Easytrieve
Other Languages HTML, XML, XSD, XSL, JavaScript
2. Tools Tableau, Jdeveloper, Eclipse, CVS, CICS, TSO/ISPF, QMF, SPUFI, Change man, Expeditor
Smartest, File aid, MQ Series FTP, IBM Debugger, & Control-M.
Operating Systems Linux, HP-UX, Windows, MVS/ESA&OS/2, UNIX, MS-DOS
Experience Profile – Key Projects
Project # Multi-Channel Engagement Behavioral Scoring
Cisco Systems, CA Aug 2014 – Present
Cisco Marketing and Campaigning team requires the cisco customer’s propensity in buying the Cisco
Services/Equipment like Data center, Switches, Routers and etc…. based on the end user activity on various Cisco
subsidiary websites like CDC, Communities, Social and Support websites including the enriched internal data from
various channels. This requires the processing of discrete web logs from multiple instances of various web servers
related to different web sites. As part of this project, we will accumulate, crunch, compute and analyze the data to
come up with distinct topic level scores to indicate user interest in buying appropriate Services/Equipment.
Role Description: Hadoop Developer
Tools and Software: MapR Hadoop Distribution 5.1, MapR Hadoop Distribution 3.1, PIG, HIVE, Oracle 11g/R2, Eclipse,
Linux, Sqoop and Flume
Roles and Responsibilities
• Responsible for collecting Data required for behavioral interest detection system from the tag servers.
• Developed configuration and UDF’s to parse the staged data which infers the interests and purchase level
intent of all visitors based on their activity.
• Performed data analytics in Pig and Hive and then exported this metrics to external systems using Sqoop
scripts.
• Provided ad-hoc queries using Hive and data metrics using Tableau Dashboard Reporting to the Business
Users.
• Implemented Partitioning, Buckets in Hive.
• Debugging and troubleshooting the issues in development, Stage and production environments.
• Involved in minor and major release work activities.
Oregon Medicaid (MMIS), HP Mar 2012 – Jun 2014
MMIS is like many healthcare organizations have a legacy system, clinicians and researchers needed access to the
data. The data types include EMR generated data, genomic data, financial data, patient and caregiver data,
incremental physiological monitoring, ventilator data, temperature and humidity data.
Any electronically generated data in a healthcare environment can be ingested and stored in Hadoop. We used for
developing an analytic ecosystem to aide in the delivery of quality care at the lowest possible cost and an
environment to enable clinical researchers to examine healthcare data.
Role Description: Hadoop Developer
Tools and Software: HDFS, HIVE, Sqoop, PIG, Flume, MapReduce
3. Roles and Responsibilities
• Responsible for collecting Data required for testing various Map Reduce applications from different sources.
• Developed Hive UDF’s to parse the staged data to get the Hit times of the claims from a specific hospital for a
particular diagnosis code.
• Designed workflow by scheduling Hive processes for Log file dta which is streamed into HDFS using Flume.
• Developed SQL scripts to compare all the records for every field and table at each phase of the data.
• Design and develop ETL workflow using Oozie afor business requirements which includes automating the
extraction of data from MySQL database into HDFS using Sqoop scripts.
• Performed data analytics in Hive and then exported this metrics to back to Oracle Database using Sqoop.
• Provided ad-hoc queries and data metrics to the Business Users using Hive, Pig.
• Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
• Debugging and troubleshooting the issues in development and Test environments.
• Conducting root cause analysis and resolve production problems and data issues.
• Involved in minor and major release work activities.
Project # Cigna Intake Healthcare solution & Nevada MMIS
Cigna, HP & Nevada Medicaid, HP Oct 2010 – Feb 2012
The project involves implementation of B2B for 270/271, 276/277 & 837 transactions.
Role Description: EDI Developer
Tools and Software: Oracle Document Editor/Edifecs Specbuilder 7.0.8 & 8.0, Edifecs XEngine, Java script, Oracle B2B,
Oracle SOA Suite 10g/11g, BPEL, DB Adapter
Business Requirements and Design
• Participated in partner discussions, gathered functional requirements and prepared Functional specifications
document and companion guides for 837 FFS claims, 270/271 transactions, 276/277 transactions.
• Prepared technical design documents for B2B implementation and integration with legacy systems
• Prepared technical design document for Error Handling and auditing strategy for all the B2B interfaces
• Gathered requirements for custom message validation rules using SpecBuilder Java Script Rules, and through
SpecBuilder rules
• Perform testing using Xengine/Specbuilder and generate test cases and test data
Oracle B2B Design and Development
• Implement all custom message validation rules using SpecBuilder Java Script Rules, or through SpecBuilder
rules
• Generated .ecs and .xsd files for documents using EDIFECS Spec Builder
270/271 – Eligibility and Benefits Inquiry
276/277 – Claim status request
837I - HIPAA Institutional Claims
837P - HIPAA Professional Claims
• Develop EDI guideline files using Edifecs Spec Builder/Oracle Document Editor
• Implement different SNIP level validations and custom validations using Java script
• Prepared technical design documents and mapping documents for 270/271, and 276/277.
• Performed unit testing, integration testing and functional testing
Production Support
• Worked on several CR’s and fixed issues encountered in production