SlideShare une entreprise Scribd logo
1  sur  4
NEELIMA KAMBAM
Hadoop Developer
Skill / Experience
Summary
10+ years of experience in IT with 3+ years of experience in Hadoop technologies, HDFS,
PIG, HIVE, Sqoop, Flume, and MapReduce.
Expertise in Hadoop echo systems HDFS, Pig, Hive, Sqoop, Flume and Map Reduce for
scalability, distributed computing and high performance computing.
2 years of experience in design/development of Solution Architecture for B2B/EAI
implementations in Healthcare, Manufacturing and Logistics verticals using Edifecs products.
Experienced in implementing EAI and B2B and Business based solutions. Excellent
communication, documentation and presentation skills.
6+ years of experience as Mainframe Programmer/Analyst. Involved in Design,
Development, Maintenance and Testing of Applications on Mainframes using COBOL, DB2,
CICS, VSAM, JCL and various Project Methodology tools/utilities. Worked with software
Applications, which includes Banking, Insurance and Manufacturing. Played key role in
doing Estimations, Analysis, Coding, Testing, giving production support and interacting with
the Users.
Design/Implementati
on experience
EAI, and B2B Solution Design, Development
Experienced in 837, 270/271, 276, 277, 835 Healthcare transactions , IT tools evaluation , EAI
competency center setup , Performance tuning of EAI and B2B solutions
Domain Experience Marketing, Healthcare, Manufacturing, Logistics
Technology
Experience
Hadoop MapR Hadoop Distribution, Cloudera Hadoop Distribution (CDH3), Hive, Pig, Cloudera
Manager, HDFS, Sqoop, Flume.
Edifecs
XEngine, Specbuilder, Mapbuilder, EDI-HIPAA, EDI-X12, Rosettanet Transactions, Transaction
Manager
Oracle Fusion
Middleware
Oracle SOA Suite 10.1.3.x/11.1.3 EAI/SOA B2B - EDI-X12, EDI-HIPAA Oracle B2B
Business Activity Monitoring IDE Jdeveloper, Eclipse
Application Servers &
Web Servers
Oracle Application Server, Apache
Database and Related
API
Oracle 9.2, PL/SQL, MySQL, NoSQL, DB2
Programming
Languages
Java, Cobol, JCL, SQL, VSAM & Easytrieve
Other Languages HTML, XML, XSD, XSL, JavaScript
Tools Tableau, Jdeveloper, Eclipse, CVS, CICS, TSO/ISPF, QMF, SPUFI, Change man, Expeditor
Smartest, File aid, MQ Series FTP, IBM Debugger, & Control-M.
Operating Systems Linux, HP-UX, Windows, MVS/ESA&OS/2, UNIX, MS-DOS
Experience Profile – Key Projects
Project # Multi-Channel Engagement Behavioral Scoring
Cisco Systems, CA Aug 2014 – Present
Cisco Marketing and Campaigning team requires the cisco customer’s propensity in buying the Cisco
Services/Equipment like Data center, Switches, Routers and etc…. based on the end user activity on various Cisco
subsidiary websites like CDC, Communities, Social and Support websites including the enriched internal data from
various channels. This requires the processing of discrete web logs from multiple instances of various web servers
related to different web sites. As part of this project, we will accumulate, crunch, compute and analyze the data to
come up with distinct topic level scores to indicate user interest in buying appropriate Services/Equipment.
Role Description: Hadoop Developer
Tools and Software: MapR Hadoop Distribution 5.1, MapR Hadoop Distribution 3.1, PIG, HIVE, Oracle 11g/R2, Eclipse,
Linux, Sqoop and Flume
Roles and Responsibilities
• Responsible for collecting Data required for behavioral interest detection system from the tag servers.
• Developed configuration and UDF’s to parse the staged data which infers the interests and purchase level
intent of all visitors based on their activity.
• Performed data analytics in Pig and Hive and then exported this metrics to external systems using Sqoop
scripts.
• Provided ad-hoc queries using Hive and data metrics using Tableau Dashboard Reporting to the Business
Users.
• Implemented Partitioning, Buckets in Hive.
• Debugging and troubleshooting the issues in development, Stage and production environments.
• Involved in minor and major release work activities.
Oregon Medicaid (MMIS), HP Mar 2012 – Jun 2014
MMIS is like many healthcare organizations have a legacy system, clinicians and researchers needed access to the
data. The data types include EMR generated data, genomic data, financial data, patient and caregiver data,
incremental physiological monitoring, ventilator data, temperature and humidity data.
Any electronically generated data in a healthcare environment can be ingested and stored in Hadoop. We used for
developing an analytic ecosystem to aide in the delivery of quality care at the lowest possible cost and an
environment to enable clinical researchers to examine healthcare data.
Role Description: Hadoop Developer
Tools and Software: HDFS, HIVE, Sqoop, PIG, Flume, MapReduce
Roles and Responsibilities
• Responsible for collecting Data required for testing various Map Reduce applications from different sources.
• Developed Hive UDF’s to parse the staged data to get the Hit times of the claims from a specific hospital for a
particular diagnosis code.
• Designed workflow by scheduling Hive processes for Log file dta which is streamed into HDFS using Flume.
• Developed SQL scripts to compare all the records for every field and table at each phase of the data.
• Design and develop ETL workflow using Oozie afor business requirements which includes automating the
extraction of data from MySQL database into HDFS using Sqoop scripts.
• Performed data analytics in Hive and then exported this metrics to back to Oracle Database using Sqoop.
• Provided ad-hoc queries and data metrics to the Business Users using Hive, Pig.
• Implemented Partitioning, Dynamic Partitions, Buckets in Hive.
• Debugging and troubleshooting the issues in development and Test environments.
• Conducting root cause analysis and resolve production problems and data issues.
• Involved in minor and major release work activities.
Project # Cigna Intake Healthcare solution & Nevada MMIS
Cigna, HP & Nevada Medicaid, HP Oct 2010 – Feb 2012
The project involves implementation of B2B for 270/271, 276/277 & 837 transactions.
Role Description: EDI Developer
Tools and Software: Oracle Document Editor/Edifecs Specbuilder 7.0.8 & 8.0, Edifecs XEngine, Java script, Oracle B2B,
Oracle SOA Suite 10g/11g, BPEL, DB Adapter
Business Requirements and Design
• Participated in partner discussions, gathered functional requirements and prepared Functional specifications
document and companion guides for 837 FFS claims, 270/271 transactions, 276/277 transactions.
• Prepared technical design documents for B2B implementation and integration with legacy systems
• Prepared technical design document for Error Handling and auditing strategy for all the B2B interfaces
• Gathered requirements for custom message validation rules using SpecBuilder Java Script Rules, and through
SpecBuilder rules
• Perform testing using Xengine/Specbuilder and generate test cases and test data
Oracle B2B Design and Development
• Implement all custom message validation rules using SpecBuilder Java Script Rules, or through SpecBuilder
rules
• Generated .ecs and .xsd files for documents using EDIFECS Spec Builder
270/271 – Eligibility and Benefits Inquiry
276/277 – Claim status request
837I - HIPAA Institutional Claims
837P - HIPAA Professional Claims
• Develop EDI guideline files using Edifecs Spec Builder/Oracle Document Editor
• Implement different SNIP level validations and custom validations using Java script
• Prepared technical design documents and mapping documents for 270/271, and 276/277.
• Performed unit testing, integration testing and functional testing
Production Support
• Worked on several CR’s and fixed issues encountered in production
EDUCATION: B.Tech. in Mechanical Engineering from S.V University, India.

Contenu connexe

Tendances

GouriShankar_Informatica
GouriShankar_InformaticaGouriShankar_Informatica
GouriShankar_InformaticaGouri Shankar M
 
2014.07.11 biginsights data2014
2014.07.11 biginsights data20142014.07.11 biginsights data2014
2014.07.11 biginsights data2014Wilfried Hoge
 
IoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected WorldIoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected WorldDataWorks Summit
 
Bhadale group of companies quantum ml services catalogue
Bhadale group of companies quantum ml services catalogueBhadale group of companies quantum ml services catalogue
Bhadale group of companies quantum ml services catalogueVijayananda Mohire
 
Get Started Quickly with IBM's Hadoop as a Service
Get Started Quickly with IBM's Hadoop as a ServiceGet Started Quickly with IBM's Hadoop as a Service
Get Started Quickly with IBM's Hadoop as a ServiceIBM Cloud Data Services
 
Ankit agrawal cognos report_developer
Ankit agrawal cognos report_developerAnkit agrawal cognos report_developer
Ankit agrawal cognos report_developerAnkit Agrawal
 
InfoSphere BigInsights
InfoSphere BigInsightsInfoSphere BigInsights
InfoSphere BigInsightsWilfried Hoge
 
Rahul Ratnakar Pawar_Resume
Rahul Ratnakar Pawar_ResumeRahul Ratnakar Pawar_Resume
Rahul Ratnakar Pawar_ResumeRahul Pawar
 
The Anywhere Enterprise – How a Flexible Foundation Opens Doors
The Anywhere Enterprise – How a Flexible Foundation Opens DoorsThe Anywhere Enterprise – How a Flexible Foundation Opens Doors
The Anywhere Enterprise – How a Flexible Foundation Opens DoorsInside Analysis
 
Dan reznick resume_05102016
Dan reznick resume_05102016Dan reznick resume_05102016
Dan reznick resume_05102016Daniel Reznick
 
flexpod_hadoop_cloudera
flexpod_hadoop_clouderaflexpod_hadoop_cloudera
flexpod_hadoop_clouderaPrem Jain
 
SplunkSummit 2015 - Real World Big Data Architecture
SplunkSummit 2015 -  Real World Big Data ArchitectureSplunkSummit 2015 -  Real World Big Data Architecture
SplunkSummit 2015 - Real World Big Data ArchitectureSplunk
 
Bhadale group of companies data science services catalogue - detailed
Bhadale group of companies data science services catalogue - detailedBhadale group of companies data science services catalogue - detailed
Bhadale group of companies data science services catalogue - detailedVijayananda Mohire
 
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011Cloudera, Inc.
 

Tendances (20)

GouriShankar_Informatica
GouriShankar_InformaticaGouriShankar_Informatica
GouriShankar_Informatica
 
2014.07.11 biginsights data2014
2014.07.11 biginsights data20142014.07.11 biginsights data2014
2014.07.11 biginsights data2014
 
Krishnan Ramachandran
Krishnan RamachandranKrishnan Ramachandran
Krishnan Ramachandran
 
IoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected WorldIoT: How Data Science Driven Software is Eating the Connected World
IoT: How Data Science Driven Software is Eating the Connected World
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Bhadale group of companies quantum ml services catalogue
Bhadale group of companies quantum ml services catalogueBhadale group of companies quantum ml services catalogue
Bhadale group of companies quantum ml services catalogue
 
Hadoop Trends
Hadoop TrendsHadoop Trends
Hadoop Trends
 
Get Started Quickly with IBM's Hadoop as a Service
Get Started Quickly with IBM's Hadoop as a ServiceGet Started Quickly with IBM's Hadoop as a Service
Get Started Quickly with IBM's Hadoop as a Service
 
Destroying Data Silos
Destroying Data SilosDestroying Data Silos
Destroying Data Silos
 
Ankit agrawal cognos report_developer
Ankit agrawal cognos report_developerAnkit agrawal cognos report_developer
Ankit agrawal cognos report_developer
 
InfoSphere BigInsights
InfoSphere BigInsightsInfoSphere BigInsights
InfoSphere BigInsights
 
Rahul Ratnakar Pawar_Resume
Rahul Ratnakar Pawar_ResumeRahul Ratnakar Pawar_Resume
Rahul Ratnakar Pawar_Resume
 
The Anywhere Enterprise – How a Flexible Foundation Opens Doors
The Anywhere Enterprise – How a Flexible Foundation Opens DoorsThe Anywhere Enterprise – How a Flexible Foundation Opens Doors
The Anywhere Enterprise – How a Flexible Foundation Opens Doors
 
ER/Studio Data Architect Datasheet
ER/Studio Data Architect DatasheetER/Studio Data Architect Datasheet
ER/Studio Data Architect Datasheet
 
Big Data
Big DataBig Data
Big Data
 
Dan reznick resume_05102016
Dan reznick resume_05102016Dan reznick resume_05102016
Dan reznick resume_05102016
 
flexpod_hadoop_cloudera
flexpod_hadoop_clouderaflexpod_hadoop_cloudera
flexpod_hadoop_cloudera
 
SplunkSummit 2015 - Real World Big Data Architecture
SplunkSummit 2015 -  Real World Big Data ArchitectureSplunkSummit 2015 -  Real World Big Data Architecture
SplunkSummit 2015 - Real World Big Data Architecture
 
Bhadale group of companies data science services catalogue - detailed
Bhadale group of companies data science services catalogue - detailedBhadale group of companies data science services catalogue - detailed
Bhadale group of companies data science services catalogue - detailed
 
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011
Hadoop in the Enterprise - Dr. Amr Awadallah @ Microstrategy World 2011
 

En vedette

Các mẹo vặt gia đình nên biết
Các mẹo vặt gia đình nên biếtCác mẹo vặt gia đình nên biết
Các mẹo vặt gia đình nên biếtDo Cat
 
Hang Sơn Đoòng
Hang Sơn ĐoòngHang Sơn Đoòng
Hang Sơn ĐoòngDo Cat
 
Những loài hoa mang dáng hình động vật
Những loài hoa mang dáng hình động vậtNhững loài hoa mang dáng hình động vật
Những loài hoa mang dáng hình động vậtDo Cat
 
Một sưu tập ....đái bậy
Một sưu tập ....đái bậyMột sưu tập ....đái bậy
Một sưu tập ....đái bậyDo Cat
 
Bamboo art
Bamboo artBamboo art
Bamboo artDo Cat
 
Nuskin facial spa presentation_final
Nuskin facial spa presentation_finalNuskin facial spa presentation_final
Nuskin facial spa presentation_finalMimi_Pound
 

En vedette (7)

02ghgh
02ghgh02ghgh
02ghgh
 
Các mẹo vặt gia đình nên biết
Các mẹo vặt gia đình nên biếtCác mẹo vặt gia đình nên biết
Các mẹo vặt gia đình nên biết
 
Hang Sơn Đoòng
Hang Sơn ĐoòngHang Sơn Đoòng
Hang Sơn Đoòng
 
Những loài hoa mang dáng hình động vật
Những loài hoa mang dáng hình động vậtNhững loài hoa mang dáng hình động vật
Những loài hoa mang dáng hình động vật
 
Một sưu tập ....đái bậy
Một sưu tập ....đái bậyMột sưu tập ....đái bậy
Một sưu tập ....đái bậy
 
Bamboo art
Bamboo artBamboo art
Bamboo art
 
Nuskin facial spa presentation_final
Nuskin facial spa presentation_finalNuskin facial spa presentation_final
Nuskin facial spa presentation_final
 

Similaire à Neelima_Resume

Similaire à Neelima_Resume (20)

Resume
ResumeResume
Resume
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Rakesh_Resume_2016
Rakesh_Resume_2016Rakesh_Resume_2016
Rakesh_Resume_2016
 
Informatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQLInformatica,Teradata,Oracle,SQL
Informatica,Teradata,Oracle,SQL
 
SivakumarS
SivakumarSSivakumarS
SivakumarS
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
Resume
ResumeResume
Resume
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resume
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Purnachandra_Hadoop_N
Purnachandra_Hadoop_NPurnachandra_Hadoop_N
Purnachandra_Hadoop_N
 
Kiran_Profile
Kiran_ProfileKiran_Profile
Kiran_Profile
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Amit_Kumar_CV
Amit_Kumar_CVAmit_Kumar_CV
Amit_Kumar_CV
 

Neelima_Resume

  • 1. NEELIMA KAMBAM Hadoop Developer Skill / Experience Summary 10+ years of experience in IT with 3+ years of experience in Hadoop technologies, HDFS, PIG, HIVE, Sqoop, Flume, and MapReduce. Expertise in Hadoop echo systems HDFS, Pig, Hive, Sqoop, Flume and Map Reduce for scalability, distributed computing and high performance computing. 2 years of experience in design/development of Solution Architecture for B2B/EAI implementations in Healthcare, Manufacturing and Logistics verticals using Edifecs products. Experienced in implementing EAI and B2B and Business based solutions. Excellent communication, documentation and presentation skills. 6+ years of experience as Mainframe Programmer/Analyst. Involved in Design, Development, Maintenance and Testing of Applications on Mainframes using COBOL, DB2, CICS, VSAM, JCL and various Project Methodology tools/utilities. Worked with software Applications, which includes Banking, Insurance and Manufacturing. Played key role in doing Estimations, Analysis, Coding, Testing, giving production support and interacting with the Users. Design/Implementati on experience EAI, and B2B Solution Design, Development Experienced in 837, 270/271, 276, 277, 835 Healthcare transactions , IT tools evaluation , EAI competency center setup , Performance tuning of EAI and B2B solutions Domain Experience Marketing, Healthcare, Manufacturing, Logistics Technology Experience Hadoop MapR Hadoop Distribution, Cloudera Hadoop Distribution (CDH3), Hive, Pig, Cloudera Manager, HDFS, Sqoop, Flume. Edifecs XEngine, Specbuilder, Mapbuilder, EDI-HIPAA, EDI-X12, Rosettanet Transactions, Transaction Manager Oracle Fusion Middleware Oracle SOA Suite 10.1.3.x/11.1.3 EAI/SOA B2B - EDI-X12, EDI-HIPAA Oracle B2B Business Activity Monitoring IDE Jdeveloper, Eclipse Application Servers & Web Servers Oracle Application Server, Apache Database and Related API Oracle 9.2, PL/SQL, MySQL, NoSQL, DB2 Programming Languages Java, Cobol, JCL, SQL, VSAM & Easytrieve Other Languages HTML, XML, XSD, XSL, JavaScript
  • 2. Tools Tableau, Jdeveloper, Eclipse, CVS, CICS, TSO/ISPF, QMF, SPUFI, Change man, Expeditor Smartest, File aid, MQ Series FTP, IBM Debugger, & Control-M. Operating Systems Linux, HP-UX, Windows, MVS/ESA&OS/2, UNIX, MS-DOS Experience Profile – Key Projects Project # Multi-Channel Engagement Behavioral Scoring Cisco Systems, CA Aug 2014 – Present Cisco Marketing and Campaigning team requires the cisco customer’s propensity in buying the Cisco Services/Equipment like Data center, Switches, Routers and etc…. based on the end user activity on various Cisco subsidiary websites like CDC, Communities, Social and Support websites including the enriched internal data from various channels. This requires the processing of discrete web logs from multiple instances of various web servers related to different web sites. As part of this project, we will accumulate, crunch, compute and analyze the data to come up with distinct topic level scores to indicate user interest in buying appropriate Services/Equipment. Role Description: Hadoop Developer Tools and Software: MapR Hadoop Distribution 5.1, MapR Hadoop Distribution 3.1, PIG, HIVE, Oracle 11g/R2, Eclipse, Linux, Sqoop and Flume Roles and Responsibilities • Responsible for collecting Data required for behavioral interest detection system from the tag servers. • Developed configuration and UDF’s to parse the staged data which infers the interests and purchase level intent of all visitors based on their activity. • Performed data analytics in Pig and Hive and then exported this metrics to external systems using Sqoop scripts. • Provided ad-hoc queries using Hive and data metrics using Tableau Dashboard Reporting to the Business Users. • Implemented Partitioning, Buckets in Hive. • Debugging and troubleshooting the issues in development, Stage and production environments. • Involved in minor and major release work activities. Oregon Medicaid (MMIS), HP Mar 2012 – Jun 2014 MMIS is like many healthcare organizations have a legacy system, clinicians and researchers needed access to the data. The data types include EMR generated data, genomic data, financial data, patient and caregiver data, incremental physiological monitoring, ventilator data, temperature and humidity data. Any electronically generated data in a healthcare environment can be ingested and stored in Hadoop. We used for developing an analytic ecosystem to aide in the delivery of quality care at the lowest possible cost and an environment to enable clinical researchers to examine healthcare data. Role Description: Hadoop Developer Tools and Software: HDFS, HIVE, Sqoop, PIG, Flume, MapReduce
  • 3. Roles and Responsibilities • Responsible for collecting Data required for testing various Map Reduce applications from different sources. • Developed Hive UDF’s to parse the staged data to get the Hit times of the claims from a specific hospital for a particular diagnosis code. • Designed workflow by scheduling Hive processes for Log file dta which is streamed into HDFS using Flume. • Developed SQL scripts to compare all the records for every field and table at each phase of the data. • Design and develop ETL workflow using Oozie afor business requirements which includes automating the extraction of data from MySQL database into HDFS using Sqoop scripts. • Performed data analytics in Hive and then exported this metrics to back to Oracle Database using Sqoop. • Provided ad-hoc queries and data metrics to the Business Users using Hive, Pig. • Implemented Partitioning, Dynamic Partitions, Buckets in Hive. • Debugging and troubleshooting the issues in development and Test environments. • Conducting root cause analysis and resolve production problems and data issues. • Involved in minor and major release work activities. Project # Cigna Intake Healthcare solution & Nevada MMIS Cigna, HP & Nevada Medicaid, HP Oct 2010 – Feb 2012 The project involves implementation of B2B for 270/271, 276/277 & 837 transactions. Role Description: EDI Developer Tools and Software: Oracle Document Editor/Edifecs Specbuilder 7.0.8 & 8.0, Edifecs XEngine, Java script, Oracle B2B, Oracle SOA Suite 10g/11g, BPEL, DB Adapter Business Requirements and Design • Participated in partner discussions, gathered functional requirements and prepared Functional specifications document and companion guides for 837 FFS claims, 270/271 transactions, 276/277 transactions. • Prepared technical design documents for B2B implementation and integration with legacy systems • Prepared technical design document for Error Handling and auditing strategy for all the B2B interfaces • Gathered requirements for custom message validation rules using SpecBuilder Java Script Rules, and through SpecBuilder rules • Perform testing using Xengine/Specbuilder and generate test cases and test data Oracle B2B Design and Development • Implement all custom message validation rules using SpecBuilder Java Script Rules, or through SpecBuilder rules • Generated .ecs and .xsd files for documents using EDIFECS Spec Builder 270/271 – Eligibility and Benefits Inquiry 276/277 – Claim status request 837I - HIPAA Institutional Claims 837P - HIPAA Professional Claims • Develop EDI guideline files using Edifecs Spec Builder/Oracle Document Editor • Implement different SNIP level validations and custom validations using Java script • Prepared technical design documents and mapping documents for 270/271, and 276/277. • Performed unit testing, integration testing and functional testing Production Support • Worked on several CR’s and fixed issues encountered in production
  • 4. EDUCATION: B.Tech. in Mechanical Engineering from S.V University, India.