This document contains the resume of Ranjith Kumar. It summarizes his professional experience, technical skills, projects worked on and education. Ranjith has over 10 years of experience working with technologies like Spark, Hadoop, Scala and mainframe skills. He has extensive experience in migrating data and building reports. Currently he works on migrating panels from mainframes to Spark platforms at IMS Health.
1 of 6
Download to read offline
More Related Content
Ranjith resume
1. Ranjith kumar
Mobile: +919742596739
E-Mail:vemularanjithkumar.cse@gmail.com
PROFESSIONAL SUMMARY
 Certified Spark Professional.
 Having good hands on in Spark, Spark Batch, SparkSql, Spark Streaming ,HADOOP, Hive and other big data
related tools and technologies.
 Having good hands on in SCALA programming.
 Having knowledge and hands on in Microsoft Azure cloud platform.
 Ability to learn new technologies and processes rapidly and implement them in the project.
 Extensive Experience on Mainframe related skills such as COBOL, JCL, VSAM
 Have Good Knowledge of the complete System Development Life Cycle and implementation
 Has been a part of coding, design, compilation and batch support and tool testing activities in various projects.
 Has good knowledge in REXX tool design
 Highly motivated with very good problem solving and analytical skills.
 Has a Good verbal and written communication skill with excellent team bonding and coordination capabilities
IT EXPERIENCE
 Working for IMS health India pvt limited from 2015 Feb to till date
 Worked in Dell Services from 2015 Oct to Feb2015
 Worked in Attra InfoTech from 2010 July to Oct 2014
TECHNICAL SKILLS
Software Platform : HADOOP, HDFS, HIVE, SparkSql
Languages : SCALA, JAVA, COBOL
Scripting Languages : UNIX/LINUX
File Management : VSAM, JSON
Database : DB2, No Sql databases
Tools : SQOOP, SCALA IDE, Juypeter notebook
EDUCATIONAL QUALIFICATION
Qualification University/ Institution Year Percentage
B-TECH(CSC) Nagarjuna university 2009 70 %
(Ranjith kumar)
P a g e | 1
2. PROJECTS EXPERIENCE
Project #1
Company : IMS Health Analytics PVT limited
Project Name : Migration of MIDAS/MMIDAS panels to Spark
Client : IMS health
Role : Senior Developer
Duration : Feb 2015-till Date
Technology : SCALA, Spark SQL, Spark batch, Spark streaming, Microsoft Azure cloud
platform
Tools used : SQOOP
Database : DB2
Distributed File system : HDFS
Project Description :
The purpose of the project is to Migrate MIDAS/MMIDAS panels in mainframes to spark. HDFS has been used
as the distributed file storage system to access the data faster.
Responsibilities :
 Domain and project requirement understanding
 Writing and testing application code in Microsoft Azure cloud platform.
 Creating dataframes and running queries in Juypeter Notebook in Microsoft Azure cloud platform
 Writing High Level Design Specifications
 Writing technical specification
 Interacting on weekly calls
 Coding as per Functional Specification document
 Unit testing of code
(Ranjith kumar)
P a g e | 2
3. Project #2
Company : Dell
Project Name : Experian MI Reports
Client : Experian
Role : Senior Developer
Duration : Feb 2013-Feb2014
Technology : COBOL, VSAM, JCL, DB2
Tools used : CHANGEMAN, SPUFI, FILEAID, FILE MANAGER
Database : DB2
Project Description :
Experian UK Ltd one of the World’s largest suppliers of information related to finance. It deals in managing
Consumer, Insurance, Automotive and Business data mainly in the UK. The Group helps businesses to
manage credit risk, prevent fraud, target marketing offers and automate decision making. Experian also helps
individuals to check their credit report and credit score, and protect against identity theft
The purpose of the project is to create the daily report which is having the details of the previous day’s
application information. The application processed is stored in the VSAM files with the date of application is in
one of the block. The COBOL program is written to read the date and create the report based criteria specified
during the input into different files. This MI report is send to the client’s file by using the connect direct
processing..
Responsibilities :
 Impact analysis for every new requirement
 Domain and project requirement understanding
 Writing High Level Design Specifications
 Writing technical specification
 Interacting with Client on weekly calls
 Coding as per Functional Specification document
 Unit testing of code
(Ranjith kumar)
P a g e | 3
4. Project #3
Company : Attra InfoTech India pvt limited
Project Name : Changeman and Falcon support
Client : GE Money
Role : Developer
Duration : July 2010 -Feb 2013
Technology : COBOL, REXX, SKEL, PANEL DESIGN
Tools used : CHANGEMAN, SPUFI, FILEAID, FILE MANAGER
Operating System : OS/390
Database : DB2
Project Description :
GE is a leading consumer finance company all over the world, offering a range of services: including
Personal loans, credit cards, personal insurance and interest-free and promotional retail finance.
Changeman falcon supports are two different supports where we support of 13 different GE businesses
for
Changeman and falcon applications. As part of this we use to customize COBOL code as per business
Requirement for Falcon and use to customize Skels and REXX for changeman applications.
Responsibilities :
 Impact analysis for every new requirement
 Domain and project requirement understanding
 Writing High Level Design Specifications
 Writing technical specification
 Interacting with Client on weekly calls
 Coding as per Functional Specification document
 Unit testing of code
(Ranjith kumar)
P a g e | 4
5. I hereby confirm that the information in this document is accurate and true to the best of my knowledge.
Place: Bangalore
Date: Signature
(Ranjith kumar)
P a g e | 5
6. I hereby confirm that the information in this document is accurate and true to the best of my knowledge.
Place: Bangalore
Date: Signature
(Ranjith kumar)
P a g e | 5