This document contains the resume of Ramesh, who has over 5 years of experience in SAP EIM technologies like BODS, BODI, Information Steward, and HANA. He has extensive experience designing and implementing ETL processes to extract, transform, and load data from various source systems into data warehouses. Ramesh has worked on multiple projects involving SAP data integration, data governance, and BI reporting.
1 of 6
Download to read offline
More Related Content
Ramesh BODS_IS
1. Ramesh
Mobile: +91-9987713828
SAP BOBJ EIM- IS/DS Consultant
Maid ID: rameshbods@gmail.com
SUMMARY
 Over all 5 Years of IT Experience which includes of SAP EIM-DATASERVICES (BODS)/DATA INTEGRATER
(BODI),SAP Business Objects WEB-I, BODS Administration, SAP Information Steward, SAP HANA Experience
Involved in End-to-End DWH Implementation Projects.
BODI/DS Experience:
 POC on SAP Information Steward using Matapedia, MDM, Data Insight, Cleansing Package Builder, Match and
Consolidation.
 POC on SAP HANA .Creating all types of Information views, SLT, BODS Replication and analysis on HANA
Database using all kind of reporting tools.
 Created Many Validation rules to validate sample data using SAP Information Steward Scripting language.
 Experience On Business Objects MDM to integrate legacy System with BO System.
 Implemented Data Assessment solution by configuring the Data Profiling and regularly collecting the profile
tasks in the form of Dashboards.
 Experienced in configuring Information steward for Data Profiling.
 Experienced in building data quality monitoring Dashboards and Financial Impact Dashboards
 Implemented Data Governance, Failed Database.
 Designed cleansing packages in IS and used them in SAP BODQ.
 Complete knowledge in implementing Information Steward Administration.
 Strong scripting knowledge of SAP BODS/IS.
 Experience in working with BODI/DS using with different Data Sources (Flat files, Oracle) and SAP-
ECC&BI/BW.
 Strong skills in writing SQL Queries on Oracle, MS-SQL Server.
 An energetic, self-motivated designer with hands on experience in creating batch jobs, using the ETL tool BO
Data Integrator and SAP Data Services.
 Highly optimized the batch jobs developed in SAP BODS/DI and tuning the performance by various techniques
in BODS.
 Experience on SAP BOBJ BODS ADMINISTRATOR like user creation, adding repositories to job servers.
2.  Used different performance tuning techniques like Parallel processing, Multi-Threading, Partitioning and Bulk
Loading, etc to improve the extraction and loading performance
 Implemented Multi-User access management and Version Management by installing the Secured Central
Repository.
 Developments of Crystal Xcelsius components Grids, Pie charts, Bar Charts and developed the animation actions
for those components.
 Expertise in Business Objects administrator settings for various users and assigning rights and permissions for
various features, objects and users in Central Management Console.
 Knowledge of complete SDLC process.
 Worked extensively with Data Services Management Console for administration of Users and configuring the
Real-Time clients, Assess Servers, Real-Time Jobs and monitoring the statistics reports.
Technical Skills:
ETL Tools Sap BODI/Ds, Data Stage
EIM Tool Sap Information Steward.
Reporting Tools Sap Bo, Xcelsius, Web-I, Tableau
Legacy SYSTEMS Sap- ERP, Sap-BW, Oracle Apps
Operating Systems Ms-Windows.
Data Bases Oracle, MS-Sql, Ms- Access 2000, SAP HANA
Languages C, C++, Sql/Plsql.
Packages Bobj Rapid Marts
Professional Experience:
 Working with NDS INFOTECH from OCT-2011-Till date.
 Worked with IGRID TECHNOLOGY SOLUTIONS Hyderabad SEP 2010-OCT 2011.
Project#1:
Project Name : SAP BOBJ Rapid Marts, DWH IMPLEMENTATION.
Client : TEMPURPEDIC (Product), U.S .
Team size : 5
Role : BODS, BO Developer and Administrator.
Environment : SAP Data Services 4.2, SAP IS 4.2 Oracle, Windows server 2008, BO.
Duration : NOV 2011 to till date
Roles and responsibilities:
 Involved in gathering the requirements, understand the current LDWH, prepared design data flow of adopting
EDW.
3.  Extracted from multiple sources like (Flat Files/Tables) from Oracle Apps ERP system.
 Interacted with Business Analyst and business users to understand the business requirements and Gathered
requirements for enterprise data warehouse schemas.
 Performed Source System Analysis. Analyzed the Source and Target Data elements, created Mapping
document, Unit and System Test Plan.
 Implemented Rapid Mart Recovery mechanism for Delta Management.
 Error Handling Using TRY Catch Mechanism and Event based Triggering by writing BODS Scripting Language.
 Generating Multiple Event Files to process reports Based On Job to reduce Burden On Reporting Server.
 Dynamic Path selection by creating Variables and Parameters.
 Done Out of box Implementation to RapidMarts packages (AP,AR,Inventory,Sales,Purchasing).
 Schedule Delta Jobs to Execute the Delta Data or Changed Data Daily.
 Designed and Debugged ETL process to extract data from Sources systems Transform as per business
requirements and load the data into dimensional data models to support BI reports, Dashboard,
scorecards.
 Prepared Data Flow and Technical, Specification Documents and script files for tables.
Project#2:
Client : RTI INTERNATIONAL (METALS), U.S
Team size : 3
Role : BODI, BO Developer and BO Administer
Environment : Data Services 3.2, Oracle, Windows server 2003,BO.
Duration : Mar 2011 to Oct 2011.
Roles and responsibilities:
 Involved in gathering the requirements, understand the current LDWH, prepared design data flow of adopting
EDW by implementing SAP BOBJ CC Rapid Mart.
 Prepared system and executed all the Pre-Install activities before installing the standard CC Rapid Mart i.e.
Repository creation, creation and management of Job Server.
 Enhanced the standard Rapid Mart Work Flows, to enable the ETL of custom tables in SAP by developing new
Data Flows and pulled data using LOOKUP(), SUBSTR functions
 Implemented Rapid Mart Recovery mechanism for Delta Management.
 Schedule Delta Jobs to Execute the Delta Data or Changed Data Daily.
 Maintained the Directories with Data Files & ABAP Files to Enable Time Dimension, Data load Using Custom
Logic.
 Spitted the standard ETL Jobs into individual jobs for Master Data, Hierarchies, Cost Center Detail and
Summary, Profit Center Detail and Summary and SAP Meta Data.
4.  Prepared Data Flow and Technical Documents.
Project#3:
Client : BASF CEMICALS, U.S.A
Team size : 6
Role : BODI and BO Developer.
Duration : SEP 2010 – Mar 2011
Environment : BODI/DS, Oracle10g, Windows XP. BO.
Roles and responsibilities:
 Enhanced the standard SAP dataflow to integrate the non-SAP source systems data viz. Flat files and XML files
 Designed an ETL Job which extracts the Excel Workbook sheets dynamically and supports the multiple
workbook extraction
 Designed an ETL Job which integrates Flat file and XML data, loads into single target using XML_PIPELINE
and MERGER Transforms
 Designed and ETL Job which distributes the multiple tables data into nested structure in XML format to external
systems using TEMPLATE_XML object
 Used BODS Data Integrator Transforms PIVOT and REVERSE_PIVOT to apply the row level and column level
transformation according to reporting requirement
 Developed a standard BODS ETL Job to load the Time Dimension table using Data Integrator Transform
DATE_GENERATION
 Implemented Slowly Changing Dimension – Type 2 (SCD – Type 2) for the required dimensions by using
TABLE_COMPARIOSN Transform, HISTORY_PRESERVING Transform and KEY_GENERATION
Transform
 Implemented and ETL logic to identify the Holiday of the Year by adding a new column as indicator in the
Time Dimension
 Implemented the DWH Dimension concept i.e. Dimension with NULL record using the Platform
Transform ROW_GENERATION
 Used the LOOKUP_EXT and LOOKUP_SEQ functions to derive the columns in BODS ETL by look
upping the values in lookup table of type validity tables
 Implemented ETL Data Flows with DATA_TRNSFER transform to improve the Data Loading
performance by forcibly generating the Insert into…select statement.
Education:
5.  Masters of Computer Applications (MCA) from Osmania University in 2010.
6.  Masters of Computer Applications (MCA) from Osmania University in 2010.