Jeff has over 33 years of experience in IT consulting, product development, and system operations. He has expertise in big data technologies including Hadoop, Spark, and Hive. Most recently as a Big Data Architect, he helped customers optimize data warehouse workloads on Hadoop. He also led teams to design and build innovative tools for automating data warehouse migrations to Hadoop. Jeff has extensive experience developing, operating, and administering large-scale production environments and big data initiatives.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
This document provides a summary of Rajesh Dheeti's professional experience and qualifications. It summarizes his 4+ years of experience developing ETL processes using Informatica PowerCenter to extract, transform, and load data from sources like Oracle and Teradata. It also lists 5 projects he has worked on involving building ETL mappings and workflows to load data into data warehouses.
The document provides a summary of Prasenjit Chowdhury's experience and qualifications. He has over 12 years of experience in information technology, including as an Exadata administrator. He has skills in Oracle database administration, Exadata administration, and backup and recovery. His objective is to work as a customer-focused solution architect utilizing skills in technologies like Teradata and Hadoop.
This document contains a professional profile for Jeevananthan R, including his contact details, educational background, work experience, skills, and strengths. He has over 4 years of experience in data warehousing using tools like Informatica, Oracle, and Unix shell scripting. Currently working as an ETL Developer/Module Lead at Cognizant Technologies, he has previously worked on projects involving data extraction, transformation, and loading for clients like Smith & Nephew and CVS Caremark.
Munir Muhammad has over 15 years of experience as an Oracle PL/SQL and ETL developer. He currently works as a senior database developer at JP Morgan Chase, where he supports various projects involving Oracle E-Business Suite. Previously, he held roles as a senior data warehouse developer at Barclays Bank and as a senior Oracle developer for the Department of Correction in Delaware. He has extensive experience with Oracle databases, PL/SQL, ETL processes, data modeling, and report development.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
Sreekanth has over 5 years of experience in data warehousing and ETL development using IBM DataStage. He has worked on projects in the telecom and automotive industries, extracting data from various sources and loading it into data warehouses. His responsibilities included designing and developing ETL jobs, testing, troubleshooting, performance tuning, and providing production support. He is proficient in DataStage, Oracle, Teradata, UNIX, and scheduling tools like Autosys.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
This document contains the resume of Vipin KP, who has over 5 years of experience as a Big Data Hadoop Developer. He has extensive experience developing Hadoop applications for clients such as EMC, Apple, Dun & Bradstreet, Neilsen, Commonwealth Bank of Australia, and Nokia Siemens Network. He has expertise in technologies such as Hadoop, Hive, Pig, Sqoop, Oozie, and Spark and has developed ETL processes, data pipelines, and analytics solutions on Hadoop clusters. He holds a Master's degree in Computer Science and is Cloudera certified in Hadoop development.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
Saroj Mahanta has nearly 9 years of experience as an Oracle Apps Database Administrator. He has experience administering Oracle R12 databases for clients such as Microsoft, Virgin Australia, JP Morgan Chase, Sonae, and Comverse Inc. Currently, he works for Microsoft in Espoo, Finland where he supports developers and helps migrate the database from Oracle to Microsoft SQL Server.
Theodore W. Dennis has over 24 years of experience as a software engineer with expertise in enterprise database application solutions. He has worked on projects across various industries and technologies, including agile methodologies, cloud technologies, databases, programming languages, and tools. His experience spans roles from staff augmentation consultant to technical lead. Recent projects include developing Java and database components for a global web application, creating and supporting EDI interfaces in Oracle PL/SQL, and architecting a data services web portal using ColdFusion.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Ìý
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
This document is a curriculum vitae for Majid Izadi that provides information about his professional experience and qualifications. It summarizes his experience as an Oracle Applications Developer with over 20 years of experience developing and supporting large-scale Oracle software applications. It also lists his positions held at various companies in the IT sector along with descriptions of projects he has worked on involving Oracle databases, Forms, Reports, PL/SQL, and establishing communication between different systems.
This resume summarizes Sanjaykumar Mane's qualifications and experience. He has over 15 years of experience in database engineering. His skills include Oracle 11g, PL/SQL, SQL*Loader, UNIX, data modeling, ETL tools like Pentaho and Oracle Data Integration. He has worked as a technical lead and project leader on various projects involving data migration, report generation, and database design. His most recent experience is as a technical lead for CITI, where he worked on MemSQL and ODI proof of concepts.
Yuvaraj Shanmugam has over 18 years of experience in application architecture, development, and maintenance using technologies like Java, J2EE, web services, mainframe applications, and databases. He has worked as an application architect at Syntel and a technical architect at HTC Global Services, leading teams and designing and developing various applications. He has extensive experience with technologies such as Java, databases like DB2 and PostgreSQL, mainframe tools, SOA, and big data technologies.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
This document contains a summary of Amit Kumar's professional experience and qualifications. He has over 9 years of IT experience, including 8 years of data warehouse and business intelligence experience. Currently he works as a data architect at Capgemini, leading a team of 12 on an oil and gas equipment install base project. He has extensive experience designing and developing data integration solutions using tools like Informatica, Hadoop, and SQL.
This document contains a summary of skills and experience for Chandrashekhar seeking challenging assignments as an Oracle SQL and PL/SQL developer. He has over 5 years of experience with Oracle databases, SQL, PL/SQL, Oracle Forms/Reports, and XML. His experience includes data migration, database upgrades, performance tuning, software development, and testing. He is proficient with tools like SQL Developer, TOAD, and PL/SQL Developer.
This document contains a summary of Azhar Mohammed's experience and qualifications. He has over 9 years of experience in IT with a focus on Informatica administration. Specifically, he has 4+ years of experience administering Informatica PowerCenter versions 9.5, 9.1, 8.6, 8.5, and 7.1. He also has experience with Informatica Data Quality and other tools. His background includes administration, installation, configuration, troubleshooting, and upgrading various versions of Informatica software.
Michael De Mar has extensive experience as an HL7 interface analyst with skills in many programming languages and software. He has implemented solutions for both large healthcare systems and smaller practices. Most recently, he has worked as a senior interface analyst for Allscripts where he helped create over 2000 interfaces across 17 facilities. He also has a master's degree in computer science from Hofstra University.
The document provides a summary of Praveena Chandrasekaran's professional experience and skills. She has over 7 years of experience in data warehousing, business intelligence, and ETL development using Informatica. Some of her roles included developing ETL mappings and workflows to load data from various sources into targets, performing testing, and leading a project as an onsite coordinator. She has strong skills in SQL, databases, data modeling, Informatica, and UNIX scripting.
Anantha Krishnan T M S has over 7 years of experience as an Oracle/PL-SQL developer. He has extensive experience designing and developing data warehouses, ETL processes, batch jobs, and front-end applications for large financial clients like CVS Health and JPMorgan Chase. Some of his key skills include Oracle database administration, performance tuning, data modeling, Informatica ETL, UNIX scripting, and Agile methodologies. He is looking for a role where he can utilize his expertise in Oracle, databases, and data warehousing.
Chandan Das is a developer/designer with over 7 years of experience in IT implementation projects using technologies like Teradata, Oracle, Hadoop, Pig, Hive, and Sqoop. He has extensive experience in data warehousing, ETL, and database administration. His career objective is to obtain a productive role in an IT organization where he can implement his expertise in developing complex projects efficiently and meeting expectations. He provides details of his professional experience, technical skills, key achievements and completed projects.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
This resume summarizes Arbind Kumar Jha's experience working with big data technologies like Hadoop, Hive, Pig, and HBase. He has over 12 years of IT experience, including 1.5 years working with Hadoop. His current role is a Technical Architect Lead at HCL Technologies, where he works on architectures, designs, and develops solutions involving big data, NoSQL, Hadoop, and BIRT. His technical skills include programming languages like Java, databases like Oracle and SQL Server, and big data tools like Hadoop, Hive, Pig, Cassandra, and Flume.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
This document contains the resume of Vipin KP, who has over 5 years of experience as a Big Data Hadoop Developer. He has extensive experience developing Hadoop applications for clients such as EMC, Apple, Dun & Bradstreet, Neilsen, Commonwealth Bank of Australia, and Nokia Siemens Network. He has expertise in technologies such as Hadoop, Hive, Pig, Sqoop, Oozie, and Spark and has developed ETL processes, data pipelines, and analytics solutions on Hadoop clusters. He holds a Master's degree in Computer Science and is Cloudera certified in Hadoop development.
This document contains the resume of Hassan Qureshi. He has over 9 years of experience as a Hadoop Lead Developer with expertise in technologies like Hadoop, HDFS, Hive, Pig and HBase. Currently he works as the technical lead of a data engineering team developing insights from data. He has extensive hands-on experience installing, configuring and maintaining Hadoop clusters in different environments.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
Saroj Mahanta has nearly 9 years of experience as an Oracle Apps Database Administrator. He has experience administering Oracle R12 databases for clients such as Microsoft, Virgin Australia, JP Morgan Chase, Sonae, and Comverse Inc. Currently, he works for Microsoft in Espoo, Finland where he supports developers and helps migrate the database from Oracle to Microsoft SQL Server.
Theodore W. Dennis has over 24 years of experience as a software engineer with expertise in enterprise database application solutions. He has worked on projects across various industries and technologies, including agile methodologies, cloud technologies, databases, programming languages, and tools. His experience spans roles from staff augmentation consultant to technical lead. Recent projects include developing Java and database components for a global web application, creating and supporting EDI interfaces in Oracle PL/SQL, and architecting a data services web portal using ColdFusion.
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
Ìý
Abinash Bindhani is seeking a position as a Hadoop developer where he can utilize over 2 years of experience with Hadoop and Java technologies. He currently works as a senior systems engineer at Infosys where he has gained experience migrating data from Oracle to Hadoop platforms and collecting/analyzing log data using tools like Flume, Pig, and Hive. His technical skills include MapReduce, HBase, HDFS, Java, Spring, MySQL, and Apache Tomcat. He has expertise in Hadoop architecture, cluster concepts, and each phase of the software development life cycle.
This document is a curriculum vitae for Majid Izadi that provides information about his professional experience and qualifications. It summarizes his experience as an Oracle Applications Developer with over 20 years of experience developing and supporting large-scale Oracle software applications. It also lists his positions held at various companies in the IT sector along with descriptions of projects he has worked on involving Oracle databases, Forms, Reports, PL/SQL, and establishing communication between different systems.
This resume summarizes Sanjaykumar Mane's qualifications and experience. He has over 15 years of experience in database engineering. His skills include Oracle 11g, PL/SQL, SQL*Loader, UNIX, data modeling, ETL tools like Pentaho and Oracle Data Integration. He has worked as a technical lead and project leader on various projects involving data migration, report generation, and database design. His most recent experience is as a technical lead for CITI, where he worked on MemSQL and ODI proof of concepts.
Yuvaraj Shanmugam has over 18 years of experience in application architecture, development, and maintenance using technologies like Java, J2EE, web services, mainframe applications, and databases. He has worked as an application architect at Syntel and a technical architect at HTC Global Services, leading teams and designing and developing various applications. He has extensive experience with technologies such as Java, databases like DB2 and PostgreSQL, mainframe tools, SOA, and big data technologies.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
This document contains a summary of Amit Kumar's professional experience and qualifications. He has over 9 years of IT experience, including 8 years of data warehouse and business intelligence experience. Currently he works as a data architect at Capgemini, leading a team of 12 on an oil and gas equipment install base project. He has extensive experience designing and developing data integration solutions using tools like Informatica, Hadoop, and SQL.
This document contains a summary of skills and experience for Chandrashekhar seeking challenging assignments as an Oracle SQL and PL/SQL developer. He has over 5 years of experience with Oracle databases, SQL, PL/SQL, Oracle Forms/Reports, and XML. His experience includes data migration, database upgrades, performance tuning, software development, and testing. He is proficient with tools like SQL Developer, TOAD, and PL/SQL Developer.
This document contains a summary of Azhar Mohammed's experience and qualifications. He has over 9 years of experience in IT with a focus on Informatica administration. Specifically, he has 4+ years of experience administering Informatica PowerCenter versions 9.5, 9.1, 8.6, 8.5, and 7.1. He also has experience with Informatica Data Quality and other tools. His background includes administration, installation, configuration, troubleshooting, and upgrading various versions of Informatica software.
Michael De Mar has extensive experience as an HL7 interface analyst with skills in many programming languages and software. He has implemented solutions for both large healthcare systems and smaller practices. Most recently, he has worked as a senior interface analyst for Allscripts where he helped create over 2000 interfaces across 17 facilities. He also has a master's degree in computer science from Hofstra University.
The document provides a summary of Praveena Chandrasekaran's professional experience and skills. She has over 7 years of experience in data warehousing, business intelligence, and ETL development using Informatica. Some of her roles included developing ETL mappings and workflows to load data from various sources into targets, performing testing, and leading a project as an onsite coordinator. She has strong skills in SQL, databases, data modeling, Informatica, and UNIX scripting.
Anantha Krishnan T M S has over 7 years of experience as an Oracle/PL-SQL developer. He has extensive experience designing and developing data warehouses, ETL processes, batch jobs, and front-end applications for large financial clients like CVS Health and JPMorgan Chase. Some of his key skills include Oracle database administration, performance tuning, data modeling, Informatica ETL, UNIX scripting, and Agile methodologies. He is looking for a role where he can utilize his expertise in Oracle, databases, and data warehousing.
Chandan Das is a developer/designer with over 7 years of experience in IT implementation projects using technologies like Teradata, Oracle, Hadoop, Pig, Hive, and Sqoop. He has extensive experience in data warehousing, ETL, and database administration. His career objective is to obtain a productive role in an IT organization where he can implement his expertise in developing complex projects efficiently and meeting expectations. He provides details of his professional experience, technical skills, key achievements and completed projects.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes for data migration, analytics projects for clients in various industries. His roles have included requirement analysis, mapping design, testing, performance tuning and managing project timelines.
The document provides a summary of Subramanyam M's professional experience including 4+ years working with ETL tools like IBM Datastage and Ascential Datastage. He has experience designing, developing and testing data warehouse projects involving data extraction, transformation and loading. He also has skills in SQL, Oracle, DB2, Unix and data warehousing techniques. The profile outlines various ETL projects he has worked on involving large datasets for clients like HSBC Bank, Scope International and Bharti Airtel.
Pallavi Gokhale Mishra has over 16 years of experience in data migration, data warehousing, project management, and software development. She currently works as a Solution Architect for IBM India on a project involving data migration from Oracle CRM to Siebel CRM for Vodafone India. Her experience includes managing teams and leading complex data migration projects involving Siebel, SAP, and other applications for clients in telecom, automotive, banking, and other industries. She has strong skills in ETL tools like IBM Datastage, databases like Oracle, and programming languages like SQL.
The document provides a technical summary and experience profile of Nootan Sharma. It summarizes his 8 years of experience in data warehousing and business intelligence projects. It details his expertise in tools like Informatica PowerCenter, Oracle, SQL Server and data quality management. It also lists his past work experience with companies like Capgemini, Birlasoft and Infogain on various BI and data warehousing projects for clients in different sectors.
Sean Lynch has over 15 years of experience leading teams that deliver software projects on time and under budget. He currently works as a Senior System Analyst and Solutions Architect at Blue Cross Blue Shield, where he has helped design and implement their large national data warehouse system. Previously he has worked as a consultant for several insurance and financial companies, managing projects and teams. He has a focus on quality results and extensive experience across the software development lifecycle.
Vishwanath Mallanagouda is a data warehouse application developer with over 4 years of experience working with technologies like Informatica, Oracle, SQL, and Hadoop. He has expertise in ETL tool development, data modeling, and database administration. Currently working as an application developer at Deloitte India Consulting, his past experience includes projects for banking, insurance, and public sector clients at IBM.
Pradeep Kumar Pandey has over 10 years of experience as a data/systems integration specialist and ETL expert. He has extensive experience designing and implementing data warehouses using tools like IBM DataStage, Informatica, Oracle OBIEE, and Oracle OBIA. He has led teams and taken on roles such as developer, technical lead, and team lead. Pradeep has worked on projects across various industries including telecom, financial services, HR, and retail.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
Suneel Manne has over 4.9 years of experience in IT with a focus on data warehousing using tools like IBM Data Stage. He has expertise in designing, developing, and implementing ETL processes to extract data from various sources and load it into data warehouses. Some of his key skills include SQL, DB2, Data Stage, Linux, bug tracking tools, and data modeling. He has worked on several projects for clients like GTECH Corporation and T-Mobile BI, taking on roles such as ETL Developer and Big Data Developer.
Suneel Manne has over 4.9 years of experience in IT with a focus on data warehousing using tools like IBM Data Stage. He has expertise in designing, developing, and implementing ETL processes to extract data from various sources and load it into data warehouses. Some of his key skills include SQL, DB2, Data Stage, Linux, bug tracking tools, and data modeling. He has worked on several projects for clients like GTECH Corporation and T-Mobile BI, taking on roles such as ETL Developer and Big Data Developer.
Sanjay Lakhanpal has over 9 years of experience as a project lead and developer working on data warehousing, application development, and distributed applications projects. He has expertise in big data technologies like Hadoop, MapReduce, Hive, Pig, and Sqoop. As a project lead and developer, he is responsible for designing solutions, developing ETL jobs, writing scripts, testing, and leading teams. He has worked on projects in various domains for clients such as IBM, Orange, XL Capital, Chola Mandalam, and HMEL.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
Balwant Singh is a senior package solution consultant with over 5 years of experience in ETL, data warehousing, and reporting solutions using technologies such as ODI 11g, OBIEE 11g, and Oracle databases. He has extensive experience designing and implementing data integration projects for customers in various industries. Currently, he works for IBM India on an MSI project involving OBIEE, ODI, and BI applications.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
• 11+ Years of IT Industry experience in Analysis, Design, Development, Maintenance and Support of various software applications mainly in Data Warehousing (Informatica Power Center, OWB, SSIS and Business Objects), Oracle (SQL, PL/SQL) and Teradata in industry verticals like Finance, Telecom, Retail and Healthcare.
• Work experience in client facing roles in UK and Ireland.
• Performed numerous roles in Business Intelligence projects as Data warehouse System Analyst, ETL Designer, Onshore coordinator, Technical Lead and Senior Data warehouse Developer roles with multinational IT result-driven organizations
• Extensive experience on Data integration projects accessing sources like Teradata, Oracle and SQL server.
• Created robust EDW Solution from various types of sources like Flat files, XML Files, EDCDIC Cobol copybook from Mainframe systems, DB2 Unload files.
• Extensive experience on Data discovery, cleansing using Informatica IDQ.
• Resolved Inconsistent and Duplicate Data issues during Data Analysis to Support Strategic EDW Goals.
• Extensive experience of Data Integration using Informatica Power center Tool stack.
• Strong knowledge on Data Warehousing concepts, ETL concepts, Data Modeling, Dimensional Modeling.
• Conducted training on Informatica and have achieved awards for proficient training capabilities.
• Excellent understanding of OLTP and OLAP concepts and expert in writing SQL, Stored procedure on Teradata, Oracle and SQL Server.
• Extensive experience in implementing Data Warehousing methodologies including STAR SCHEMA and SNOW-FLAKE SCHEMAS & 3NF for huge data warehouses.
• Extensive knowledge on Change Data Capture (CDC) and SCD Type 1, Type 2, Type 3 Implementations.
• Excellent understanding of Kimball and Inmon Methodologies.
• Provided leadership when addressing high level technical issues and questions with the functionality of the reporting and business intelligence applications.
• Managed the current and strategize to foresee and plan for the future engineering needs in Data Integration space.
• Performed roles as a interface and coordinator between Database Administration, ETL Development, Testing teams and reporting teams to eliminate the road blocks for smooth flow of information.
• Hands on experience in tuning ETL mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
• Expert in designing and developing of complicate ETL mappings using Informatica PowerCenter.
• Proficient in optimizing performance issues using Informatica PowerCenter and Teradata.
• Having experience on using Teradata utilities (TPT, BTEQ, Fast Load, MultiLoad, FastExport, Tpump).
• Exposure in writing Shell Scripting as per given requirement.
• Work extensively in Teradata GCFR tool.
• Experience in SAP ECC integration with Informatica.
• Got training in Tableau, Qlik View& SAP BW 3.5and done POC for the same.
Ipsita Mohanty has over 9 years of experience in the IT industry as a software engineer. She has strong skills in Informatica, SQL, PL/SQL, and data warehousing. She has experience leading teams and mentoring others. Her past roles include projects in various domains involving data integration, ETL development, and business intelligence reporting.
John Plunkett III is an experienced database professional seeking a position as a Database Administrator or Data Manager. He has skills in Oracle, Sybase, SQL Server, and other databases, as well as experience tuning databases for performance and ensuring data integrity. His background includes roles supporting SNP, Trizetto Group, Caesars Entertainment, and other organizations.
1. Jeffrey W. Richardson
mmasigung@gmail.com
(512) 699-8556
.
Page 1 of 4
SUMMARY
Jeff has over 33 years of experience in IT consulting, product development and system operations. From
his early degree in Electronic Data Processing to his current work in Big Data / Hadoop technologies, Jeff
has continuously honed his skills, expertise, and passion in the design, development and implementation
of large scale solutions. His most recent experiences have been with MetiStream, Inc., as a Big Data
Architect supporting customer projects around data warehousing offload and optimization efforts to
Hadoop. Leveraging Big Data open source technologies, Jeff led MetiStream’s efforts to design and build
innovative tools and best practices to automate and expedite data warehouse migrations to Hadoop.
While at IBM, Jeff achieved the highest level of IT Specialist Certification and his last assignment there
was with the Big Data Analytics Center of Competency. For many large customer engagements, Jeff has
been responsible for the development, operation and administration of large scale production, test and
development environments including a number of significant Big Data initiatives at communication
providers and banking institutions.
TECHNICAL SKILLS
Big Data Technologies
Big Data Hadoop, Sqoop, Spark, Hive, Impala, YARN, Oozie, Parquet, Avro, Cloudera
Manager
Programming Languages
Proficient korn shell, bash, awk, sed
Comfortable HTML, JavaScript, SQL
Working knowledge Scala, C, Java (JDBC), Python, Perl, COBOL, RPG II, FORTRAN, Assembler
Development Platforms & Tools
Platforms Linux, Windows, z/VM, AIX, Solaris. Amazon Web Services (AWS), SkyTap
IDEs Eclipse, Aginity Workbench, IBM Data Studio, Toad for DB2, IntelliJ
Databases Proficient: IBM DB2 LUW, Netezza Performance Server, Hive, Impala
Moderate: IBM Informix
Academic understanding: Oracle, MySQL, PostgreSQL, Cloudscape/Derby,
Sybase, Teradata
Assorted Tools VMware Workstation, IBM InfoSphere Data Architect, Docker, JIRA, GitHub
Application Server IBM WebSphere Application Server
Security Kerberos, Center for Internet Security benchmarks
EDUCATION
1994 B.A. Computer Science St. Edward’s University
1982 A.A.S. Electronic Data Processing Temple Junior College.
TRAINING / CERTIFICATIONS
2015 Cloudera Certified Administrator for Apache Hadoop Cloudera
2012 IBM Certified Specialist – Netezza Platform Software v6 IBM
2012 IBM Level 3 IT Specialist Certification IBM
2012 IBM Certified Database Associate DB2 10.1 IBM
2009 IBM Certified Database Administrator DB2 9 LUW IBM
2. Page 2 of 4
2004 IBM Certified Database Administrator DB2 UDB V8.1 IBM
WORK EXPERIENCE
March 2015-April 2016 MetiStream McLean, VA - Big Data Architect
ï‚· Big Data Engineer, as part of a MetiStream team assisting a Health Care services and product
company
o Developed a test automation framework based on Hadoop components
o Technologies included were Oozie workflows controlling Hive, shell and email actions.
o Framework included a test case template, business rules table (pass/fail criteria), results
table, email and report generation
o Ported Oracle and Netezza SQL query test cases to Hive Query Language
o Provided documentation and training to the company
ï‚· Big Data Engineer, as part of a Cloudera professional services technical team at a large financial
institution
o Supported architecture, design and implementation phases of a new analytic process,
offloading work from Netezza to Hadoop for 3 years of data with an estimated size of
approximately 5TB.
o Supported Hadoop application architecture review and Data Pipeline review including the
evaluation of data sources, data processing jobs, analytic processes, and SLAs.
o Worked with Cloudera to evaluate application data access patterns, how data schemas are
managed and evolved, and the effectiveness of the partitioning system
o In collaboration with Cloudera and customer team members, modified and extended a
prototype which loads merchant / location data from the current Netezza data warehouse to
Hadoop. Using Apache Sqoop and other Big Data solutions, helped build and manage jobs
and the overall data pipeline solution to pull data from the Netezza system and create
transformations to publish the dataset within the storage schema.
o Technologies involved were Sqoop, Pig, Hive, Impala, YARN, Oozie, Crunch, Kerberos,
Parquet, Avro, Python, YAML, HQL, Korn shell, bash, Cloudera Manager.
ï‚· Using Apache Spark and other open source Big Data technologies, lead a team of four engineers in
the development of MetiStream’s Translation Engine and provider overall support and best practices
around MetiStream’s Netezza / DW Offload and Optimization Solution
 Supported the review, development, and quality control of MetiStream’s Advanced Apache Spark
Training
ï‚· Developed an introductory level Spark education class, deployed to Skytap and AWS
ï‚· Built and deployed a cloud based PostgreSQL team database server.
May, 1982 – Oct, 2014 IBM Austin, TX
Feb 2013 -Oct 2014 - Overland Park, KS
ï‚· Member of IBM Big Data Analytics Center of Competency.
o Mentored and guided team members during formation of this new organization within IBM.
o Focused on Netezza and Netezza Replication technologies.
3. Page 3 of 4
o Member of Data Architect team for two Netezza implementation projects at a major US
telecom provider.
 Reverse engineered prototype Netezza databases with IBM InfoSphere Data
Architect. Created the physical data model, DDL. From this the team created the
initial logical data model and imported it into CA’s ERwin Data Modeler. Erwin did not
support Netezza so Jeff developed a procedure that would create DB2 DDL suitable
for Netezza.
 Designed three variations of a security model based on Netezza best practices and
the customer’s enterprise database security requirements. Implemented the option
chosen by the customer.
 Led the team that designed and implemented Fraud data ingest, ETL and analytics.
Designed and developed code that could be deployed in test or production
environments without modification. Met all architectural and customer requirements
for coding standards, delivering ready-to-use code.
 Led the team that implemented network quality project. Modified existing code design
so that it would work in all environments without modification and met all standards.
 Worked with IBM and customer stakeholders and project managers to ensure all
project dates and milestones my teams were responsible for were met.
Sep 2011 – Feb 2013
ï‚· IBM Software Group Information Management ISV Support
o Participated in development of educational materials for the IBM PureData family of products:
 IBM PureData for Analytics (Netezza)
 IBM PureData for Operational Analytics (DB2)
Sep 2010-Aug 2011 – Columbus, OH
Member of the IBM Software Group Information Management ISV Support organization. Architect and
specialist for three database migration projects at a large financial institution.
ï‚· Ensured architectural and design guidelines were adhered to and non-functional requirements met
during deployment of the physical infrastructure and migration of two mission critical databases from
Oracle to DB2.
o Executed gap analysis: found and resolved security incompatibilities, replication design
errors.
o Arranged and delivered technical education to database administrators.
o Enhanced customer’s performance test suite.
o Assisted customer’s data architects in developing new methods of database and table design
for performance and data retention.
Sep 2008 – Aug 2011 IBM Software Group Information Management ISV Support
ï‚· Supported IBM, IBM business partners and customers worldwide with technical assistance integrating
IBM Information Management products such as DB2 and Informix into their products.
ï‚· Developed and delivered education to IBM customers and business partners in the United States,
Japan, Korea, Viet Nam, Malaysia and Argentina.
4. Page 4 of 4
June 2002 – Sep 2008 Linux Integration Center
ï‚· Lab administrator
o System: workstation, server, blade server, mainframe
o Network: switches, cabling, DNS, NTP, DHCP, routers, gateways, bridges
o Security: controlled access, userid authentication/authorization
o Database administrator: installation, configuration, data ingest, maintenance
ï‚· Designed and led implementation team for an unattended software installation and configuration
methodology for the lab (pre-cloud).
ï‚· Developed and executed Proof of Concept and Proof of Technology solutions for different IBM
products such as:
o DB2 UDB or LUW products, including high availability (HA) environments, DB2 Connect and
DB2 clusters (DPF)
o WebSphere Portal Server, including DB2 LUW or Oracle and IBM Tivoli Director Server
o WebSphere Application Server
o Tivoli Directory Server (using DB2 UDB V8 as the data store)
o GPFS (General Parallel File System)
ï‚· Reproduced customer environments in the lab to help debug or tune customer environments
ï‚· Security: Created procedure for hardening a compute-intensive cluster, included instructions for
softening the cluster when software component updates or upgrades were required.
Jan 1991 – June 2002
ï‚· Administrator for large System Test lab.
ï‚· Test lead on many different IBM products.
ï‚· Developed and delivered education to IBM business partners in Germany.
ï‚· Developed, executed Y2K tests for IBM software. Consulted with IBM customers in Japan.
Jan 1990 – Dec 1991 Programmer retraining
ï‚· Dedicated one year towards completion of Bachelor of Arts degree.
May 1982 – Dec 1989 Mainframe operations
ï‚· System Operator and System Analyst for Austin site mainframe complex, responsible for production
job runs, software development environments and manufacturing line control processes.
Prior experience:
Prior to working at IBM, Jeff worked at two banks in the Central Texas area and before that served for
four years in the US Air Force.