• Received the large volume of click stream data from various 3rd parties like Adobe, Conviva into HDFS.
• Written Map Reduce code to remove invalid and incomplete data.
• Created Hive external tables on top the valid data sets.
• Developed complex business rules using Hive, Impala and Pig to transform and store the data in an efficient manner for trend analysis, billing and business intelligence.
• Written Hive user defined functions to accomplish critical logic.
• Integrate Hadoop with Teradata and Oracle RDBMS systems by Importing and Exporting Customer data using Sqoop.
• Ingested tweets related to DirecTV using Flume into HDFS.
• Automated end-to-end process with the...
We’ve updated our privacy policy so that we are compliant with changing global privacy regulations and to provide you with insight into the limited ways in which we use your data.
You can read the details below. By accepting, you agree to the updated privacy policy.