際際滷shows by User: tzolov1 / http://www.slideshare.net/images/logo.gif 際際滷shows by User: tzolov1 / Fri, 05 Oct 2018 17:34:20 GMT 際際滷Share feed for 際際滷shows by User: tzolov1 Machines Can Learn - a Practical Take on Machine Intelligence Using Spring Cloud Data Flow and TensorFlow /slideshow/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow/118337093 springone2018christiantzolov-machinescanlearnv2-181005173420
https://springoneplatform.io/2018/sessions/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow Machine learning (ML) has brought unprecedented abilities to the software engineering field. ML allows you to reason about and to solve otherwise "un-programmable" tasks such as computer vision and language processing. If you're a Java developer and you're interested in leveraging ML to deliver richer business insights to your customers, in this talk you'll learn what it takes to build cloud-native applications to perform data-driven machine intelligence operations. This coding-centric talk walks through the different facets of iterative development and testing using Spring Cloud Stream and the orchestration of such applications into coherent data pipelines using Spring Cloud Data Flow. Specifically, we will also review TensorFlow, a popular Machine Learning toolkit, and how it is integrated in the overall design. This talk will showcase how building a complex use-cases such as real-time image recognition or object detection, can be simplified with the help of Spring Ecosystem and TensorFlow. More importantly, I'd will share the findings from the ML space; tips and tricks on what goes into developing such complex solutions.]]>

https://springoneplatform.io/2018/sessions/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow Machine learning (ML) has brought unprecedented abilities to the software engineering field. ML allows you to reason about and to solve otherwise "un-programmable" tasks such as computer vision and language processing. If you're a Java developer and you're interested in leveraging ML to deliver richer business insights to your customers, in this talk you'll learn what it takes to build cloud-native applications to perform data-driven machine intelligence operations. This coding-centric talk walks through the different facets of iterative development and testing using Spring Cloud Stream and the orchestration of such applications into coherent data pipelines using Spring Cloud Data Flow. Specifically, we will also review TensorFlow, a popular Machine Learning toolkit, and how it is integrated in the overall design. This talk will showcase how building a complex use-cases such as real-time image recognition or object detection, can be simplified with the help of Spring Ecosystem and TensorFlow. More importantly, I'd will share the findings from the ML space; tips and tricks on what goes into developing such complex solutions.]]>
Fri, 05 Oct 2018 17:34:20 GMT /slideshow/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow/118337093 tzolov1@slideshare.net(tzolov1) Machines Can Learn - a Practical Take on Machine Intelligence Using Spring Cloud Data Flow and TensorFlow tzolov1 https://springoneplatform.io/2018/sessions/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow Machine learning (ML) has brought unprecedented abilities to the software engineering field. ML allows you to reason about and to solve otherwise "un-programmable" tasks such as computer vision and language processing. If you're a Java developer and you're interested in leveraging ML to deliver richer business insights to your customers, in this talk you'll learn what it takes to build cloud-native applications to perform data-driven machine intelligence operations. This coding-centric talk walks through the different facets of iterative development and testing using Spring Cloud Stream and the orchestration of such applications into coherent data pipelines using Spring Cloud Data Flow. Specifically, we will also review TensorFlow, a popular Machine Learning toolkit, and how it is integrated in the overall design. This talk will showcase how building a complex use-cases such as real-time image recognition or object detection, can be simplified with the help of Spring Ecosystem and TensorFlow. More importantly, I'd will share the findings from the ML space; tips and tricks on what goes into developing such complex solutions. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/springone2018christiantzolov-machinescanlearnv2-181005173420-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> https://springoneplatform.io/2018/sessions/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow Machine learning (ML) has brought unprecedented abilities to the software engineering field. ML allows you to reason about and to solve otherwise &quot;un-programmable&quot; tasks such as computer vision and language processing. If you&#39;re a Java developer and you&#39;re interested in leveraging ML to deliver richer business insights to your customers, in this talk you&#39;ll learn what it takes to build cloud-native applications to perform data-driven machine intelligence operations. This coding-centric talk walks through the different facets of iterative development and testing using Spring Cloud Stream and the orchestration of such applications into coherent data pipelines using Spring Cloud Data Flow. Specifically, we will also review TensorFlow, a popular Machine Learning toolkit, and how it is integrated in the overall design. This talk will showcase how building a complex use-cases such as real-time image recognition or object detection, can be simplified with the help of Spring Ecosystem and TensorFlow. More importantly, I&#39;d will share the findings from the ML space; tips and tricks on what goes into developing such complex solutions.
Machines Can Learn - a Practical Take on Machine Intelligence Using Spring Cloud Data Flow and TensorFlow from Christian Tzolov
]]>
345 4 https://cdn.slidesharecdn.com/ss_thumbnails/springone2018christiantzolov-machinescanlearnv2-181005173420-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Visualize and Analyze Apache Geode Real-time and Historical Metrics with Grafana /tzolov1/visualize-and-analyze-apache-geode-realtime-and-historical-metrics-with-grafana geodemetricsvisualization-180201173141-180418103521
Interested in a single dashboard providing a combined picture of both, real-time metrics and analysis of historical statistics for Apache Geode (Pivotal GemFire)? During this webinar we will show you how to create a dashboard providing the proper context for interpreting real-time metrics using Grafana - an open platform for analytics and monitoring. Accomplishing this requires the consolidation of two monitoring and metrics feeds in GemFire: the real-time metrics accessed via a JMX API; and the post-mortem historical statistics accessed via archive files. Join us as we describe and demonstrate how these two monitoring and metrics feeds can be combined, providing a unified monitoring and metrics dashboard for GemFire. We will also share common use cases and explore how the Geode Grafana Dashboard Repository, a pre-built collection of Geode-Grafana dashboards, helps create customized, monitoring dashboards. Video: https://youtu.be/lVeYdA6GYZ4 BrightTalk webinar link: https://goo.gl/YkLBvT]]>

Interested in a single dashboard providing a combined picture of both, real-time metrics and analysis of historical statistics for Apache Geode (Pivotal GemFire)? During this webinar we will show you how to create a dashboard providing the proper context for interpreting real-time metrics using Grafana - an open platform for analytics and monitoring. Accomplishing this requires the consolidation of two monitoring and metrics feeds in GemFire: the real-time metrics accessed via a JMX API; and the post-mortem historical statistics accessed via archive files. Join us as we describe and demonstrate how these two monitoring and metrics feeds can be combined, providing a unified monitoring and metrics dashboard for GemFire. We will also share common use cases and explore how the Geode Grafana Dashboard Repository, a pre-built collection of Geode-Grafana dashboards, helps create customized, monitoring dashboards. Video: https://youtu.be/lVeYdA6GYZ4 BrightTalk webinar link: https://goo.gl/YkLBvT]]>
Wed, 18 Apr 2018 10:35:21 GMT /tzolov1/visualize-and-analyze-apache-geode-realtime-and-historical-metrics-with-grafana tzolov1@slideshare.net(tzolov1) Visualize and Analyze Apache Geode Real-time and Historical Metrics with Grafana tzolov1 Interested in a single dashboard providing a combined picture of both, real-time metrics and analysis of historical statistics for Apache Geode (Pivotal GemFire)? During this webinar we will show you how to create a dashboard providing the proper context for interpreting real-time metrics using Grafana - an open platform for analytics and monitoring. Accomplishing this requires the consolidation of two monitoring and metrics feeds in GemFire: the real-time metrics accessed via a JMX API; and the post-mortem historical statistics accessed via archive files. Join us as we describe and demonstrate how these two monitoring and metrics feeds can be combined, providing a unified monitoring and metrics dashboard for GemFire. We will also share common use cases and explore how the Geode Grafana Dashboard Repository, a pre-built collection of Geode-Grafana dashboards, helps create customized, monitoring dashboards. Video: https://youtu.be/lVeYdA6GYZ4 BrightTalk webinar link: https://goo.gl/YkLBvT <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/geodemetricsvisualization-180201173141-180418103521-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Interested in a single dashboard providing a combined picture of both, real-time metrics and analysis of historical statistics for Apache Geode (Pivotal GemFire)? During this webinar we will show you how to create a dashboard providing the proper context for interpreting real-time metrics using Grafana - an open platform for analytics and monitoring. Accomplishing this requires the consolidation of two monitoring and metrics feeds in GemFire: the real-time metrics accessed via a JMX API; and the post-mortem historical statistics accessed via archive files. Join us as we describe and demonstrate how these two monitoring and metrics feeds can be combined, providing a unified monitoring and metrics dashboard for GemFire. We will also share common use cases and explore how the Geode Grafana Dashboard Repository, a pre-built collection of Geode-Grafana dashboards, helps create customized, monitoring dashboards. Video: https://youtu.be/lVeYdA6GYZ4 BrightTalk webinar link: https://goo.gl/YkLBvT
Visualize and Analyze Apache Geode Real-time and Historical Metrics with Grafana from Christian Tzolov
]]>
52 2 https://cdn.slidesharecdn.com/ss_thumbnails/geodemetricsvisualization-180201173141-180418103521-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Enable SQL/JDBC Access to Apache Geode/GemFire Using Apache Calcite /slideshow/enable-sqljdbc-access-to-apache-geodegemfire-using-apache-calcite/83569752 geodesummitspringone2017christiantzolov-171207144528
https://springoneplatform.io/sessions/enable-sql-jdbc-access-to-apache-geode-gemfire-using-apache-calcite When working with BigData & IoT systems we often feel the need for an established, Common Query Language. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. It allows you to integrate SQL parser, Cost-Based Optimizer, and JDBC with your NoSql system. Calcite has been used to empower many BigData platforms such as Hive, Spark, Flink, Drill, HBase/Phoenix to name some. In this session I will walk you through the process of building a SQL access layer for Apache Geode (GemFire). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of In-Memory-Data-Grid systems like Geode. Hopefully this will enable you to add SQL capabilities to your preferred NoSQL data system.]]>

https://springoneplatform.io/sessions/enable-sql-jdbc-access-to-apache-geode-gemfire-using-apache-calcite When working with BigData & IoT systems we often feel the need for an established, Common Query Language. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. It allows you to integrate SQL parser, Cost-Based Optimizer, and JDBC with your NoSql system. Calcite has been used to empower many BigData platforms such as Hive, Spark, Flink, Drill, HBase/Phoenix to name some. In this session I will walk you through the process of building a SQL access layer for Apache Geode (GemFire). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of In-Memory-Data-Grid systems like Geode. Hopefully this will enable you to add SQL capabilities to your preferred NoSQL data system.]]>
Thu, 07 Dec 2017 14:45:28 GMT /slideshow/enable-sqljdbc-access-to-apache-geodegemfire-using-apache-calcite/83569752 tzolov1@slideshare.net(tzolov1) Enable SQL/JDBC Access to Apache Geode/GemFire Using Apache Calcite tzolov1 https://springoneplatform.io/sessions/enable-sql-jdbc-access-to-apache-geode-gemfire-using-apache-calcite When working with BigData & IoT systems we often feel the need for an established, Common Query Language. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. It allows you to integrate SQL parser, Cost-Based Optimizer, and JDBC with your NoSql system. Calcite has been used to empower many BigData platforms such as Hive, Spark, Flink, Drill, HBase/Phoenix to name some. In this session I will walk you through the process of building a SQL access layer for Apache Geode (GemFire). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of In-Memory-Data-Grid systems like Geode. Hopefully this will enable you to add SQL capabilities to your preferred NoSQL data system. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/geodesummitspringone2017christiantzolov-171207144528-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> https://springoneplatform.io/sessions/enable-sql-jdbc-access-to-apache-geode-gemfire-using-apache-calcite When working with BigData &amp; IoT systems we often feel the need for an established, Common Query Language. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. It allows you to integrate SQL parser, Cost-Based Optimizer, and JDBC with your NoSql system. Calcite has been used to empower many BigData platforms such as Hive, Spark, Flink, Drill, HBase/Phoenix to name some. In this session I will walk you through the process of building a SQL access layer for Apache Geode (GemFire). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of In-Memory-Data-Grid systems like Geode. Hopefully this will enable you to add SQL capabilities to your preferred NoSQL data system.
Enable SQL/JDBC Access to Apache Geode/GemFire Using Apache Calcite from Christian Tzolov
]]>
81 1 https://cdn.slidesharecdn.com/ss_thumbnails/geodesummitspringone2017christiantzolov-171207144528-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
SQL for NoSQL and how Apache Calcite can help /slideshow/sql-for-nosql-and-how-apache-calcite-can-help/71830866 fosdem2017christiantzolov-170206202927
https://fosdem.org/2017/schedule/event/hpc_bigdata_calcite/ When working with BigData & IoT systems we often feel the need for a Common Query Language. The platform specific languages are often harder to integrate with and require longer adoption time. To fill this gap many NoSql (Not-only-Sql) vendors are building SQL layers for their platforms. It is worth exploring the driving forces behind this trend, how it fits in your BigData stacks and how we can adopt it in our favorite tools. However building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your big data system. Calcite has been used to empower many Big-Data platforms such as Hive, Spark, Drill Phoenix to name some. I will walk you through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.]]>

https://fosdem.org/2017/schedule/event/hpc_bigdata_calcite/ When working with BigData & IoT systems we often feel the need for a Common Query Language. The platform specific languages are often harder to integrate with and require longer adoption time. To fill this gap many NoSql (Not-only-Sql) vendors are building SQL layers for their platforms. It is worth exploring the driving forces behind this trend, how it fits in your BigData stacks and how we can adopt it in our favorite tools. However building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your big data system. Calcite has been used to empower many Big-Data platforms such as Hive, Spark, Drill Phoenix to name some. I will walk you through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.]]>
Mon, 06 Feb 2017 20:29:27 GMT /slideshow/sql-for-nosql-and-how-apache-calcite-can-help/71830866 tzolov1@slideshare.net(tzolov1) SQL for NoSQL and how Apache Calcite can help tzolov1 https://fosdem.org/2017/schedule/event/hpc_bigdata_calcite/ When working with BigData & IoT systems we often feel the need for a Common Query Language. The platform specific languages are often harder to integrate with and require longer adoption time. To fill this gap many NoSql (Not-only-Sql) vendors are building SQL layers for their platforms. It is worth exploring the driving forces behind this trend, how it fits in your BigData stacks and how we can adopt it in our favorite tools. However building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your big data system. Calcite has been used to empower many Big-Data platforms such as Hive, Spark, Drill Phoenix to name some. I will walk you through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/fosdem2017christiantzolov-170206202927-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> https://fosdem.org/2017/schedule/event/hpc_bigdata_calcite/ When working with BigData &amp; IoT systems we often feel the need for a Common Query Language. The platform specific languages are often harder to integrate with and require longer adoption time. To fill this gap many NoSql (Not-only-Sql) vendors are building SQL layers for their platforms. It is worth exploring the driving forces behind this trend, how it fits in your BigData stacks and how we can adopt it in our favorite tools. However building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your big data system. Calcite has been used to empower many Big-Data platforms such as Hive, Spark, Drill Phoenix to name some. I will walk you through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.
SQL for NoSQL and how Apache Calcite can help from Christian Tzolov
]]>
1205 5 https://cdn.slidesharecdn.com/ss_thumbnails/fosdem2017christiantzolov-170206202927-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Using Apache Calcite for Enabling SQL and JDBC Access to Apache Geode and Other NoSQL Data Systems /slideshow/using-apache-calcite-for-enabling-sql-and-jdbc-access-to-apache-geode-and-other-nosql-data-systems/68931084 apachecon2016christiantzolov-161115000837
When working with BigData & IoT systems we often feel the need for a Common Query Language. The system specific languages usually require longer adoption time and are harder to integrate within the existing stacks. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your NoSql system. We will walk through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.]]>

When working with BigData & IoT systems we often feel the need for a Common Query Language. The system specific languages usually require longer adoption time and are harder to integrate within the existing stacks. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your NoSql system. We will walk through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.]]>
Tue, 15 Nov 2016 00:08:36 GMT /slideshow/using-apache-calcite-for-enabling-sql-and-jdbc-access-to-apache-geode-and-other-nosql-data-systems/68931084 tzolov1@slideshare.net(tzolov1) Using Apache Calcite for Enabling SQL and JDBC Access to Apache Geode and Other NoSQL Data Systems tzolov1 When working with BigData & IoT systems we often feel the need for a Common Query Language. The system specific languages usually require longer adoption time and are harder to integrate within the existing stacks. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your NoSql system. We will walk through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/apachecon2016christiantzolov-161115000837-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> When working with BigData &amp; IoT systems we often feel the need for a Common Query Language. The system specific languages usually require longer adoption time and are harder to integrate within the existing stacks. To fill this gap some NoSql vendors are building SQL access to their systems. Building SQL engine from scratch is a daunting job and frameworks like Apache Calcite can help you with the heavy lifting. Calcite allow you to integrate SQL parser, cost-based optimizer, and JDBC with your NoSql system. We will walk through the process of building a SQL access layer for Apache Geode (In-Memory Data Grid). I will share my experience, pitfalls and technical consideration like balancing between the SQL/RDBMS semantics and the design choices and limitations of the data system. Hopefully this will enable you to add SQL capabilities to your prefered NoSQL data system.
Using Apache Calcite for Enabling SQL and JDBC Access to Apache Geode and Other NoSQL Data Systems from Christian Tzolov
]]>
3077 3 https://cdn.slidesharecdn.com/ss_thumbnails/apachecon2016christiantzolov-161115000837-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Apache conbigdata2015 christiantzolov-federated sql on hadoop and beyond- leveraging apache geode to build a poor mans sap hana /tzolov1/apache-conbigdata2015-christiantzolovfederated-sql-on-hadoop-and-beyond-leveraging-apache-geode-to-build-a-poor-mans-sap-hana apacheconbigdata2015-christiantzolov-federatedsqlonhadoopandbeyond-leveragingapachegeodetobuildapoor-150930195359-lva1-app6891
際際滷s from ApacheCon BigData 2015 HAWQ/GEODE talk: http://sched.co/3zut In the space of Big Data, two powerful data processing tools compliment each other. Namely HAWQ and Geode. HAWQ is a scalable OLAP SQL-on-Hadoop system, while Geode is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and Geode. Presentation will walking you through the implementation of the different Integration strategies demonstrating the power of combining various OSS technologies for processing bit and fast data. Presentation will touch upon OSS technologies like HAWQ, Geode, SpringXD, Hadoop and Spring Boot.]]>

際際滷s from ApacheCon BigData 2015 HAWQ/GEODE talk: http://sched.co/3zut In the space of Big Data, two powerful data processing tools compliment each other. Namely HAWQ and Geode. HAWQ is a scalable OLAP SQL-on-Hadoop system, while Geode is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and Geode. Presentation will walking you through the implementation of the different Integration strategies demonstrating the power of combining various OSS technologies for processing bit and fast data. Presentation will touch upon OSS technologies like HAWQ, Geode, SpringXD, Hadoop and Spring Boot.]]>
Wed, 30 Sep 2015 19:53:59 GMT /tzolov1/apache-conbigdata2015-christiantzolovfederated-sql-on-hadoop-and-beyond-leveraging-apache-geode-to-build-a-poor-mans-sap-hana tzolov1@slideshare.net(tzolov1) Apache conbigdata2015 christiantzolov-federated sql on hadoop and beyond- leveraging apache geode to build a poor mans sap hana tzolov1 際際滷s from ApacheCon BigData 2015 HAWQ/GEODE talk: http://sched.co/3zut In the space of Big Data, two powerful data processing tools compliment each other. Namely HAWQ and Geode. HAWQ is a scalable OLAP SQL-on-Hadoop system, while Geode is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and Geode. Presentation will walking you through the implementation of the different Integration strategies demonstrating the power of combining various OSS technologies for processing bit and fast data. Presentation will touch upon OSS technologies like HAWQ, Geode, SpringXD, Hadoop and Spring Boot. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/apacheconbigdata2015-christiantzolov-federatedsqlonhadoopandbeyond-leveragingapachegeodetobuildapoor-150930195359-lva1-app6891-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 際際滷s from ApacheCon BigData 2015 HAWQ/GEODE talk: http://sched.co/3zut In the space of Big Data, two powerful data processing tools compliment each other. Namely HAWQ and Geode. HAWQ is a scalable OLAP SQL-on-Hadoop system, while Geode is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and Geode. Presentation will walking you through the implementation of the different Integration strategies demonstrating the power of combining various OSS technologies for processing bit and fast data. Presentation will touch upon OSS technologies like HAWQ, Geode, SpringXD, Hadoop and Spring Boot.
Apache conbigdata2015 christiantzolov-federated sql on hadoop and beyond- leveraging apache geode to build a poor mans sap hana from Christian Tzolov
]]>
539 4 https://cdn.slidesharecdn.com/ss_thumbnails/apacheconbigdata2015-christiantzolov-federatedsqlonhadoopandbeyond-leveragingapachegeodetobuildapoor-150930195359-lva1-app6891-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Federated Queries with HAWQ - SQL on Hadoop and Beyond /slideshow/federated-queries-with-hawq-sql-on-hadoop-and-beyond/52824765 springone2gx2015christiantzolovhawqfinal-150915230609-lva1-app6892
In the space of Big Data, Pivotal offers two powerful data processing tools namely HAWQ and GemFire. HAWQ is a scalable OLAP SQL-on-Hadoop system, while GemFire is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and GemFire. The practical experience in applying Spring Boot and Spring XD for some of the use cases will be shared while walking you through the implementation of the different Integration strategies. Amongst other we will show an integration path that leverages SpringXD to ingest GemFire data and store it in HDFS as well as the benefits of using Spring Boot to implement REStful proxy for the HAWQ Web Table integration scenario.]]>

In the space of Big Data, Pivotal offers two powerful data processing tools namely HAWQ and GemFire. HAWQ is a scalable OLAP SQL-on-Hadoop system, while GemFire is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and GemFire. The practical experience in applying Spring Boot and Spring XD for some of the use cases will be shared while walking you through the implementation of the different Integration strategies. Amongst other we will show an integration path that leverages SpringXD to ingest GemFire data and store it in HDFS as well as the benefits of using Spring Boot to implement REStful proxy for the HAWQ Web Table integration scenario.]]>
Tue, 15 Sep 2015 23:06:09 GMT /slideshow/federated-queries-with-hawq-sql-on-hadoop-and-beyond/52824765 tzolov1@slideshare.net(tzolov1) Federated Queries with HAWQ - SQL on Hadoop and Beyond tzolov1 In the space of Big Data, Pivotal offers two powerful data processing tools namely HAWQ and GemFire. HAWQ is a scalable OLAP SQL-on-Hadoop system, while GemFire is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and GemFire. The practical experience in applying Spring Boot and Spring XD for some of the use cases will be shared while walking you through the implementation of the different Integration strategies. Amongst other we will show an integration path that leverages SpringXD to ingest GemFire data and store it in HDFS as well as the benefits of using Spring Boot to implement REStful proxy for the HAWQ Web Table integration scenario. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/springone2gx2015christiantzolovhawqfinal-150915230609-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> In the space of Big Data, Pivotal offers two powerful data processing tools namely HAWQ and GemFire. HAWQ is a scalable OLAP SQL-on-Hadoop system, while GemFire is OLTP like, in-memory data grid and event processing system. This presentation will show different integration approaches that allow integration and data exchange between HAWQ and GemFire. The practical experience in applying Spring Boot and Spring XD for some of the use cases will be shared while walking you through the implementation of the different Integration strategies. Amongst other we will show an integration path that leverages SpringXD to ingest GemFire data and store it in HDFS as well as the benefits of using Spring Boot to implement REStful proxy for the HAWQ Web Table integration scenario.
Federated Queries with HAWQ - SQL on Hadoop and Beyond from Christian Tzolov
]]>
808 5 https://cdn.slidesharecdn.com/ss_thumbnails/springone2gx2015christiantzolovhawqfinal-150915230609-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-tzolov1-48x48.jpg?cb=1634120961 Integration and Interoperability, Distributed and Data-Intensive Architectures http://blog.tzolov.net/?view=sidebar https://cdn.slidesharecdn.com/ss_thumbnails/springone2018christiantzolov-machinescanlearnv2-181005173420-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/machines-can-learn-a-practical-take-on-machine-intelligence-using-spring-cloud-data-flow-and-tensorflow/118337093 Machines Can Learn - a... https://cdn.slidesharecdn.com/ss_thumbnails/geodemetricsvisualization-180201173141-180418103521-thumbnail.jpg?width=320&height=320&fit=bounds tzolov1/visualize-and-analyze-apache-geode-realtime-and-historical-metrics-with-grafana Visualize and Analyze ... https://cdn.slidesharecdn.com/ss_thumbnails/geodesummitspringone2017christiantzolov-171207144528-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/enable-sqljdbc-access-to-apache-geodegemfire-using-apache-calcite/83569752 Enable SQL/JDBC Access...