際際滷shows by User: czq825 / http://www.slideshare.net/images/logo.gif 際際滷shows by User: czq825 / Thu, 28 Apr 2022 15:33:02 GMT 際際滷Share feed for 際際滷shows by User: czq825 Early Forecasting of the Impact of Traffic Accidents Using a Single Shot Observation /slideshow/early-forecasting-of-the-impact-of-traffic-accidents-using-a-single-shot-observation/251685986 sdm22-220428153302
SIAM International Conference on Data Mining (SDM) 2022]]>

SIAM International Conference on Data Mining (SDM) 2022]]>
Thu, 28 Apr 2022 15:33:02 GMT /slideshow/early-forecasting-of-the-impact-of-traffic-accidents-using-a-single-shot-observation/251685986 czq825@slideshare.net(czq825) Early Forecasting of the Impact of Traffic Accidents Using a Single Shot Observation czq825 SIAM International Conference on Data Mining (SDM) 2022 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/sdm22-220428153302-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> SIAM International Conference on Data Mining (SDM) 2022
Early Forecasting of the Impact of Traffic Accidents Using a Single Shot Observation from Zhiqian Chen
]]>
61 0 https://cdn.slidesharecdn.com/ss_thumbnails/sdm22-220428153302-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Rational Neural Networks for Approximating Jump Discontinuities of Graph Convolution Operator /slideshow/rational-neural-networks-for-approximating-jump-discontinuities-of-graph-convolution-operator/229971995 ratnet-200310025131
For node level graph encoding, a recent important state-of-art method is the graph convolutional networks (GCN), which nicely integrate local vertex features and graph topology in the spectral domain. However, current studies suffer from several drawbacks: (1) graph CNNs relies on Chebyshev polynomial approximation which results in oscillatory approximation at jump discontinuities; (2) Increasing the order of Chebyshev polynomial can reduce the oscillations issue, but also incurs unaffordable computational cost; (3) Chebyshev polynomials require degree Omega(poly(1/e)) to approximate a jump signal, while rational function only needs O(poly log(1/e). However, it's non-trivial to apply rational approximation without increasing computational complexity due to the denominator. In this paper, the superiority of rational approximation is exploited for graph signal recovering. RatioanlNet is proposed to integrate rational function and neural networks. We show that rational function of eigenvalues can be rewritten as a function of graph Laplacian, which can avoid multiplication by the eigenvector matrix. Focusing on the analysis of approximation on graph convolution operation, a graph signal regression task is formulated. Under graph signal regression task, its time complexity can be significantly reduced by graph Fourier transform. To overcome the local minimum problem of neural networks model, a relaxed Remez algorithm is utilized to initialize the weight parameters. Convergence rate of RatioanlNet and polynomial based methods on jump signal is analyzed for a theoretical guarantee. The extensive experimental results demonstrated that our approach could effectively characterize the jump discontinuities, outperforming competing methods by a substantial margin on both synthetic and real-world graphs.]]>

For node level graph encoding, a recent important state-of-art method is the graph convolutional networks (GCN), which nicely integrate local vertex features and graph topology in the spectral domain. However, current studies suffer from several drawbacks: (1) graph CNNs relies on Chebyshev polynomial approximation which results in oscillatory approximation at jump discontinuities; (2) Increasing the order of Chebyshev polynomial can reduce the oscillations issue, but also incurs unaffordable computational cost; (3) Chebyshev polynomials require degree Omega(poly(1/e)) to approximate a jump signal, while rational function only needs O(poly log(1/e). However, it's non-trivial to apply rational approximation without increasing computational complexity due to the denominator. In this paper, the superiority of rational approximation is exploited for graph signal recovering. RatioanlNet is proposed to integrate rational function and neural networks. We show that rational function of eigenvalues can be rewritten as a function of graph Laplacian, which can avoid multiplication by the eigenvector matrix. Focusing on the analysis of approximation on graph convolution operation, a graph signal regression task is formulated. Under graph signal regression task, its time complexity can be significantly reduced by graph Fourier transform. To overcome the local minimum problem of neural networks model, a relaxed Remez algorithm is utilized to initialize the weight parameters. Convergence rate of RatioanlNet and polynomial based methods on jump signal is analyzed for a theoretical guarantee. The extensive experimental results demonstrated that our approach could effectively characterize the jump discontinuities, outperforming competing methods by a substantial margin on both synthetic and real-world graphs.]]>
Tue, 10 Mar 2020 02:51:31 GMT /slideshow/rational-neural-networks-for-approximating-jump-discontinuities-of-graph-convolution-operator/229971995 czq825@slideshare.net(czq825) Rational Neural Networks for Approximating Jump Discontinuities of Graph Convolution Operator czq825 For node level graph encoding, a recent important state-of-art method is the graph convolutional networks (GCN), which nicely integrate local vertex features and graph topology in the spectral domain. However, current studies suffer from several drawbacks: (1) graph CNNs relies on Chebyshev polynomial approximation which results in oscillatory approximation at jump discontinuities; (2) Increasing the order of Chebyshev polynomial can reduce the oscillations issue, but also incurs unaffordable computational cost; (3) Chebyshev polynomials require degree Omega(poly(1/e)) to approximate a jump signal, while rational function only needs O(poly log(1/e). However, it's non-trivial to apply rational approximation without increasing computational complexity due to the denominator. In this paper, the superiority of rational approximation is exploited for graph signal recovering. RatioanlNet is proposed to integrate rational function and neural networks. We show that rational function of eigenvalues can be rewritten as a function of graph Laplacian, which can avoid multiplication by the eigenvector matrix. Focusing on the analysis of approximation on graph convolution operation, a graph signal regression task is formulated. Under graph signal regression task, its time complexity can be significantly reduced by graph Fourier transform. To overcome the local minimum problem of neural networks model, a relaxed Remez algorithm is utilized to initialize the weight parameters. Convergence rate of RatioanlNet and polynomial based methods on jump signal is analyzed for a theoretical guarantee. The extensive experimental results demonstrated that our approach could effectively characterize the jump discontinuities, outperforming competing methods by a substantial margin on both synthetic and real-world graphs. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/ratnet-200310025131-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> For node level graph encoding, a recent important state-of-art method is the graph convolutional networks (GCN), which nicely integrate local vertex features and graph topology in the spectral domain. However, current studies suffer from several drawbacks: (1) graph CNNs relies on Chebyshev polynomial approximation which results in oscillatory approximation at jump discontinuities; (2) Increasing the order of Chebyshev polynomial can reduce the oscillations issue, but also incurs unaffordable computational cost; (3) Chebyshev polynomials require degree Omega(poly(1/e)) to approximate a jump signal, while rational function only needs O(poly log(1/e). However, it&#39;s non-trivial to apply rational approximation without increasing computational complexity due to the denominator. In this paper, the superiority of rational approximation is exploited for graph signal recovering. RatioanlNet is proposed to integrate rational function and neural networks. We show that rational function of eigenvalues can be rewritten as a function of graph Laplacian, which can avoid multiplication by the eigenvector matrix. Focusing on the analysis of approximation on graph convolution operation, a graph signal regression task is formulated. Under graph signal regression task, its time complexity can be significantly reduced by graph Fourier transform. To overcome the local minimum problem of neural networks model, a relaxed Remez algorithm is utilized to initialize the weight parameters. Convergence rate of RatioanlNet and polynomial based methods on jump signal is analyzed for a theoretical guarantee. The extensive experimental results demonstrated that our approach could effectively characterize the jump discontinuities, outperforming competing methods by a substantial margin on both synthetic and real-world graphs.
Rational Neural Networks for Approximating Jump Discontinuities of Graph Convolution Operator from Zhiqian Chen
]]>
44 0 https://cdn.slidesharecdn.com/ss_thumbnails/ratnet-200310025131-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Learning to fuse music genres with generative adversarial dual learning /slideshow/learning-to-fuse-music-genres-with-generative-adversarial-dual-learning/83005048 fusiongan-171129210137
Paper presentation at ICDM 2017, see more information at http://people.cs.vt.edu/czq/publication/fusiongan/]]>

Paper presentation at ICDM 2017, see more information at http://people.cs.vt.edu/czq/publication/fusiongan/]]>
Wed, 29 Nov 2017 21:01:37 GMT /slideshow/learning-to-fuse-music-genres-with-generative-adversarial-dual-learning/83005048 czq825@slideshare.net(czq825) Learning to fuse music genres with generative adversarial dual learning czq825 Paper presentation at ICDM 2017, see more information at http://people.cs.vt.edu/czq/publication/fusiongan/ <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/fusiongan-171129210137-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Paper presentation at ICDM 2017, see more information at http://people.cs.vt.edu/czq/publication/fusiongan/
Learning to fuse music genres with generative adversarial dual learning from Zhiqian Chen
]]>
178 2 https://cdn.slidesharecdn.com/ss_thumbnails/fusiongan-171129210137-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-czq825-48x48.jpg?cb=1729710478 https://cdn.slidesharecdn.com/ss_thumbnails/sdm22-220428153302-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/early-forecasting-of-the-impact-of-traffic-accidents-using-a-single-shot-observation/251685986 Early Forecasting of t... https://cdn.slidesharecdn.com/ss_thumbnails/ratnet-200310025131-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/rational-neural-networks-for-approximating-jump-discontinuities-of-graph-convolution-operator/229971995 Rational Neural Networ... https://cdn.slidesharecdn.com/ss_thumbnails/fusiongan-171129210137-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/learning-to-fuse-music-genres-with-generative-adversarial-dual-learning/83005048 Learning to fuse music...