際際滷shows by User: AlexanderNovikov8 / http://www.slideshare.net/images/logo.gif 際際滷shows by User: AlexanderNovikov8 / Wed, 12 Oct 2016 18:14:06 GMT 際際滷Share feed for 際際滷shows by User: AlexanderNovikov8 Tensor Train decomposition in machine learning /slideshow/tensor-train-decomposition-in-machine-learning/67084738 eim2afcatfu7rnef5aqt-signature-d86a536333fda39cc7bb8abfa6af2d97156ec5d1af830a37c5073025d0ab01fc-poli-161012181406
Tensor Train (TT) decomposition [3] is a generalization of SVD decomposition from matrices to tensors (=multidimensional arrays). It represents a tensor compactly in terms of factors and allows to work with the tensor via its factors without materializing the tensor itself. For example, we can find the elementwise product of two TT-tensors of size 2^100 and get the result in the TT-format as well. In the talk, we will show how Tensor Train decomposition can be used to represent parameters of neural networks [1] and polynomial models [2]. This parametrization allows exponentially many 'virtual' parameters while working only with small factors of the TT-format. To train the model, i.e. optimize the objective subject to the constraint that the parameters are in the TT-format, [2] uses stochastic Riemannian optimization. [1] Novikov, A., Podoprikhin, D., Osokin, A., & Vetrov, D. P. (2015). Tensorizing neural networks. In Advances in Neural Information Processing Systems. [2] Novikov, A., Trofimov, M., & Oseledets, I. (2016). Tensor Train polynomial models via Riemannian optimization. arXiv:1605.03795. [3] Oseledets, I. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing.]]>

Tensor Train (TT) decomposition [3] is a generalization of SVD decomposition from matrices to tensors (=multidimensional arrays). It represents a tensor compactly in terms of factors and allows to work with the tensor via its factors without materializing the tensor itself. For example, we can find the elementwise product of two TT-tensors of size 2^100 and get the result in the TT-format as well. In the talk, we will show how Tensor Train decomposition can be used to represent parameters of neural networks [1] and polynomial models [2]. This parametrization allows exponentially many 'virtual' parameters while working only with small factors of the TT-format. To train the model, i.e. optimize the objective subject to the constraint that the parameters are in the TT-format, [2] uses stochastic Riemannian optimization. [1] Novikov, A., Podoprikhin, D., Osokin, A., & Vetrov, D. P. (2015). Tensorizing neural networks. In Advances in Neural Information Processing Systems. [2] Novikov, A., Trofimov, M., & Oseledets, I. (2016). Tensor Train polynomial models via Riemannian optimization. arXiv:1605.03795. [3] Oseledets, I. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing.]]>
Wed, 12 Oct 2016 18:14:06 GMT /slideshow/tensor-train-decomposition-in-machine-learning/67084738 AlexanderNovikov8@slideshare.net(AlexanderNovikov8) Tensor Train decomposition in machine learning AlexanderNovikov8 Tensor Train (TT) decomposition [3] is a generalization of SVD decomposition from matrices to tensors (=multidimensional arrays). It represents a tensor compactly in terms of factors and allows to work with the tensor via its factors without materializing the tensor itself. For example, we can find the elementwise product of two TT-tensors of size 2^100 and get the result in the TT-format as well. In the talk, we will show how Tensor Train decomposition can be used to represent parameters of neural networks [1] and polynomial models [2]. This parametrization allows exponentially many 'virtual' parameters while working only with small factors of the TT-format. To train the model, i.e. optimize the objective subject to the constraint that the parameters are in the TT-format, [2] uses stochastic Riemannian optimization. [1] Novikov, A., Podoprikhin, D., Osokin, A., & Vetrov, D. P. (2015). Tensorizing neural networks. In Advances in Neural Information Processing Systems. [2] Novikov, A., Trofimov, M., & Oseledets, I. (2016). Tensor Train polynomial models via Riemannian optimization. arXiv:1605.03795. [3] Oseledets, I. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/eim2afcatfu7rnef5aqt-signature-d86a536333fda39cc7bb8abfa6af2d97156ec5d1af830a37c5073025d0ab01fc-poli-161012181406-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Tensor Train (TT) decomposition [3] is a generalization of SVD decomposition from matrices to tensors (=multidimensional arrays). It represents a tensor compactly in terms of factors and allows to work with the tensor via its factors without materializing the tensor itself. For example, we can find the elementwise product of two TT-tensors of size 2^100 and get the result in the TT-format as well. In the talk, we will show how Tensor Train decomposition can be used to represent parameters of neural networks [1] and polynomial models [2]. This parametrization allows exponentially many &#39;virtual&#39; parameters while working only with small factors of the TT-format. To train the model, i.e. optimize the objective subject to the constraint that the parameters are in the TT-format, [2] uses stochastic Riemannian optimization. [1] Novikov, A., Podoprikhin, D., Osokin, A., &amp; Vetrov, D. P. (2015). Tensorizing neural networks. In Advances in Neural Information Processing Systems. [2] Novikov, A., Trofimov, M., &amp; Oseledets, I. (2016). Tensor Train polynomial models via Riemannian optimization. arXiv:1605.03795. [3] Oseledets, I. (2011). Tensor-train decomposition. SIAM Journal on Scientific Computing.
Tensor Train decomposition in machine learning from Alexander Novikov
]]>
7315 13 https://cdn.slidesharecdn.com/ss_thumbnails/eim2afcatfu7rnef5aqt-signature-d86a536333fda39cc7bb8abfa6af2d97156ec5d1af830a37c5073025d0ab01fc-poli-161012181406-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
AWS EC2 tutorial /slideshow/aws-tutorial-presentation/50502433 awstutorialpresentation-150714094047-lva1-app6892
際際滷s for an introductory Amazon Web Services Elastic Compute Cloud (AWS EC2) tutorial. The video (in Russian) is available here: https://youtu.be/w1hQFbHa9PQ]]>

際際滷s for an introductory Amazon Web Services Elastic Compute Cloud (AWS EC2) tutorial. The video (in Russian) is available here: https://youtu.be/w1hQFbHa9PQ]]>
Tue, 14 Jul 2015 09:40:46 GMT /slideshow/aws-tutorial-presentation/50502433 AlexanderNovikov8@slideshare.net(AlexanderNovikov8) AWS EC2 tutorial AlexanderNovikov8 際際滷s for an introductory Amazon Web Services Elastic Compute Cloud (AWS EC2) tutorial. The video (in Russian) is available here: https://youtu.be/w1hQFbHa9PQ <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/awstutorialpresentation-150714094047-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 際際滷s for an introductory Amazon Web Services Elastic Compute Cloud (AWS EC2) tutorial. The video (in Russian) is available here: https://youtu.be/w1hQFbHa9PQ
AWS EC2 tutorial from Alexander Novikov
]]>
1183 3 https://cdn.slidesharecdn.com/ss_thumbnails/awstutorialpresentation-150714094047-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-AlexanderNovikov8-48x48.jpg?cb=1526374426 https://cdn.slidesharecdn.com/ss_thumbnails/eim2afcatfu7rnef5aqt-signature-d86a536333fda39cc7bb8abfa6af2d97156ec5d1af830a37c5073025d0ab01fc-poli-161012181406-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/tensor-train-decomposition-in-machine-learning/67084738 Tensor Train decomposi... https://cdn.slidesharecdn.com/ss_thumbnails/awstutorialpresentation-150714094047-lva1-app6892-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/aws-tutorial-presentation/50502433 AWS EC2 tutorial