狠狠撸shows by User: taguchinaoya / http://www.slideshare.net/images/logo.gif 狠狠撸shows by User: taguchinaoya / Tue, 15 Feb 2022 04:45:07 GMT 狠狠撸Share feed for 狠狠撸shows by User: taguchinaoya 信号処理実応用入门 /slideshow/ss-251174206/251174206 20220208-220215044507
フーリエ変换を起点に信号処理を実応用する上での基础知识を説明します。闭闭>

フーリエ変换を起点に信号処理を実応用する上での基础知识を説明します。闭闭>
Tue, 15 Feb 2022 04:45:07 GMT /slideshow/ss-251174206/251174206 taguchinaoya@slideshare.net(taguchinaoya) 信号処理実応用入门 taguchinaoya フーリエ変换を起点に信号処理を実応用する上での基础知识を説明します。 <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20220208-220215044507-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> フーリエ変换を起点に信号処理を実応用する上での基础知识を説明します。
信号処理実応用入门 from taguchi naoya
]]>
702 0 https://cdn.slidesharecdn.com/ss_thumbnails/20220208-220215044507-thumbnail.jpg?width=120&height=120&fit=bounds presentation 000000 http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ /slideshow/20210126-taguchi-dsrinkopaperintro/241905814 20210126taguchidsrinkopaperintro-210127120224
Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ]]>

Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ]]>
Wed, 27 Jan 2021 12:02:24 GMT /slideshow/20210126-taguchi-dsrinkopaperintro/241905814 taguchinaoya@slideshare.net(taguchinaoya) Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ taguchinaoya Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20210126taguchidsrinkopaperintro-210127120224-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ
Trader-Company Method: A Metaheuristic for Interpretable Stock Price Prediction を読んだ from taguchi naoya
]]>
135 0 https://cdn.slidesharecdn.com/ss_thumbnails/20210126taguchidsrinkopaperintro-210127120224-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
20201010 kaggle tweet コンペの話 /slideshow/20201010-ds-ltguchio/238815620 20201010dsltguchio-201009134858
出張!DeNAデータサイエンス輪講用のスライドです。 >>> https://dena-ai.connpass.com/event/188588/ 以前上げたスライドとほぼ同様の内容ですが、一部改変しています。 >>> /taguchinaoya/kaggle-tweet]]>

出張!DeNAデータサイエンス輪講用のスライドです。 >>> https://dena-ai.connpass.com/event/188588/ 以前上げたスライドとほぼ同様の内容ですが、一部改変しています。 >>> /taguchinaoya/kaggle-tweet]]>
Fri, 09 Oct 2020 13:48:57 GMT /slideshow/20201010-ds-ltguchio/238815620 taguchinaoya@slideshare.net(taguchinaoya) 20201010 kaggle tweet コンペの話 taguchinaoya 出張!DeNAデータサイエンス輪講用のスライドです。 >>> https://dena-ai.connpass.com/event/188588/ 以前上げたスライドとほぼ同様の内容ですが、一部改変しています。 >>> /taguchinaoya/kaggle-tweet <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20201010dsltguchio-201009134858-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 出張!DeNAデータサイエンス輪講用のスライドです。 &gt;&gt;&gt; https://dena-ai.connpass.com/event/188588/ 以前上げたスライドとほぼ同様の内容ですが、一部改変しています。 &gt;&gt;&gt; /taguchinaoya/kaggle-tweet
20201010 kaggle tweet コンペの話 from taguchi naoya
]]>
2547 0 https://cdn.slidesharecdn.com/ss_thumbnails/20201010dsltguchio-201009134858-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Acl2020 taguchi /slideshow/acl2020-taguchi/237859672 acl2020taguchi-200814054704
ACL 2020 の参加報告。BPE 系の tokenizer についての話 ?Rico Sennrich, Barry Haddow, and Alexandra Birch. "Neural machine translation of rare words with subword units." In ACL, 2016. ?Taku Kudo. "Subword regularization: Improving neural network translation models with multiple subword candidates." In ACL, 2018. ?Ivan Provilkov, Dmitrii Emelianenko, and Elena Voita. "BPE-Dropout: Simple and Effective Subword Regularization." In ACL, 2020. ?Xuanli He, Gholamreza Haffari, and Mohammad Norouzi. "Dynamic Programming Encoding for Subword Segmentation in Neural Machine Translation." In ACL, 2020.]]>

ACL 2020 の参加報告。BPE 系の tokenizer についての話 ?Rico Sennrich, Barry Haddow, and Alexandra Birch. "Neural machine translation of rare words with subword units." In ACL, 2016. ?Taku Kudo. "Subword regularization: Improving neural network translation models with multiple subword candidates." In ACL, 2018. ?Ivan Provilkov, Dmitrii Emelianenko, and Elena Voita. "BPE-Dropout: Simple and Effective Subword Regularization." In ACL, 2020. ?Xuanli He, Gholamreza Haffari, and Mohammad Norouzi. "Dynamic Programming Encoding for Subword Segmentation in Neural Machine Translation." In ACL, 2020.]]>
Fri, 14 Aug 2020 05:47:04 GMT /slideshow/acl2020-taguchi/237859672 taguchinaoya@slideshare.net(taguchinaoya) Acl2020 taguchi taguchinaoya ACL 2020 の参加報告。BPE 系の tokenizer についての話 ?Rico Sennrich, Barry Haddow, and Alexandra Birch. "Neural machine translation of rare words with subword units." In ACL, 2016. ?Taku Kudo. "Subword regularization: Improving neural network translation models with multiple subword candidates." In ACL, 2018. ?Ivan Provilkov, Dmitrii Emelianenko, and Elena Voita. "BPE-Dropout: Simple and Effective Subword Regularization." In ACL, 2020. ?Xuanli He, Gholamreza Haffari, and Mohammad Norouzi. "Dynamic Programming Encoding for Subword Segmentation in Neural Machine Translation." In ACL, 2020. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/acl2020taguchi-200814054704-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> ACL 2020 の参加報告。BPE 系の tokenizer についての話 ?Rico Sennrich, Barry Haddow, and Alexandra Birch. &quot;Neural machine translation of rare words with subword units.&quot; In ACL, 2016. ?Taku Kudo. &quot;Subword regularization: Improving neural network translation models with multiple subword candidates.&quot; In ACL, 2018. ?Ivan Provilkov, Dmitrii Emelianenko, and Elena Voita. &quot;BPE-Dropout: Simple and Effective Subword Regularization.&quot; In ACL, 2020. ?Xuanli He, Gholamreza Haffari, and Mohammad Norouzi. &quot;Dynamic Programming Encoding for Subword Segmentation in Neural Machine Translation.&quot; In ACL, 2020.
Acl2020 taguchi from taguchi naoya
]]>
1270 0 https://cdn.slidesharecdn.com/ss_thumbnails/acl2020taguchi-200814054704-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
kaggle tweet コンペの話 /taguchinaoya/kaggle-tweet kaggletweetcompetition-200712235246
kaggle tweet コンペでの経験と得た知見の共有 :) -- Share our story and the techniques we've achieved in the kaggle tweet sentiment extraction competition.]]>

kaggle tweet コンペでの経験と得た知見の共有 :) -- Share our story and the techniques we've achieved in the kaggle tweet sentiment extraction competition.]]>
Sun, 12 Jul 2020 23:52:46 GMT /taguchinaoya/kaggle-tweet taguchinaoya@slideshare.net(taguchinaoya) kaggle tweet コンペの話 taguchinaoya kaggle tweet コンペでの経験と得た知見の共有 :) -- Share our story and the techniques we've achieved in the kaggle tweet sentiment extraction competition. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/kaggletweetcompetition-200712235246-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> kaggle tweet コンペでの経験と得た知見の共有 :) -- Share our story and the techniques we&#39;ve achieved in the kaggle tweet sentiment extraction competition.
kaggle tweet コンペの話 from taguchi naoya
]]>
1151 0 https://cdn.slidesharecdn.com/ss_thumbnails/kaggletweetcompetition-200712235246-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
20190725 taguchi decision_tree_for_pubshare /slideshow/20190725-taguchi-decisiontreeforpubshare-163784192/163784192 20190725taguchidecisiontreeforpubshare-190814085129
决定木の简単な歴史と学习アルゴリズムについて闭闭>

决定木の简単な歴史と学习アルゴリズムについて闭闭>
Wed, 14 Aug 2019 08:51:29 GMT /slideshow/20190725-taguchi-decisiontreeforpubshare-163784192/163784192 taguchinaoya@slideshare.net(taguchinaoya) 20190725 taguchi decision_tree_for_pubshare taguchinaoya 决定木の简単な歴史と学习アルゴリズムについて <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20190725taguchidecisiontreeforpubshare-190814085129-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> 决定木の简単な歴史と学习アルゴリズムについて
20190725 taguchi decision_tree_for_pubshare from taguchi naoya
]]>
504 2 https://cdn.slidesharecdn.com/ss_thumbnails/20190725taguchidecisiontreeforpubshare-190814085129-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
PLAsTiCC Astronomical Classification に参加した話 /slideshow/plasticc-astronomical-classification/126898670 20181227taguchiplasticcslideshare-181228140354
PLAsTiCC Astronomical Classification という kaggle のコンペに参加した際の話をまとめました]]>

PLAsTiCC Astronomical Classification という kaggle のコンペに参加した際の話をまとめました]]>
Fri, 28 Dec 2018 14:03:54 GMT /slideshow/plasticc-astronomical-classification/126898670 taguchinaoya@slideshare.net(taguchinaoya) PLAsTiCC Astronomical Classification に参加した話 taguchinaoya PLAsTiCC Astronomical Classification という kaggle のコンペに参加した際の話をまとめました <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/20181227taguchiplasticcslideshare-181228140354-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> PLAsTiCC Astronomical Classification という kaggle のコンペに参加した際の話をまとめました
PLAsTiCC Astronomical Classification に参加した話 from taguchi naoya
]]>
1236 1 https://cdn.slidesharecdn.com/ss_thumbnails/20181227taguchiplasticcslideshare-181228140354-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
memory augmented neural networks with wormhole connection /slideshow/memory-augmented-neural-networks-with-wormhole-connection/84066751 guchiomemoryaugmentedneuralnetworkswithwormholeconnection-171214111954
A part of the slide which I used for meeting in my lab. The paper is about novel memory augmented neural networks called TARDIS, which can choose whether or not it utilize recurrent information from its memory or hidden state of the controller. In addition, it use one-hot vector for memory addressing, so Reinforcement Learning is adopted for its training strategy. I removed information about our research from the slide, so there may be insufficient explanations.]]>

A part of the slide which I used for meeting in my lab. The paper is about novel memory augmented neural networks called TARDIS, which can choose whether or not it utilize recurrent information from its memory or hidden state of the controller. In addition, it use one-hot vector for memory addressing, so Reinforcement Learning is adopted for its training strategy. I removed information about our research from the slide, so there may be insufficient explanations.]]>
Thu, 14 Dec 2017 11:19:54 GMT /slideshow/memory-augmented-neural-networks-with-wormhole-connection/84066751 taguchinaoya@slideshare.net(taguchinaoya) memory augmented neural networks with wormhole connection taguchinaoya A part of the slide which I used for meeting in my lab. The paper is about novel memory augmented neural networks called TARDIS, which can choose whether or not it utilize recurrent information from its memory or hidden state of the controller. In addition, it use one-hot vector for memory addressing, so Reinforcement Learning is adopted for its training strategy. I removed information about our research from the slide, so there may be insufficient explanations. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/guchiomemoryaugmentedneuralnetworkswithwormholeconnection-171214111954-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> A part of the slide which I used for meeting in my lab. The paper is about novel memory augmented neural networks called TARDIS, which can choose whether or not it utilize recurrent information from its memory or hidden state of the controller. In addition, it use one-hot vector for memory addressing, so Reinforcement Learning is adopted for its training strategy. I removed information about our research from the slide, so there may be insufficient explanations.
memory augmented neural networks with wormhole connection from taguchi naoya
]]>
181 1 https://cdn.slidesharecdn.com/ss_thumbnails/guchiomemoryaugmentedneuralnetworkswithwormholeconnection-171214111954-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://cdn.slidesharecdn.com/profile-photo-taguchinaoya-48x48.jpg?cb=1729221446 https://cdn.slidesharecdn.com/ss_thumbnails/20220208-220215044507-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/ss-251174206/251174206 信号処理実応用入门 https://cdn.slidesharecdn.com/ss_thumbnails/20210126taguchidsrinkopaperintro-210127120224-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/20210126-taguchi-dsrinkopaperintro/241905814 Trader-Company Method:... https://cdn.slidesharecdn.com/ss_thumbnails/20201010dsltguchio-201009134858-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/20201010-ds-ltguchio/238815620 20201010 kaggle tweet ...