機械学習の社会実装では、予測精度が高くても、機械学習がブラックボックであるために使うことができないということがよく起きます。
このスライドでは機械学習が不得意な予測結果の根拠を示すために考案されたLIMEの論文を解説します。
Ribeiro, Marco Tulio, Sameer Singh, and Carlos Guestrin. "" Why should i trust you?" Explaining the predictions of any classifier." Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. 2016.
AAAI2023「Are Transformers Effective for Time Series Forecasting?」と、HuggingFace「Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)」の紹介です。
This document summarizes recent research on applying self-attention mechanisms from Transformers to domains other than language, such as computer vision. It discusses models that use self-attention for images, including ViT, DeiT, and T2T, which apply Transformers to divided image patches. It also covers more general attention modules like the Perceiver that aims to be domain-agnostic. Finally, it discusses work on transferring pretrained language Transformers to other modalities through frozen weights, showing they can function as universal computation engines.
AAAI2023「Are Transformers Effective for Time Series Forecasting?」と、HuggingFace「Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)」の紹介です。
This document summarizes recent research on applying self-attention mechanisms from Transformers to domains other than language, such as computer vision. It discusses models that use self-attention for images, including ViT, DeiT, and T2T, which apply Transformers to divided image patches. It also covers more general attention modules like the Perceiver that aims to be domain-agnostic. Finally, it discusses work on transferring pretrained language Transformers to other modalities through frozen weights, showing they can function as universal computation engines.
The document proposes a new fast algorithm for smooth non-negative matrix factorization (NMF) using function approximation. The algorithm uses function approximation to smooth the basis vectors, allowing for faster computation compared to existing methods. The method is extended to tensor decomposition models. Experimental results on image datasets show the proposed methods achieve better denoising and source separation performance compared to ordinary NMF and tensor decomposition methods, while being up to 300 times faster computationally. Future work includes extending the model to incorporate both common smoothness across factors and individual sparseness.
Introduction to Common Spatial Pattern Filters for EEG Motor Imagery Classifi...Tatsuya Yokota
?
This document introduces common spatial pattern (CSP) filters for EEG motor imagery classification. CSP filters aim to find spatial patterns in EEG data that maximize the difference between two classes. The document outlines several CSP algorithms including standard CSP, common spatially standardized CSP, and spatially constrained CSP. CSP filters extract discriminative features from EEG data that can improve classification accuracy for brain-computer interface applications involving motor imagery tasks.
Linked CP Tensor Decomposition (presented by ICONIP2012)Tatsuya Yokota
?
This document proposes a new method called Linked Tensor Decomposition (LTD) to analyze common and individual factors from a group of tensor data. LTD combines the advantages of Individual Tensor Decomposition (ITD), which analyzes individual characteristics, and Simultaneous Tensor Decomposition (STD), which analyzes common factors in a group. LTD represents each tensor as the sum of a common factor and individual factors. An algorithm using Hierarchical Alternating Least Squares is developed to solve the LTD model. Experiments on toy problems and face reconstruction demonstrate LTD can extract both common and individual factors more effectively than ITD or STD alone. Future work will explore Tucker-based LTD and statistical independence in the LTD model
This document discusses independent component analysis (ICA) for blind source separation. ICA is a method to estimate original signals from observed signals consisting of mixed original signals and noise. It introduces the ICA model and approach, including whitening, maximizing non-Gaussianity using kurtosis and negentropy, and fast ICA algorithms. The document provides examples applying ICA to separate images and discusses approaches to improve ICA, including using differential filtering. ICA is an important technique for blind source separation and independent component estimation from observed signals.
My Thesis Topic was "Motor Imagery Signal Classification using EEG and ECoG signal for Brain Computer Interface." I have done my undergraduate thesis on the study, comparison and development of newer algorithms and feature sets related to two class classification problem in Motor Imagery Signal Classification using EEG and ECoG signal for Brain Computer Interface under the supervision of Dr. Mohammad Imamul Hassan Bhuiyan, Professor, Department of Electrical and Electronic Engineering, Bangladesh University of Engineering and Technology.
低ランク性および平滑性を用いたテンソル補完 (Tensor Completion based on Low-rank and Smooth Structures)
1. Tensor Completion based on
Low-rank and Smooth Structures
低ランク性および平滑性を用いた
テンソルデータ補完
横田 達也
名古屋工業大学
2016年9月16日 MI研招待講演
1
2. 行列?テンソル補完の研究について紹介
主な研究成果
行列補完
T. Yokota, A. Cichocki. A fast automatic rank determination algorithm for
noisy low-rank matrix completion. In Proceedings of Asia-Pacific Signal and
Information Processing Association Annual Summit and Conference
(APSIPA ASC 2015), pp. 43-46, 2015.
テンソル補完
T. Yokota, A. Cichocki. Tensor Completion via Functional Smooth
Component Deflation. In Proceedings of 41st International Conference on
Audio, Speech, and Signal Processing (ICASSP2016), pp. 2514-2518, 2016.
T. Yokota, Q. Zhao, C. Li, and A. Cichocki. Smooth PARAFAC
Decomposition for Tensor Completion, IEEE Transactions on Signal
Processing, vol. 64, issue 20, pp. 5423-5436, 2016.
2
本日の講演内容
13. リコメンダーシステム (Netflix 問題)
Netflix, Amazon, 楽天…
Rating (1~10)
13
低ランク性に基づく補完の例
User ID 001
User ID 002
User ID 003
User ID 004
10 92
10
89
8 13
1
…
…
…
…
……
…
…
…
…
14. 14
未完行列の低ランク近似
User ID 001
User ID 002
User ID 003
User ID 004
青いタイプの商品を好む
赤いタイプの商品を好む
青いタイプのみの成分行列 赤いタイプのみの成分行列
青のユーザ
分布ベクトル
…
…
青の代表スコアベクトル
赤のユーザ
分布ベクトル
…
…
赤の代表スコアベクトル
15. 15
未完行列の低ランク近似 (2)
User ID 001
User ID 002
User ID 003
User ID 004
青の成分行列 赤の成分行列
= +
≒ + + +???
一般化
(I*J)-行列
ランク-R 行列近似
19. 19
低ランク性に基づく行列補完(例)
原画像 未完画像 復元画像
[Li+ 2014] W. Li, L. Zhao, Z. Lin, D. Xu, and D. Lu. "Non‐Local Image Inpainting Using Low‐Rank Matrix Completion."
Computer Graphics Forum. Vol. 34, No. 6, pp. 111-122, 2015.
28. 28
テンソルデータの例
Time series signal
Multiple channels of time series signal
1階テンソル(ベクトル)
筋電図、心電図
モノラル音声
2階テンソル(行列)
脳電図、脳磁図
X線、濃淡画像
3階テンソル
MRI、CT、PET
カラー画像
4階テンソル
機能的MRI
カラー動画像
Gray-scale image Color image
???
MRI,Functional MRI
時間軸