The document discusses distances between data and similarity measures in data analysis. It introduces the concept of distance between data as a quantitative measure of how different two data points are, with smaller distances indicating greater similarity. Distances are useful for tasks like clustering data, detecting anomalies, data recognition, and measuring approximation errors. The most common distance measure, Euclidean distance, is explained for vectors of any dimension using the concept of norm from geometry. Caution is advised when calculating distances between data with differing scales.
本スライドは、弊社の梅本により弊社内の技術勉強会で使用されたものです。
近年注目を集めるアーキテクチャーである「Transformer」の解説スライドとなっております。
"Arithmer Seminar" is weekly held, where professionals from within and outside our company give lectures on their respective expertise.
The slides are made by the lecturer from outside our company, and shared here with his/her permission.
Arithmer株式会社は東京大学大学院数理科学研究科発の数学の会社です。私達は現代数学を応用して、様々な分野のソリューションに、新しい高度AIシステムを導入しています。AIをいかに上手に使って仕事を効率化するか、そして人々の役に立つ結果を生み出すのか、それを考えるのが私たちの仕事です。
Arithmer began at the University of Tokyo Graduate School of Mathematical Sciences. Today, our research of modern mathematics and AI systems has the capability of providing solutions when dealing with tough complex issues. At Arithmer we believe it is our job to realize the functions of AI through improving work efficiency and producing more useful results for society.
AAAI2023「Are Transformers Effective for Time Series Forecasting?」と、HuggingFace「Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)」の紹介です。
1. The document discusses energy-based models (EBMs) and how they can be applied to classifiers. It introduces noise contrastive estimation and flow contrastive estimation as methods to train EBMs.
2. One paper presented trains energy-based models using flow contrastive estimation by passing data through a flow-based generator. This allows implicit modeling with EBMs.
3. Another paper argues that classifiers can be viewed as joint energy-based models over inputs and outputs, and should be treated as such. It introduces a method to train classifiers as EBMs using contrastive divergence.
AAAI2023「Are Transformers Effective for Time Series Forecasting?」と、HuggingFace「Yes, Transformers are Effective for Time Series Forecasting (+ Autoformer)」の紹介です。
1. The document discusses energy-based models (EBMs) and how they can be applied to classifiers. It introduces noise contrastive estimation and flow contrastive estimation as methods to train EBMs.
2. One paper presented trains energy-based models using flow contrastive estimation by passing data through a flow-based generator. This allows implicit modeling with EBMs.
3. Another paper argues that classifiers can be viewed as joint energy-based models over inputs and outputs, and should be treated as such. It introduces a method to train classifiers as EBMs using contrastive divergence.
主要功能為網路封包側錄,不僅透過「網路流量數據」可檢測出範圍龐大且複雜的攻擊,即使訊息(如個人資料和重要數據)已經洩漏, momentum亦可進行調查破壞規模的影響範圍,足以將損害降到最低。
日商Terilogy是全球領先IT網路系統與資安服務供應商,其產品不僅協助企業改善網路性能,並可從複雜的IT基礎設備中獲得更好的回報。客戶涵蓋電信運營商、網路接入服務提供商,及對IT系統有嚴格要求的企業。此次主要代理的Terilogy「momentum資安舉證解決方案」,可協助客戶獲得確鑿證據以佐證調查,克服一般網路舉證的難處:當企業遇到需要進行網路舉證時,常遇到網路數據難以保存的問題,momentum可透過聯合資安產品的告警功能,只在特定事件發生時自動保存封包,大幅節省用戶的儲存容量。另一個常見的難題是,在高速網路環境下,封包擷取和持續數據保存功能往往無法兩全,而momentum即使在10Gbps高速流量下亦可實現「零丟包」的承諾,當重大資安事件發生時,往往遇到無法及時在任意時間中鎖定並提取任意Flow動作的狀況,momentum在「保證零丟包」前提下,亦可同時實現一面擷取封包,一面生成Flow Base index。