ºÝºÝߣshows by User: arogozhnikov / http://www.slideshare.net/images/logo.gif ºÝºÝߣshows by User: arogozhnikov / Thu, 20 Apr 2017 17:01:35 GMT ºÝºÝߣShare feed for ºÝºÝߣshows by User: arogozhnikov Machine learning in science and industry — day 3 /slideshow/machine-learning-in-science-and-industry-day-3-75239567/75239567 3-lecture-graddays-170420170135
- generalized linear models - linear models with non-linear features - SVM and kernel trick - regularizations - factorization models and recommender systems - unsupervised dimensionality reduction: PCA, LLE, IsoMAP - Artificial neural networks - training neural networks, stochastic optimizations - dropout]]>

- generalized linear models - linear models with non-linear features - SVM and kernel trick - regularizations - factorization models and recommender systems - unsupervised dimensionality reduction: PCA, LLE, IsoMAP - Artificial neural networks - training neural networks, stochastic optimizations - dropout]]>
Thu, 20 Apr 2017 17:01:35 GMT /slideshow/machine-learning-in-science-and-industry-day-3-75239567/75239567 arogozhnikov@slideshare.net(arogozhnikov) Machine learning in science and industry — day 3 arogozhnikov - generalized linear models - linear models with non-linear features - SVM and kernel trick - regularizations - factorization models and recommender systems - unsupervised dimensionality reduction: PCA, LLE, IsoMAP - Artificial neural networks - training neural networks, stochastic optimizations - dropout <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/3-lecture-graddays-170420170135-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> - generalized linear models - linear models with non-linear features - SVM and kernel trick - regularizations - factorization models and recommender systems - unsupervised dimensionality reduction: PCA, LLE, IsoMAP - Artificial neural networks - training neural networks, stochastic optimizations - dropout
Machine learning in science and industry — day 3 from arogozhnikov
]]>
9482 8 https://cdn.slidesharecdn.com/ss_thumbnails/3-lecture-graddays-170420170135-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Machine learning in science and industry — day 4 /slideshow/machine-learning-in-science-and-industry-day-4/75239350 4-lecture-graddays-170420165455
- tabular data approach to machine learning and when it didn't work - convolutional neural networks and their application - deep learning: history and today - generative adversarial networks - finding optimal hyperparameters - joint embeddings]]>

- tabular data approach to machine learning and when it didn't work - convolutional neural networks and their application - deep learning: history and today - generative adversarial networks - finding optimal hyperparameters - joint embeddings]]>
Thu, 20 Apr 2017 16:54:55 GMT /slideshow/machine-learning-in-science-and-industry-day-4/75239350 arogozhnikov@slideshare.net(arogozhnikov) Machine learning in science and industry — day 4 arogozhnikov - tabular data approach to machine learning and when it didn't work - convolutional neural networks and their application - deep learning: history and today - generative adversarial networks - finding optimal hyperparameters - joint embeddings <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/4-lecture-graddays-170420165455-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> - tabular data approach to machine learning and when it didn&#39;t work - convolutional neural networks and their application - deep learning: history and today - generative adversarial networks - finding optimal hyperparameters - joint embeddings
Machine learning in science and industry — day 4 from arogozhnikov
]]>
9131 3 https://cdn.slidesharecdn.com/ss_thumbnails/4-lecture-graddays-170420165455-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Machine learning in science and industry — day 2 /arogozhnikov/machine-learning-in-science-and-industry-day-2 2-lecture-graddays-170420163619
- decision trees - random forest - Boosting: adaboost - reweighting with boosting - gradient boosting - learning to rank with gradient boosting - multiclass classification - trigger in LHCb - boosting to uniformity and flatness loss - particle identification]]>

- decision trees - random forest - Boosting: adaboost - reweighting with boosting - gradient boosting - learning to rank with gradient boosting - multiclass classification - trigger in LHCb - boosting to uniformity and flatness loss - particle identification]]>
Thu, 20 Apr 2017 16:36:19 GMT /arogozhnikov/machine-learning-in-science-and-industry-day-2 arogozhnikov@slideshare.net(arogozhnikov) Machine learning in science and industry — day 2 arogozhnikov - decision trees - random forest - Boosting: adaboost - reweighting with boosting - gradient boosting - learning to rank with gradient boosting - multiclass classification - trigger in LHCb - boosting to uniformity and flatness loss - particle identification <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/2-lecture-graddays-170420163619-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> - decision trees - random forest - Boosting: adaboost - reweighting with boosting - gradient boosting - learning to rank with gradient boosting - multiclass classification - trigger in LHCb - boosting to uniformity and flatness loss - particle identification
Machine learning in science and industry — day 2 from arogozhnikov
]]>
9237 8 https://cdn.slidesharecdn.com/ss_thumbnails/2-lecture-graddays-170420163619-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Machine learning in science and industry — day 1 /slideshow/machine-learning-in-science-and-industry-day-1/75235204 1-lecture-graddays-170420145217
A course of machine learning in science and industry. - notions and applications - nearest neighbours: search and machine learning algorithms - roc curve - optimal classification and regression - density estimation - Gaussian mixtures and EM algorithm - clustering, an example of clustering in the opera ]]>

A course of machine learning in science and industry. - notions and applications - nearest neighbours: search and machine learning algorithms - roc curve - optimal classification and regression - density estimation - Gaussian mixtures and EM algorithm - clustering, an example of clustering in the opera ]]>
Thu, 20 Apr 2017 14:52:16 GMT /slideshow/machine-learning-in-science-and-industry-day-1/75235204 arogozhnikov@slideshare.net(arogozhnikov) Machine learning in science and industry — day 1 arogozhnikov A course of machine learning in science and industry. - notions and applications - nearest neighbours: search and machine learning algorithms - roc curve - optimal classification and regression - density estimation - Gaussian mixtures and EM algorithm - clustering, an example of clustering in the opera <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/1-lecture-graddays-170420145217-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> A course of machine learning in science and industry. - notions and applications - nearest neighbours: search and machine learning algorithms - roc curve - optimal classification and regression - density estimation - Gaussian mixtures and EM algorithm - clustering, an example of clustering in the opera
Machine learning in science and industry — day 1 from arogozhnikov
]]>
9960 8 https://cdn.slidesharecdn.com/ss_thumbnails/1-lecture-graddays-170420145217-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
Reweighting and Boosting to uniforimty in HEP /slideshow/reweighting-and-boosting-to-uniforimty-in-hep/64258481 reweighting-flatness-160721183630
Specific versions of boosting for High Energy Physics: event reweighting and boosting to uniformity.]]>

Specific versions of boosting for High Energy Physics: event reweighting and boosting to uniformity.]]>
Thu, 21 Jul 2016 18:36:30 GMT /slideshow/reweighting-and-boosting-to-uniforimty-in-hep/64258481 arogozhnikov@slideshare.net(arogozhnikov) Reweighting and Boosting to uniforimty in HEP arogozhnikov Specific versions of boosting for High Energy Physics: event reweighting and boosting to uniformity. <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/reweighting-flatness-160721183630-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Specific versions of boosting for High Energy Physics: event reweighting and boosting to uniformity.
Reweighting and Boosting to uniforimty in HEP from arogozhnikov
]]>
3027 2 https://cdn.slidesharecdn.com/ss_thumbnails/reweighting-flatness-160721183630-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP Lectures - day 3, basic track /slideshow/mlhep-lectures-day-3-basic-track/63948144 lectures56-160712124623
Ensembles * AdaBoost * Gradient Boosting for regression * Gradient Boosting for classification * Second-order information * Losses: regression, classification, ranking Multiclass classification: * ensembling * softmax modifications Feature engineering Output engineering Feature selection Dimensionality rediction: * PCA * LDA, CSP * LLE * Isomap Hyperparameter optimization * ML-based approach * Gaussian processes ]]>

Ensembles * AdaBoost * Gradient Boosting for regression * Gradient Boosting for classification * Second-order information * Losses: regression, classification, ranking Multiclass classification: * ensembling * softmax modifications Feature engineering Output engineering Feature selection Dimensionality rediction: * PCA * LDA, CSP * LLE * Isomap Hyperparameter optimization * ML-based approach * Gaussian processes ]]>
Tue, 12 Jul 2016 12:46:23 GMT /slideshow/mlhep-lectures-day-3-basic-track/63948144 arogozhnikov@slideshare.net(arogozhnikov) MLHEP Lectures - day 3, basic track arogozhnikov Ensembles * AdaBoost * Gradient Boosting for regression * Gradient Boosting for classification * Second-order information * Losses: regression, classification, ranking Multiclass classification: * ensembling * softmax modifications Feature engineering Output engineering Feature selection Dimensionality rediction: * PCA * LDA, CSP * LLE * Isomap Hyperparameter optimization * ML-based approach * Gaussian processes <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lectures56-160712124623-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Ensembles * AdaBoost * Gradient Boosting for regression * Gradient Boosting for classification * Second-order information * Losses: regression, classification, ranking Multiclass classification: * ensembling * softmax modifications Feature engineering Output engineering Feature selection Dimensionality rediction: * PCA * LDA, CSP * LLE * Isomap Hyperparameter optimization * ML-based approach * Gaussian processes
MLHEP Lectures - day 3, basic track from arogozhnikov
]]>
2900 7 https://cdn.slidesharecdn.com/ss_thumbnails/lectures56-160712124623-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP Lectures - day 2, basic track /slideshow/mlhep-lectures-day-2-basic-track/63947521 lectures34-160712123147
* linear models: logistic regression * polynomial decision rule and polynomial regression * SVM (Support Vector Machine), kernel trick * Overfitting: two definitions * Model selection * Regularization: L1, L2, elastic net. * Decision trees * splitting criteria for classification and regression * overfitting in trees: pre-stopping and post-pruning * non-stability of trees * feature importance * Ensembling * RSM, subsampling, bagging. * Random Forest ]]>

* linear models: logistic regression * polynomial decision rule and polynomial regression * SVM (Support Vector Machine), kernel trick * Overfitting: two definitions * Model selection * Regularization: L1, L2, elastic net. * Decision trees * splitting criteria for classification and regression * overfitting in trees: pre-stopping and post-pruning * non-stability of trees * feature importance * Ensembling * RSM, subsampling, bagging. * Random Forest ]]>
Tue, 12 Jul 2016 12:31:47 GMT /slideshow/mlhep-lectures-day-2-basic-track/63947521 arogozhnikov@slideshare.net(arogozhnikov) MLHEP Lectures - day 2, basic track arogozhnikov * linear models: logistic regression * polynomial decision rule and polynomial regression * SVM (Support Vector Machine), kernel trick * Overfitting: two definitions * Model selection * Regularization: L1, L2, elastic net. * Decision trees * splitting criteria for classification and regression * overfitting in trees: pre-stopping and post-pruning * non-stability of trees * feature importance * Ensembling * RSM, subsampling, bagging. * Random Forest <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lectures34-160712123147-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> * linear models: logistic regression * polynomial decision rule and polynomial regression * SVM (Support Vector Machine), kernel trick * Overfitting: two definitions * Model selection * Regularization: L1, L2, elastic net. * Decision trees * splitting criteria for classification and regression * overfitting in trees: pre-stopping and post-pruning * non-stability of trees * feature importance * Ensembling * RSM, subsampling, bagging. * Random Forest
MLHEP Lectures - day 2, basic track from arogozhnikov
]]>
2992 3 https://cdn.slidesharecdn.com/ss_thumbnails/lectures34-160712123147-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP Lectures - day 1, basic track /slideshow/mlhep-lectures-day-1-basic-track/63947145 lectures12-160712122103
Introduction to machine learning terminology. Applications within High Energy Physics and outside HEP. * Basic problems: classification and regression. * Nearest neighbours approach and spacial indices * Overfitting (intro) * Curse of dimensionality * ROC curve, ROC AUC * Bayes optimal classifier * Density estimation: KDE and histograms * Parametric density estimation * Mixtures for density estimation and EM algorithm * Generative approach vs discriminative approach * Linear decision rule, intro to logistic regression * Linear regression ]]>

Introduction to machine learning terminology. Applications within High Energy Physics and outside HEP. * Basic problems: classification and regression. * Nearest neighbours approach and spacial indices * Overfitting (intro) * Curse of dimensionality * ROC curve, ROC AUC * Bayes optimal classifier * Density estimation: KDE and histograms * Parametric density estimation * Mixtures for density estimation and EM algorithm * Generative approach vs discriminative approach * Linear decision rule, intro to logistic regression * Linear regression ]]>
Tue, 12 Jul 2016 12:21:03 GMT /slideshow/mlhep-lectures-day-1-basic-track/63947145 arogozhnikov@slideshare.net(arogozhnikov) MLHEP Lectures - day 1, basic track arogozhnikov Introduction to machine learning terminology. Applications within High Energy Physics and outside HEP. * Basic problems: classification and regression. * Nearest neighbours approach and spacial indices * Overfitting (intro) * Curse of dimensionality * ROC curve, ROC AUC * Bayes optimal classifier * Density estimation: KDE and histograms * Parametric density estimation * Mixtures for density estimation and EM algorithm * Generative approach vs discriminative approach * Linear decision rule, intro to logistic regression * Linear regression <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lectures12-160712122103-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> Introduction to machine learning terminology. Applications within High Energy Physics and outside HEP. * Basic problems: classification and regression. * Nearest neighbours approach and spacial indices * Overfitting (intro) * Curse of dimensionality * ROC curve, ROC AUC * Bayes optimal classifier * Density estimation: KDE and histograms * Parametric density estimation * Mixtures for density estimation and EM algorithm * Generative approach vs discriminative approach * Linear decision rule, intro to logistic regression * Linear regression
MLHEP Lectures - day 1, basic track from arogozhnikov
]]>
2965 5 https://cdn.slidesharecdn.com/ss_thumbnails/lectures12-160712122103-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP 2015: Introductory Lecture #4 /slideshow/mlhep-2015-introductory-lecture-4/52495841 lecture4-150907124759-lva1-app6891
* tuning gradient boosting over decision trees (GBDT) * speeding up predictions for online triggers: lookup tables * PCA, autoencoder, manifold learning * structural learning: Markov chain, LDA * remarks on collaborative research]]>

* tuning gradient boosting over decision trees (GBDT) * speeding up predictions for online triggers: lookup tables * PCA, autoencoder, manifold learning * structural learning: Markov chain, LDA * remarks on collaborative research]]>
Mon, 07 Sep 2015 12:47:59 GMT /slideshow/mlhep-2015-introductory-lecture-4/52495841 arogozhnikov@slideshare.net(arogozhnikov) MLHEP 2015: Introductory Lecture #4 arogozhnikov * tuning gradient boosting over decision trees (GBDT) * speeding up predictions for online triggers: lookup tables * PCA, autoencoder, manifold learning * structural learning: Markov chain, LDA * remarks on collaborative research <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lecture4-150907124759-lva1-app6891-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> * tuning gradient boosting over decision trees (GBDT) * speeding up predictions for online triggers: lookup tables * PCA, autoencoder, manifold learning * structural learning: Markov chain, LDA * remarks on collaborative research
MLHEP 2015: Introductory Lecture #4 from arogozhnikov
]]>
849 8 https://cdn.slidesharecdn.com/ss_thumbnails/lecture4-150907124759-lva1-app6891-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP 2015: Introductory Lecture #3 /slideshow/mlhep-2015-introductory-lecture-3/52495646 lecture3-150907124210-lva1-app6891
* Ensembles strategies * Bagging * Random Forest * comparing distributions * AdaBoost, AdaLoss (= ExpLoss) * Gradient Boosting, it's parameters * GB fot classification, regression, ranking * Boosting to uniformity (uBoost, FlatnessLoss)]]>

* Ensembles strategies * Bagging * Random Forest * comparing distributions * AdaBoost, AdaLoss (= ExpLoss) * Gradient Boosting, it's parameters * GB fot classification, regression, ranking * Boosting to uniformity (uBoost, FlatnessLoss)]]>
Mon, 07 Sep 2015 12:42:10 GMT /slideshow/mlhep-2015-introductory-lecture-3/52495646 arogozhnikov@slideshare.net(arogozhnikov) MLHEP 2015: Introductory Lecture #3 arogozhnikov * Ensembles strategies * Bagging * Random Forest * comparing distributions * AdaBoost, AdaLoss (= ExpLoss) * Gradient Boosting, it's parameters * GB fot classification, regression, ranking * Boosting to uniformity (uBoost, FlatnessLoss) <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lecture3-150907124210-lva1-app6891-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> * Ensembles strategies * Bagging * Random Forest * comparing distributions * AdaBoost, AdaLoss (= ExpLoss) * Gradient Boosting, it&#39;s parameters * GB fot classification, regression, ranking * Boosting to uniformity (uBoost, FlatnessLoss)
MLHEP 2015: Introductory Lecture #3 from arogozhnikov
]]>
787 9 https://cdn.slidesharecdn.com/ss_thumbnails/lecture3-150907124210-lva1-app6891-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP 2015: Introductory Lecture #2 /arogozhnikov/mlhep-2015-introductory-lecture-2 lecture2-150907123345-lva1-app6891
* Logistic regression, logistic loss (log loss) * stochastic optimization * adding new features, generalized linear model * Kernel trick, intro to SVM * Overfitting * Decision trees for classification and regression * Building trees greedily: Gini index, entropy * Trees fighting with overfitting: pre-stopping and post-pruning * Feature importances]]>

* Logistic regression, logistic loss (log loss) * stochastic optimization * adding new features, generalized linear model * Kernel trick, intro to SVM * Overfitting * Decision trees for classification and regression * Building trees greedily: Gini index, entropy * Trees fighting with overfitting: pre-stopping and post-pruning * Feature importances]]>
Mon, 07 Sep 2015 12:33:45 GMT /arogozhnikov/mlhep-2015-introductory-lecture-2 arogozhnikov@slideshare.net(arogozhnikov) MLHEP 2015: Introductory Lecture #2 arogozhnikov * Logistic regression, logistic loss (log loss) * stochastic optimization * adding new features, generalized linear model * Kernel trick, intro to SVM * Overfitting * Decision trees for classification and regression * Building trees greedily: Gini index, entropy * Trees fighting with overfitting: pre-stopping and post-pruning * Feature importances <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lecture2-150907123345-lva1-app6891-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> * Logistic regression, logistic loss (log loss) * stochastic optimization * adding new features, generalized linear model * Kernel trick, intro to SVM * Overfitting * Decision trees for classification and regression * Building trees greedily: Gini index, entropy * Trees fighting with overfitting: pre-stopping and post-pruning * Feature importances
MLHEP 2015: Introductory Lecture #2 from arogozhnikov
]]>
803 11 https://cdn.slidesharecdn.com/ss_thumbnails/lecture2-150907123345-lva1-app6891-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
MLHEP 2015: Introductory Lecture #1 /slideshow/mlhep-2015-introductory-lecture-1/52495214 lecture1-150907123024-lva1-app6892
* ML in HEP * classification and regression * knn classification and regression * ROC curve * optimal bayesian classifier * Fisher's QDA * intro to Logistic Regression]]>

* ML in HEP * classification and regression * knn classification and regression * ROC curve * optimal bayesian classifier * Fisher's QDA * intro to Logistic Regression]]>
Mon, 07 Sep 2015 12:30:24 GMT /slideshow/mlhep-2015-introductory-lecture-1/52495214 arogozhnikov@slideshare.net(arogozhnikov) MLHEP 2015: Introductory Lecture #1 arogozhnikov * ML in HEP * classification and regression * knn classification and regression * ROC curve * optimal bayesian classifier * Fisher's QDA * intro to Logistic Regression <img style="border:1px solid #C3E6D8;float:right;" alt="" src="https://cdn.slidesharecdn.com/ss_thumbnails/lecture1-150907123024-lva1-app6892-thumbnail.jpg?width=120&amp;height=120&amp;fit=bounds" /><br> * ML in HEP * classification and regression * knn classification and regression * ROC curve * optimal bayesian classifier * Fisher&#39;s QDA * intro to Logistic Regression
MLHEP 2015: Introductory Lecture #1 from arogozhnikov
]]>
775 10 https://cdn.slidesharecdn.com/ss_thumbnails/lecture1-150907123024-lva1-app6892-thumbnail.jpg?width=120&height=120&fit=bounds presentation Black http://activitystrea.ms/schema/1.0/post http://activitystrea.ms/schema/1.0/posted 0
https://public.slidesharecdn.com/v2/images/profile-picture.png https://cdn.slidesharecdn.com/ss_thumbnails/3-lecture-graddays-170420170135-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/machine-learning-in-science-and-industry-day-3-75239567/75239567 Machine learning in sc... https://cdn.slidesharecdn.com/ss_thumbnails/4-lecture-graddays-170420165455-thumbnail.jpg?width=320&height=320&fit=bounds slideshow/machine-learning-in-science-and-industry-day-4/75239350 Machine learning in sc... https://cdn.slidesharecdn.com/ss_thumbnails/2-lecture-graddays-170420163619-thumbnail.jpg?width=320&height=320&fit=bounds arogozhnikov/machine-learning-in-science-and-industry-day-2 Machine learning in sc...