2. TensorFlow Agenda
Introduction To TensorFlow
Introduction To Deep Learning
Fundamentals Of Neural Networks
Fundamentals Of Deep Networks
Convolutional Neural Networks (CNN)
Recurrent Neural Networks (RNN)
Restricted Boltzmann Machine(RBM) And Autoencoders
3. What is TensorFlow?
TensorFlow is a multipurpose open source so2ware
library for numerical computation using data flow
graphs. It has been designed with deep learning in
mind but it is applicable to a much wider range of
problems.
But what does it actually do? TensorFlow provides
primitives for defining functions on tensors and
automatically computing their derivatives.
4. But whats a Tensor?
Formally, tensors are multilinear maps from vector
spaces to the real numbers ( vector space, and dual
space)
A scalar is a tensor ( )
A vector is a tensor ( )
A matrix is a tensor ( )
Common to have fixed basis, so a tensor can be
represented as a multidimensional array of numbers.
5. Introduction To Deep Learning
Deep Learning is machine learning technique that
learns features and tasks directly from data.
Data can be images, text, or sound.
6. What is Artificial Intelligence?
Every aspect of learning or any other features of
intelligence can in principle be so precisely
described that a machine can be made to simulate
it. An attempt will be made to find how to make
machine us language, form abstractions and
concepts, solve kinds of problems now reserved for
humans, and improve themselves.
8. Limitations of Machine Learning
There are a few key limitations of machine learning
approaches that impact on their usefulness for
certain tasks, as well as their ability to function in
real-world environments. Machine learning
algorithms function very well on tasks related to
familiar data from a training set. Limitations tend
to surface when the algorithm tries to incorporate
new data. As these systems advance, they are
quickly becoming better at categorizing familiar
data and performing tasks such as image or speech
recognition.
9. The Math behind Machine Learning: Linear Algebra
Scalars
Vectors
Matrices
Tensors
Hyperplanes
The Math Behind Machine Learning: Statistics
Probability
Conditional Probabilities
Posterior Probability
Distributions
Samples vs Population
Resampling Methods
Selection Bias
Likelihood
10. Defining Neural Networks
Neural networks are a set of algorithms, modeled loosely after the
human brain, that are designed to recognize patterns. They interpret
sensory data through a kind of machine perception, labeling or
clustering raw input. The patterns they recognize are numerical,
contained in vectors, into which all real-world data, be it images,
sound, text or time series, must be translated.
(OR)
A neural network is a series of algorithms that attempts to identify
underlying relationships in a set of data by using a process that mimics
the way the human brain operates. Neural networks have the ability to
adapt to changing input so the network produces the best possible
result without the need to redesign the output criteria. The concept of
neural networks is rapidly increasing in popularity in the area of
developing trading systems.
11. Deep Learning
The important innovation in deep learning is a system
that learns categories incrementally through its hidden
layer architecture, defining low-level categories like
letters before moving on to higher level categories such
as words. In the example of image recognition this means
identifying light/dark areas before categorising lines and
then shapes to allow face recognition. Each neuron or
node in the network represents one aspect of the whole
and together they provide a full representation of the
image. Each node or hidden layer is given a weight that
represents the strength of its relationship with the output
and as the model develops the weights are adjusted.
12. LAYER 1: Algorithm first learns to recognise pixels
and then edges and shapes
LAYER 2 Learns to identify more complex shapes
and features like eyes and mouths
LAYER 3 Learns which shapes and objects can be
used to identify a human face