ºÝºÝߣ

ºÝºÝߣShare a Scribd company logo
Machine Learning: finding patterns
Outline Machine learning and Classification Examples *Learning as Search  Bias Weka
Finding patterns Goal: programs that detect patterns and regularities in the data Strong patterns  ?  good predictions Problem 1: most patterns are not interesting Problem 2: patterns may be inexact (or  spurious) Problem 3: data may be garbled or missing
Machine learning techniques Algorithms for acquiring structural descriptions from examples Structural descriptions represent patterns explicitly Can be used to predict outcome in new situation Can be used to understand and explain how prediction is derived ( may be even more important ) Methods originate from artificial intelligence, statistics, and research on databases witten&eibe
Can machines really learn? Definitions of ¡°learning¡± from dictionary: To get knowledge of by study, experience, or being taught To become aware by information or from observation To commit to memory To be informed of, ascertain; to receive instruction witten&eibe Difficult to measure Trivial for computers Things learn when they change their behavior in a way that makes them perform better in the future. Operational definition: Does a slipper learn? Does learning imply intention?
Classification Learn a method for predicting the instance class from pre-labeled (classified)  instances Many approaches: Regression,  Decision Trees, Bayesian, Neural Networks,  ...  Given a set of points from classes  what is the class of new point  ?
Classification: Linear Regression Linear Regression w 0   + w 1  x  + w 2  y >= 0 Regression computes w i   from data to minimize squared error to ¡®fit¡¯ the data Not flexible enough
Classification: Decision Trees X Y if X > 5 then blue else if Y > 3 then blue else if X > 2 then green else blue 5 2 3
Classification: Neural Nets Can select more complex regions Can be more accurate Also can overfit the data ¨C find patterns in random noise
Outline Machine learning and Classification Examples *Learning as Search  Bias Weka
The weather problem Given past data, Can you come up with the rules for  Play/Not Play ? What is the game? no true high mild rainy yes false normal hot overcast yes true high mild overcast yes true normal mild sunny yes false normal mild rainy yes false normal mild sunny no false high mild sunny yes true normal mild overcast no true normal mild rainy yes false normal mild rainy yes false high mild rainy yes false high hot overcast no true high hot sunny no false high hot sunny Play Windy Humidity Temperature Outlook
The   weather problem Given this data, what are the rules for play/not play? ¡­ ¡­ ¡­ ¡­ ¡­ Yes False Normal Mild Rainy Yes False High Hot  Overcast  No True High  Hot  Sunny No False High Hot Sunny Play Windy Humidity Temperature Outlook
The   weather problem Conditions for playing witten&eibe ¡­ ¡­ ¡­ ¡­ ¡­ Yes False Normal Mild Rainy Yes False High Hot  Overcast  No True High  Hot  Sunny No False High Hot Sunny Play Windy Humidity Temperature Outlook If outlook = sunny and humidity = high then play = no If outlook = rainy and windy = true then play = no If outlook = overcast then play = yes If humidity = normal then play = yes If none of the above then play = yes
Weather data with mixed attributes no true 91 71 rainy yes false 75 81 overcast yes true 90 72 overcast yes true 70 75 sunny yes false 80 75 rainy yes false 70 69 sunny no false 95 72 sunny yes true 65 64 overcast no true 70 65 rainy yes false 80 68 rainy yes false 96 70 rainy yes false 86 83 overcast no true 90 80 sunny no false 85 85 sunny Play Windy Humidity Temperature Outlook
Weather data with mixed attributes How will the rules change when some attributes have numeric values? ¡­ ¡­ ¡­ ¡­ ¡­ Yes False 80 75 Rainy Yes False 86 83 Overcast  No True 90 80 Sunny No False 85 85 Sunny Play Windy Humidity Temperature Outlook
Weather data with mixed attributes Rules with mixed attributes  witten&eibe ¡­ ¡­ ¡­ ¡­ ¡­ Yes False 80 75 Rainy Yes False 86 83 Overcast  No True 90 80 Sunny No False 85 85 Sunny Play Windy Humidity Temperature Outlook If outlook = sunny and humidity > 83 then play = no If outlook = rainy and windy = true then play = no If outlook = overcast then play = yes If humidity < 85 then play = yes If none of the above then play = yes
The contact lenses data witten&eibe None Reduced Yes Hypermetrope Pre-presbyopic  None Normal Yes Hypermetrope Pre-presbyopic None Reduced No Myope Presbyopic None Normal No Myope Presbyopic None Reduced Yes Myope Presbyopic Hard Normal Yes Myope Presbyopic None Reduced No Hypermetrope Presbyopic Soft Normal No Hypermetrope Presbyopic None Reduced Yes Hypermetrope Presbyopic None Normal Yes Hypermetrope Presbyopic Soft Normal No Hypermetrope Pre-presbyopic None Reduced No Hypermetrope Pre-presbyopic Hard Normal Yes Myope Pre-presbyopic None Reduced Yes Myope Pre-presbyopic Soft Normal No Myope Pre-presbyopic None Reduced No Myope Pre-presbyopic hard Normal Yes Hypermetrope Young None Reduced Yes Hypermetrope Young Soft Normal No Hypermetrope Young None Reduced No Hypermetrope Young Hard Normal Yes Myope Young None Reduced Yes Myope Young  Soft Normal No Myope Young None Reduced No Myope Young Recommended lenses Tear production rate Astigmatism Spectacle prescription Age
A complete and correct rule set witten&eibe If tear production rate = reduced then recommendation = none If age = young and astigmatic = no and tear production rate = normal then recommendation = soft If age = pre-presbyopic and astigmatic = no and tear production rate = normal then recommendation = soft If age = presbyopic and spectacle prescription = myope and astigmatic = no  then recommendation = none If spectacle prescription = hypermetrope and astigmatic = no and tear production rate = normal then recommendation = soft If spectacle prescription = myope and astigmatic = yes and tear production rate = normal then recommendation = hard If age young and astigmatic = yes  and tear production rate = normal then recommendation = hard If age = pre-presbyopic and spectacle prescription = hypermetrope and astigmatic = yes then recommendation = none If age = presbyopic and spectacle prescription = hypermetrope and astigmatic = yes then recommendation = none
A decision tree for this problem witten&eibe
Classifying iris flowers witten&eibe ¡­ ¡­ ¡­ Iris virginica 1.9 5.1 2.7 5.8 102 101 52 51 2 1 Iris virginica 2.5 6.0 3.3 6.3 Iris versicolor 1.5 4.5 3.2 6.4 Iris versicolor 1.4 4.7 3.2 7.0 Iris setosa 0.2 1.4 3.0 4.9 Iris setosa 0.2 1.4 3.5 5.1 Type Petal width Petal length Sepal width Sepal length If petal length < 2.45 then Iris setosa If sepal width < 2.10 then Iris versicolor ...
Example: 209 different computer configurations Linear regression function Predicting CPU performance witten&eibe 0 0 32 128 CHMAX 0 0 8 16 CHMIN Channels Performance Cache (Kb) Main memory (Kb) Cycle time (ns) 45 0 4000 1000 480 209 67 32 8000 512 480 208 ¡­ 269 32 32000 8000 29 2 198 256 6000 256 125 1 PRP CACH MMAX MMIN MYCT PRP = -55.9 + 0.0489 MYCT + 0.0153 MMIN + 0.0056 MMAX + 0.6410 CACH - 0.2700 CHMIN + 1.480 CHMAX
Soybean classification witten&eibe Diaporthe stem canker 19 Diagnosis Normal 3 Condition Roots ¡­ Yes 2 Stem lodging Abnormal 2 Condition Stem ¡­ ? 3 Leaf spot size Abnormal 2 Condition Leaves ? 5 Fruit spots Normal 4 Condition of fruit pods Fruit ¡­ Absent 2 Mold growth Normal 2 Condition Seed ¡­ Above normal 3 Precipitation July 7 Time of occurrence Environment Sample value Number of values Attribute
The role of domain knowledge But in this domain, ¡°leaf condition is normal¡± implies ¡°leaf malformation is absent¡±! witten&eibe If leaf condition is normal and stem condition is abnormal and stem cankers is below soil line and canker lesion color is brown then diagnosis is rhizoctonia root rot If leaf malformation is absent and stem condition is abnormal and stem cankers is below soil line and canker lesion color is brown then diagnosis is rhizoctonia root rot
Outline Machine learning and Classification Examples *Learning as Search   Bias Weka
Learning as search Inductive learning: find a concept description that fits the data Example: rule sets as description language  Enormous, but finite, search space Simple solution: enumerate the concept space eliminate descriptions that do not fit examples surviving descriptions contain target concept witten&eibe
Enumerating the concept space Search space for weather problem 4 x 4 x 3 x 3 x 2 = 288 possible combinations With 14 rules  ?  2.7x10 34  possible rule sets Solution: candidate-elimination algorithm Other practical problems: More than one description may survive No description may survive Language is unable to describe target concept or   data contains noise witten&eibe
The version space Space of consistent concept descriptions Completely determined by two sets L : most specific descriptions that cover all positive examples and no negative ones G : most general descriptions that do not cover any negative examples and all positive ones Only  L  and  G  need be maintained and updated But: still computationally very expensive And: does not solve other practical problems witten&eibe
*Version space example, 1 Given: red or green cows or chicken   witten&eibe Start with: L ={} G ={<*, *>} First example: <green,cow>: positive How does this change L and G?
*Version space example, 2 Given: red or green cows or chicken   witten&eibe Result: L ={<green, cow>} G ={<*, *>} Second example: <red,chicken>: negative
*Version space example, 3 Given: red or green cows or chicken   witten&eibe Result: L ={<green, cow>} G ={<green,*>,<*,cow>} Final example: <green, chicken>: positive
*Version space example, 4 Given: red or green cows or chicken   witten&eibe Resultant version space: L ={<green, *>} G ={<green, *>}
*Version space example, 5 Given: red or green cows or chicken   witten&eibe L ={} G ={<*, *>} <green,cow>: positive L ={<green, cow>} G ={<*, *>} <red,chicken>: negative L ={<green, cow>} G ={<green,*>,<*,cow>} <green, chicken>: positive L ={<green, *>} G ={<green, *>}
*Candidate-elimination algorithm witten&eibe Initialize  L  and  G For each example  e: If  e  is positive: Delete all elements from  G  that do not cover  e For each element  r  in  L  that does not cover  e: Replace  r  by all of its most specific generalizations that 1. cover  e  and 2. are more specific than some element in  G Remove elements from  L  that are more general than some other element in  L If  e  is   negative: Delete all elements from  L  that cover  e For each element  r  in  G  that covers  e: Replace  r  by all of its most general specializations  that 1. do not cover  e  and  2. are more general than some element in  L  Remove elements from  G  that are more specific than some other element in  G
Outline Machine learning and Classification Examples *Learning as Search  Bias Weka
Bias Important decisions in learning systems: Concept description language Order in which the space is searched Way that overfitting to the particular training data is avoided These form the ¡°bias¡± of the search: Language bias Search bias Overfitting-avoidance bias witten&eibe
Language bias Important question: is language universal or does it restrict what can be learned? Universal language can express arbitrary subsets of examples If language includes logical  or  (¡°disjunction¡±), it is universal Example: rule sets Domain knowledge can be used to exclude some concept descriptions  a priori  from the search witten&eibe
Search bias Search heuristic ¡° Greedy¡± search: performing the best single step ¡° Beam search¡±: keeping several alternatives ¡­ Direction of search General-to-specific E.g. specializing a rule by adding conditions Specific-to-general E.g. generalizing an individual instance into a rule witten&eibe
Overfitting-avoidance bias Can be seen as a form of search bias Modified evaluation criterion E.g. balancing simplicity and number of errors Modified search strategy E.g. pruning (simplifying a description) Pre-pruning: stops at a simple description before search proceeds to an overly complex one Post-pruning: generates a complex description first and simplifies it afterwards witten&eibe
Weka

More Related Content

Viewers also liked (6)

Presentation on types of research
Presentation on types of researchPresentation on types of research
Presentation on types of research
Research Scholar - HNB Garhwal Central University, Srinagar, Uttarakhand.
?
Research Presentation Format
Research Presentation FormatResearch Presentation Format
Research Presentation Format
Lifelong Learning
?
Research design
Research designResearch design
Research design
Ekta Chauhan
?
Ppt on research design
Ppt on research designPpt on research design
Ppt on research design
Satakshi Kaushik
?
Literature Review (Review of Related Literature - Research Methodology)
Literature Review (Review of Related Literature - Research Methodology)Literature Review (Review of Related Literature - Research Methodology)
Literature Review (Review of Related Literature - Research Methodology)
Dilip Barad
?
Definition and types of research
Definition and types of researchDefinition and types of research
Definition and types of research
fadifm
?

Similar to Machine Learning: finding patterns Outline (20)

Data Mining - Practical Machine Learning.pdf
Data Mining - Practical Machine Learning.pdfData Mining - Practical Machine Learning.pdf
Data Mining - Practical Machine Learning.pdf
OBERDAN PINHEIRO
?
Lesson #1 Data Mining and concepts Topic.pdf
Lesson #1 Data Mining and concepts Topic.pdfLesson #1 Data Mining and concepts Topic.pdf
Lesson #1 Data Mining and concepts Topic.pdf
solishennessyvince
?
Machine Learning and Data Mining
Machine Learning and Data MiningMachine Learning and Data Mining
Machine Learning and Data Mining
Tilani Gunawardena PhD(UNIBAS), BSc(Pera), FHEA(UK), CEng, MIESL
?
3.Classification.ppt
3.Classification.ppt3.Classification.ppt
3.Classification.ppt
KhalilDaiyob1
?
Decision tree.10.11
Decision tree.10.11Decision tree.10.11
Decision tree.10.11
okeee
?
Covering algorithm
Covering algorithmCovering algorithm
Covering algorithm
Tilani Gunawardena PhD(UNIBAS), BSc(Pera), FHEA(UK), CEng, MIESL
?
CART Algorithm.pptx
CART Algorithm.pptxCART Algorithm.pptx
CART Algorithm.pptx
SumayaNazir440
?
Introduction to ML and Decision Tree
Introduction to ML and Decision TreeIntroduction to ML and Decision Tree
Introduction to ML and Decision Tree
Suman Debnath
?
Data Science-entropy machine learning.pptx
Data Science-entropy machine learning.pptxData Science-entropy machine learning.pptx
Data Science-entropy machine learning.pptx
ZainabShahzad9
?
Forms of learning in ai
Forms of learning in aiForms of learning in ai
Forms of learning in ai
Robert Antony
?
Scaffolding Decision Making
Scaffolding Decision MakingScaffolding Decision Making
Scaffolding Decision Making
Bernie Dodge
?
Scaffolding Decision Making
Scaffolding Decision MakingScaffolding Decision Making
Scaffolding Decision Making
guest3e28c2
?
Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learning
Amr BARAKAT
?
ºÝºÝߣs(ppt)
ºÝºÝߣs(ppt)ºÝºÝߣs(ppt)
ºÝºÝߣs(ppt)
butest
?
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
butest
?
Notes_5.2.pptx
Notes_5.2.pptxNotes_5.2.pptx
Notes_5.2.pptx
jaywarven1
?
Notes_5.2 (1).pptx
Notes_5.2 (1).pptxNotes_5.2 (1).pptx
Notes_5.2 (1).pptx
jaywarven1
?
Notes_5.2 (1).pptx
Notes_5.2 (1).pptxNotes_5.2 (1).pptx
Notes_5.2 (1).pptx
jaywarven1
?
General Education Course 104_LESSON7-LECTURE-NOTES.pdf
General Education Course 104_LESSON7-LECTURE-NOTES.pdfGeneral Education Course 104_LESSON7-LECTURE-NOTES.pdf
General Education Course 104_LESSON7-LECTURE-NOTES.pdf
ApplesErmidaBanuelos
?
Normal distribution slide share
Normal distribution slide shareNormal distribution slide share
Normal distribution slide share
Kate FLR
?
Data Mining - Practical Machine Learning.pdf
Data Mining - Practical Machine Learning.pdfData Mining - Practical Machine Learning.pdf
Data Mining - Practical Machine Learning.pdf
OBERDAN PINHEIRO
?
Lesson #1 Data Mining and concepts Topic.pdf
Lesson #1 Data Mining and concepts Topic.pdfLesson #1 Data Mining and concepts Topic.pdf
Lesson #1 Data Mining and concepts Topic.pdf
solishennessyvince
?
Decision tree.10.11
Decision tree.10.11Decision tree.10.11
Decision tree.10.11
okeee
?
Introduction to ML and Decision Tree
Introduction to ML and Decision TreeIntroduction to ML and Decision Tree
Introduction to ML and Decision Tree
Suman Debnath
?
Data Science-entropy machine learning.pptx
Data Science-entropy machine learning.pptxData Science-entropy machine learning.pptx
Data Science-entropy machine learning.pptx
ZainabShahzad9
?
Forms of learning in ai
Forms of learning in aiForms of learning in ai
Forms of learning in ai
Robert Antony
?
Scaffolding Decision Making
Scaffolding Decision MakingScaffolding Decision Making
Scaffolding Decision Making
Bernie Dodge
?
Scaffolding Decision Making
Scaffolding Decision MakingScaffolding Decision Making
Scaffolding Decision Making
guest3e28c2
?
Decision trees for machine learning
Decision trees for machine learningDecision trees for machine learning
Decision trees for machine learning
Amr BARAKAT
?
ºÝºÝߣs(ppt)
ºÝºÝߣs(ppt)ºÝºÝߣs(ppt)
ºÝºÝߣs(ppt)
butest
?
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
Software-Praktikum SoSe 2005 Lehrstuhl fuer Maschinelles ...
butest
?
Notes_5.2 (1).pptx
Notes_5.2 (1).pptxNotes_5.2 (1).pptx
Notes_5.2 (1).pptx
jaywarven1
?
Notes_5.2 (1).pptx
Notes_5.2 (1).pptxNotes_5.2 (1).pptx
Notes_5.2 (1).pptx
jaywarven1
?
General Education Course 104_LESSON7-LECTURE-NOTES.pdf
General Education Course 104_LESSON7-LECTURE-NOTES.pdfGeneral Education Course 104_LESSON7-LECTURE-NOTES.pdf
General Education Course 104_LESSON7-LECTURE-NOTES.pdf
ApplesErmidaBanuelos
?
Normal distribution slide share
Normal distribution slide shareNormal distribution slide share
Normal distribution slide share
Kate FLR
?

More from butest (20)

EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
butest
?
1. MPEG I.B.P frameÖ®²»Í¬
1. MPEG I.B.P frameÖ®²»Í¬1. MPEG I.B.P frameÖ®²»Í¬
1. MPEG I.B.P frameÖ®²»Í¬
butest
?
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
butest
?
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
butest
?
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
butest
?
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
butest
?
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
butest
?
PPT
PPTPPT
PPT
butest
?
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
butest
?
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
butest
?
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
butest
?
Facebook
Facebook Facebook
Facebook
butest
?
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
butest
?
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
butest
?
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
butest
?
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
butest
?
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
butest
?
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
butest
?
hier
hierhier
hier
butest
?
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
butest
?
EL MODELO DE NEGOCIO DE YOUTUBEEL MODELO DE NEGOCIO DE YOUTUBE
EL MODELO DE NEGOCIO DE YOUTUBE
butest
?
1. MPEG I.B.P frameÖ®²»Í¬
1. MPEG I.B.P frameÖ®²»Í¬1. MPEG I.B.P frameÖ®²»Í¬
1. MPEG I.B.P frameÖ®²»Í¬
butest
?
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
butest
?
Timeline: The Life of Michael Jackson
Timeline: The Life of Michael JacksonTimeline: The Life of Michael Jackson
Timeline: The Life of Michael Jackson
butest
?
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
Popular Reading Last Updated April 1, 2010 Adams, Lorraine The ...
butest
?
LESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIALLESSONS FROM THE MICHAEL JACKSON TRIAL
LESSONS FROM THE MICHAEL JACKSON TRIAL
butest
?
Com 380, Summer II
Com 380, Summer IICom 380, Summer II
Com 380, Summer II
butest
?
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet JazzThe MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
The MYnstrel Free Press Volume 2: Economic Struggles, Meet Jazz
butest
?
MICHAEL JACKSON.doc
MICHAEL JACKSON.docMICHAEL JACKSON.doc
MICHAEL JACKSON.doc
butest
?
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
Social Networks: Twitter Facebook SL - ºÝºÝߣ 1
butest
?
Facebook
Facebook Facebook
Facebook
butest
?
Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...Executive Summary Hare Chevrolet is a General Motors dealership ...
Executive Summary Hare Chevrolet is a General Motors dealership ...
butest
?
Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...Welcome to the Dougherty County Public Library's Facebook and ...
Welcome to the Dougherty County Public Library's Facebook and ...
butest
?
NEWS ANNOUNCEMENT
NEWS ANNOUNCEMENTNEWS ANNOUNCEMENT
NEWS ANNOUNCEMENT
butest
?
C-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.docC-2100 Ultra Zoom.doc
C-2100 Ultra Zoom.doc
butest
?
MAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.docMAC Printing on ITS Printers.doc.doc
MAC Printing on ITS Printers.doc.doc
butest
?
Mac OS X Guide.doc
Mac OS X Guide.docMac OS X Guide.doc
Mac OS X Guide.doc
butest
?
WEB DESIGN!
WEB DESIGN!WEB DESIGN!
WEB DESIGN!
butest
?

Machine Learning: finding patterns Outline

  • 2. Outline Machine learning and Classification Examples *Learning as Search Bias Weka
  • 3. Finding patterns Goal: programs that detect patterns and regularities in the data Strong patterns ? good predictions Problem 1: most patterns are not interesting Problem 2: patterns may be inexact (or spurious) Problem 3: data may be garbled or missing
  • 4. Machine learning techniques Algorithms for acquiring structural descriptions from examples Structural descriptions represent patterns explicitly Can be used to predict outcome in new situation Can be used to understand and explain how prediction is derived ( may be even more important ) Methods originate from artificial intelligence, statistics, and research on databases witten&eibe
  • 5. Can machines really learn? Definitions of ¡°learning¡± from dictionary: To get knowledge of by study, experience, or being taught To become aware by information or from observation To commit to memory To be informed of, ascertain; to receive instruction witten&eibe Difficult to measure Trivial for computers Things learn when they change their behavior in a way that makes them perform better in the future. Operational definition: Does a slipper learn? Does learning imply intention?
  • 6. Classification Learn a method for predicting the instance class from pre-labeled (classified) instances Many approaches: Regression, Decision Trees, Bayesian, Neural Networks, ... Given a set of points from classes what is the class of new point ?
  • 7. Classification: Linear Regression Linear Regression w 0 + w 1 x + w 2 y >= 0 Regression computes w i from data to minimize squared error to ¡®fit¡¯ the data Not flexible enough
  • 8. Classification: Decision Trees X Y if X > 5 then blue else if Y > 3 then blue else if X > 2 then green else blue 5 2 3
  • 9. Classification: Neural Nets Can select more complex regions Can be more accurate Also can overfit the data ¨C find patterns in random noise
  • 10. Outline Machine learning and Classification Examples *Learning as Search Bias Weka
  • 11. The weather problem Given past data, Can you come up with the rules for Play/Not Play ? What is the game? no true high mild rainy yes false normal hot overcast yes true high mild overcast yes true normal mild sunny yes false normal mild rainy yes false normal mild sunny no false high mild sunny yes true normal mild overcast no true normal mild rainy yes false normal mild rainy yes false high mild rainy yes false high hot overcast no true high hot sunny no false high hot sunny Play Windy Humidity Temperature Outlook
  • 12. The weather problem Given this data, what are the rules for play/not play? ¡­ ¡­ ¡­ ¡­ ¡­ Yes False Normal Mild Rainy Yes False High Hot Overcast No True High Hot Sunny No False High Hot Sunny Play Windy Humidity Temperature Outlook
  • 13. The weather problem Conditions for playing witten&eibe ¡­ ¡­ ¡­ ¡­ ¡­ Yes False Normal Mild Rainy Yes False High Hot Overcast No True High Hot Sunny No False High Hot Sunny Play Windy Humidity Temperature Outlook If outlook = sunny and humidity = high then play = no If outlook = rainy and windy = true then play = no If outlook = overcast then play = yes If humidity = normal then play = yes If none of the above then play = yes
  • 14. Weather data with mixed attributes no true 91 71 rainy yes false 75 81 overcast yes true 90 72 overcast yes true 70 75 sunny yes false 80 75 rainy yes false 70 69 sunny no false 95 72 sunny yes true 65 64 overcast no true 70 65 rainy yes false 80 68 rainy yes false 96 70 rainy yes false 86 83 overcast no true 90 80 sunny no false 85 85 sunny Play Windy Humidity Temperature Outlook
  • 15. Weather data with mixed attributes How will the rules change when some attributes have numeric values? ¡­ ¡­ ¡­ ¡­ ¡­ Yes False 80 75 Rainy Yes False 86 83 Overcast No True 90 80 Sunny No False 85 85 Sunny Play Windy Humidity Temperature Outlook
  • 16. Weather data with mixed attributes Rules with mixed attributes witten&eibe ¡­ ¡­ ¡­ ¡­ ¡­ Yes False 80 75 Rainy Yes False 86 83 Overcast No True 90 80 Sunny No False 85 85 Sunny Play Windy Humidity Temperature Outlook If outlook = sunny and humidity > 83 then play = no If outlook = rainy and windy = true then play = no If outlook = overcast then play = yes If humidity < 85 then play = yes If none of the above then play = yes
  • 17. The contact lenses data witten&eibe None Reduced Yes Hypermetrope Pre-presbyopic None Normal Yes Hypermetrope Pre-presbyopic None Reduced No Myope Presbyopic None Normal No Myope Presbyopic None Reduced Yes Myope Presbyopic Hard Normal Yes Myope Presbyopic None Reduced No Hypermetrope Presbyopic Soft Normal No Hypermetrope Presbyopic None Reduced Yes Hypermetrope Presbyopic None Normal Yes Hypermetrope Presbyopic Soft Normal No Hypermetrope Pre-presbyopic None Reduced No Hypermetrope Pre-presbyopic Hard Normal Yes Myope Pre-presbyopic None Reduced Yes Myope Pre-presbyopic Soft Normal No Myope Pre-presbyopic None Reduced No Myope Pre-presbyopic hard Normal Yes Hypermetrope Young None Reduced Yes Hypermetrope Young Soft Normal No Hypermetrope Young None Reduced No Hypermetrope Young Hard Normal Yes Myope Young None Reduced Yes Myope Young Soft Normal No Myope Young None Reduced No Myope Young Recommended lenses Tear production rate Astigmatism Spectacle prescription Age
  • 18. A complete and correct rule set witten&eibe If tear production rate = reduced then recommendation = none If age = young and astigmatic = no and tear production rate = normal then recommendation = soft If age = pre-presbyopic and astigmatic = no and tear production rate = normal then recommendation = soft If age = presbyopic and spectacle prescription = myope and astigmatic = no then recommendation = none If spectacle prescription = hypermetrope and astigmatic = no and tear production rate = normal then recommendation = soft If spectacle prescription = myope and astigmatic = yes and tear production rate = normal then recommendation = hard If age young and astigmatic = yes and tear production rate = normal then recommendation = hard If age = pre-presbyopic and spectacle prescription = hypermetrope and astigmatic = yes then recommendation = none If age = presbyopic and spectacle prescription = hypermetrope and astigmatic = yes then recommendation = none
  • 19. A decision tree for this problem witten&eibe
  • 20. Classifying iris flowers witten&eibe ¡­ ¡­ ¡­ Iris virginica 1.9 5.1 2.7 5.8 102 101 52 51 2 1 Iris virginica 2.5 6.0 3.3 6.3 Iris versicolor 1.5 4.5 3.2 6.4 Iris versicolor 1.4 4.7 3.2 7.0 Iris setosa 0.2 1.4 3.0 4.9 Iris setosa 0.2 1.4 3.5 5.1 Type Petal width Petal length Sepal width Sepal length If petal length < 2.45 then Iris setosa If sepal width < 2.10 then Iris versicolor ...
  • 21. Example: 209 different computer configurations Linear regression function Predicting CPU performance witten&eibe 0 0 32 128 CHMAX 0 0 8 16 CHMIN Channels Performance Cache (Kb) Main memory (Kb) Cycle time (ns) 45 0 4000 1000 480 209 67 32 8000 512 480 208 ¡­ 269 32 32000 8000 29 2 198 256 6000 256 125 1 PRP CACH MMAX MMIN MYCT PRP = -55.9 + 0.0489 MYCT + 0.0153 MMIN + 0.0056 MMAX + 0.6410 CACH - 0.2700 CHMIN + 1.480 CHMAX
  • 22. Soybean classification witten&eibe Diaporthe stem canker 19 Diagnosis Normal 3 Condition Roots ¡­ Yes 2 Stem lodging Abnormal 2 Condition Stem ¡­ ? 3 Leaf spot size Abnormal 2 Condition Leaves ? 5 Fruit spots Normal 4 Condition of fruit pods Fruit ¡­ Absent 2 Mold growth Normal 2 Condition Seed ¡­ Above normal 3 Precipitation July 7 Time of occurrence Environment Sample value Number of values Attribute
  • 23. The role of domain knowledge But in this domain, ¡°leaf condition is normal¡± implies ¡°leaf malformation is absent¡±! witten&eibe If leaf condition is normal and stem condition is abnormal and stem cankers is below soil line and canker lesion color is brown then diagnosis is rhizoctonia root rot If leaf malformation is absent and stem condition is abnormal and stem cankers is below soil line and canker lesion color is brown then diagnosis is rhizoctonia root rot
  • 24. Outline Machine learning and Classification Examples *Learning as Search Bias Weka
  • 25. Learning as search Inductive learning: find a concept description that fits the data Example: rule sets as description language Enormous, but finite, search space Simple solution: enumerate the concept space eliminate descriptions that do not fit examples surviving descriptions contain target concept witten&eibe
  • 26. Enumerating the concept space Search space for weather problem 4 x 4 x 3 x 3 x 2 = 288 possible combinations With 14 rules ? 2.7x10 34 possible rule sets Solution: candidate-elimination algorithm Other practical problems: More than one description may survive No description may survive Language is unable to describe target concept or data contains noise witten&eibe
  • 27. The version space Space of consistent concept descriptions Completely determined by two sets L : most specific descriptions that cover all positive examples and no negative ones G : most general descriptions that do not cover any negative examples and all positive ones Only L and G need be maintained and updated But: still computationally very expensive And: does not solve other practical problems witten&eibe
  • 28. *Version space example, 1 Given: red or green cows or chicken witten&eibe Start with: L ={} G ={<*, *>} First example: <green,cow>: positive How does this change L and G?
  • 29. *Version space example, 2 Given: red or green cows or chicken witten&eibe Result: L ={<green, cow>} G ={<*, *>} Second example: <red,chicken>: negative
  • 30. *Version space example, 3 Given: red or green cows or chicken witten&eibe Result: L ={<green, cow>} G ={<green,*>,<*,cow>} Final example: <green, chicken>: positive
  • 31. *Version space example, 4 Given: red or green cows or chicken witten&eibe Resultant version space: L ={<green, *>} G ={<green, *>}
  • 32. *Version space example, 5 Given: red or green cows or chicken witten&eibe L ={} G ={<*, *>} <green,cow>: positive L ={<green, cow>} G ={<*, *>} <red,chicken>: negative L ={<green, cow>} G ={<green,*>,<*,cow>} <green, chicken>: positive L ={<green, *>} G ={<green, *>}
  • 33. *Candidate-elimination algorithm witten&eibe Initialize L and G For each example e: If e is positive: Delete all elements from G that do not cover e For each element r in L that does not cover e: Replace r by all of its most specific generalizations that 1. cover e and 2. are more specific than some element in G Remove elements from L that are more general than some other element in L If e is negative: Delete all elements from L that cover e For each element r in G that covers e: Replace r by all of its most general specializations that 1. do not cover e and 2. are more general than some element in L Remove elements from G that are more specific than some other element in G
  • 34. Outline Machine learning and Classification Examples *Learning as Search Bias Weka
  • 35. Bias Important decisions in learning systems: Concept description language Order in which the space is searched Way that overfitting to the particular training data is avoided These form the ¡°bias¡± of the search: Language bias Search bias Overfitting-avoidance bias witten&eibe
  • 36. Language bias Important question: is language universal or does it restrict what can be learned? Universal language can express arbitrary subsets of examples If language includes logical or (¡°disjunction¡±), it is universal Example: rule sets Domain knowledge can be used to exclude some concept descriptions a priori from the search witten&eibe
  • 37. Search bias Search heuristic ¡° Greedy¡± search: performing the best single step ¡° Beam search¡±: keeping several alternatives ¡­ Direction of search General-to-specific E.g. specializing a rule by adding conditions Specific-to-general E.g. generalizing an individual instance into a rule witten&eibe
  • 38. Overfitting-avoidance bias Can be seen as a form of search bias Modified evaluation criterion E.g. balancing simplicity and number of errors Modified search strategy E.g. pruning (simplifying a description) Pre-pruning: stops at a simple description before search proceeds to an overly complex one Post-pruning: generates a complex description first and simplifies it afterwards witten&eibe
  • 39. Weka