Вы находитесь на странице: 1из 24

Lecture Slides for

ETHEM ALPAYDIN
© The MIT Press, 2010
alpaydin@boun.edu.tr
http://www.cmpe.boun.edu.tr/~ethem/i2ml2e
Why “Learn” ?
 Machine learning is programming computers to optimize a
performance criterion using example data or past experience.
 Machine Learning is a system that can learn from example
through self-improvement and without being explicitly coded by
programmer. The breakthrough comes with the idea that a
machine can singularly learn from the data (i.e., example) to
produce accurate results.
 There is no need to “learn” to calculate payroll
 Learning is used when:
 Human expertise does not exist (navigating on Mars),
 Humans are unable to explain their expertise (speech
recognition)
 Solution changes in time (routing on a computer network)
 Solution needs to be adapted to particular cases (user
biometrics)
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 3
Comaparison of Traditional and
Machine learning

 In traditional programming, a programmer code all the rules


in consultation with an expert in the industry for which
software is being developed. Each rule is based on a logical
foundation; the machine will execute an output following
the logical statement When the system grows complex,
more rules need to be written. It can quickly become
unsustainable to maintain.
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 4
Comaparison of Traditional and
Machine learning

 Machine learning is supposed to overcome this issue. The machine


learns how the input and output data are correlated and it writes a
rule. The programmers do not need to write new rules each time
there is new data. The algorithms adapt in response to new data and
experiences to improve efficacy over time.
 Machine learning is the brain where all the learning takes place. The
way the machine learns is similar to the human being. Humans learn
from experience.
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 5
How does Machine learning work?

 The core objective of machine learning is the learning and inference.


Learning :
 First of all, the machine learns through the discovery of patterns from the
data.
 One crucial part of the data scientist is to choose carefully which data to
provide to the machine. The list of attributes used to solve a problem is
called a feature vector.
 The machine uses some algorithms to simplify the reality and transform this
discovery into a model. Therefore, the learning stage is used to describe the
data and summarize it into a model.
Inferring :
 When the model is built, it is possible to test how powerful it is on never-
seen-before data. The new data are transformed into a features vector, go
through the model and give a prediction.
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 6
Machine learning Algorithms and where they are used?

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 7
What We Talk About When We
Talk About“Learning”
 Learning general models from a data of particular
examples
 Data is cheap and abundant (data warehouses, data
marts); knowledge is expensive and scarce.
 Example in retail: Customer transactions to consumer
behavior:
People who bought “Blink” also bought “Outliers”
(www.amazon.com)
 Build a model that is a good and useful approximation to
the data.
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 8
Data Mining
 Retail: Market basket analysis, Customer relationship
management (CRM)
 Finance: Credit scoring, fraud detection
 Manufacturing: Control, robotics, troubleshooting
 Medicine: Medical diagnosis
 Telecommunications: Spam filters, intrusion detection
 Bioinformatics: Motifs, alignment
 Web mining: Search engines
 ...

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 9
What is Machine Learning?
 Optimize a performance criterion using example data or
past experience.
 Role of Statistics: Inference from a sample
 Role of Computer science: Efficient algorithms to
 Solve the optimization problem
 Representing and evaluating the model for inference

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 10
Applications
 Association
 Supervised Learning
 Classification
 Regression
 Unsupervised Learning
 Reinforcement Learning

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 11
Learning Associations
 Basket analysis:
P (Y | X ) probability that somebody who buys X also buys
Y where X and Y are products/services.

Example: P ( chips | beer ) = 0.7

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 12
Supervised learning
 An algorithm uses training data and feedback from humans to
learn the relationship of given inputs to a given output.
There are two categories of supervised learning:
 Classification task : For discrete value
 Regression task
 When the output is a continuous value, the task is a
regression. For instance, a financial analyst may need to
forecast the value of a stock based on a range of feature like
equity, previous stock performances, macroeconomics index.
The system will be trained to estimate the price of the stocks
with the lowest possible error.

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 13
Classification
 Example: Credit
scoring
 Differentiating
between low-risk
and high-risk
customers from their
income and savings

Discriminant: IF income > θ1 AND savings > θ2


THEN low-risk ELSE high-risk

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 14
Classification: Applications
 Aka Pattern recognition
 Face recognition: Pose, lighting, occlusion (glasses,
beard), make-up, hair style
 Character recognition: Different handwriting styles.
 Speech recognition: Temporal dependency.
 Medical diagnosis: From symptoms to illnesses
 Biometrics: Recognition/authentication using physical
and/or behavioral characteristics: Face, iris, signature, etc
 ...

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 15
Face Recognition
Training examples of a person

Test images

ORL dataset,
AT&T Laboratories, Cambridge UK

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 16
Regression
 Example: Price of a used
car
 x : car attributes y = wx+w0
y : price
y = g (x | q )
g ( ) model,
q parameters

17
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0)
Regression Applications
 Navigating a car: Angle of the steering
 Kinematics of a robot arm

(x,y) α1= g1(x,y)


α2= g2(x,y)
α2

α1

 Response surface design

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 18
Supervised Learning: Uses
 Prediction of future cases: Use the rule to predict the
output for future inputs
 Knowledge extraction: The rule is easy to understand
 Compression: The rule is simpler than the data it explains
 Outlier detection: Exceptions that are not covered by the
rule, e.g., fraud

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 19
Unsupervised Learning
 Learning “what normally happens”
 No output
 Clustering: Grouping similar instances
 Example applications
 Customer segmentation in CRM
 Image compression: Color quantization
 Bioinformatics: Learning motifs

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 20
Reinforcement Learning
 Learning a policy: A sequence of outputs
 No supervised output but delayed reward
 Credit assignment problem
 Game playing
 Robot in a maze
 Multiple agents, partial observability, ...

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 21
Resources: Datasets
 UCI Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html
 UCI KDD Archive:
http://kdd.ics.uci.edu/summary.data.application.html
 Statlib: http://lib.stat.cmu.edu/
 Delve: http://www.cs.utoronto.ca/~delve/

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 22
Resources: Journals
 Journal of Machine Learning Research www.jmlr.org
 Machine Learning
 Neural Computation
 Neural Networks
 IEEE Transactions on Neural Networks
 IEEE Transactions on Pattern Analysis and Machine
Intelligence
 Annals of Statistics
 Journal of the American Statistical Association
 ...
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 23
Resources: Conferences
 International Conference on Machine Learning (ICML)
 European Conference on Machine Learning (ECML)
 Neural Information Processing Systems (NIPS)
 Uncertainty in Artificial Intelligence (UAI)
 Computational Learning Theory (COLT)
 International Conference on Artificial Neural Networks
(ICANN)
 International Conference on AI & Statistics (AISTATS)
 International Conference on Pattern Recognition (ICPR)
 ...

Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) 24

Вам также может понравиться