Swipe to navigate through the chapters of this book
One of the major AI applications is the development of intelligent autonomous robots. Since flexibility and adaptivity are important features of really intelligent agents, research into learning mechanisms and the development of machine learning algorithms is one of the most important branches of AI. After motivating and introducing basic concepts of machine learning like classification and approximation, this chapter presents basic supervised learning algorithms such as the perceptron, nearest neighbor methods and decision tree induction. Unsupervised clustering methods and data mining software tools complete the picture of this fascinating field.
Please log in to get access to this content
To get access to this content you need the following product:
Python is a modern scripting language with very readable syntax, powerful data types, and extensive standard libraries, which can be used to this end.
Caution! This is not a proof of convergence for the perceptron learning rule. It only shows that the perceptron converges when the training dataset consists of a single example.
The functionals argmin and argmax determine, similarly to min and max, the minimum or maximum of a set or function. However, rather than returning the value of the maximum or minimum, they give the position, that is, the argument in which the extremum appears.
The Hamming distance between two bit vectors is the number of different bits of the two vectors.
To keep the example simple and readable, the feature vector x was deliberately kept one-dimensional.
The three day total of snowfall is in fact an important feature for determining the hazard level. In practice, however, additional attributes are used [Bra01]. The example used here is simplified.
It would be better to use the error on the test data directly. At least when the amount of training data is sufficient to justify a separate testing set.
Feature scaling is necessary or advantageous for many machine learning algorithms.
The nearest neighbor algorithm is not to be confused with the nearest neighbor method for classification from Sect. 8.3.
A minimum spanning tree is an acyclic, undirected graph with the minimum sum of edge lengths.
- Machine Learning and Data Mining
- Springer International Publishing
- Sequence number
- Chapter number
- Chapter 8