Link : Artificial Intelligence #3:kNN & Bayes Classification method

*The k-NN algorithm is among the simplest of all machine learning algorithms. ... The neighbors are taken from a set of objects for which the class (for k-NN classification). This can be thought of as the training set for the algorithm, though no explicit training step is required.by Sobhan N.*

What you'll learn

- Use k Nearest Neighbor classification method to classify datasets.
- Learn main concept behind the k Nearest Neighbor classification method .
- Write your own code to make k Nearest Neighbor classification method by yourself.
- Use k Nearest Neighbor classification method to classify IRIS dataset.
- Use Naive Bayes classification method to classify datasets.
- Learn main concept behind Naive Bayes classification method.
- Write your own code to make Naive Bayes classification method by yourself.
- Use Naive Bayes classification method to classify Pima Indian Diabetes Dataset.
- Use Naive Bayes classification method to obtain probability of being male or female based on Height, Weight and FootSize.

Description

In this Course you learn k-Nearest Neighbors & Naive Bayes Classification Methods.

In pattern recognition, the k-nearest neighbors algorithm (k-NN) is a non-parametric method used for classification and regression.

**k-NN is a type of instance-based learning, or lazy learning, where the function is only approximated locally and all computation is deferred until classification. The k-NN algorithm is among the simplest of all machine learning algorithms.**

For classification, a useful technique can be to assign weight to the contributions of the neighbors, so that the nearer neighbors contribute more to the average than the more distant ones.

The neighbors are taken from a set of objects for which the class (for k-NN classification). This can be thought of as the training set for the algorithm, though no explicit training step is required.

## 0 Comments