Machine Learning by Stanford University Taught by Andrew NG.
The Code done by using Octave.
You may use either MATLAB or Octave (>= 3.8.0).
The solutions of the execrcies of the Course.
-
Linear Regression with one and Multi variable.
-
Computing the Cost function.
-
Gradient Descent.
-
Feature Normalization.
-
Visualisation.
- Compute the cost and Gradient for Logistic regression
- Apply Regularization for the Cost and the Graident for Logistic regression
- Neural Network
- Cost and Gradient for Neural Network
- Multi-class classifier using one-vs-all methods
- Apply the backpropagation
- Add regularized term
- Regularized linear regression cost function
- Generates a learning curve
- Generates a cross validation curve
- Work with SVM
- Gaussian kernel for SVM
- Parameters to use for Dataset 3
- Work K-mean algorithm
- Perform principal component analysis
- Projects a data set into a lower dimensional space
- Recovers the original data from the projection
- Find closest centroids (used in K-means)
- Compute centroid means (used in K-means)
- Initialization for K-means centroids
- Work with Anomaly Detection and RecommenderSystems
- Estimate the parameters of a Gaussian distribution with a diagonal covariance matrix
- Find a threshold for anomaly detection
- Implement the cost function for collaborative filtering