# Machine Learning with Python

Calling all curious minds! Start your Machine Learning with Python coding journey today, and become an expert engineer.

(ML-PYTHON.AP1) / ISBN : 978-1-64459-274-8## About This Course

Learn Machine Learning with Python, a comprehensive training manual that teaches the fundamentals of coding with Python. Whether you want to improve your coding skills or you want an upgrade at your workplace, this is the ideal start. In this course, you’ll master the processes, patterns, and strategies of this user-friendly programming language. This Python ML course covers supervised learning paradigms, like classical algorithms, and regression techniques to evaluate performance metrics. Besides this, you’ll also learn feature engineering for converting raw data into meaningful features. Furthermore, you’ll also leverage the Python scikit-learn library along with other powerful tools. Practice on our Labs to solidify your understanding as you explore object-oriented programming, modules, error handling, and even file operations. By the end of this course, you'll be confidently writing Python ML scripts and resolving coding issues.

## Skills You’ll Get

- Understand the fundamentals of supervised machine learning algorithms and their classification
- Evaluating performance metrics for assessing the efficacy of your models
- Using engineer features to convert raw data into meaningful ML algorithms
- Managing system performance by creating robust pipelines
- Apply ML to various data types
- Leverage Python scikit-learn library and other tools
- Use of advanced techniques like neural networks and graphical models

Get the support you need. Enroll in our Instructor-Led Course.

### Interactive Lessons

16+ Interactive Lessons | 44+ Exercises | 95+ Quizzes | 100+ Flashcards | 100+ Glossary of terms

### Gamified TestPrep

55+ Pre Assessment Questions | 55+ Post Assessment Questions |

### Let’s Discuss Learning

- Welcome
- Scope, Terminology, Prediction, and Data
- Putting the Machine in Machine Learning
- Examples of Learning Systems
- Evaluating Learning Systems
- A Process for Building Learning Systems
- Assumptions and Reality of Learning
- End-of-Lesson Material

### Some Technical Background

- About Our Setup
- The Need for Mathematical Language
- Our Software for Tackling Machine Learning
- Probability
- Linear Combinations, Weighted Sums, and Dot Products
- A Geometric View: Points in Space
- Notation and the Plus-One Trick
- Getting Groovy, Breaking the Straight-Jacket, and Nonlinearity
- NumPy versus “All the Maths”
- Floating-Point Issues
- EOC

### Predicting Categories: Getting Started with Classification

- Classification Tasks
- A Simple Classification Dataset
- Training and Testing: Don’t Teach to the Test
- Evaluation: Grading the Exam
- Simple Classifier #1: Nearest Neighbors, Long Distance Relationships, and Assumptions
- Simple Classifier #2: Naive Bayes, Probability, and Broken Promises
- Simplistic Evaluation of Classifiers
- EOC

### Predicting Numerical Values: Getting Started with Regression

- A Simple Regression Dataset
- Nearest-Neighbors Regression and Summary Statistics
- Linear Regression and Errors
- Optimization: Picking the Best Answer
- Simple Evaluation and Comparison of Regressors
- EOC

### Evaluating and Comparing Learners

- Evaluation and Why Less Is More
- Terminology for Learning Phases
- Major Tom, There’s Something Wrong: Overfitting and Underfitting
- From Errors to Costs
- (Re)Sampling: Making More from Less
- Break-It-Down: Deconstructing Error into Bias and Variance
- Graphical Evaluation and Comparison
- Comparing Learners with Cross-Validation
- EOC

### Evaluating Classifiers

- Baseline Classifiers
- Beyond Accuracy: Metrics for Classification
- ROC Curves
- Another Take on Multiclass: One-versus-One
- Precision-Recall Curves
- Cumulative Response and Lift Curves
- More Sophisticated Evaluation of Classifiers: Take Two
- EOC

### Evaluating Regressors

- Baseline Regressors
- Additional Measures for Regression
- Residual Plots
- A First Look at Standardization
- Evaluating Regressors in a More Sophisticated Way: Take Two
- EOC

### More Classification Methods

- Revisiting Classification
- Decision Trees
- Support Vector Classifiers
- Logistic Regression
- Discriminant Analysis
- Assumptions, Biases, and Classifiers
- Comparison of Classifiers: Take Three
- EOC

### More Regression Methods

- Linear Regression in the Penalty Box: Regularization
- Support Vector Regression
- Piecewise Constant Regression
- Regression Trees
- Comparison of Regressors: Take Three
- EOC

### Manual Feature Engineering: Manipulating Data for Fun and Profit

- Feature Engineering Terminology and Motivation
- Feature Selection and Data Reduction: Taking out the Trash
- Feature Scaling
- Discretization
- Categorical Coding
- Relationships and Interactions
- Target Manipulations
- EOC

### Tuning Hyperparameters and Pipelines

- Models, Parameters, Hyperparameters
- Tuning Hyperparameters
- Down the Recursive Rabbit Hole: Nested Cross-Validation
- Pipelines
- Pipelines and Tuning Together
- EOC

### Combining Learners

- Ensembles
- Voting Ensembles
- Bagging and Random Forests
- Boosting
- Comparing the Tree-Ensemble Methods
- EOC

### Models That Engineer Features for Us

- Feature Selection
- Feature Construction with Kernels
- Principal Components Analysis: An Unsupervised Technique
- EOC

### Feature Engineering for Domains: Domain-Specific Learning

- Working with Text
- Clustering
- Working with Images
- EOC

### Connections, Extensions, and Further Directions

- Optimization
- Linear Regression from Raw Materials
- Building Logistic Regression from Raw Materials
- SVM from Raw Materials
- Neural Networks
- Probabilistic Graphical Models
- EOC

### Appendix A: mlwpy.py Listing

### Some Technical Background

- Plotting a Probability Distribution Graph
- Using the zip Function
- Calculating the Sum of Squares
- Plotting a Line Graph
- Plotting a 3D Graph
- Plotting a Polynomial Graph
- Using the numpy.dot() Method

### Predicting Categories: Getting Started with Classification

- Displaying Histograms

### Predicting Numerical Values: Getting Started with Regression

- Defining an Outlier
- Calculating the Median Value
- Estimating the Multiple Regression Equation

### Evaluating and Comparing Learners

- Constructing a Swarm Plot
- Using the describe() Method
- Viewing Variance

### Evaluating Classifiers

- Creating a Confusion Matrix
- Creating an ROC Curve
- Recreating an ROC Curve
- Creating a Trendline Graph

### Evaluating Regressors

- Viewing the Standard Deviation
- Constructing a Scatterplot
- Evaluating the Prediction Error Rates

### More Classification Methods

- Evaluating a Logistic Model
- Creating a Covariance Matrix
- Using the load_digits() Function

### More Regression Methods

- Illustrating a Less Consistent Relationship
- Illustrating a Piecewise Constant Regression

### Manual Feature Engineering: Manipulating Data for Fun and Profit

- Manipulating the Target
- Manipulating the Input Space

### Combining Learners

- Calculating the Mean Value

### Models That Engineer Features for Us

- Displaying a Correlation Matrix
- Creating a Nonlinear Model
- Performing a Principal Component Analysis
- Using the Manifold Method

### Feature Engineering for Domains: Domain-Specific Learning

- Encoding Text

### Connections, Extensions, and Further Directions

- Building an Estimated Simple Linear Regression Equation

## Any questions?

Check out the FAQs

Still have unanswered questions and need to get in touch?

Contact Us NowThis is a beginner-friendly course and you can literally start with very basic or no prior knowledge of this coding language, and gradually build up your logic building skills as you progress. However, it will be much easier if you have some basic knowledge of the programming concepts. And, some bit of prior coding experience with Python.

This ML training course will transform you from a curious onlooker to a machine learning expert. There’s a lot you’ll be learning:

- Supervising ML algorithms; classification (spam filters), and regression (predicting prices)
- Build models, assessing their performances, and delivering results
- Master feature engineering
- Exploring data diversity
- Leveraging Scikit-learn and other python tools

You’ll learn these 2 algorithm categories:

- Classification - Support Vector Machines (SVM) * Random Forests * K-Nearest Neighbors (KNN) * Logistic Regression
- Regression* Linear Regression * Decision Tree Regression

No, this python course majorly focuses on core fundamentals of ML concepts and algorithms.

There isn’t any one particular IDE recommended for this course. Some of the most popular IDE options for python include Jupyter Notebook, PyCharm, and Visual Studio Code (VS code).

This course is perfect for all those who want to become expert ML engineers, or those wanting to solidify their understanding of Python fundamentals and Machine Learning algorithms.