Grokking Machine Learning 1st Edition by Luis Serrano – Ebook PDF Instant Download/Delivery: 1638350200, 9781638350200
Full download Grokking Machine Learning 1st Edition after payment
Product details:
ISBN 10: 1638350200
ISBN 13: 9781638350200
Author: Luis Serrano
Discover valuable machine learning techniques you can understand and apply using just high-school math. In Grokking Machine Learning you will learn: Supervised algorithms for classifying and splitting data Methods for cleaning and simplifying data Machine learning packages and tools Neural networks and ensemble methods for complex datasets Grokking Machine Learning teaches you how to apply ML to your projects using only standard Python code and high school-level math. No specialist knowledge is required to tackle the hands-on exercises using Python and readily available machine learning tools. Packed with easy-to-follow Python-based exercises and mini-projects, this book sets you on the path to becoming a machine learning expert. Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications. About the technology Discover powerful machine learning techniques you can understand and apply using only high school math! Put simply, machine learning is a set of techniques for data analysis based on algorithms that deliver better results as you give them more data. ML powers many cutting-edge technologies, such as recommendation systems, facial recognition software, smart speakers, and even self-driving cars. This unique book introduces the core concepts of machine learning, using relatable examples, engaging exercises, and crisp illustrations. About the book Grokking Machine Learning presents machine learning algorithms and techniques in a way that anyone can understand. This book skips the confused academic jargon and offers clear explanations that require only basic algebra. As you go, you’ll build interesting projects with Python, including models for spam detection and image recognition.
Grokking Machine Learning 1st Table of contents:
1 What is machine learning? It is common sense, except done by a computer
Do I need a heavy math and coding background to understand machine learning?
OK, so what exactly is machine learning?
How do we get machines to make decisions with data? The remember-formulate-predict framework
Summary
2 Types of machine learning
What is the difference between labeled and unlabeled data?
Supervised learning: The branch of machine learning that works with labeled data
Unsupervised learning: The branch of machine learning that works with unlabeled data
What is reinforcement learning?
Summary
Exercises
3 Drawing a line close to our points: Linear regression
The problem: We need to predict the price of a house
The solution: Building a regression model for housing prices
How to get the computer to draw this line: The linear regression algorithm
How do we measure our results? The error function
Real-life application: Using Turi Create to predict housing prices in India
What if the data is not in a line? Polynomial regression
Parameters and hyperparameters
Applications of regression
Summary
Exercises
4 Optimizing the training process: Underfitting, overfitting, testing, and regularization
An example of underfitting and overfitting using polynomial regression
How do we get the computer to pick the right model? By testing
Where did we break the golden rule, and how do we fix it? The validation set
A numerical way to decide how complex our model should be: The model complexity graph
Another alternative to avoiding overfitting: Regularization
Polynomial regression, testing, and regularization with Turi Create
Summary
Exercises
5 Using lines to split our points: The perceptron algorithm
The problem: We are on an alien planet, and we don’t know their language!
How do we determine whether a classifier is good or bad? The error function
How to find a good classifier? The perceptron algorithm
Coding the perceptron algorithm
Applications of the perceptron algorithm
Summary
Exercises
6 A continuous approach to splitting points: Logistic classifiers
Logistic classifiers: A continuous version of perceptron classifiers
How to find a good logistic classifier? The logistic regression algorithm
Coding the logistic regression algorithm
Real-life application: Classifying IMDB reviews with Turi Create
Classifying into multiple classes: The softmax function
Summary
Exercises
7 How do you measure classification models? Accuracy and its friends
Accuracy: How often is my model correct?
How to fix the accuracy problem? Defining different types of errors and how to measure them
A useful tool to evaluate our model: The receiver operating characteristic (ROC) curve
Summary
Exercises
8 Using probability to its maximum: The naive Bayes model
Sick or healthy? A story with Bayes’ theorem as the hero
Use case: Spam-detection model
Building a spam-detection model with real data
Summary
Exercises
9 Splitting data by asking questions: Decision trees
The problem: We need to recommend apps to users according to what they are likely to download
The solution: Building an app-recommendation system
Beyond questions like yes/no
The graphical boundary of decision trees
Real-life application: Modeling student admissions with Scikit-Learn
Decision trees for regression
Applications
Summary
Exercises
10 Combining building blocks to gain more power: Neural networks
Neural networks with an example: A more complicated alien planet
Training neural networks
Coding neural networks in Keras
Neural networks for regression
Other architectures for more complex datasets
Summary
Exercises
11 Finding boundaries with style: Support vector machines and the kernel method
Using a new error function to build better classifiers
Coding support vector machines in Scikit-Learn
Training SVMs with nonlinear boundaries: The kernel method
Summary
Exercises
12 Combining models to maximize results: Ensemble learning
With a little help from our friends
Bagging: Joining some weak learners randomly to build a strong learner
AdaBoost: Joining weak learners in a clever way to build a strong learner
Gradient boosting: Using decision trees to build strong learners
XGBoost: An extreme way to do gradient boosting
Applications of ensemble methods
Summary
Exercises
13 Putting it all in practice: A real-life example of data engineering and machine learning
The Titanic dataset
Cleaning up our dataset: Missing values and how to deal with them
Feature engineering: Transforming the features in our dataset before training the models
Training our models
Tuning the hyperparameters to find the best model: Grid search
Using K-fold cross-validation to reuse our data as training and validation
Summary
People also search for Grokking Machine Learning 1st:
machine learning for beginners
grokking artificial intelligence
ml visual learning books
introduction to machine learning textbooks
neural network beginner guides
Tags: Grokking, Machine Learning, Luis Serrano, Machine Learning



