Skip to content

Commit b08b1a2

Browse files
Update machine-learning/03-Training and AutoML.ipynb
Co-authored-by: Luis Quintanilla <46974588+luisquintanilla@users.noreply.github.com>
1 parent 48e4596 commit b08b1a2

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

machine-learning/03-Training and AutoML.ipynb

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,9 @@
4747
"The process of finding the best configuration for your trainer is known as hyper-parameter optimization (HPO). Like the process of choosing your trainer it involves a lot of trial and error. The built-in Automated ML (AutoML) capabilities in ML.NET simplify the HPO process.\n",
4848
"\n",
4949
"### OverFitting and UnderFitting\n",
50-
"Overfitting and underfitting are the two most common problems we would see when training a model. UnderFitting means the selected trainer is not capable enough to fit training dataset and usually result in a high loss during training and low score/metric on test dataset. To resolve this we need either select a more powerful model, or do more feature engineering. And overfitting is just the opposite, which happens when model get overtrained and usually result in a decent low loss during training but low score on test dataset.\n",
50+
"Overfitting and underfitting are the two most common problems you encounter when training a model. Underfitting means the selected trainer is not capable enough to fit training dataset and usually result in a high loss during training and low score/metric on test dataset. To resolve this you need to either select a more powerful model or perform more feature engineering. Overfitting is the opposite, which happens when model learns the training data too well. This usually results in low loss metric during training but high loss on test dataset.\n".
51+
52+
A good analogy for these concepts is studying for an exam. Let's say you knew the questions and answers ahead of time. After studying, you take the test and get a perfect score. Great news! However, when you're given the exam again with the questions rearranged and with slightly different wording you get a lower score. That suggests you memorized the answers and didn't actually learn the concepts you were being tested on. This is an example of overfitting. Underfitting is the opposite where the study materials you were given don't accurately represent what you're evaluated on for the exam. As a result, you resort to guessing the answers since you don't have enough knowledge to answer correctly.
5153
"\n",
5254
"In the next section, we will go through two examples. The first example performs regression training on a linear dataset using both simple, linear and more advanced, non-linear trainers. And is to illustrate the importance of selecting the __Right__ trainer instead of __Advanced__ trainer. The second example performs regression training, while on a non-linear dataset, using both `LightGbm` with difference hyper-parameters. This is to show the importance of hyper-parameter optimization during training a model."
5355
]

0 commit comments

Comments
 (0)