Skip to content

Commit 860bd41

Browse files
Update machine-learning/03-Training and AutoML.ipynb
Co-authored-by: Luis Quintanilla <46974588+luisquintanilla@users.noreply.github.com>
1 parent 7317d00 commit 860bd41

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

machine-learning/03-Training and AutoML.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@
5151

5252
A good analogy for these concepts is studying for an exam. Let's say you knew the questions and answers ahead of time. After studying, you take the test and get a perfect score. Great news! However, when you're given the exam again with the questions rearranged and with slightly different wording you get a lower score. That suggests you memorized the answers and didn't actually learn the concepts you were being tested on. This is an example of overfitting. Underfitting is the opposite where the study materials you were given don't accurately represent what you're evaluated on for the exam. As a result, you resort to guessing the answers since you don't have enough knowledge to answer correctly.
5353
"\n",
54-
"In the next section, we will go through two examples. The first example performs regression training on a linear dataset using both simple, linear and more advanced, non-linear trainers. And is to illustrate the importance of selecting the __Right__ trainer instead of __Advanced__ trainer. The second example performs regression training, while on a non-linear dataset, using both `LightGbm` with difference hyper-parameters. This is to show the importance of hyper-parameter optimization during training a model."
54+
"In the next section, we will go through two examples. The first example trains a regression model on a linear dataset using both linear and more advanced non-linear trainers to highlight the importance of selecting the right trainer. The second example trains a regression model on a non-linear dataset using `LightGbm` with different hyper-parameters to show the importance of hyper-parameter optimization"
5555
]
5656
},
5757
{

0 commit comments

Comments
 (0)