-
Notifications
You must be signed in to change notification settings - Fork 7
Open
Description
However, if I try a simple model like y= mx+c, then it gives some weights but never gets the y-intercept right. It's always close to zero. I did try playing around with parameters/ increasing data but it still gives the same fit without y-intercept. Seems like there is some bug in the gradient descent method.
Here is the code I tried
<script src="https://rawgit.com/chen0040/js-regression/master/build/jsregression.min.js" type="application/javascript"></script> <script> // === training data generated from y = 2.0 + 5.0 * x + 2.0 * x^2 === // /* var data = []; for(var x = 1.0; x < 100.0; x += 1.0) { var y = 2.0 + 5.0 * x + 2.0 * x * x + Math.random() * 1.0; data.push([x, x * x, y]); // Note that the last column should be y the output } */ var data = []; for(var x = 1.0; x < 10000.0; x += 1.0) { var y = 6.0 + 5.0 * x*0.1 + Math.random() * 1.0; data.push([x*0.1, y]); // Note that the last column should be y the output } // === Create the linear regression === // var regression = new jsregression.LinearRegression({ alpha: 0.00001, // iterations: 30000, lambda: 0.0 }); // can also use default configuration: var regression = new jsregression.LinearRegression(); // === Train the linear regression === // var model = regression.fit(data); // === Print the trained model === // console.log(model); // === Testing the trained linear regression === // /* var testingData = []; for(var x = 1.0; x < 100.0; x += 1.0) { var actual_y = 2.0 + 5.0 * x + 2.0 * x * x + Math.random() * 1.0; var predicted_y = regression.transform([x, x * x]); console.log("actual: " + actual_y + " predicted: " + predicted_y); } */ </script>Metadata
Metadata
Assignees
Labels
No labels