You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Optimize (minimize) the objective function returned as the first value of `fg`, where the
29
44
second value contains the gradient, starting from a point `x` and using the algorithm
@@ -33,11 +48,44 @@ Returns the final point `x`, the coresponding function value `f` and gradient `g
33
48
total number of calls to `fg`, and the history of the gradient norm across the different
34
49
iterations.
35
50
36
-
Check the README of this package for further details on creating an algorithm instance, as well as for the meaning of the keyword arguments and their default values.
51
+
The algorithm is run until either `hasconverged(x, f, g, norm(g))` returns `true` or
52
+
`shouldstop(x, f, g, numfg, numiter, time)` returns `true`. The latter case happening before
53
+
the former is considered to be a failure to converge, and a warning is issued.
54
+
55
+
The keyword arguments are:
56
+
57
+
- `precondition::Function`: A function that takes the current point `x` and the gradient `g`
58
+
and returns a preconditioned gradient. By default, the identity is used.
59
+
- `finalize!::Function`: A function that takes the final point `x`, the function value `f`,
60
+
the gradient `g`, and the iteration number, and returns a possibly modified values for
61
+
`x`, `f` and `g`. By default, the identity is used.
62
+
It is the user's responsibility to ensure that the modified values do not lead to
63
+
inconsistencies within the optimization algorithm.
64
+
- `hasconverged::Function`: A function that takes the current point `x`, the function value `f`,
65
+
the gradient `g`, and the norm of the gradient, and returns a boolean indicating whether
66
+
the optimization has converged. By default, the norm of the gradient is compared to the
67
+
tolerance `gradtol` as encoded in the algorithm instance.
68
+
- `shouldstop::Function`: A function that takes the current point `x`, the function value `f`,
69
+
the gradient `g`, the number of calls to `fg`, the iteration number, and the time spent
70
+
so far, and returns a boolean indicating whether the optimization should stop. By default,
71
+
the number of iterations is compared to the maximum number of iterations as encoded in the
72
+
algorithm instance.
73
+
74
+
Check the README of this package for further details on creating an algorithm instance,
75
+
as well as for the meaning of the remaining keyword arguments and their default values.
76
+
77
+
!!! Warning
78
+
79
+
The default values of `hasconverged` and `shouldstop` are provided to ensure continuity
80
+
with the previous versions of this package. However, this behaviour might change in the
81
+
future.
82
+
83
+
Also see [`GradientDescent`](@ref), [`ConjugateGradient`](@ref), [`LBFGS`](@ref).
37
84
"""
38
85
function optimize end
39
86
40
87
include("linesearches.jl")
88
+
include("terminate.jl")
41
89
include("gd.jl")
42
90
include("cg.jl")
43
91
include("lbfgs.jl")
@@ -61,23 +109,23 @@ Test the compatibility between the computation of the gradient, the retraction a
61
109
62
110
It is up to the user to check that the values in `dfs1` and `dfs2` match up to expected precision, by inspecting the numerical values or plotting them. If these values don't match, the linesearch in `optimize` cannot be expected to work.
0 commit comments