Skip to content

Commit c963a8f

Browse files
committed
Update image links with aboslute path
1 parent b1cc085 commit c963a8f

File tree

1 file changed

+27
-27
lines changed

1 file changed

+27
-27
lines changed

README.rst

Lines changed: 27 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -68,9 +68,9 @@ Supported Optimizers
6868
AccSGD
6969
------
7070

71-
+----------------------------------------+-----------------------------------------+
72-
| .. image:: docs/rastrigin_AccSGD.png | .. image:: docs/rosenbrock_AccSGD.png |
73-
+----------------------------------------+-----------------------------------------+
71+
+-----------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
72+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_AccSGD.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_AccSGD.png |
73+
+-----------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------+
7474

7575
.. code:: python
7676
@@ -95,9 +95,9 @@ AccSGD
9595
AdaBound
9696
--------
9797

98-
+-----------------------------------------+------------------------------------------+
99-
| .. image:: docs/rastrigin_AdaBound.png | .. image:: docs/rosenbrock_AdaBound.png |
100-
+-----------------------------------------+------------------------------------------+
98+
+------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------+
99+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_AdaBound.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_AdaBound.png |
100+
+------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------+
101101

102102
.. code:: python
103103
@@ -128,9 +128,9 @@ upper bounds. The dynamic learning rate bounds are based on the exponential
128128
moving averages of the adaptive learning rates themselves, which smooth out
129129
unexpected large learning rates and stabilize the training of deep neural networks.
130130

131-
+-----------------------------------------+------------------------------------------+
132-
| .. image:: docs/rastrigin_AdaMod.png | .. image:: docs/rosenbrock_AdaMod.png |
133-
+-----------------------------------------+------------------------------------------+
131+
+------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------+
132+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_AdaMod.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_AdaMod.png |
133+
+------------------------------------------------------------------------------------------------------------+-------------------------------------------------------------------------------------------------------------+
134134

135135
.. code:: python
136136
@@ -158,9 +158,9 @@ gradient, the step size is adjusted for each parameter in such
158158
a way that it should have a larger step size for faster gradient changing
159159
parameters and a lower step size for lower gradient changing parameters.
160160

161-
+-----------------------------------------+-------------------------------------------+
162-
| .. image:: docs/rastrigin_DiffGrad.png | .. image:: docs/rosenbrock_DiffGrad.png |
163-
+-----------------------------------------+-------------------------------------------+
161+
+------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
162+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_DiffGrad.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_DiffGrad.png |
163+
+------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
164164

165165
.. code:: python
166166
@@ -184,9 +184,9 @@ parameters and a lower step size for lower gradient changing parameters.
184184
Lamb
185185
----
186186

187-
+-------------------------------------+---------------------------------------+
188-
| .. image:: docs/rastrigin_Lamb.png | .. image:: docs/rosenbrock_Lamb.png |
189-
+-------------------------------------+---------------------------------------+
187+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
188+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_Lamb.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_Lamb.png |
189+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
190190

191191
.. code:: python
192192
@@ -211,9 +211,9 @@ Lamb
211211
NovoGrad
212212
--------
213213

214-
+-----------------------------------------+-------------------------------------------+
215-
| .. image:: docs/rastrigin_NovoGrad.png | .. image:: docs/rosenbrock_NovoGrad.png |
216-
+-----------------------------------------+-------------------------------------------+
214+
+------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
215+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_NovoGrad.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_NovoGrad.png |
216+
+------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
217217

218218
.. code:: python
219219
@@ -240,9 +240,9 @@ NovoGrad
240240
RAdam
241241
-----
242242

243-
+--------------------------------------+----------------------------------------+
244-
| .. image:: docs/rastrigin_RAdam.png | .. image:: docs/rosenbrock_RAdam.png |
245-
+--------------------------------------+----------------------------------------+
243+
+---------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------+
244+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_RAdam.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_RAdam.png |
245+
+---------------------------------------------------------------------------------------------------------+-----------------------------------------------------------------------------------------------------------+
246246

247247
.. code:: python
248248
@@ -266,9 +266,9 @@ RAdam
266266
SGDW
267267
----
268268

269-
+-------------------------------------+---------------------------------------+
270-
| .. image:: docs/rastrigin_SGDW.png | .. image:: docs/rosenbrock_SGDW.png |
271-
+-------------------------------------+---------------------------------------+
269+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
270+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_SGDW.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_SGDW.png |
271+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
272272

273273
.. code:: python
274274
@@ -296,9 +296,9 @@ Yogi
296296
Yogi is optimization algorithm based on ADAM with more fine grained effective
297297
learning rate control, and has similar theoretical guarantees on convergence as ADAM.
298298

299-
+-------------------------------------+---------------------------------------+
300-
| .. image:: docs/rastrigin_Yogi.png | .. image:: docs/rosenbrock_Yogi.png |
301-
+-------------------------------------+---------------------------------------+
299+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
300+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_Yogi.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_Yogi.png |
301+
+--------------------------------------------------------------------------------------------------------+----------------------------------------------------------------------------------------------------------+
302302

303303
.. code:: python
304304

0 commit comments

Comments
 (0)