Skip to content

Commit aa4594f

Browse files
committed
Merge branch 'main' into timmens
2 parents f5c7512 + a180dc3 commit aa4594f

9 files changed

+288
-6592
lines changed

src/scipy_dev/notebooks/06_scaling-Copy1.ipynb

-6,228
This file was deleted.

src/scipy_dev/notebooks/06_scaling.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"id": "6ef94deb",
3232
"metadata": {},
3333
"source": [
34-
"## Get a badly scaled problem from a benchmar set"
34+
"## Get a badly scaled problem from a benchmark set"
3535
]
3636
},
3737
{

src/scipy_dev/notebooks/07_automatic_differentiation.ipynb

+18-3
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99
"\n",
1010
"In this exercise you will use automatic differentiation in JAX and estimagic to solve the previous problem.\n",
1111
"\n",
12+
"> Note. Because JAX cannot (yet) be installed on Windows there will be extra exercises for Windows users.\n",
13+
"\n",
1214
"## Resources\n",
1315
"\n",
1416
"- https://jax.readthedocs.io/en/latest/jax.numpy.html\n",
@@ -36,7 +38,13 @@
3638
"source": [
3739
"## Task 1: Switch to JAX\n",
3840
"\n",
39-
"- Use the code from exercise 2, task 2, and convert the criterion function and the start parameters to JAX. Look at the [`jax.numpy` documentation](https://jax.readthedocs.io/en/latest/jax.numpy.html) and slides if you have any questions."
41+
"- Use the code from exercise 2, task 2, and convert the criterion function and the start parameters to JAX. Look at the [`jax.numpy` documentation](https://jax.readthedocs.io/en/latest/jax.numpy.html) and slides if you have any questions.\n",
42+
"\n",
43+
"---\n",
44+
"\n",
45+
"## Task 1 (Windows): Copy functions\n",
46+
"\n",
47+
"- Copy the criterion function and start parameters from exericse 2, task 2, here."
4048
]
4149
},
4250
{
@@ -55,7 +63,14 @@
5563
"## Task 2: Gradient\n",
5664
"\n",
5765
"- Compute the gradient of the criterion (the whole function). Look at the [`autodiff_cookbook` documentation](https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html) and slides if you have any questions.\n",
58-
"- Measure the runtime of a jitted and unjitted version of the gradient (using `%timeit`.)"
66+
"- Measure the runtime of a jitted and unjitted version of the gradient (using `%timeit`.)\n",
67+
"\n",
68+
"---\n",
69+
"\n",
70+
"## Task 2 (Windows): Gradient\n",
71+
"\n",
72+
"- Compute the gradient of the criterion (the whole function) analytically\n",
73+
"- Implement the analytical gradient"
5974
]
6075
},
6176
{
@@ -71,7 +86,7 @@
7186
"id": "85959d9b",
7287
"metadata": {},
7388
"source": [
74-
"## Task 3: Minimize\n",
89+
"## Task 3 (all systems): Minimize\n",
7590
"\n",
7691
"- Use estimagic to minimize the criterion\n",
7792
" - pass the gradient function you computed above to the minimize call.\n",

src/scipy_dev/notebooks/08_jaxopt.ipynb

+2
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99
"\n",
1010
"In this exercise you will use JAXopt to solve a *batched* version of the first and second exercise.\n",
1111
"\n",
12+
"> Note. You cannot tackle these exercises on Windows.\n",
13+
"\n",
1214
"## Resources\n",
1315
"\n",
1416
"- [JAX documentation](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html)\n",

src/scipy_dev/notebooks/solutions/01_first_optimization_with_scipy_optimize.ipynb

-8
Original file line numberDiff line numberDiff line change
@@ -177,14 +177,6 @@
177177
"assert_array_almost_equal(b, np.arange(3))\n",
178178
"assert_array_almost_equal(c, np.eye(2))"
179179
]
180-
},
181-
{
182-
"cell_type": "code",
183-
"execution_count": null,
184-
"id": "27f71064",
185-
"metadata": {},
186-
"outputs": [],
187-
"source": []
188180
}
189181
],
190182
"metadata": {

src/scipy_dev/notebooks/solutions/07_automatic_differentiation.ipynb

+2
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99
"\n",
1010
"In this exercise you will use automatic differentiation in JAX and estimagic to solve the previous problem.\n",
1111
"\n",
12+
"> Note. Here you will only find the solution for Unix and Linux.\n",
13+
"\n",
1214
"## Resources\n",
1315
"\n",
1416
"- https://jax.readthedocs.io/en/latest/jax.numpy.html\n",

src/scipy_dev/notebooks/solutions/08_jaxopt.ipynb

+2
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@
99
"\n",
1010
"In this exercise you will use JAXopt to solve a *batched* version of the first and second exercise.\n",
1111
"\n",
12+
"> Note. You cannot tackle these exercises on Windows.\n",
13+
"\n",
1214
"## Resources\n",
1315
"\n",
1416
"- [JAX documentation](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html)\n",

0 commit comments

Comments
 (0)