|
9 | 9 | "\n",
|
10 | 10 | "In this exercise you will use automatic differentiation in JAX and estimagic to solve the previous problem.\n",
|
11 | 11 | "\n",
|
| 12 | + "> Note. Because JAX cannot (yet) be installed on Windows there will be extra exercises for Windows users.\n", |
| 13 | + "\n", |
12 | 14 | "## Resources\n",
|
13 | 15 | "\n",
|
14 | 16 | "- https://jax.readthedocs.io/en/latest/jax.numpy.html\n",
|
|
36 | 38 | "source": [
|
37 | 39 | "## Task 1: Switch to JAX\n",
|
38 | 40 | "\n",
|
39 |
| - "- Use the code from exercise 2, task 2, and convert the criterion function and the start parameters to JAX. Look at the [`jax.numpy` documentation](https://jax.readthedocs.io/en/latest/jax.numpy.html) and slides if you have any questions." |
| 41 | + "- Use the code from exercise 2, task 2, and convert the criterion function and the start parameters to JAX. Look at the [`jax.numpy` documentation](https://jax.readthedocs.io/en/latest/jax.numpy.html) and slides if you have any questions.\n", |
| 42 | + "\n", |
| 43 | + "---\n", |
| 44 | + "\n", |
| 45 | + "## Task 1 (Windows): Copy functions\n", |
| 46 | + "\n", |
| 47 | + "- Copy the criterion function and start parameters from exericse 2, task 2, here." |
40 | 48 | ]
|
41 | 49 | },
|
42 | 50 | {
|
|
55 | 63 | "## Task 2: Gradient\n",
|
56 | 64 | "\n",
|
57 | 65 | "- Compute the gradient of the criterion (the whole function). Look at the [`autodiff_cookbook` documentation](https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html) and slides if you have any questions.\n",
|
58 |
| - "- Measure the runtime of a jitted and unjitted version of the gradient (using `%timeit`.)" |
| 66 | + "- Measure the runtime of a jitted and unjitted version of the gradient (using `%timeit`.)\n", |
| 67 | + "\n", |
| 68 | + "---\n", |
| 69 | + "\n", |
| 70 | + "## Task 2 (Windows): Gradient\n", |
| 71 | + "\n", |
| 72 | + "- Compute the gradient of the criterion (the whole function) analytically\n", |
| 73 | + "- Implement the analytical gradient" |
59 | 74 | ]
|
60 | 75 | },
|
61 | 76 | {
|
|
71 | 86 | "id": "85959d9b",
|
72 | 87 | "metadata": {},
|
73 | 88 | "source": [
|
74 |
| - "## Task 3: Minimize\n", |
| 89 | + "## Task 3 (all systems): Minimize\n", |
75 | 90 | "\n",
|
76 | 91 | "- Use estimagic to minimize the criterion\n",
|
77 | 92 | " - pass the gradient function you computed above to the minimize call.\n",
|
|
0 commit comments