.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "tutorials/plot_gradient.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_tutorials_plot_gradient.py: Computing gradients =================== In this tutorial we will see how to compute gradients of quantities with respect to input values automatically. .. GENERATED FROM PYTHON SOURCE LINES 15-28 .. code-block:: Python import matplotlib.pyplot as plt import nannos as nn nn.set_backend("torch") # nn.set_backend("autograd") from nannos import grad bk = nn.backend .. GENERATED FROM PYTHON SOURCE LINES 30-32 Let's define a function that will return the reflection coefficient for a metasurface: .. GENERATED FROM PYTHON SOURCE LINES 32-52 .. code-block:: Python def f(thickness): lattice = nn.Lattice(([1, 0], [0, 1])) sup = lattice.Layer("Superstrate") sub = lattice.Layer("Substrate", epsilon=2) ms = lattice.Layer("ms", thickness=thickness, epsilon=6) sim = nn.Simulation( [sup, ms, sub], nn.PlaneWave(1.5), nh=1, ) R, T = sim.diffraction_efficiencies() return R x = bk.array([0.3], dtype=bk.float64) print(f(x)) .. rst-class:: sphx-glr-script-out .. code-block:: none tensor(0.0317, dtype=torch.float64) .. GENERATED FROM PYTHON SOURCE LINES 53-55 We will compute the finite difference approximation of the gradient: .. GENERATED FROM PYTHON SOURCE LINES 55-67 .. code-block:: Python def first_finite_differences(f, x): eps = 1e-4 return nn.backend.array( [(f(x + eps * v) - f(x - eps * v)) / (2 * eps) for v in nn.backend.eye(len(x))], ) df_fd = first_finite_differences(f, x) print(df_fd) .. rst-class:: sphx-glr-script-out .. code-block:: none tensor([-0.7177], dtype=torch.float64) .. GENERATED FROM PYTHON SOURCE LINES 68-69 Automatic differentiation: .. GENERATED FROM PYTHON SOURCE LINES 69-77 .. code-block:: Python df = grad(f) df_auto = df(x) print(df_auto) assert nn.backend.allclose(df_fd, df_auto, atol=1e-7) .. rst-class:: sphx-glr-script-out .. code-block:: none tensor([-0.7177], dtype=torch.float64) .. GENERATED FROM PYTHON SOURCE LINES 78-79 A random pattern: .. GENERATED FROM PYTHON SOURCE LINES 79-111 .. code-block:: Python import random random.seed(2022) discretization = 2**4, 2**4 def f(var): lattice = nn.Lattice(([1, 0], [0, 1]), discretization=discretization) xa = var.reshape(lattice.discretization) sup = lattice.Layer("Superstrate") sub = lattice.Layer("Substrate") ms = lattice.Layer("ms", 1) ms.epsilon = 9 + 1 * xa + 0j sim = nn.Simulation( [sup, ms, sub], nn.PlaneWave(1.5), nh=51, ) R, T = sim.diffraction_efficiencies() return R nvar = discretization[0] * discretization[1] print(nvar) xlist = [random.random() for _ in range(nvar)] x = bk.array(xlist, dtype=bk.float64) .. rst-class:: sphx-glr-script-out .. code-block:: none 256 .. GENERATED FROM PYTHON SOURCE LINES 112-113 Finite differences: .. GENERATED FROM PYTHON SOURCE LINES 113-118 .. code-block:: Python t0 = nn.tic() df_fd = first_finite_differences(f, x) tfd = nn.toc(t0) .. rst-class:: sphx-glr-script-out .. code-block:: none elapsed time 17.841123819351196s .. GENERATED FROM PYTHON SOURCE LINES 119-120 Automatic differentiation: .. GENERATED FROM PYTHON SOURCE LINES 120-131 .. code-block:: Python df = grad(f) t0 = nn.tic() df_auto = df(x) tauto = nn.toc(t0) assert nn.backend.allclose(df_fd, df_auto, atol=1e-7) print("speedup: ", tfd / tauto) .. rst-class:: sphx-glr-script-out .. code-block:: none elapsed time 0.14763927459716797s speedup: 120.84266783368106 .. GENERATED FROM PYTHON SOURCE LINES 132-133 Plot gradients .. GENERATED FROM PYTHON SOURCE LINES 133-145 .. code-block:: Python fig, ax = plt.subplots(1, 2, figsize=(8, 3)) _ = ax[0].imshow(df_auto.reshape(*discretization).real) plt.colorbar(_, ax=ax[0]) ax[0].set_title("autodiff") _ = ax[1].imshow(df_fd.reshape(*discretization).real) plt.colorbar(_, ax=ax[1]) ax[1].set_title("finite differences") plt.tight_layout() nn.set_backend("numpy") .. image-sg:: /tutorials/images/sphx_glr_plot_gradient_001.png :alt: autodiff, finite differences :srcset: /tutorials/images/sphx_glr_plot_gradient_001.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 24.106 seconds) **Estimated memory usage:** 544 MB .. _sphx_glr_download_tutorials_plot_gradient.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/nannos/nannos.gitlab.io/doc?filepath=notebooks/tutorials/plot_gradient.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gradient.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gradient.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gradient.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_