diff --git a/CITATION.cff b/CITATION.cff index a25bafd9..2c422c26 100644 --- a/CITATION.cff +++ b/CITATION.cff @@ -1,34 +1,34 @@ cff-version: 1.2.0 message: "If you use this software, please cite both the software and the paper from preferred-citation." title: "PVDeg: Photovoltaic Degradation Tools" -abstract: "PVDeg is an open-source Python package for modeling photovoltaic (PV) degradation, developed at the National Renewable Energy Laboratory (NREL) and supported by the Durable Module Materials (DuraMAT) consortium. It provides modular functions, materials databases, and calculation workflows for simulating degradation mechanisms (e.g., LeTID, hydrolysis, UV exposure) using weather data from the National Solar Radiation Database (NSRDB) and the Photovoltaic Geographical Information System (PVGIS). By integrating Monte Carlo uncertainty propagation and geospatial processing, PVDeg enables field-relevant predictions and uncertainty quantification of module reliability and lifetime." +abstract: "PVDeg is an open-source Python package for modeling photovoltaic (PV) degradation, developed at the National Laboratory of the Rockies (NLR) and supported by the Durable Module Materials (DuraMAT) consortium. It provides modular functions, materials databases, and calculation workflows for simulating degradation mechanisms (e.g., LeTID, hydrolysis, UV exposure) using weather data from the National Solar Radiation Database (NSRDB) and the Photovoltaic Geographical Information System (PVGIS). By integrating Monte Carlo uncertainty propagation and geospatial processing, PVDeg enables field-relevant predictions and uncertainty quantification of module reliability and lifetime." authors: - family-names: Springer given-names: Martin - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0000-0001-6803-108X" - family-names: Brown given-names: Matthew - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0000-0002-0988-3431" - family-names: Ovaitt given-names: Silvana - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0000-0003-0180-728X" - family-names: Ford given-names: Tobin - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0009-0000-7428-5625" - family-names: Daxini given-names: Rajiv - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0000-0003-1993-9408" - family-names: Holsapple given-names: Derek - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies - family-names: Kempe given-names: Michael - affiliation: National Renewable Energy Laboratory + affiliation: National Laboratory of the Rockies orcid: "https://orcid.org/0000-0003-3312-0482" keywords: - photovoltaic @@ -43,7 +43,7 @@ keywords: - PVGIS - DuraMAT license: BSD-3-Clause -repository-code: "https://github.com/NREL/PVDegradationTools" +repository-code: "https://github.com/NatLabRockies/PVDegradationTools" url: "https://pvdegradationtools.readthedocs.io" type: software version: 0.7.1 diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index d65ceaca..0d141c20 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -112,8 +112,8 @@ git push origin feature/my-feature - 📖 [Full Contributing Guide](https://pvdegradationtools.readthedocs.io/en/latest/user_guide/contributing.html) - Comprehensive documentation - 📦 [Installation Guide](https://pvdegradationtools.readthedocs.io/en/latest/user_guide/installation.html) - Detailed setup instructions -- 🐛 [GitHub Issues](https://github.com/NREL/PVDegradationTools/issues) - Report bugs or request features -- 💬 [GitHub Discussions](https://github.com/NREL/PVDegradationTools/discussions) - Ask questions +- 🐛 [GitHub Issues](https://github.com/NatLabRockies/PVDegradationTools/issues) - Report bugs or request features +- 💬 [GitHub Discussions](https://github.com/NatLabRockies/PVDegradationTools/discussions) - Ask questions ## Contributor License Agreement diff --git a/docs/source/index.rst b/docs/source/index.rst index 25d3c385..5e5affcb 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -12,9 +12,9 @@ Welcome to PVDeg! ============================================================== -PVDeg is an open-source Python package for modeling photovoltaic (PV) degradation, developed at the National Renewable Energy Laboratory (NREL) and supported by the Durable Module Materials (DuraMAT) consortium. It provides modular functions, materials databases, and calculation workflows for simulating degradation mechanisms (e.g., LeTID, hydrolysis, UV exposure) using weather data from the National Solar Radiation Database (NSRDB) and the Photovoltaic Geographical Information System (PVGIS). By integrating Monte Carlo uncertainty propagation and geospatial processing, PVDeg enables field-relevant predictions and uncertainty quantification of module reliability and lifetime. +PVDeg is an open-source Python package for modeling photovoltaic (PV) degradation, developed at the National Laboratory of the Rockies (NLR) and supported by the Durable Module Materials (DuraMAT) consortium. It provides modular functions, materials databases, and calculation workflows for simulating degradation mechanisms (e.g., LeTID, hydrolysis, UV exposure) using weather data from the National Solar Radiation Database (NSRDB) and the Photovoltaic Geographical Information System (PVGIS). By integrating Monte Carlo uncertainty propagation and geospatial processing, PVDeg enables field-relevant predictions and uncertainty quantification of module reliability and lifetime. -The source code for PVDeg is hosted on `github `_. Please see the :ref:`installation` page for installation help. +The source code for PVDeg is hosted on `github `_. Please see the :ref:`installation` page for installation help. See :ref:`tutorials` to learn how to use and experiment with various functionalities @@ -52,7 +52,7 @@ If you use PVDeg in a published work, please cite both the software and the pape **Software Citation:** -Click the "Cite this repository" button on the `GitHub repository `_, or visit `Zenodo `_ for the DOI corresponding to your specific version. On the Zenodo page, use the "Cite as" section in the right sidebar to copy the citation in your preferred format (BibTeX, APA, etc.). +Click the "Cite this repository" button on the `GitHub repository `_, or visit `Zenodo `_ for the DOI corresponding to your specific version. On the Zenodo page, use the "Cite as" section in the right sidebar to copy the citation in your preferred format (BibTeX, APA, etc.). **JOSS Paper (In Review):** diff --git a/docs/source/tutorials/index.rst b/docs/source/tutorials/index.rst index 01fa4763..81bc2456 100644 --- a/docs/source/tutorials/index.rst +++ b/docs/source/tutorials/index.rst @@ -9,7 +9,7 @@ PVDeg provides comprehensive tutorials organized by topic. Choose your preferred Jupyter Book (Recommended) --------------------------- -Interactive tutorials with live execution: `PVDeg Jupyter Book `_ +Interactive tutorials with live execution: `PVDeg Jupyter Book `_ - Click the 🚀 rocket icon to launch notebooks in `Google Colab `_ - **Development Preview:** See latest changes at `dev-preview `_ @@ -20,7 +20,7 @@ Binder Run tutorials in your browser without installation: .. image:: https://mybinder.org/badge_logo.svg - :target: https://mybinder.org/v2/gh/NREL/PVDegradationTools/main + :target: https://mybinder.org/v2/gh/NatLabRockies/PVDegradationTools/main :alt: Binder Local Installation @@ -32,7 +32,7 @@ Local Installation .. code-block:: bash - git clone https://github.com/NREL/PVDegradationTools.git + git clone https://github.com/NatLabRockies/PVDegradationTools.git cd PVDegradationTools 3. **Start Jupyter:** @@ -51,10 +51,10 @@ Local Installation - ``10_workshop_demos/`` - Workshop demonstrations - ``tools/`` - Standalone analysis tools -NREL HPC (Kestrel) +NLR HPC (Kestrel) ------------------ -Running notebooks on Kestrel is documented on the `NREL HPC Documentation `_. +Running notebooks on Kestrel is documented on the `NLR HPC Documentation `_. **Important:** Register a custom iPykernel before running notebooks on Kestrel: diff --git a/docs/source/user_guide/NSRDB_API_Key.rst b/docs/source/user_guide/NSRDB_API_Key.rst index 54e30c16..42976750 100644 --- a/docs/source/user_guide/NSRDB_API_Key.rst +++ b/docs/source/user_guide/NSRDB_API_Key.rst @@ -5,4 +5,4 @@ NSRDB API Key The National Solar Radiation Database (NSRDB) is a serially complete collection of satellite-derived measurements of solar radiation—global horizontal, direct normal, and diffuse horizontal irradiance—and meteorological data. These data have been collected at a sufficient number of locations and temporal and spatial scales to accurately represent regional solar radiation climates. The data are publicly available at no cost to the user. These API provide access to downloading the data. -Obtain your API key at https://developer.nrel.gov/signup/. \ No newline at end of file +Obtain your API key at https://developer.nlr.gov/signup/. \ No newline at end of file diff --git a/docs/source/user_guide/contributing.rst b/docs/source/user_guide/contributing.rst index ec692cc2..b7b7d079 100644 --- a/docs/source/user_guide/contributing.rst +++ b/docs/source/user_guide/contributing.rst @@ -7,7 +7,7 @@ We welcome contributions to PVDeg! Whether you're fixing bugs, adding features, improving documentation, or contributing to our material property databases, your help is valuable to the PV community. -For a quick overview, see `CONTRIBUTING.md `_ on GitHub. +For a quick overview, see `CONTRIBUTING.md `_ on GitHub. This guide provides comprehensive details for contributors. @@ -17,13 +17,13 @@ Easy Ways to Contribute Here are ways to contribute, even if you're new to PVDeg, git, or Python: -* **Report bugs or request features** via `GitHub issues `_ +* **Report bugs or request features** via `GitHub issues `_ * **Join discussions** on existing issues and pull requests * **Improve documentation** - fix typos, clarify explanations, add examples * **Enhance unit tests** - increase coverage or improve test quality * **Create or improve tutorials** - demonstrate PVDeg in your area of expertise * **Contribute to material databases** - add validated degradation parameters and properties -* **Share your work** - add your project to our `wiki `_ +* **Share your work** - add your project to our `wiki `_ * **Spread the word** - tell colleagues about PVDeg Getting Started @@ -75,7 +75,7 @@ Development Environment Setup python -m ipykernel install --user --name=pvdeg-dev - This is especially important when working on HPC systems like NREL's Kestrel. + This is especially important when working on HPC systems like NLR's Kestrel. Pre-commit Hooks ~~~~~~~~~~~~~~~~ @@ -490,7 +490,7 @@ Pull Request Process * Provide a clear description of changes * Reference related issues (e.g., "Closes #123") * Ensure all CI checks pass (tests, pre-commit hooks) - * Request review from maintainers or tag ``@NREL/pvdeg-maintainers`` + * Request review from maintainers or tag ``@NLR/pvdeg-maintainers`` **Best practices**: @@ -537,7 +537,7 @@ Contributor License Agreement ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ First-time contributors must sign the `Contributor License Agreement (CLA) -`_. +`_. This protects both you and the project. When you submit your first pull request, a bot will comment with instructions @@ -573,7 +573,7 @@ Getting Help If you have questions or need help: -* **Ask on GitHub Discussions**: ``_ +* **Ask on GitHub Discussions**: ``_ * **Open an issue**: For bugs or feature requests * **Check the documentation**: ``_ * **Review existing PRs**: See how others approached similar problems diff --git a/docs/source/user_guide/installation.rst b/docs/source/user_guide/installation.rst index 4373ebce..c1da3ad0 100644 --- a/docs/source/user_guide/installation.rst +++ b/docs/source/user_guide/installation.rst @@ -5,7 +5,7 @@ Installation PVDeg releases may be installed using pip. Compatible with Python 3.10 and above. -For a quick start, see the **Installation** section in our `README.md `_. +For a quick start, see the **Installation** section in our `README.md `_. Base Install ------------ @@ -20,7 +20,7 @@ This installs PVDeg with all required dependencies for basic degradation modelin * Core scientific computing libraries (numpy, pandas, scipy) * PV modeling with pvlib -* Weather data access (NREL-rex) +* Weather data access (NLR-rex) * Geospatial tools (cartopy, geopy) * Jupyter notebook support (jupyterlab, notebook) * Pre-commit hooks for development @@ -103,7 +103,7 @@ While PVDeg is installed via pip, you can use conda to manage your Python enviro python -m ipykernel install --user --name=pvdeg This allows you to select the ``pvdeg`` kernel when running Jupyter notebooks, - especially important on HPC systems like NREL's Kestrel. + especially important on HPC systems like NLR's Kestrel. Developer Installation ---------------------- diff --git a/docs/source/user_guide/meteorological-data.rst b/docs/source/user_guide/meteorological-data.rst index 05376706..59a442e8 100644 --- a/docs/source/user_guide/meteorological-data.rst +++ b/docs/source/user_guide/meteorological-data.rst @@ -12,8 +12,8 @@ The methodology for these datasets varies but both are gridded geospatial datase .. _NSRDB: NSRDB ------ -The NSRDB is produced by NREL and combines multiple datasets but we are most concerned with `Physical Solar Model 3 (PSM3) `_. This data was generated using satellite data from multiple channels to derive cloud -and aerosol properties, then fed into a radiative transfer model. Learn more about the NSRDB `here `_. +The NSRDB is produced by NLR and combines multiple datasets but we are most concerned with `Physical Solar Model 3 (PSM3) `_. This data was generated using satellite data from multiple channels to derive cloud +and aerosol properties, then fed into a radiative transfer model. Learn more about the NSRDB `here `_. The NSRDB is free to use but requires an api-key and email. See :ref:`NSRDB_API_Key` for more information. For our purposes, the api is limited to 1000 requests per day, although you can request a batch download via email with a singificantly higher rate limit (not recommended for PVDeg). @@ -23,14 +23,14 @@ Flowchart showing the dataflow from satellite to solar radiation measurement. .. image:: meteorological-data-details/data_flow_chart.png :alt: dataflow from satellite to solar radiation measurement, image missing -``_ +``_ NSRDB data are seperated by satellite/model source. Each dataset is shown below, much of the PVDeg project uses the *Americas* data. .. image:: meteorological-data-details/nsrdb_global_coverage.jpg :alt: NSRDB data sources, image missing -``_ +``_ .. _PVGIS: diff --git a/docs/source/user_guide/package_overview.rst b/docs/source/user_guide/package_overview.rst index 79662c01..152ed819 100644 --- a/docs/source/user_guide/package_overview.rst +++ b/docs/source/user_guide/package_overview.rst @@ -8,12 +8,12 @@ photovoltaics and accelerated testing. It currently offers functions to calculate test-chamber irradiance settings, the humidity of PV materials, the spectral degradation in backsheets, and more. Functionality has been simplified so you can use .psm3 weather files retrieved -from NREL's `National Solar Radiation Database (NSRDB) `_. +from NREL's `National Solar Radiation Database (NSRDB) `_. In some cases, such as calculating the relative backsheet spectral degradation, you will need spectraly resolved irradiance. This can be field data or data produced via simulation (for example: results from `bifacial_radiance -`_) +`_) **Package Functions:** diff --git a/docs/source/user_guide/pysam.rst b/docs/source/user_guide/pysam.rst index cfbe2677..7592a6aa 100644 --- a/docs/source/user_guide/pysam.rst +++ b/docs/source/user_guide/pysam.rst @@ -3,7 +3,7 @@ PySAM Implementation ==================== -PVDeg provides a convient wrapper for ``NREL-PySAM`` which is a Python wrapper for `NREL's System Advisor Model (SAM) `_. PySAM has a steep learning curve so we seek to provide a simple implementation that allows users to run geospatial analyses with SAM. +PVDeg provides a convient wrapper for ``NREL-PySAM`` which is a Python wrapper for `NLR's System Advisor Model (SAM) `_. PySAM has a steep learning curve so we seek to provide a simple implementation that allows users to run geospatial analyses with SAM. This work was produced to support `Innovative Solar Practices Integrated with Rural Economies and Ecosystems (InSPIRE) `_. diff --git a/docs/source/whatsnew/releases/v0.7.0.rst b/docs/source/whatsnew/releases/v0.7.0.rst index 4bfb1812..55dea81f 100644 --- a/docs/source/whatsnew/releases/v0.7.0.rst +++ b/docs/source/whatsnew/releases/v0.7.0.rst @@ -43,7 +43,7 @@ Enhancements ``03_monte_carlo/02_standoff.ipynb`` (NYC). - Add jupytext configuration and Python script versions for all notebooks with pre-commit hook synchronization for version control. - - Update Eagle HPC references to Kestrel (NREL's current HPC system). + - Update Eagle HPC references to Kestrel (NLR's current HPC system). - Update all configuration files (``_toc.yml``, ``_config.yml``, ``myst.yml``), GitHub Actions workflows (``nbval.yaml``, ``testbook.yaml``, ``deploy-books.yml``), and CI/CD pipelines for new structure. diff --git a/docs/source/whatsnew/releases/v0.7.2.rst b/docs/source/whatsnew/releases/v0.7.2.rst new file mode 100644 index 00000000..5c1fc58d --- /dev/null +++ b/docs/source/whatsnew/releases/v0.7.2.rst @@ -0,0 +1,21 @@ +v0.7.2 (2026-03-10) +=================== + + +Documentation & Bug Fixes +------------------------- +- Comprehensive update of "NREL" and "National Renewable Laboratory" + with the new name National Laboratory of the Rockies (NatLabRockies or NLR) on + all files, including repositories, dependencies (i.e. NLR-rex), HPC settings, + tutorials, emails, and function names. +- Please note NREL-pySAM has not migrated name and will not do so until ~April 2026. + updated as required. This includes an update to the CLA and its checksum (:pull:`XXX`). + + + +Contributors +------------ +- Silvana Ovaitt (:ghuser:`shirubana`) +- Rajiv Daxini (:ghuser:`rdaxini`) +- Martin Springer (:ghuser:`martin-springer`) + diff --git a/pvdeg/geospatialscenario.py b/pvdeg/geospatialscenario.py index 61b97872..d41d2621 100644 --- a/pvdeg/geospatialscenario.py +++ b/pvdeg/geospatialscenario.py @@ -181,7 +181,7 @@ def addLocation( weather_arg = { "satellite": satellite, "names": year, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": nsrdb_attributes, } @@ -794,7 +794,7 @@ def _get_geospatial_data(year: int): weather_arg = { "satellite": "Americas", "names": year, - "NREL_HPC": True, + "NLR_HPC": True, # 'attributes': ['air_temperature', 'wind_speed', 'dhi', 'ghi', 'dni', # 'relative_humidity']} "attributes": [], # does having do atributes break anything, should we just diff --git a/pvdeg/montecarlo.py b/pvdeg/montecarlo.py index 04bbaa33..0d20c9f7 100644 --- a/pvdeg/montecarlo.py +++ b/pvdeg/montecarlo.py @@ -271,7 +271,7 @@ def generateCorrelatedSamples( # monte carlo function # model after - -# https://github.com/NREL/PVDegradationTools/blob/main/pvdeg_tutorials/tutorials/LETID%20-%20Outdoor%20Geospatial%20Demo.ipynb # noqa +# https://github.com/NatLabRockies/PVDegradationTools/blob/main/pvdeg_tutorials/tutorials/LETID%20-%20Outdoor%20Geospatial%20Demo.ipynb # noqa def simulate( diff --git a/pvdeg/utilities.py b/pvdeg/utilities.py index bfe8662b..f51187f0 100644 --- a/pvdeg/utilities.py +++ b/pvdeg/utilities.py @@ -829,7 +829,7 @@ def geospatial_from_csv( Creates an xarray dataset contaning aeospatial weather data and a pandas dataframe containing geospatial metadata from a list of local csv files. - Useful for importing data from NSRDB api viewer https://nsrdb.nrel.gov/data-viewer + Useful for importing data from NSRDB api viewer https://nsrdb.nlr.gov/data-viewer when downloaded locally as csv Parameters @@ -1110,7 +1110,7 @@ def fix_metadata(meta): # we want this to only exist for things that can be run on kestrel -def nrel_kestrel_check(): +def nlr_kestrel_check(): """Check if the user is on Kestrel HPC environment. Passes silently or raises a @@ -1122,11 +1122,11 @@ def nrel_kestrel_check(): See Also -------- - NREL HPC : https://www.nrel.gov/hpc/ - Kestrel Documentation : https://nrel.github.io/HPC/Documentation/ + NLR HPC : https://www.nlr.gov/hpc/ + Kestrel Documentation : https://nlr.github.io/HPC/Documentation/ """ - KESTREL_HOSTNAME = "kestrel.hpc.nrel.gov" + KESTREL_HOSTNAME = "kestrel.hpc.nlr.gov" host = run(args=["hostname", "-f"], shell=False, capture_output=True, text=True) device_domain = ".".join(host.stdout.split(".")[-4:])[:-1] diff --git a/pvdeg/weather.py b/pvdeg/weather.py index 3b2bebd3..0390397c 100644 --- a/pvdeg/weather.py +++ b/pvdeg/weather.py @@ -1,7 +1,7 @@ """Collection of classes and functions to obtain spectral parameters.""" from pvdeg import humidity -from pvdeg.utilities import nrel_kestrel_check +from pvdeg.utilities import nlr_kestrel_check from typing import Union from pvlib import iotools @@ -128,7 +128,7 @@ def get( ------- Collecting a single site of PSM4 NSRDB data. *Api key and email must be replaced with your personal api key and email*. - [Request a key!](https://developer.nrel.gov/signup/) + [Request a key!](https://developer.nlr.gov/signup/) .. code-block:: python @@ -149,7 +149,7 @@ def get( weather_df, meta_dict = pvdeg.weather.get(database="PVGIS", id=(49.95, 1.5)) - Collecting geospatial data from NSRDB on Kestrel (NREL INTERNAL USERS ONLY) + Collecting geospatial data from NSRDB on Kestrel (NLR INTERNAL USERS ONLY) satellite options: ``"GOES", "METEOSAT", "Himawari", "SUNY", "CONUS", "Americas"`` @@ -161,7 +161,7 @@ def get( weather_arg = { "satellite": "Americas", "names": "TMY", - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -257,7 +257,7 @@ def get( elif geospatial: if database == "NSRDB": - nrel_kestrel_check() + nlr_kestrel_check() weather_ds, meta_df = get_NSRDB(geospatial=geospatial, **kwargs) meta_df["wind_height"] = 2 @@ -611,7 +611,7 @@ def ini_h5_geospatial(fps): return weather_ds, meta_df -def get_NSRDB_fnames(satellite, names, NREL_HPC=False, **_): +def get_NSRDB_fnames(satellite, names, NLR_HPC=False, **_): """Get a sorted list of NSRDB files for a given satellite and year. Parameters @@ -622,8 +622,8 @@ def get_NSRDB_fnames(satellite, names, NREL_HPC=False, **_): PVLIB naming convention year or 'TMY': If int, year of desired data If str, 'TMY' or 'TMY3' - NREL_HPC : (bool) - If True, use NREL HPC path + NLR_HPC : (bool) + If True, use NLR HPC path If False, use AWS path Returns @@ -642,11 +642,11 @@ def get_NSRDB_fnames(satellite, names, NREL_HPC=False, **_): "CONUS": "conus", "Americas": "current", } - if NREL_HPC: + if NLR_HPC: hpc_fp = "/datasets/NSRDB/" hsds = False else: - hpc_fp = "/nrel/nsrdb/v3/" + hpc_fp = "/nlr/nsrdb/v3/" hsds = True if type(names) in [int, float]: @@ -672,7 +672,7 @@ def get_NSRDB_fnames(satellite, names, NREL_HPC=False, **_): def get_NSRDB( satellite=None, names="TMY", - NREL_HPC=False, + NLR_HPC=False, gid=None, location=None, geospatial=False, @@ -690,8 +690,8 @@ def get_NSRDB( names : (int or str) If int, year of desired data If str, 'TMY' or 'TMY3' - NREL_HPC : (bool) - If True, use NREL HPC path + NLR_HPC : (bool) + If True, use NLR HPC path If False, use AWS path gid : (int) gid for the desired location @@ -714,7 +714,7 @@ def get_NSRDB( satellite, gid = get_satellite(location) if not geospatial: nsrdb_fnames, hsds = get_NSRDB_fnames( - satellite=satellite, names=names, NREL_HPC=NREL_HPC + satellite=satellite, names=names, NLR_HPC=NLR_HPC ) dattr = {} @@ -768,7 +768,7 @@ def get_NSRDB( # the year it was created. this creates problems, we only want to combine the # files if they are NOT TMY - nsrdb_fnames, hsds = get_NSRDB_fnames(satellite, names, NREL_HPC) + nsrdb_fnames, hsds = get_NSRDB_fnames(satellite, names, NLR_HPC) if isinstance(names, str) and names.lower() in ["tmy", "tmy3"]: # maintain as list with last element of sorted list @@ -1380,7 +1380,7 @@ def weather_distributed( NSRDB (including `database="PSM4"`) is rate limited and your key will face restrictions after making too many requests. - See rates [here](https://developer.nrel.gov/docs/solar/nsrdb/guide/). + See rates [here](https://developer.nlr.gov/docs/solar/nsrdb/guide/). Parameters ---------- @@ -1400,12 +1400,12 @@ def weather_distributed( api_key: str Only required when making NSRDB requests using "PSM4". - [NSRDB developer API key](https://developer.nrel.gov/signup/) + [NSRDB developer API key](https://developer.nlr.gov/signup/) email: str Only required when making NSRDB requests using "PSM4". [NSRDB developer account email associated with - `api_key`](https://developer.nrel.gov/signup/) + `api_key`](https://developer.nlr.gov/signup/) Returns ------- diff --git a/pyproject.toml b/pyproject.toml index fe5930b1..1f991ab7 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -6,8 +6,8 @@ build-backend = "setuptools.build_meta" [project] name = "pvdeg" description = "Pvdeg is a python library that supports the calculation of degradation related parameters for photovoltaic (PV) modules." -authors = [{name = "Pvdeg Python Developers", email = "Michael.Kempe@nrel.gov"}] -maintainers = [{email = "Silvana.Ovaitt@nrel.gov"}] +authors = [{name = "Pvdeg Python Developers", email = "Michael.Kempe@nlr.gov"}] +maintainers = [{email = "Silvana.Ovaitt@nlr.gov"}] license = {text = "BSD-3"} readme = "README.md" requires-python = ">=3.10.0" @@ -32,7 +32,7 @@ dependencies = [ "numpy>=1.19.3", "pvlib>=0.12.0", "scipy>1.6.0", - "NREL-rex", + "NLR-rex", "cartopy", "dask[dataframe]", "dask-jobqueue", @@ -93,10 +93,10 @@ all = [ console_scripts = "pvdeg.cli:cli" [project.urls] -Homepage = "https://github.com/NREL/PVDegradationTools" -"Bug Tracker" = "https://github.com/NREL/PVDegradationTools/issues" +Homepage = "https://github.com/NatLabRockies/PVDegradationTools" +"Bug Tracker" = "https://github.com/NatLabRockies/PVDegradationTools/issues" Documentation = "https://pvdegradationtools.readthedocs.io/" -"Source Code" = "https://github.com/NREL/PVDegradationTools" +"Source Code" = "https://github.com/NatLabRockies/PVDegradationTools" [tool.setuptools.packages.find] include = ["pvdeg"] diff --git a/scripts/run_all_testbook.py b/scripts/run_all_testbook.py index 5fae1d01..6fa434f9 100644 --- a/scripts/run_all_testbook.py +++ b/scripts/run_all_testbook.py @@ -32,22 +32,22 @@ def monkeypatch_addLocation(self, *args, **kwargs): """ -def monkeypatch_nrel_kestrel_check(): - """String to monkeypatch pvdeg.utilities.nrel_kestrel_check to a no-op function""" +def monkeypatch_nlr_kestrel_check(): + """String to monkeypatch pvdeg.utilities.nlr_kestrel_check to a no-op function""" return """ import pvdeg.utilities import pvdeg.weather -def mock_nrel_kestrel_check(*args, **kwargs): +def mock_nlr_kestrel_check(*args, **kwargs): pass -pvdeg.utilities.nrel_kestrel_check = mock_nrel_kestrel_check +pvdeg.utilities.nlr_kestrel_check = mock_nlr_kestrel_check # Also patch it in the weather module since it imports the function directly -pvdeg.weather.nrel_kestrel_check = mock_nrel_kestrel_check +pvdeg.weather.nlr_kestrel_check = mock_nlr_kestrel_check """ def monkeypatch_cells(tb): # Inject both monkeypatches as the very first cell - tb.inject(monkeypatch_addLocation() + monkeypatch_nrel_kestrel_check(), 0) + tb.inject(monkeypatch_addLocation() + monkeypatch_nlr_kestrel_check(), 0) def main(notebook_path): diff --git a/sign-cla.md b/sign-cla.md index d4f30caa..abec5b16 100644 --- a/sign-cla.md +++ b/sign-cla.md @@ -64,7 +64,7 @@ You can confirm the MD5 checksum of the CLA by running the md5 program over ``` md5 cla-1.0.md -MD5 (cla-1.0.md) = 2aa6e2788a0ff4d45cabfc839290e1ca +MD5 (cla-1.0.md) = 93888c3304015c6b99de6dac7252a812 ``` or on Windows @@ -84,8 +84,8 @@ Sending the Email ----------------- Send an email to pvdeg's official Open Sourceror -at [silvana.ovaitt@nrel.gov](mailto:silvana.ovaitt@nrel.gov), -cc-ing [michael.kempe@nrel.gov](mailto:michael.kempe@nrel.gov), +at [silvana.ovaitt@nlr.gov](mailto:silvana.ovaitt@nlr.gov), +cc-ing [michael.kempe@nlr.gov](mailto:michael.kempe@nlr.gov), with the subject "CLA pvdeg" and the following body: diff --git a/tests/test_pysam.py b/tests/test_pysam.py index 99bd4116..7bbe6f38 100644 --- a/tests/test_pysam.py +++ b/tests/test_pysam.py @@ -28,7 +28,7 @@ META_SINGLE_LOC = GEO_META.iloc[0].to_dict() -def test_pysam_missing_nrel_pysam_deps(monkeypatch, caplog): +def test_pysam_missing_nlr_pysam_deps(monkeypatch, caplog): real_import = builtins.__import__ def fake_import(name, *args, **kwargs): diff --git a/tests/test_utilities.py b/tests/test_utilities.py index 0d514dda..ea6e0b9b 100644 --- a/tests/test_utilities.py +++ b/tests/test_utilities.py @@ -230,9 +230,9 @@ def test_add_material(tmp_path): # this only works because we are not running on kestrel -def test_nrel_kestrel_check_bad(): +def test_nlr_kestrel_check_bad(): with pytest.raises(ConnectionError): - pvdeg.utilities.nrel_kestrel_check() + pvdeg.utilities.nlr_kestrel_check() # NEW MATERIAL UTIL FUNCTIONS diff --git a/tests/test_weather.py b/tests/test_weather.py index 34b3401c..4d62efd0 100644 --- a/tests/test_weather.py +++ b/tests/test_weather.py @@ -65,12 +65,12 @@ def test_get(): """Test with (lat,lon) and gid options.""" # TODO: Test with AWS - # #Test with lat, lon on NREL HPC + # #Test with lat, lon on NLR HPC # weather_db = 'NSRDB' - # weather_id = (39.741931, -105.169891) #NREL + # weather_id = (39.741931, -105.169891) #NLR # weather_arg = {'satellite' : 'GOES', # 'names' : 2021, - # 'NREL_HPC' : True, + # 'NLR_HPC' : True, # 'attributes' : ['air_temperature', 'wind_speed', 'dhi', # 'ghi', 'dni','relative_humidity']} @@ -80,7 +80,7 @@ def test_get(): # assert isinstance(weather_df, pd.DataFrame) # assert len(weather_df) != 0 - # #Test with gid on NREL HPC + # #Test with gid on NLR HPC # weather_id = 1933572 # weather_df, meta = pvdeg.weather.load(weather_db, weather_id, **weather_arg) pass @@ -316,7 +316,7 @@ def fake_glob(pattern): files, hsds = pvdeg.weather.get_NSRDB_fnames( satellite="Americas", names="TMY", - NREL_HPC=True, + NLR_HPC=True, ) # HPC path -> h5py (not HSDS) @@ -332,10 +332,10 @@ def test_get_NSRDB_ds_has_kestrel_nsrdb_fnames_tmy(monkeypatch): """For TMY, get_NSRDB should store only the last element of the sorted list.""" # Fake get_NSRDB_fnames to return UNSORTED list + hsds flag - def fake_get_NSRDB_fnames(satellite, names, NREL_HPC): + def fake_get_NSRDB_fnames(satellite, names, NLR_HPC): assert satellite == "Americas" assert names == "TMY" - assert NREL_HPC is True + assert NLR_HPC is True return SORTED_TMY_DIR, False # Fake ini_h5_geospatial to return an empty dataset/meta (no attrs set here) @@ -353,7 +353,7 @@ def fake_ini_h5_geospatial(nsrdb_fnames): ds, meta = pvdeg.weather.get_NSRDB( satellite="Americas", names="TMY", - NREL_HPC=True, + NLR_HPC=True, geospatial=True, ) @@ -366,10 +366,10 @@ def fake_ini_h5_geospatial(nsrdb_fnames): def test_get_NSRDB_ds_has_kestrel_nsrdb_fnames_year(monkeypatch): """For a specific year, get_NSRDB should store the full sorted list.""" - def fake_get_NSRDB_fnames(satellite, names, NREL_HPC): + def fake_get_NSRDB_fnames(satellite, names, NLR_HPC): assert satellite == "Americas" assert names == 2024 - assert NREL_HPC is False + assert NLR_HPC is False return SORTED_TMY_DIR, True def fake_ini_h5_geospatial(nsrdb_fnames): @@ -385,7 +385,7 @@ def fake_ini_h5_geospatial(nsrdb_fnames): ds, meta = pvdeg.weather.get_NSRDB( satellite="Americas", names=2024, - NREL_HPC=False, + NLR_HPC=False, geospatial=True, ) diff --git a/tutorials/01_basics/04_weather_database_access.ipynb b/tutorials/01_basics/04_weather_database_access.ipynb index 6251ae29..02baeb72 100644 --- a/tutorials/01_basics/04_weather_database_access.ipynb +++ b/tutorials/01_basics/04_weather_database_access.ipynb @@ -8,13 +8,13 @@ "\n", "**Requirements:**\n", "- Internet access\n", - "- NSRDB API key. API keys are free. You can request and obtain an API key in about 5 minutes. To get your own key, visit https://developer.nrel.gov/signup/\n", - "- Step **1.** is for Kestrel HPC users. You will need an account with NREL's Kestrel computer for this method.\n", + "- NSRDB API key. API keys are free. You can request and obtain an API key in about 5 minutes. To get your own key, visit https://developer.nlr.gov/signup/\n", + "- Step **1.** is for Kestrel HPC users. You will need an account with NLR's Kestrel computer for this method.\n", "\n", "**Objectives:**\n", "\n", "Using direct access to large scale weather databases, we're going to estimate the minimum standoff distance for a roof mounted PV system. We'll do this in 3 ways using both the NSRDB and PVGIS database.\n", - "1. Single Location, NSRDB via NREL's high performance computer, Kestrel.\n", + "1. Single Location, NSRDB via NLR's high performance computer, Kestrel.\n", "2. Single Location via NSRDB public API key.\n", "3. Single Location via the PVGIS public database\n", "\n", @@ -118,7 +118,7 @@ "source": [ "# 1. NSRDB - HSDS on Kestrel\n", "\n", - "This method requires a direct connection to NREL's high performance computer \"Kestrel\". If you are not running this journal from Kestrel, skip this section and proceed to section **2.**\n", + "This method requires a direct connection to NLR's high performance computer \"Kestrel\". If you are not running this journal from Kestrel, skip this section and proceed to section **2.**\n", "\n", "In this step:\n", "\n", @@ -149,7 +149,7 @@ "weather_arg = {\n", " \"satellite\": \"GOES\",\n", " \"names\": 2021,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -160,7 +160,7 @@ " ],\n", "}\n", "\n", - "# Uncomment the following when working on NREL Kestrel\n", + "# Uncomment the following when working on NLR Kestrel\n", "\n", "# weather_df, meta = pvdeg.weather.get(weather_db, weather_id, **weather_arg)\n", "\n", @@ -185,7 +185,7 @@ "source": [ "# 2. NSRDB - API\n", "\n", - "To access the NREL NSRDB, you will need an API key. Key's are free, but require you to set up an account. Without an API key, you can use a demonstration API which is severely limited. To set up an account and get your API key, visit https://developer.nrel.gov/signup/\n", + "To access the NLR NSRDB, you will need an API key. Key's are free, but require you to set up an account. Without an API key, you can use a demonstration API which is severely limited. To set up an account and get your API key, visit https://developer.nlr.gov/signup/\n", "\n", "Key Notes:\n", "- set `attributes = []` to return all possible attributes (weather fields)\n", @@ -232,7 +232,7 @@ "# Uncomment below to fetch fresh data with your own API key:\n", "# API_KEY = \"your_api_key_here\"\n", "# # The example API key here is for demonstation and is rate-limited per IP.\n", - "# # To get your own API key, visit https://developer.nrel.gov/signup/\n", + "# # To get your own API key, visit https://developer.nlr.gov/signup/\n", "# weather_db = \"PSM4\"\n", "# weather_id = (39.741931, -105.169891)\n", "# weather_arg = {\n", @@ -331,7 +331,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg_313", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -349,5 +349,5 @@ } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/01_basics/scripts/04_weather_database_access.py b/tutorials/01_basics/scripts/04_weather_database_access.py index a44296fa..791860be 100644 --- a/tutorials/01_basics/scripts/04_weather_database_access.py +++ b/tutorials/01_basics/scripts/04_weather_database_access.py @@ -1,45 +1,53 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # Weather Database Access -# +# # **Requirements:** # - Internet access -# - NSRDB API key. API keys are free. You can request and obtain an API key in about 5 minutes. To get your own key, visit https://developer.nrel.gov/signup/ -# - Step **1.** is for Kestrel HPC users. You will need an account with NREL's Kestrel computer for this method. -# +# - NSRDB API key. API keys are free. You can request and obtain an API key in about 5 minutes. To get your own key, visit https://developer.nlr.gov/signup/ +# - Step **1.** is for Kestrel HPC users. You will need an account with NLR's Kestrel computer for this method. +# # **Objectives:** -# +# # Using direct access to large scale weather databases, we're going to estimate the minimum standoff distance for a roof mounted PV system. We'll do this in 3 ways using both the NSRDB and PVGIS database. -# 1. Single Location, NSRDB via NREL's high performance computer, Kestrel. +# 1. Single Location, NSRDB via NLR's high performance computer, Kestrel. # 2. Single Location via NSRDB public API key. # 3. Single Location via the PVGIS public database -# +# # **Background:** -# +# # This journal will demonstrate all existing built-in methods for directly accessing public weather databases. Some methods are restriced to certain user groups. For general users, see methods **2** and **3**. For users with an active Kestrel HPC account, you may use method **1** as well as **2** and **3**. -# +# # For all users and all steps: This journal will run significantly longer than other tutorials and have significant internet traffic as you fetch large datasets. -# %% [markdown] # This example demonstrates the calculation of a minimum standoff distance necessary for roof-mounted PV modules to ensure that the $T_{98}$ operational temperature remains under 70°C, in which case the more rigorous thermal stability testing requirements of IEC TS 63126 would not needed to be considered. We use data from [Fuentes, 1987] to model the approximate exponential decay in temperature, $T(X)$, with increasing standoff distance, $X$, as, -# +# # $$ X = -X_0 \ln\left(1-\frac{T_0-T}{\Delta T}\right)$$ -# +# # where $T_0$ is the temperature for $X=0$ (insulated back) and $\Delta T$ is the temperature difference between an insulated back ($X=0$) and open rack mounting configuration ($X=\infty)$. -# +# # The following figure showcases this calulation for the entire United States. We used pvlib and data from the National Solar Radiation Database (NSRDB) to calculate the module temperatures for different mounting configuration and applied our model to obtain the standoff distance for roof-mounted PV systems. -# %% [markdown] # # Single location example -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% import pvdeg import pandas as pd -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -48,18 +56,20 @@ print("Python version ", sys.version) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # 1. NSRDB - HSDS on Kestrel -# -# This method requires a direct connection to NREL's high performance computer "Kestrel". If you are not running this journal from Kestrel, skip this section and proceed to section **2.** -# +# +# This method requires a direct connection to NLR's high performance computer "Kestrel". If you are not running this journal from Kestrel, skip this section and proceed to section **2.** +# # In this step: -# +# # First we select a database. Here, we will use the NSRDB. Since we are modeling a single location, we can pass the `weather_id` as tuple (lat, long). A location gid can be used as well. 'gid' is a unique identifier to a geographic location within the NSRDB. We'll look at how to find gids later on. -# +# # Next, we want to select a satellite, named dataset (year of data), and what weather attributes we want to fetch. For further options, see the documentation for `pvdeg.weather.get` -# %% +# In[4]: + + # Get weather data weather_db = "NSRDB" @@ -69,7 +79,7 @@ weather_arg = { "satellite": "GOES", "names": 2021, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -80,7 +90,7 @@ ], } -# Uncomment the following when working on NREL Kestrel +# Uncomment the following when working on NLR Kestrel # weather_df, meta = pvdeg.weather.get(weather_db, weather_id, **weather_arg) @@ -89,23 +99,24 @@ # print(pvdeg.standards.interpret_standoff(res)) # print(meta) -# %% [markdown] + # `pvdeg.weather.get` returns the same variables as `weather.read` which we have used in each journal before this. We get a weather DataFrame and a meta-data dicitonary. Each contains a minimum of consistent fields, but may have additional fields based on the database accessed or the attributes requested. -# +# # Lets verify the weather data we fetched by running a familiar calculation; standoff distance. -# %% [markdown] # # 2. NSRDB - API -# -# To access the NREL NSRDB, you will need an API key. Key's are free, but require you to set up an account. Without an API key, you can use a demonstration API which is severely limited. To set up an account and get your API key, visit https://developer.nrel.gov/signup/ -# +# +# To access the NLR NSRDB, you will need an API key. Key's are free, but require you to set up an account. Without an API key, you can use a demonstration API which is severely limited. To set up an account and get your API key, visit https://developer.nlr.gov/signup/ +# # Key Notes: # - set `attributes = []` to return all possible attributes (weather fields) # - There are 2 major methods with the API # - names = 'tmy' : generate a TMY-like weather dataframe aggregate. This will calculate the relative humidity from temperature and dew point. # - names = '2019' : collect a weather dataframe including measured relative humidity. -# %% +# In[5]: + + # Load pre-saved weather data for this tutorial # This avoids API rate limits during testing and builds import json @@ -117,7 +128,7 @@ # Uncomment below to fetch fresh data with your own API key: # API_KEY = "your_api_key_here" # # The example API key here is for demonstation and is rate-limited per IP. -# # To get your own API key, visit https://developer.nrel.gov/signup/ +# # To get your own API key, visit https://developer.nlr.gov/signup/ # weather_db = "PSM4" # weather_id = (39.741931, -105.169891) # weather_arg = { @@ -147,12 +158,14 @@ meta_clean = {k: v for k, v in meta.items() if k not in ["irradiance_time_offset"]} print(meta_clean) -# %% [markdown] + # # 3. PVGIS -# +# # This method uses the PVGIS database, a public resource. It requires no API key or user account. -# %% +# In[6]: + + weather_db = "PVGIS" # weather_id = (39.741931, -105.169891) weather_id = (24.7136, 46.6753) # Riyadh, Saudi Arabia @@ -180,3 +193,4 @@ # Clean metadata for consistent output (remove variable fields) meta_clean = {k: v for k, v in meta.items() if k not in ["irradiance_time_offset"]} print(meta_clean) + diff --git a/tutorials/02_degradation/03_letid_outdoor.ipynb b/tutorials/02_degradation/03_letid_outdoor.ipynb index 171b380e..c93d1cd3 100644 --- a/tutorials/02_degradation/03_letid_outdoor.ipynb +++ b/tutorials/02_degradation/03_letid_outdoor.ipynb @@ -105,7 +105,7 @@ "metadata": {}, "source": [ "First, we'll use pvlib to create and run a model system, and use the irradiance, temperature, and operating point of that model to set up our LETID model\n", - "For this example, we'll model a fixed latitude tilt system at NREL, in Golden, CO, USA, using [NSRDB](https://nsrdb.nrel.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database." + "For this example, we'll model a fixed latitude tilt system at NLR, in Golden, CO, USA, using [NSRDB](https://nsrdb.nlr.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database." ] }, { @@ -1963,7 +1963,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -1977,9 +1977,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.13.11" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/02_degradation/04_letid_outdoor_scenario.ipynb b/tutorials/02_degradation/04_letid_outdoor_scenario.ipynb index eac32461..a6f439ce 100644 --- a/tutorials/02_degradation/04_letid_outdoor_scenario.ipynb +++ b/tutorials/02_degradation/04_letid_outdoor_scenario.ipynb @@ -489,7 +489,7 @@ "\n", "Great!\n", "\n", - "The example proceeds below in similar fashion to the outdoor example, using a fixed latitude tilt system at NREL, in Golden, CO, USA, using [NSRDB](https://nsrdb.nrel.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database." + "The example proceeds below in similar fashion to the outdoor example, using a fixed latitude tilt system at NLR, in Golden, CO, USA, using [NSRDB](https://nsrdb.nlr.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database." ] }, { @@ -3681,7 +3681,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -3695,9 +3695,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.13.11" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/02_degradation/scripts/03_letid_outdoor.py b/tutorials/02_degradation/scripts/03_letid_outdoor.py index 66ebd270..8cd8fef6 100644 --- a/tutorials/02_degradation/scripts/03_letid_outdoor.py +++ b/tutorials/02_degradation/scripts/03_letid_outdoor.py @@ -1,16 +1,18 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # LETID Outdoor -# +# # This is an example on how to model LETID progression in outdoor environments -# +# # We can use the equations in this library to model LETID progression in a simulated outdoor environment, given that we have weather and system data. This example makes use of tools from the fabulous [pvlib](https://pvlib-python.readthedocs.io/en/stable/) library to calculate system irradiance and temperature, which we use to calculate progression in LETID states. -# +# # This will illustrate the potential of "Temporary Recovery", i.e., the backwards transition of the LETID defect B->A that can take place with carrier injection at lower temperatures. -# -# +# +# # **Requirements:** # - `pvlib`, `pandas`, `numpy`, `matplotlib` -# +# # **Objectives:** # 1. Use `pvlib` and provided weather files to set up a temperature and injection timeseries # 2. Define necessary solar cell device parameters @@ -18,11 +20,16 @@ # 4. Run through timeseries, calculating defect states # 5. Calculate device degradation and plot -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% from pvdeg import letid, collection, utilities, DATA_DIR import pvlib @@ -32,7 +39,10 @@ import matplotlib.pyplot as plt import pvdeg -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -43,11 +53,13 @@ print("pvlib version ", pvlib.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # First, we'll use pvlib to create and run a model system, and use the irradiance, temperature, and operating point of that model to set up our LETID model -# For this example, we'll model a fixed latitude tilt system at NREL, in Golden, CO, USA, using [NSRDB](https://nsrdb.nrel.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database. +# For this example, we'll model a fixed latitude tilt system at NLR, in Golden, CO, USA, using [NSRDB](https://nsrdb.nlr.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database. + +# In[4]: + -# %% # load weather and location data, use pvlib read_psm3 function with map_variables = True sam_file = "psm3.csv" @@ -55,17 +67,26 @@ os.path.join(DATA_DIR, sam_file), file_type="PSM3", map_variables=True ) -# %% + +# In[5]: + + weather -# %% + +# In[6]: + + # if our weather file doesn't have precipitable water, calculate it with pvlib if "precipitable_water" not in weather.columns: weather["precipitable_water"] = pvlib.atmosphere.gueymard94_pw( weather["temp_air"], weather["relative_humidity"] ) -# %% + +# In[7]: + + # rename some columns for pvlib if they haven't been already weather.rename( columns={ @@ -91,10 +112,16 @@ ] ] -# %% + +# In[8]: + + weather -# %% + +# In[9]: + + # import pvlib stuff and pick a module and inverter. Choice of these things will slightly affect the pvlib results which we later use to calculate injection. # we'll use the SAPM temperature model open-rack glass/polymer coeffecients. @@ -113,7 +140,10 @@ "open_rack_glass_polymer" ] -# %% + +# In[10]: + + # set up system in pvlib lat = meta["latitude"] lon = meta["longitude"] @@ -132,18 +162,23 @@ temperature_model_parameters=temperature_model_parameters, ) -# %% + +# In[11]: + + # create and run pvlib modelchain mc = ModelChain(system, location, aoi_model="physical") mc.run_model(weather) -# %% [markdown] + # # Set up timeseries # In this example, injection is a function of both the operating point of the module (which we will assume is maximum power point) and irradiance. Maximum power point injection is equivalent to $(I_{sc}-I_{mp})/I_{sc}\times Ee$, where $Ee$ is effective irradiance, the irradiance absorbed by the module's cells. We normalize it to 1-sun irradiance, 1000 $W/m^2$. -# +# # We will use the irradiance, DC operating point, and cell temperature from the pvlib modelchain results. -# %% +# In[12]: + + ee = mc.results.effective_irradiance # injection = (mc.results.dc['i_sc']-mc.results.dc['i_mp'])/(mc.results.dc['i_sc'])*(ee/1000) injection = letid.calc_injection_outdoors(mc.results) @@ -157,28 +192,38 @@ ) # reset the index so datetime is a column. I prefer integer indexing. timesteps.rename(columns={"index": "Datetime"}, inplace=True) -# %% + +# In[13]: + + # filter out times when injection is NaN, these won't progress LETID, and it'll make the calculations below run faster timesteps = timesteps[timesteps["Injection"].notnull()] timesteps.reset_index(inplace=True, drop=True) -# %% + +# In[14]: + + timesteps -# %% [markdown] + # # Device parameters # To define a device, we need to define several important quantities about the device: wafer thickness (in $\mu m$), rear surface recombination velocity (in cm/s), and cell area (in cm2). -# %% +# In[15]: + + wafer_thickness = 180 # um s_rear = 46 # cm/s cell_area = 243 # cm^2 -# %% [markdown] + # Other device parameters # Other required device parameters: base diffusivity (in cm2/s), and optical generation profile, which allow us to estimate current collection in the device. -# %% +# In[16]: + + generation_df = pd.read_excel( os.path.join(DATA_DIR, "PVL_GenProfile.xlsx"), header=0 ) # this is an optical generation profile generated by PVLighthouse's OPAL2 default model for 1-sun, normal incident AM1.5 sunlight on a 180-um thick SiNx-coated, pyramid-textured wafer. @@ -187,22 +232,26 @@ d_base = 27 # cm^2/s electron diffusivity. See https://www2.pvlighthouse.com.au/calculators/mobility%20calculator/mobility%20calculator.aspx for details -# %% [markdown] + # # Degradation parameters # To model the device's degradation, we need to define several more important quantities about the degradation the device will experience. These include undegraded and degraded lifetime (in $\mu s$). -# %% +# In[17]: + + tau_0 = 115 # us, carrier lifetime in non-degraded states, e.g. LETID/LID states A or C tau_deg = 55 # us, carrier lifetime in fully-degraded state, e.g. LETID/LID state B -# %% [markdown] + # Remaining degradation parameters: -# +# # The rest of the quantities to define are: the initial percentage of defects in each state (A, B, and C), and the dictionary of mechanism parameters. -# +# # In this example, we'll assume the device starts in the fully-undegraded state (100% state A), and we'll use the kinetic parameters for LETID degradation from Repins. -# %% +# In[18]: + + # starting defect state percentages nA_0 = 100 nB_0 = 0 @@ -223,11 +272,13 @@ tau_0, tau_deg, nB_0 ) # calculate tau for the first timestep -# %% [markdown] + # # Run through timesteps # Since each timestep depends on the preceding timestep, we need to calculate in a loop. This will take a few minutes depending on the length of the timeseries. -# %% +# In[19]: + + for index, timestep in timesteps.iterrows(): # first row tau has already been assigned if index == 0: @@ -322,11 +373,13 @@ tau, wafer_thickness, s_rear, jsc, temperature=25 ) -# %% [markdown] + # # Finish calculating degraded device parameters. # Now that we have calculated defect states, we can calculate all the quantities that depend on defect states. -# %% +# In[20]: + + timesteps["tau"] = letid.tau_now(tau_0, tau_deg, timesteps["NB"]) # calculate device Jsc for every timestep. Unfortunately this requires an integration so I think we have to run through a loop. Device Jsc allows calculation of device Voc. @@ -339,20 +392,24 @@ timesteps.at[index, "tau"], wafer_thickness, s_rear, jsc_now, temperature=25 ) -# %% + +# In[21]: + + timesteps = letid.calc_device_params( timesteps, cell_area=243 ) # this function quickly calculates the rest of the device parameters: Isc, FF, max power, and normalized max power timesteps -# %% [markdown] + # Note of course that all these calculated device parameters are modeled STC device parameters, not the instantaneous, weather-dependent values. This isn't a robust performance model of a degraded module. -# %% [markdown] # # Plot the results -# %% +# In[22]: + + from cycler import cycler plt.style.use("default") @@ -383,13 +440,14 @@ plt.show() -# %% [markdown] + # The example data provided for Golden, CO, shows how $N_A$ increases in cold weather, and power temporarily recovers, due to temporary recovery of LETID (B->A). -# %% [markdown] # # The function `calc_letid_outdoors` wraps all of the steps above into a single function: -# %% +# In[23]: + + nA_0 = 100 nB_0 = 0 nC_0 = 0 @@ -410,4 +468,9 @@ module_parameters=cec_module, ) -# %% + +# In[ ]: + + + + diff --git a/tutorials/02_degradation/scripts/04_letid_outdoor_scenario.py b/tutorials/02_degradation/scripts/04_letid_outdoor_scenario.py index 1279bb96..0623320d 100644 --- a/tutorials/02_degradation/scripts/04_letid_outdoor_scenario.py +++ b/tutorials/02_degradation/scripts/04_letid_outdoor_scenario.py @@ -1,19 +1,21 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # LETID Outdoor Scenario -# +# # This is an example for using a test result to model LETID progression in outdoor environments -# +# # One important use case for this library is to use data from a LETID test (e.g. [IEC TS 63342](https://webstore.iec.ch/publication/67332)) to model how a module may degrade and regenerate in the field. -# +# # We will take some data from module testing presented in [Karas *et al.* 2022](https://onlinelibrary.wiley.com/doi/10.1002/pip.3573), and use it to estimate device parameters. -# +# # We can use the equations in this library to model LETID progression in a simulated outdoor environment, given that we have weather and system data. This example makes use of tools from the fabulous [pvlib](https://pvlib-python.readthedocs.io/en/stable/) library to calculate system irradiance and temperature, which we use to calculate progression in LETID states. -# -# -# +# +# +# # **Requirements:** # - `pvlib`, `pandas`, `numpy`, `matplotlib`, `scipy` -# +# # **Objectives:** # 1. Load data from example test results # 2. Use `pvlib` and provided weather files to set up a temperature and injection timeseries @@ -22,11 +24,16 @@ # 5. Run through timeseries, calculating defect states # 6. Calculate device degradation and plot -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% from pvdeg import letid, collection, utilities, DATA_DIR import pvdeg @@ -37,7 +44,10 @@ import matplotlib.pyplot as plt -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -47,23 +57,30 @@ print("Pandas version ", pd.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # First, we'll load some data taken from an accelerated test. See [Karas *et al.* 2022](https://onlinelibrary.wiley.com/doi/10.1002/pip.3573) for full details. This data is the average of "Type S" modules from Lab 3. Type S modules were prototype modules made with 48 monocrystalline cells, and degraded about 4-5% in LETID testing. Data throughout testing is shown below: -# %% +# In[4]: + + cell_area = 243 # cm^2 -# %% + +# In[5]: + + df = pd.read_csv(os.path.join(DATA_DIR, "module test data.csv")) df["cell Voc"] = df["Voc"] / 48 df["Jsc"] = df["Isc"] / cell_area * 1000 df["% degradation"] = (df["Pmp"] - df["Pmp"].iloc[0]) / df["Pmp"] * 100 df -# %% [markdown] + # The module parameter that is most sensitve to our device input parameters is cell open-circuit voltage, which in this case started at about 0.664 V/cell. We will select reasonable values for solar cell input parameters, and use ```letid.calc_voc_from_tau()``` to check if those parameters match the cell Voc of the device we're trying to model. The important quantities here are bulk lifetime in the initial state (```tau_0```), wafer thickness, and rear surface recombination velocity. -# %% +# In[6]: + + tau_0 = 120 # [us] reasonable bulk lifetime for commercial-quality p-type silicon, maybe a little on the low side for typical wafer lifetimes, which could range from 100-500 us. wafer_thickness = 180 # [um] a reasonable wafer thickness for typical commercial silicon solar cells. Wafer thicknesses for Si solar cells are typically 160-180 um. @@ -84,7 +101,10 @@ voc_0 = letid.calc_voc_from_tau(tau_0, wafer_thickness, srv_rear, jsc_0, temperature) voc_0 -# %% + +# In[7]: + + generation_df = pd.read_excel( os.path.join(DATA_DIR, "PVL_GenProfile.xlsx"), header=0 ) # this is an optical generation profile generated by PVLighthouse's OPAL2 default model for 1-sun, normal incident AM1.5 sunlight on a 180-um thick SiNx-coated, pyramid-textured wafer. @@ -93,10 +113,12 @@ d_base = 27 # cm^2/s electron diffusivity. See https://www2.pvlighthouse.com.au/calculators/mobility%20calculator/mobility%20calculator.aspx for details -# %% [markdown] + # Pretty close! -# %% +# In[8]: + + # check to make sure power is close to the measured Week 0 power ff_0 = df.query("Week == 0")["FF"].item() # [%] fill factor @@ -104,10 +126,12 @@ pmp_0 = (voc_0 * 48) * (jsc_0 * cell_area / 1000) * ff_0 / 100 # [W] maximum power pmp_0, df.query("Week == 0")["Pmp"].item() -# %% [markdown] + # Now we do the same thing for the degraded state to determine ```tau_deg```, the bulk lifetime when the module is in its most degraded state. So here the cell Voc target is the roughly 0.656 V measured after 4 weeks of testing. -# %% +# In[9]: + + tau_deg = 80 # [us] degraded bulk lifetime isc_deg = df.query("Week == 4")[ @@ -120,7 +144,10 @@ ) voc_deg -# %% + +# In[10]: + + # check to make sure power is close to the measured Week 4 power ff_deg = df.query("Week == 4")["FF"].item() # [%] fill factor @@ -130,17 +157,22 @@ ) # [W] maximum power pmp_deg, df.query("Week == 4")["Pmp"].item() -# %% + +# In[11]: + + (pmp_0 - pmp_deg) / pmp_0 -# %% [markdown] + # So for modeling this module, we will use ```tau_0``` = 120 $\mu s$, ```tau_deg``` = 80 $\mu s$, with ```wafer_thickness``` = 180 $\mu m$ and ```srv_rear``` = 100 cm/s. -# +# # Great! -# -# The example proceeds below in similar fashion to the outdoor example, using a fixed latitude tilt system at NREL, in Golden, CO, USA, using [NSRDB](https://nsrdb.nrel.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database. +# +# The example proceeds below in similar fashion to the outdoor example, using a fixed latitude tilt system at NLR, in Golden, CO, USA, using [NSRDB](https://nsrdb.nlr.gov/) hourly PSM weather data, SAPM temperature models, and module and inverter models from the CEC database. + +# In[12]: + -# %% # load weather and location data, use pvlib read_psm3 function sam_file = "psm3.csv" @@ -148,17 +180,26 @@ os.path.join(DATA_DIR, sam_file), file_type="PSM3", map_variables=True ) -# %% + +# In[13]: + + weather -# %% + +# In[14]: + + # if our weather file doesn't have precipitable water, calculate it with pvlib if "precipitable_water" not in weather.columns: weather["precipitable_water"] = pvlib.atmosphere.gueymard94_pw( weather["temp_air"], weather["relative_humidity"] ) -# %% + +# In[15]: + + # drop unneeded columns weather = weather[ [ @@ -172,25 +213,35 @@ ] ] -# %% + +# In[16]: + + weather -# %% [markdown] + # # Set up PVlib model # Note that the module we select here is NOT the same "Type S" module that was tested for LETID. I'm simply trying to find a module in the CEC database with I-V characteristics that are reasonably close to the tested module, so the pvlib calculated DC results are close to how our Type S module might behave in the field. -# %% +# In[17]: + + cec_modules = pvlib.pvsystem.retrieve_sam("CECMod").T cec_modules[cec_modules["STC"].between(220, 250) & (cec_modules["N_s"] == 48)] -# %% [markdown] + # The LG ones look close to the module we're trying to model. Pmp around 235W, Isc around 9.9A. Let's go with 'LG_Electronics_Inc__LG235N8K_G4' -# %% +# In[18]: + + cec_modules = cec_modules.T cec_module = cec_modules["LG_Electronics_Inc__LG235N8K_G4"] -# %% + +# In[19]: + + # import the rest of the pvlib stuff # we'll use the SAPM temperature model open-rack glass/polymer coeffecients. @@ -206,7 +257,10 @@ "open_rack_glass_polymer" ] -# %% + +# In[20]: + + # set up system in pvlib lat = meta["latitude"] lon = meta["longitude"] @@ -225,18 +279,23 @@ temperature_model_parameters=temperature_model_parameters, ) -# %% + +# In[21]: + + # create and run pvlib modelchain mc = ModelChain(system, location, aoi_model="physical") mc.run_model(weather) -# %% [markdown] + # # Set up timeseries # In this example, injection is a function of both the operating point of the module (which we will assume is maximum power point) and irradiance. Maximum power point injection is equivalent to $(I_{sc}-I_{mp})/I_{sc}\times Ee$, where $Ee$ is effective irradiance, the irradiance absorbed by the module's cells. We normalize it to 1-sun irradiance, 1000 $W/m^2$. -# +# # We will use the irradiance, DC operating point, and cell temperature from the pvlib modelchain results. -# %% +# In[22]: + + ee = mc.results.effective_irradiance # injection = (mc.results.dc['i_sc']-mc.results.dc['i_mp'])/(mc.results.dc['i_sc'])*(ee/1000) injection = letid.calc_injection_outdoors(mc.results) @@ -250,16 +309,18 @@ ) # reset the index so datetime is a column. I prefer integer indexing. timesteps.rename(columns={"index": "Datetime"}, inplace=True) -# %% [markdown] -# + +# # # Remaining degradation parameters: # We've already set our important device parameters: ```tau_0```, ```tau_deg```, ```wafer_thickness```, ```srv_rear```, ```cell_area```, etc, but we need a few more: generation profile and carrier diffusivity. These are necessary for calculating current collection, and the "default" values provided here should be sufficient for most use cases. -# +# # The rest of the quantities to define are: the initial percentage of defects in each state (A, B, and C), and the dictionary of mechanism parameters. -# +# # In this example, we'll assume the device starts in the fully-undegraded state (100% state A), and we'll use the parameters for LETID degradation from Repins. -# %% +# In[23]: + + generation_df = pd.read_excel( os.path.join(DATA_DIR, "PVL_GenProfile.xlsx"), header=0 ) # this is an optical generation profile generated by PVLighthouse's OPAL2 default model for 1-sun, normal incident AM1.5 sunlight on a 180-um thick SiNx-coated, pyramid-textured wafer. @@ -268,7 +329,10 @@ d_base = 27 # cm^2/s electron diffusivity. See https://www2.pvlighthouse.com.au/calculators/mobility%20calculator/mobility%20calculator.aspx for details -# %% + +# In[24]: + + # starting defect state percentages nA_0 = 100 nB_0 = 0 @@ -289,14 +353,19 @@ tau_0, tau_deg, nB_0 ) # calculate tau for the first timestep -# %% + +# In[25]: + + timesteps -# %% [markdown] + # # Run through timesteps # Since each timestep depends on the preceding timestep, we need to calculate in a loop. This will take a few minutes depending on the length of the timeseries. -# %% +# In[26]: + + for index, timestep in timesteps.iterrows(): # first row tau has already been assigned if index == 0: @@ -377,7 +446,10 @@ timesteps.at[index, "NB"] = n_B + dN_Bdt * t_step timesteps.at[index, "NC"] = n_C + dN_Cdt * t_step -# %% + +# In[27]: + + timesteps["tau"] = letid.tau_now(tau_0, tau_deg, timesteps["NB"]) # calculate device Jsc for every timestep. Unfortunately this requires an integration so I think we have to run through a loop. Device Jsc allows calculation of device Voc. @@ -390,17 +462,22 @@ timesteps.at[index, "tau"], wafer_thickness, srv_rear, jsc_now, temperature=25 ) -# %% + +# In[28]: + + timesteps = letid.calc_device_params( timesteps, cell_area ) # this function quickly calculates the rest of the device parameters: Isc, FF, max power, and normalized max power timesteps -# %% [markdown] + # Note of course that all these calculated device parameters are modeled STC device parameters, not the instantaneous, weather-dependent values. We'll merge back in the pvlib results for convenience, but these don't reflect the device degradation. We'll calculate energy loss next -# %% +# In[29]: + + timesteps = timesteps.merge(mc.results.dc, left_on="Datetime", right_index=True) timesteps = timesteps.merge( pd.DataFrame(mc.results.effective_irradiance, columns=["Effective irradiance"]), @@ -411,10 +488,12 @@ timesteps -# %% [markdown] + # # Plot the results -# %% +# In[30]: + + from cycler import cycler plt.style.use("default") @@ -448,7 +527,10 @@ plt.show() -# %% + +# In[31]: + + import matplotlib.dates as mdates plt.style.use("default") @@ -481,7 +563,10 @@ plt.show() -# %% + +# In[32]: + + import matplotlib.dates as mdates from scipy.integrate import cumulative_trapezoid, simpson @@ -603,10 +688,12 @@ plt.show() -# %% [markdown] + # # The function `calc_letid_outdoors` wraps all of the steps above into a single function: -# %% +# In[33]: + + mechanism_params = "D037" letid.calc_letid_outdoors( @@ -624,7 +711,15 @@ module_parameters=cec_module, ) -# %% + +# In[34]: + + timesteps -# %% + +# In[ ]: + + + + diff --git a/tutorials/03_monte_carlo/01_arrhenius.ipynb b/tutorials/03_monte_carlo/01_arrhenius.ipynb index c8fcd05a..bfe752b6 100644 --- a/tutorials/03_monte_carlo/01_arrhenius.ipynb +++ b/tutorials/03_monte_carlo/01_arrhenius.ipynb @@ -306,7 +306,7 @@ "Based on the function chosen to run in the monte carlo simulation, various other data will be required. In this case we will need cell temperature and total plane of array irradiance.\n", "\n", "
\n", - "Please use your own API key: The block below makes an NSRDB API to get weather and meta data and then calculate cell temperature and global poa irradiance. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.)\n", + "Please use your own API key: The block below makes an NSRDB API to get weather and meta data and then calculate cell temperature and global poa irradiance. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.)\n", "
" ] }, @@ -335,7 +335,7 @@ "# weather_db = \"PSM4\"\n", "# weather_id = (25.783388, -80.189029)\n", "# weather_arg = {\n", - "# \"api_key\": \"YOUR_API_KEY\", # Get your key at https://developer.nrel.gov/signup/\n", + "# \"api_key\": \"YOUR_API_KEY\", # Get your key at https://developer.nlr.gov/signup/\n", "# \"email\": \"your.email@example.com\",\n", "# \"map_variables\": True,\n", "# }\n", @@ -554,7 +554,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg_313", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -568,9 +568,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.13.11" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/03_monte_carlo/02_standoff.ipynb b/tutorials/03_monte_carlo/02_standoff.ipynb index 64460b1c..d77c4a76 100644 --- a/tutorials/03_monte_carlo/02_standoff.ipynb +++ b/tutorials/03_monte_carlo/02_standoff.ipynb @@ -93,7 +93,7 @@ "This is copied from another tutorial called `4 - Standards.ipynb`, please visit this page for a more in depth explanation of the process for a single standoff calculation.\n", "\n", "
\n", - "Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.)\n", + "Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.)\n", "
" ] }, @@ -116,7 +116,7 @@ " META = json.load(f)\n", "\n", "# To use the NSRDB API instead, uncomment the lines below and add your API key\n", - "# Get your API key at: https://developer.nrel.gov/signup/\n", + "# Get your API key at: https://developer.nlr.gov/signup/\n", "# weather_db = \"PSM4\"\n", "# weather_id = (40.633365593159226, -73.9945801019899) # Manhattan, NYC\n", "# weather_arg = {\n", @@ -419,7 +419,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg_313", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -433,9 +433,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.13.11" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/03_monte_carlo/scripts/01_arrhenius.py b/tutorials/03_monte_carlo/scripts/01_arrhenius.py index 6e36ec32..aa1af65d 100644 --- a/tutorials/03_monte_carlo/scripts/01_arrhenius.py +++ b/tutorials/03_monte_carlo/scripts/01_arrhenius.py @@ -1,26 +1,36 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # Arrhenius Monte Carlo -# -# +# +# # A monte carlo simulation can be used to predict results of an event with a certain amount of uncertainty. This will be introduced to our use case via mean and standard deviation for each modeling constant. Correlated multivariate monte carlo simulations expand on this by linking the behavior of multiple input variables together with correlation data, in our case we will use correlation coefficients but -# +# # **Objectives** # 1. Define necessary monte carlo simulation parameters : correlation coefficients, mean and standard standard deviation, number of trials, function to apply, requried function input # 2. Define process for creating and utilizing modeling constant correlation data # 3. Preform simple monte carlo simulation using arrhenius equation to calculate degredation and plot -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% import pvlib import numpy as np import pandas as pd import pvdeg import matplotlib.pyplot as plt -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -31,13 +41,13 @@ print("Pvlib version ", pvlib.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # Correlated Monte Carlo Simulation (parameters) -# +# # For this simulation we will be using an arrhenius equation to calculate degredation rate given by $R_D = R_0 * I ^ X * e ^ {\frac{-Ea}{kT}}$, where R0 is prefactor degredation, I is irradiance, X is the irridiance relation, Ea is activation energy and T is degrees K -# +# # We will use R0, X and Ea to preform a 3 variable monte carlo simulation to calculate degredation. -# +# # ## Required inputs # To run a monte carlo simulation with pvdeg.montecarlo the following inputs will be required # - function (currently only works with pvdeg.montecarlo.vecArrhenius() but will eventually work with most pvdeg calculation functions) @@ -46,31 +56,37 @@ # - correlation constants (if not entered, default = 0) # - number of trials to run -# %% [markdown] # # Defining Correlation Coefficients # pvdeg.montecarlo stores correlation coefficients in a ``Corr`` object. To represent a given correlation coefficient follow the given syntax below, replacing the values in the brackets with your correlation coefficients -# +# # {my_correlation_object} = Corr('{variable1}', '{variable2}', {correlation coefficient}) -# +# # note: ordering of `variable1` and `variable2` does not matter -# +# # After defining the all known correlations add them to a list which we will feed into our simulation later -# %% +# In[4]: + + corr_Ea_X = pvdeg.montecarlo.Corr("Ea", "X", 0.0269) corr_Ea_LnR0 = pvdeg.montecarlo.Corr("Ea", "LnR0", -0.9995) corr_X_LnR0 = pvdeg.montecarlo.Corr("X", "LnR0", -0.0400) corr_coeff = [corr_Ea_X, corr_Ea_LnR0, corr_X_LnR0] -# %% + +# In[5]: + + type(corr_Ea_X) -# %% [markdown] + # # Defining Mean and Standard Deviation # We will store the mean and correlation for each variable, expressed when we defined the correlation cefficients. If a variable is left out at this stage, the monte carlo simulation will throw errors. -# %% +# In[6]: + + stats_dict = { "Ea": {"mean": 62.08, "stdev": 7.3858}, "LnR0": {"mean": 13.7223084, "stdev": 2.47334772}, @@ -80,24 +96,28 @@ # and number of monte carlo trials to run n = 20000 -# %% [markdown] + # # Generating Monte Carlo Input Data # Next we will use the information collected above to generate correlated data from our modeling constant correlations, means and standard deviations. -# %% +# In[7]: + + np.random.seed(42) # for reproducibility mc_inputs = pvdeg.montecarlo.generateCorrelatedSamples( corr=corr_coeff, stats=stats_dict, n=n ) print(mc_inputs) -# %% [markdown] + # # Sanity Check # We can observe the mean and standard deviation of our newly correlated samples before using them for calculations to ensure that we have not incorrectly altered the data. The mean and standard deviation should be the similar (within a range) to your original input (the error comes from the standard distribution of generated random numbers) -# +# # This also applies to the correlation coefficients originally inputted, they should be witin the same range as those orginally supplied. -# %% +# In[8]: + + # mean and standard deviation match inputs for col in mc_inputs.columns: print( @@ -110,15 +130,17 @@ print("Ea_lnR0", round(np.corrcoef(mc_inputs["Ea"], mc_inputs["LnR0"])[0][1], 3)) print("X_lnR0", round(np.corrcoef(mc_inputs["X"], mc_inputs["LnR0"])[0][1], 3)) -# %% [markdown] + # # Other Function Requirements # Based on the function chosen to run in the monte carlo simulation, various other data will be required. In this case we will need cell temperature and total plane of array irradiance. -# +# #
-# Please use your own API key: The block below makes an NSRDB API to get weather and meta data and then calculate cell temperature and global poa irradiance. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.) +# Please use your own API key: The block below makes an NSRDB API to get weather and meta data and then calculate cell temperature and global poa irradiance. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.) #
-# %% +# In[9]: + + # Load pre-saved weather data for this tutorial # This avoids API rate limits during testing and builds import json @@ -131,16 +153,18 @@ # weather_db = "PSM4" # weather_id = (25.783388, -80.189029) # weather_arg = { -# "api_key": "YOUR_API_KEY", # Get your key at https://developer.nrel.gov/signup/ +# "api_key": "YOUR_API_KEY", # Get your key at https://developer.nlr.gov/signup/ # "email": "your.email@example.com", # "map_variables": True, # } # weather_df, meta = pvdeg.weather.get(weather_db, weather_id, **weather_arg) -# %% [markdown] + # Calculate the sun position, poa irradiance, and module temperature. -# %% +# In[10]: + + sol_pos = pvdeg.spectral.solar_position(weather_df, meta) poa_irradiance = pvdeg.spectral.poa_irradiance(weather_df, meta) temp_mod = pvdeg.temperature.module( @@ -151,37 +175,46 @@ poa_global = poa_irradiance["poa_global"].to_numpy() cell_temperature = temp_mod.to_numpy() -# %% + +# In[11]: + + # must already be numpy arrays function_kwargs = {"poa_global": poa_global, "module_temp": cell_temperature} -# %% [markdown] + # Runs monte carlo simulation for the example `pvdeg.montecarlo.vecArrhenius` function, using the correlated data dataframe created above and the required function arguments. -# +# # We can see the necessary inputs by using the help command: -# %% +# In[12]: + + # NBVAL_SKIP help(pvdeg.montecarlo.simulate) -# %% [markdown] + # # Running the Monte Carlo Simulation # We will pass the target function, `pvdeg.degredation.vecArrhenius()`, its required arguments via the correlated_samples and func_kwargs. Our fixed arguments will be passed in the form of a dictionary while the randomized monte carlo input data will be contained in a DataFrame. -# +# # All required target function arguments should be contained between the column names of the randomized input data and fixed argument dictionary, -# +# # (You can use any data you want here as long as the DataFrame's column names match the required target function's parameter names NOT included in the kwargs) -# %% +# In[13]: + + results = pvdeg.montecarlo.simulate( func=pvdeg.degradation.vecArrhenius, correlated_samples=mc_inputs, **function_kwargs ) -# %% [markdown] + # # Viewing Our Data # Let's plot the results using a histogram -# %% +# In[14]: + + lnDeg = np.log10(results) percentile_2p5 = np.percentile(lnDeg, 2.5) percentile_97p5 = np.percentile(lnDeg, 97.5) @@ -207,3 +240,4 @@ plt.legend() plt.grid(True) plt.show() + diff --git a/tutorials/03_monte_carlo/scripts/02_standoff.py b/tutorials/03_monte_carlo/scripts/02_standoff.py index b2c24e04..f6696d49 100644 --- a/tutorials/03_monte_carlo/scripts/02_standoff.py +++ b/tutorials/03_monte_carlo/scripts/02_standoff.py @@ -1,14 +1,21 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # Standoff Monte Carlo -# +# # See Monte Carlo - Arrhenius Degredation for a more in depth guide. Steps will be shortened for brevity. # This journal applies a Monte Carlo to the Standoff Calculation -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% import pvlib import numpy as np import pandas as pd @@ -16,7 +23,10 @@ import pvdeg import matplotlib.pyplot as plt -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -27,23 +37,25 @@ print("Pvlib version ", pvlib.__version__) print("Pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # Simple Standoff Calculation -# +# # This is copied from another tutorial called `4 - Standards.ipynb`, please visit this page for a more in depth explanation of the process for a single standoff calculation. -# +# #
-# Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.) +# Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.) #
-# %% +# In[4]: + + # Load weather data from locally saved files to avoid API rate limits WEATHER = pd.read_csv("../data/psm4_nyc.csv", index_col=0, parse_dates=True) with open("../data/meta_nyc.json", "r") as f: META = json.load(f) # To use the NSRDB API instead, uncomment the lines below and add your API key -# Get your API key at: https://developer.nrel.gov/signup/ +# Get your API key at: https://developer.nlr.gov/signup/ # weather_db = "PSM4" # weather_id = (40.633365593159226, -73.9945801019899) # Manhattan, NYC # weather_arg = { @@ -53,7 +65,10 @@ # } # WEATHER, META = pvdeg.weather.get(weather_db, weather_id, **weather_arg) -# %% + +# In[5]: + + # simple standoff calculation height1 = pvdeg.standards.standoff(weather_df=WEATHER, meta=META) @@ -72,14 +87,16 @@ print(height1) print(height2) -# %% [markdown] + # # Defining Correlation Coefficients, Mean and Standard Deviation For Monte Carlo Simulation -# +# # We will leave the list of correlations blank because our variables are not correlated. For a correlated use case visit the `Monte Carlo - Arrhenius.ipynb` tutorial. -# +# # Mean and standard deviation must always be populated if being used to create a dataset. However, you can feed your own correlated or uncorrelated data into the simulate function but column names must be consistent. -# %% +# In[6]: + + # These numbers may not make sense in the context of the problem but work for demonstraiting the process stats = {"X_0": {"mean": 5, "stdev": 3}, "wind_factor": {"mean": 0.33, "stdev": 0.5}} @@ -87,15 +104,20 @@ samples = pvdeg.montecarlo.generateCorrelatedSamples(corr_coeff, stats, 500) -# %% + +# In[7]: + + print(samples) -# %% [markdown] + # # Standoff Monte Carlo Inputs -# +# # When using the pvdeg.montecarlo.simulate() function on a target function all of the target function's required arguments must still be given. Our non-changing arguments will be stored in a dictionary. The randomized monte carlo input data will also be passed to the target function via the simulate function. All required target function arguments should be contained between the column names of the randomized input data and fixed argument dictionary, -# %% +# In[8]: + + # defining arguments to pass to the target function, standoff() in this case function_kwargs = { "weather_df": WEATHER, @@ -121,26 +143,33 @@ **function_kwargs, ) -# %% [markdown] + # # Dealing With Series # Notice how our results are contained in a pandas series instead of a dataframe. -# +# # This means we have to do an extra step to view our results. Run the block below to confirm that our results are indeed contained in a series. And convert them into a simpler dataframe. -# %% +# In[9]: + + print(type(results)) # Convert from pandas Series to pandas DataFrame results_df = pd.concat(results.tolist()).reset_index(drop=True) -# %% + +# In[10]: + + print(results_df) -# %% [markdown] + # # Viewing Our Data # Let's plot the results using a histogram -# %% +# In[11]: + + bin_edges = np.arange(results_df["x"].min(), results_df["x"].max() + 0.1, 0.05) plt.figure(figsize=(8, 6)) plt.hist( @@ -159,3 +188,4 @@ plt.legend() plt.grid(True) plt.show() + diff --git a/tutorials/04_geospatial/02_geospatial_templates.ipynb b/tutorials/04_geospatial/02_geospatial_templates.ipynb deleted file mode 100644 index daed2967..00000000 --- a/tutorials/04_geospatial/02_geospatial_templates.ipynb +++ /dev/null @@ -1,5862 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Templates (HPC)\n" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:27.413207Z", - "iopub.status.busy": "2026-01-26T23:41:27.413207Z", - "iopub.status.idle": "2026-01-26T23:41:31.366129Z", - "shell.execute_reply": "2026-01-26T23:41:31.366129Z" - }, - "lines_to_next_cell": 2 - }, - "outputs": [], - "source": [ - "import pvdeg\n", - "from pvdeg import TEST_DATA_DIR\n", - "import pandas as pd\n", - "import os\n", - "import xarray as xr" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Templates\n", - "\n", - "When running a geospatial analysis using `pvdeg.geospatial.analysis` on arbitary `pvdeg` functions you will need to specify a template for the shape of the output data. This is because the input data comes with dimensions of gid and time while the output will have data in a different shape usually corresonding to coordinates.\n", - "- gid, identification number corresponding to an NSRDB datapoint's location\n", - "- time, timeseries corresponding to the hourly time indicies of NSRDB datapoint's yearly meteorological data.\n", - "\n", - "Follow the steps below to see how we generate templates before running the analysis.\n", - "\n", - "The only functions where this is not required are currently `pvdeg.standards.standoff`, `pvdeg.humidity.moduke` and, `letid.calc_letid_outdoors` as they are predefined within the package." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Loading Geospatial Data\n", - "\n", - "This step skips over making the `pvdeg.weather.get` call with `geospatial == True`. See the [Duramat Demo](../10_workshop_demos/02_duramat_live_demo.ipynb) for information on how to do this traditionally.\n", - "\n", - "We can also use a `GeospatialScenario` object. See the [Geospatial Scenario Tutorial](./04_scenario_geospatial.ipynb) for more information on how to use this approach.\n", - "\n", - "*The cell below loads a pickled xarray object, this is not the best way to do this. xarray datasets should be stored as `.nc` - netcdf files*" - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:31.367881Z", - "iopub.status.busy": "2026-01-26T23:41:31.367881Z", - "iopub.status.idle": "2026-01-26T23:41:31.504300Z", - "shell.execute_reply": "2026-01-26T23:41:31.504300Z" - }, - "lines_to_next_cell": 2 - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 9MB\n",
-       "Dimensions:            (time: 17520, gid: 11)\n",
-       "Coordinates:\n",
-       "  * gid                (gid) int64 88B 449211 452064 453020 ... 460613 462498\n",
-       "  * time               (time) datetime64[ns] 140kB 2022-01-01 ... 2022-12-31T...\n",
-       "Data variables:\n",
-       "    temp_air           (time, gid) float64 2MB ...\n",
-       "    wind_speed         (time, gid) float64 2MB ...\n",
-       "    dhi                (time, gid) float64 2MB ...\n",
-       "    ghi                (time, gid) float64 2MB ...\n",
-       "    dni                (time, gid) float64 2MB ...\n",
-       "    relative_humidity  (time, gid) float64 2MB ...
" - ], - "text/plain": [ - " Size: 9MB\n", - "Dimensions: (time: 17520, gid: 11)\n", - "Coordinates:\n", - " * gid (gid) int64 88B 449211 452064 453020 ... 460613 462498\n", - " * time (time) datetime64[ns] 140kB 2022-01-01 ... 2022-12-31T...\n", - "Data variables:\n", - " temp_air (time, gid) float64 2MB ...\n", - " wind_speed (time, gid) float64 2MB ...\n", - " dhi (time, gid) float64 2MB ...\n", - " ghi (time, gid) float64 2MB ...\n", - " dni (time, gid) float64 2MB ...\n", - " relative_humidity (time, gid) float64 2MB ..." - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "geo_meta = pd.read_csv(os.path.join(TEST_DATA_DIR, \"summit-meta.csv\"), index_col=0)\n", - "\n", - "# Use xarray to open NetCDF file instead of pickle\n", - "geo_weather = xr.open_dataset(os.path.join(TEST_DATA_DIR, \"summit-weather.nc\"))\n", - "\n", - "geo_weather" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Creating Templates Manually\n", - "\n", - "`pvdeg.geospatial.ouput_template` we can produce a template for our result data.\n", - "\n", - "We need to do this because different functions return different types of values, some return multiple values as tuples, some return only single numerics, others return timeseries results. We need to specify the shape of our data to create an output xarray dataset." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Examples\n", - "\n", - "## 98ᵗʰ module percential temperature at Standoff Height\n", - "\n", - "Say we want to estimate the 98ᵗʰ percential temperature for the module at the given tilt, azimuth, and x_eff. `PVDeg` has a function to do this, `pvdeg.standards.T98_estimate` BUT it doesn't have a preset geospatial template. We will need to make one.\n", - "\n", - "- look at the function return values.\n", - "From the docstring we can see that `T98_estimate` only has one return value. IMPORTANT, this value is a single float, NOT a timeseries. This means our output shape will only be dependent on the input identifier and NOT time.\n", - "\n", - "Therefore we will map the output variable `T98` to the location identifier `gid` using a dictionary with `str: tuple` mappings.\n", - "\n", - " *IMPORTANT: you must use the syntax below where the variable maps to a tuple of the coordinates. in this case there needs to be a trailing comma in the tuple or python will iterate over the characters in the tuple instead of the elements. See further examples to alleviate confusion.*" - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:31.506307Z", - "iopub.status.busy": "2026-01-26T23:41:31.506307Z", - "iopub.status.idle": "2026-01-26T23:41:31.513666Z", - "shell.execute_reply": "2026-01-26T23:41:31.513666Z" - } - }, - "outputs": [], - "source": [ - "# define output shape\n", - "shapes = {\n", - " \"T98\": (\n", - " \"gid\",\n", - " ) # one return value at each datapoint, only dependent on datapoint, not time\n", - "}\n", - "\n", - "# create xarray template for output to be populated when analysis is run\n", - "template = pvdeg.geospatial.output_template(\n", - " ds_gids=geo_weather,\n", - " shapes=shapes,\n", - ")" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:31.513666Z", - "iopub.status.busy": "2026-01-26T23:41:31.513666Z", - "iopub.status.idle": "2026-01-26T23:41:32.174765Z", - "shell.execute_reply": "2026-01-26T23:41:32.174765Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - } - ], - "source": [ - "geo_estimate_temp = pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.standards.T98_estimate,\n", - " template=template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Glass Glass Estimated Module Temperature\n", - "\n", - "Now we want to calculate geospatial timeseries temperature values for a module using `pvdeg.temperature.module`. This is not super practical because all `pvdeg` functions that need to use tempeature for their calculations preform the temperature calculation internally, this is just for show.\n", - "\n", - "This calculation differs from the above because the temperature functions return the module temperature in a timeseries format. So we care about 2 dimensions, location identifier and TIME." - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:32.174765Z", - "iopub.status.busy": "2026-01-26T23:41:32.174765Z", - "iopub.status.idle": "2026-01-26T23:41:32.182094Z", - "shell.execute_reply": "2026-01-26T23:41:32.182094Z" - } - }, - "outputs": [], - "source": [ - "# define output shape\n", - "shapes = {\n", - " \"module_temperature\": (\n", - " \"gid\",\n", - " \"time\",\n", - " ) # one return value at each datapoint, only dependent on datapoint, not time\n", - "}\n", - "\n", - "# create xarray template for output to be populated when analysis is run\n", - "temperature_template = pvdeg.geospatial.output_template(\n", - " ds_gids=geo_weather,\n", - " shapes=shapes,\n", - ")" - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:32.182094Z", - "iopub.status.busy": "2026-01-26T23:41:32.182094Z", - "iopub.status.idle": "2026-01-26T23:41:32.794610Z", - "shell.execute_reply": "2026-01-26T23:41:32.794610Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - } - ], - "source": [ - "geo_temperature_res = pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.temperature.module,\n", - " template=temperature_template, # use the template we created\n", - " conf=\"open_rack_glass_glass\", # provide kwargs for function here\n", - ")" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:32.794610Z", - "iopub.status.busy": "2026-01-26T23:41:32.794610Z", - "iopub.status.idle": "2026-01-26T23:41:32.902114Z", - "shell.execute_reply": "2026-01-26T23:41:32.902114Z" - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "[]" - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjgAAAGdCAYAAAAfTAk2AAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjMsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvZiW1igAAAAlwSFlzAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3Q+QVdV9B/DfAnELEtiQgkhYQKQN6GSE4Ehl2gbBIZo2xdZBS42Cf0ApDGPYsVlGAqFTSluTMWCsbWdMFmvbEDG1meZPYwNxTKCitKCAS4NC2GBWSSiLFGRxuZ17M7sFWdZdsw/Ys5/PzMl9795z77vvuLz3zbnn3FeWZVkWAAAJ6XGuTwAAoLMJOABAcgQcACA5Ag4AkBwBBwBIjoADACRHwAEAkiPgAADJ6RXd0IkTJ+K1116L97///VFWVnauTwcAaIf83sRvvvlmDBkyJHr0aLuPplsGnDzcVFZWnuvTAADeg7q6uhg6dGibdbplwMl7bpobqF+/fuf6dACAdjh06FDRQdH8Pd6Wbhlwmi9L5eFGwAGArqU9w0sMMgYAkiPgAADJEXAAgOQIOABAcgQcACA5Ag4AkBwBBwBIjoADACRHwAEAklPSgLN8+fKYOHFi9OnTJyoqKlqts2DBghg/fnyUl5fH2LFjO/yjW9dff31xR8Onnnqqk84aAOjqShpwGhsbY/r06TF37tw2691xxx1x8803d/j4X/ziF/0aOABwdn+LatmyZcWypqbmjHVWrVpVLPfv3x8vvvhiu4+9ZcuW+MIXvhAvvPBCXHzxxZ1wtgBAKrrkj20eOXIk/uiP/igefvjhGDx48LvWP3bsWFFO/jVSACBdXXKQ8ac//elibM+0adPaVX/FihXRv3//lpL/1DoAkK4OB5zq6upi3Etbpba2tjRnGxHf+MY3Yt26dcX4m/ZatGhRNDQ0tJS6urqSnR8A0AUvUVVVVcWsWbParDNy5MgolTzcvPLKK6fNyrrxxhvjt37rt+L73//+afvkM7TyAgB0Dx0OOAMHDizKuZL3IN11112nrPvIRz4SDz74YHzyk588Z+cFAHSTQcZ79+6NAwcOFMumpqZi5lNu1KhR0bdv3+Lxrl274vDhw1FfXx9Hjx5tqXPZZZfFBRdcEPv27YspU6bEY489FldddVUxqLi1gcXDhg2LSy65pJRvBwDoIkoacJYsWRKrV69ueT5u3LhiuX79+pg0aVLxOO+NeeaZZ06rs3v37hgxYkQcP348du7cWcycAgBoj7Isvx1wN5NPE89nU+UDjvv163euTwcA6OTv7y45TRwAoC0CDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSU7KAs3z58pg4cWL06dMnKioqWq2zYMGCGD9+fJSXl8fYsWPbfeyNGzfG5MmT48ILL4x+/frFb//2b8fRo0c78ewBgK6sZAGnsbExpk+fHnPnzm2z3h133BE333xzh8LNddddF1OnTo1NmzbF888/H/Pnz48ePXRGAQC/0CtKZNmyZcWypqbmjHVWrVpVLPfv3x8vvvhiu4776U9/uuj5qa6ubln34Q9/+Jc+XwAgHV2q2+ONN96I5557LgYNGlRc/rroooviYx/7WPzgBz9oc79jx47FoUOHTikAQLq6VMB59dVXi+XnPve5mD17dnznO9+Jj370ozFlypT40Y9+dMb9VqxYEf37928plZWVZ/GsAYDzOuDkl4XKysraLLW1tSU72RMnThTLu+++O26//fYYN25cPPjgg8Ulqi9/+ctn3G/RokXR0NDQUurq6kp2jgBAFxuDU1VVFbNmzWqzzsiRI6NULr744mJ52WWXnbJ+zJgxsXfv3jPul8/SygsA0D10KOAMHDiwKOfKiBEjYsiQIbFz585T1v/3f/93XH/99efsvACAbjKLKu9ROXDgQLFsamqKLVu2FOtHjRoVffv2LR7v2rUrDh8+HPX19cV9bJrr5D00F1xwQezbt68YX/PYY4/FVVddVVwCu++++2Lp0qVxxRVXFPfOWb16dXFZbO3ataV6KwBAF1OygLNkyZIifDTLx8vk1q9fH5MmTSoe33XXXfHMM8+cVmf37t1Fb83x48eL3pojR4601Ln33nvjrbfeKqaL5wEqDzpPP/10XHrppaV6KwBAF1OWZVkW3Uw+TTyfTZUPOM7vhAwApPX93aWmiQMAtIeAAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDklCzgLF++PCZOnBh9+vSJioqKVussWLAgxo8fH+Xl5TF27Nh2Hbe+vj5uvfXWGDx4cFx44YXx0Y9+NJ588slOPnsAoCsrWcBpbGyM6dOnx9y5c9usd8cdd8TNN9/c7uPedtttsXPnzvjGN74RL730UvzBH/xB3HTTTfFf//VfnXDWAEAKepXqwMuWLSuWNTU1Z6yzatWqYrl///548cUX23XcDRs2xCOPPBJXXXVV8Xzx4sXx4IMPxubNm2PcuHGdcu4AQNfW5cbg5Je91qxZEwcOHIgTJ07EV7/61Xjrrbdi0qRJZ9zn2LFjcejQoVMKAJCuLhdwvva1r8Xx48fjgx/8YDF25+67745//ud/jlGjRp1xnxUrVkT//v1bSmVl5Vk9ZwDgPA441dXVUVZW1mapra0t3dlGxGc/+9k4ePBg/Pu//3u88MILsXDhwmIMTj4e50wWLVoUDQ0NLaWurq6k5wgAdKExOFVVVTFr1qw264wcOTJK5ZVXXokvfelLsW3btrj88suLdVdccUU8++yz8fDDD8ff/M3ftLpf3tOTFwCge+hQwBk4cGBRzpUjR44Uyx49Tu146tmzZzEeBwCgpGNw9u7dG1u2bCmWTU1NxeO8HD58uKXOrl27inX5vW2OHj3aUiefYp7bt29fjB49OjZt2lQ8zx/nY23ycTf5urxH5wtf+EI8/fTTccMNN/gvCgCUdpr4kiVLYvXq1S3Pm6dwr1+/vmXG01133RXPPPPMaXV2794dI0aMKAYT5/e8ae65ed/73hff+ta3irFAn/zkJ4uwlAee/HU+8YlPlOqtAABdTFmWZdm5PomzLZ8mns+mygcc9+vX71yfDgDQyd/fXW6aOADAuxFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJCckgac5cuXx8SJE6NPnz5RUVFx2vatW7fGjBkzorKyMnr37h1jxoyJlStXvutxDxw4ELfcckv069evOO6dd94Zhw8fLtG7AAC6ml6lPHhjY2NMnz49rr766nj00UdP27558+YYNGhQPP7440XI2bBhQ8yZMyd69uwZ8+fPP+Nx83Dz05/+NJ5++uk4fvx43H777cV+//iP/1jKtwMAdBFlWZZlpX6RmpqauPfee+PgwYPvWnfevHnx8ssvx7p161rdnm+77LLL4vnnn48rr7yyWPed73wnPvGJT8RPfvKTGDJkyLu+xqFDh6J///7R0NBQ9AIBAOe/jnx/n3djcPKTHjBgwBm3b9y4sbgs1Rxuctdee2306NEjnnvuuVb3OXbsWNEoJxcAIF3nVcDJL1GtWbOmuNx0JvX19cVlrZP16tWrCEX5ttasWLGiSHzNJb8cBgCkq8MBp7q6OsrKytostbW1HT6Rbdu2xbRp02Lp0qUxderU6EyLFi0qeoaaS11dXaceHwDo4oOMq6qqYtasWW3WGTlyZIeOuWPHjpgyZUrRc7N48eI26w4ePDjeeOONU9a9/fbbxcyqfFtrysvLiwIAdA8dDjgDBw4sSmfZvn17TJ48OWbOnFlMK383+YysfLByPgNr/Pjxxbp8QPKJEydiwoQJnXZeAEDXVdIxOHv37o0tW7YUy6ampuJxXprvWZNflrrmmmuKS1ILFy4sxtDkZf/+/S3H2LRpU4wePTr27dtXPM/vlXPdddfF7Nmzi20//OEPiynlf/iHf9iuGVQAQPpKeh+cJUuWxOrVq1uejxs3rliuX78+Jk2aFGvXri3CTH4fnLw0Gz58eOzZs6d4fOTIkdi5c2dxv5tm//AP/1CEmvyyVj576sYbb4xVq1aV8q0AAF3IWbkPzvnGfXAAoOvp0vfBAQD4ZQk4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQHAEHAEhOSQPO8uXLY+LEidGnT5+oqKg4bfvWrVtjxowZUVlZGb17944xY8bEypUr2zzmnj174s4774xLLrmk2OfSSy+NpUuXRmNjYwnfCQDQlfQq5cHz0DF9+vS4+uqr49FHHz1t++bNm2PQoEHx+OOPFyFnw4YNMWfOnOjZs2fMnz+/1WPW1tbGiRMn4m//9m9j1KhRsW3btpg9e3b87//+b3z+858v5dsBALqIsizLslK/SE1NTdx7771x8ODBd607b968ePnll2PdunXtPv4DDzwQjzzySLz66qvtqn/o0KHo379/NDQ0RL9+/dr9OgDAudOR7++S9uC8F/lJDxgwoFP3OXbsWFFObiAAIF3n1SDj/BLVmjVristU7bVr16546KGH4u677z5jnRUrVhSJr7nkl8MAgHR1OOBUV1dHWVlZmyUfJ9NR+ViaadOmFQOGp06d2q599u3bF9ddd10xzicfh3MmixYtKnp5mktdXV2Hzw8A6Do6fImqqqoqZs2a1WadkSNHduiYO3bsiClTphQ9N4sXL27XPq+99lpcc801xSytv/u7v2uzbnl5eVEAgO6hwwFn4MCBReks27dvj8mTJ8fMmTOLaeXt7bnJw8348ePjK1/5SvTocV5daQMAzrGSJoO9e/fGli1bimVTU1PxOC+HDx9uuSyVB5X8ktTChQujvr6+KPv37285xqZNm2L06NFFqMnly0mTJsWwYcOKaeF53eb9AABKPotqyZIlsXr16pbn48aNK5br168vQsratWuLgJLfBycvzYYPH17c0C935MiR2LlzZxw/frx4/vTTTxcDi/MydOjQU17vLMx4BwC6gLNyH5zzjfvgAEDa398GrwAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJCckgac5cuXx8SJE6NPnz5RUVFx2vatW7fGjBkzorKyMnr37h1jxoyJlStXtvv4x44di7Fjx0ZZWVls2bKlk88eAOiqShpwGhsbY/r06TF37txWt2/evDkGDRoUjz/+eGzfvj3uv//+WLRoUXzpS19q1/H/5E/+JIYMGdLJZw0AdHW9SnnwZcuWFcuamppWt99xxx2nPB85cmRs3Lgxvv71r8f8+fPbPPa3v/3t+O53vxtPPvlk8RgA4KwEnPeioaEhBgwY0Gad119/PWbPnh1PPfVUcfmrPZey8tLs0KFDnXKuAMD56bwaZLxhw4ZYs2ZNzJkz54x1siyLWbNmxT333BNXXnllu467YsWK6N+/f0vJx/wAAOnqcMCprq4uBvW2VWprazt8Itu2bYtp06bF0qVLY+rUqWes99BDD8Wbb75ZjNVpr7xu3jPUXOrq6jp8fgBAwpeoqqqqih6UtuRjaTpix44dMWXKlKLnZvHixW3WXbduXTFOp7y8/JT1eW/OLbfcEqtXrz5tn7zuO+sDAOnqcMAZOHBgUTpLPntq8uTJMXPmzGJa+btZtWpV/Nmf/VnL89deey0+/vGPF5e2JkyY0GnnBQB0XSUdZLx37944cOBAsWxqamq5V82oUaOib9++xWWpPNzkAWXhwoVRX19fbO/Zs2dLiNq0aVPcdttt8b3vfS8+9KEPxbBhw055jfw4uUsvvTSGDh1ayrcDAHQRJQ04S5YsOeWS0bhx44rl+vXrY9KkSbF27drYv39/cR+cvDQbPnx47Nmzp3h85MiR2LlzZxw/fryUpwoAJKQsy6cldTP5NPF8NlU+4Lhfv37n+nQAgE7+/j6vpokDAHQGAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAyRFwAIDkCDgAQHIEHAAgOQIOAJAcAQcASI6AAwAkR8ABAJIj4AAAySlpwFm+fHlMnDgx+vTpExUVFadt37p1a8yYMSMqKyujd+/eMWbMmFi5cmW7jv3Nb34zJkyYUOz3gQ98IG644YYSvAMAoCvqVcqDNzY2xvTp0+Pqq6+ORx999LTtmzdvjkGDBsXjjz9ehJwNGzbEnDlzomfPnjF//vwzHvfJJ5+M2bNnx5//+Z/H5MmT4+23345t27aV8q0AAF1IWZZlWalfpKamJu699944ePDgu9adN29evPzyy7Fu3bpWt+dhZsSIEbFs2bK4884739P5HDp0KPr37x8NDQ3Rr1+/93QMAODs6sj393k3Bic/6QEDBpxx+3/+53/Gvn37okePHjFu3Li4+OKL4/rrr2+zB+fYsWNFo5xcAIB0nVcBJ79EtWbNmuIy1Zm8+uqrxfJzn/tcLF68OP71X/+1GIMzadKkOHDgQKv7rFixokh8zSW/HAYApKvDAae6ujrKysraLLW1tR0+kbwHZtq0abF06dKYOnXqGeudOHGiWN5///1x4403xvjx4+MrX/lK8bpPPPFEq/ssWrSo6BlqLnV1dR0+PwAg4UHGVVVVMWvWrDbrjBw5skPH3LFjR0yZMqXoucl7ZdqSX5LKXXbZZS3rysvLi9fcu3dvq/vk2/MCAHQPHQ44AwcOLEpn2b59ezETaubMmcW08neT99jkYWXnzp3xm7/5m8W648ePx549e2L48OGddl4AQNdV0jE4eY/Kli1bimVTU1PxOC+HDx9uuSx1zTXXFJekFi5cGPX19UXZv39/yzE2bdoUo0ePLgYW5/JR0/fcc09xKeu73/1uEXTmzp1bbMunpAMAlPQ+OEuWLInVq1e3PM9nPeXWr19fDApeu3ZtEWby++DkpVneE5P3yOSOHDlShJi8l6bZAw88EL169Ypbb701jh49WtzwL59Wng82BgA4K/fBOd+4Dw4AdD1d+j44AAC/LAEHAEiOgAMAJEfAAQCSI+AAAMkRcACA5Ag4AEByBBwAIDkCDgCQnJL+VMP5qvnmzfkdEQGArqH5e7s9P8LQLQPOm2++WSwrKyvP9akAAO/hezz/yYa2dMvfojpx4kS89tpr8f73vz/Kysqiu8sTcR726urq/DZXCWnns0M7nz3a+uzQzv8vjyx5uBkyZEj06NH2KJtu2YOTN8rQoUPP9Wmcd/J/ON39H8/ZoJ3PDu189mjrs0M7/8K79dw0M8gYAEiOgAMAJEfAIcrLy2Pp0qXFktLRzmeHdj57tPXZoZ3fm245yBgASJseHAAgOQIOAJAcAQcASI6AAwAkR8DpBg4cOBC33HJLcYOoioqKuPPOO+Pw4cNt7vPWW2/FvHnz4oMf/GD07ds3brzxxnj99ddbrfvzn/+8uHFiflfogwcPRndWirbeunVrzJgxo7iTae/evWPMmDGxcuXK6E4efvjhGDFiRPzKr/xKTJgwITZt2tRm/SeeeCJGjx5d1P/IRz4S3/rWt07Zns+tWLJkSVx88cVFm1577bXxox/9KLq7zmzn48ePx2c+85li/YUXXljcefa2224r7iLf3XX23/PJ7rnnnuKz+Itf/GIJzryLyWdRkbbrrrsuu+KKK7L/+I//yJ599tls1KhR2YwZM9rc55577skqKyuz733ve9kLL7yQ/cZv/EY2ceLEVutOmzYtu/766/PZeNn//M//ZN1ZKdr60UcfzRYsWJB9//vfz1555ZXs7//+77PevXtnDz30UNYdfPWrX80uuOCC7Mtf/nK2ffv2bPbs2VlFRUX2+uuvt1r/hz/8YdazZ8/sr/7qr7IdO3Zkixcvzt73vvdlL730Ukudv/iLv8j69++fPfXUU9nWrVuz3/u938suueSS7OjRo1l31dntfPDgwezaa6/N1qxZk9XW1mYbN27Mrrrqqmz8+PFZd1aKv+dmX//614vPnyFDhmQPPvhg1t0JOInL/0HkweP5559vWfftb387Kysry/bt29fqPvkHU/4P6IknnmhZ9/LLLxfHyT+kTvbXf/3X2cc+9rHiy7m7B5xSt/XJ/viP/zi75pprsu4g/1KcN29ey/OmpqbiA3zFihWt1r/pppuy3/md3zll3YQJE7K77767eHzixIls8ODB2QMPPHDKf4fy8vLsn/7pn7LuqrPbuTWbNm0q/rZ//OMfZ91Vqdr5Jz/5SfahD30o27ZtWzZ8+HABJ8syl6gSt3HjxuJSyZVXXtmyLu+Oz3+P67nnnmt1n82bNxfdy3m9Znn36LBhw4rjNduxY0f86Z/+aTz22GPv+qNn3UEp2/qdGhoaYsCAAZG6xsbGoo1Obp+8PfPnZ2qffP3J9XMf//jHW+rv3r076uvrT6mT/7ZNfqmgrTZPWSna+Ux/t/nlk/zfSXdUqnbOf0D61ltvjfvuuy8uv/zyEr6DrsW3UuLyD/JBgwadsq5Xr17Fl2O+7Uz7XHDBBad9CF100UUt+xw7dqwYF/LAAw8UX8aUrq3facOGDbFmzZqYM2dOpO5nP/tZNDU1Fe3R3vbJ17dVv3nZkWOmrhTt3NpYs3xMTv650V1/MLJU7fyXf/mXxWfNggULSnTmXZOA00VVV1cX/0+orVJbW1uy11+0aFEx2PVTn/pUpO5ct/XJtm3bFtOmTStu2z516tSz8prwy8p7KW+66aZicPcjjzxyrk8nKXmPUD7poKampvgs4v/1OukxXUhVVVXMmjWrzTojR46MwYMHxxtvvHHK+rfffruY7ZNva02+Pu9KzWdEndyzkM/sad5n3bp18dJLL8XatWuL582/+PGrv/qrcf/998eyZcsiFee6rU++JDhlypSi52bx4sXRHeR/Tz179jxtBl9r7dMsX99W/eZlvi6fRXVynbFjx0Z3VIp2fme4+fGPf1x8bnTX3ptStfOzzz5bfO6c3JOe9xJVVVUVM6n27NkT3da5HgTE2Rn4ms/OafZv//Zv7Rr4unbt2pZ1+SyIkwe+7tq1qxjF31zyGQH59g0bNpxxNkDqStXWuXzg4KBBg7L77rsv646DMufPn3/KoMx8MGVbgzJ/93d/95R1V1999WmDjD//+c+3bG9oaDDIuJPbOdfY2JjdcMMN2eWXX5698cYbJTz77tvOP/vZz075LM5LPmj5M5/5TPFZ0p0JON1k6vK4ceOy5557LvvBD36Q/dqv/dopU5fz0fcf/vCHi+0nT10eNmxYtm7duuILO/8HlZczWb9+fbefRVWqts4/sAYOHJh96lOfyn7605+2lO7yhZFPq83DR01NTREi58yZU0yrra+vL7bfeuutWXV19SnTanv16lUEmHxG2tKlS1udJp4f41/+5V+yF198sbjVgWnindvOebjJp98PHTo027Jlyyl/u8eOHcu6q1L8Pb+TWVS/IOB0Az//+c+LL9m+fftm/fr1y26//fbszTffbNm+e/fuIpzkIaVZ/kGfT0X+wAc+kPXp0yf7/d///eKD6UwEnNK1df6Blu/zzpJ/iHUX+T1/8hCY3z8k/3/A+X2GmuW3KZg5c+Yp9b/2ta9lv/7rv17Uz3sPvvnNb56yPe/F+exnP5tddNFFxZfNlClTsp07d2bdXWe2c/Pfemvl5L//7qiz/57fScD5hbL8f871ZTIAgM5kFhUAkBwBBwBIjoADACRHwAEAkiPgAADJEXAAgOQIOABAcgQcACA5Ag4AkBwBBwBIjoADACRHwAEAIjX/B5a1SMPwe3bYAAAAAElFTkSuQmCC", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "# plot the temperature at ONE of the geospatial analysis result locations but we have calculated all of these.\n", - "import matplotlib.pyplot as plt\n", - "\n", - "module_temps = (\n", - " geo_temperature_res[\"module\"].sel(latitude=39.89, longitude=\"-106.42\").values\n", - ")\n", - "\n", - "plt.plot(module_temps)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Self Explaining Code\n", - "\n", - "If we are looking at adding templates for other functions, we can also look at the 3 presaved templates for existing `pvdeg` functions. Visit [pvdeg.geospatial.template_parameters](../../pvdeg/geospatial.py) and inspect this function to see how these different target functions utilize templates and shapes." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Creating Templates Programatically\n", - "\n", - "We can use `pvdeg.geospatial.autotemplate` to generate a template for a given function. This can return a bad result which will fail or work improperly when running `pvdeg.geospatial.analysis` with the generated template. Results should be scrutinized to make sure they are the right format." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Examples Below\n", - "Steps\n", - "- Create template using autotemplating function. Pulls in information about function to determine shape of output. Not usable on functions with ambigious return types.\n", - "- Call geospatial analysis function using template" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Cell Temperature Calculation\n", - "As shown below, we have two options, we can choose to provide a template that is generated by a function which supports autotemplating. Or we can provide the function to `geospatial.analysis` and let it generate a template internally.\n", - "\n", - "## Providing a Template with `Geospatial.auto_template`" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:32.902114Z", - "iopub.status.busy": "2026-01-26T23:41:32.902114Z", - "iopub.status.idle": "2026-01-26T23:41:33.570787Z", - "shell.execute_reply": "2026-01-26T23:41:33.570787Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:    (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude   (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n",
-       "  * longitude  (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    cell       (latitude, longitude) float64 640B nan nan nan ... nan nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " cell (latitude, longitude) float64 640B nan nan nan ... nan nan nan" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "# create a template using auto_template for the desired function\n", - "cell_temp_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.temperature.cell, ds_gids=geo_weather\n", - ")\n", - "\n", - "# run the geospatial analysis with the template\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.temperature.cell,\n", - " template=cell_temp_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Analysis Without Providing a Template\n", - "\n", - "If a function is supported by `geospatial.auto_template` we do not need to create a template outside of the function as shown in the cell above. We can simply pass the function to `geospatial.analysis` and it will create a template for us." - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:33.570787Z", - "iopub.status.busy": "2026-01-26T23:41:33.570787Z", - "iopub.status.idle": "2026-01-26T23:41:34.225816Z", - "shell.execute_reply": "2026-01-26T23:41:34.225816Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:    (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude   (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n",
-       "  * longitude  (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    cell       (latitude, longitude) float64 640B nan nan nan ... nan nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " cell (latitude, longitude) float64 640B nan nan nan ... nan nan nan" - ] - }, - "execution_count": 9, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.temperature.cell,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Module Temperature Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:34.225816Z", - "iopub.status.busy": "2026-01-26T23:41:34.225816Z", - "iopub.status.idle": "2026-01-26T23:41:34.955817Z", - "shell.execute_reply": "2026-01-26T23:41:34.955817Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:    (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude   (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n",
-       "  * longitude  (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    module     (latitude, longitude) float64 640B nan nan nan ... nan nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " module (latitude, longitude) float64 640B nan nan nan ... nan nan nan" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "module_temp_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.temperature.module, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.temperature.module,\n", - " template=module_temp_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Solar Position Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:34.956831Z", - "iopub.status.busy": "2026-01-26T23:41:34.956831Z", - "iopub.status.idle": "2026-01-26T23:41:35.622520Z", - "shell.execute_reply": "2026-01-26T23:41:35.622520Z" - } - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 4kB\n",
-       "Dimensions:             (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude            (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n",
-       "  * longitude           (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    apparent_zenith     (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    zenith              (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    apparent_elevation  (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    elevation           (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    azimuth             (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    equation_of_time    (latitude, longitude) float64 640B nan nan ... nan nan
" - ], - "text/plain": [ - " Size: 4kB\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " apparent_zenith (latitude, longitude) float64 640B nan nan ... nan nan\n", - " zenith (latitude, longitude) float64 640B nan nan ... nan nan\n", - " apparent_elevation (latitude, longitude) float64 640B nan nan ... nan nan\n", - " elevation (latitude, longitude) float64 640B nan nan ... nan nan\n", - " azimuth (latitude, longitude) float64 640B nan nan ... nan nan\n", - " equation_of_time (latitude, longitude) float64 640B nan nan ... nan nan" - ] - }, - "execution_count": 11, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "solar_position_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.spectral.solar_position, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.spectral.solar_position,\n", - " template=solar_position_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial POA Irradiance Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 12, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:35.622520Z", - "iopub.status.busy": "2026-01-26T23:41:35.622520Z", - "iopub.status.idle": "2026-01-26T23:41:36.277759Z", - "shell.execute_reply": "2026-01-26T23:41:36.277759Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 3kB\n",
-       "Dimensions:             (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude            (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n",
-       "  * longitude           (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    poa_global          (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    poa_direct          (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    poa_diffuse         (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    poa_sky_diffuse     (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    poa_ground_diffuse  (latitude, longitude) float64 640B nan nan ... nan nan
" - ], - "text/plain": [ - " Size: 3kB\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " poa_global (latitude, longitude) float64 640B nan nan ... nan nan\n", - " poa_direct (latitude, longitude) float64 640B nan nan ... nan nan\n", - " poa_diffuse (latitude, longitude) float64 640B nan nan ... nan nan\n", - " poa_sky_diffuse (latitude, longitude) float64 640B nan nan ... nan nan\n", - " poa_ground_diffuse (latitude, longitude) float64 640B nan nan ... nan nan" - ] - }, - "execution_count": 12, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "poa_irradiance_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.spectral.poa_irradiance, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.spectral.poa_irradiance,\n", - " template=poa_irradiance_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial 98th Percentile Operating Temperature Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 13, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:36.277759Z", - "iopub.status.busy": "2026-01-26T23:41:36.277759Z", - "iopub.status.idle": "2026-01-26T23:41:36.910685Z", - "shell.execute_reply": "2026-01-26T23:41:36.909362Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:       (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude      (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n",
-       "  * longitude     (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    T98_estimate  (latitude, longitude) float64 640B nan nan nan ... nan nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " T98_estimate (latitude, longitude) float64 640B nan nan nan ... nan nan nan" - ] - }, - "execution_count": 13, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "standoff_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.standards.T98_estimate, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.standards.T98_estimate,\n", - " template=standoff_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Module Humidity Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 14, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:36.910685Z", - "iopub.status.busy": "2026-01-26T23:41:36.910685Z", - "iopub.status.idle": "2026-01-26T23:41:38.327066Z", - "shell.execute_reply": "2026-01-26T23:41:38.327066Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 3kB\n",
-       "Dimensions:             (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude            (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n",
-       "  * longitude           (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    RH_surface_outside  (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    RH_front_encap      (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    RH_back_encap       (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    Ce_back_encap       (latitude, longitude) float64 640B nan nan ... nan nan\n",
-       "    RH_backsheet        (latitude, longitude) float64 640B nan nan ... nan nan
" - ], - "text/plain": [ - " Size: 3kB\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " RH_surface_outside (latitude, longitude) float64 640B nan nan ... nan nan\n", - " RH_front_encap (latitude, longitude) float64 640B nan nan ... nan nan\n", - " RH_back_encap (latitude, longitude) float64 640B nan nan ... nan nan\n", - " Ce_back_encap (latitude, longitude) float64 640B nan nan ... nan nan\n", - " RH_backsheet (latitude, longitude) float64 640B nan nan ... nan nan" - ] - }, - "execution_count": 14, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "humidity_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.humidity.module, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.humidity.module,\n", - " template=humidity_template,\n", - " backsheet_thickness=0.3,\n", - " back_encap_thickness=0.5,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial IwaVantHoff Environment Characterization Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 15, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:38.327066Z", - "iopub.status.busy": "2026-01-26T23:41:38.327066Z", - "iopub.status.idle": "2026-01-26T23:41:38.927072Z", - "shell.execute_reply": "2026-01-26T23:41:38.927072Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.9 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.7 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.8 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.4 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n", - "The array surface_tilt angle was not provided, therefore the latitude of 39.5 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "name": "stdout", - "output_type": "stream", - "text": [ - "The array surface_tilt angle was not provided, therefore the latitude of 39.6 was used.\n", - "The array azimuth was not provided, therefore an azimuth of 180.0 was used.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:      (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude     (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n",
-       "  * longitude    (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    IwaVantHoff  (latitude, longitude) float64 640B nan nan nan ... nan nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.69 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " IwaVantHoff (latitude, longitude) float64 640B nan nan nan ... nan nan nan" - ] - }, - "execution_count": 15, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "iwa_vant_hoff_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.degradation.IwaVantHoff, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.degradation.IwaVantHoff,\n", - " template=iwa_vant_hoff_template,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Geospatial Edge Seal Width Calculation" - ] - }, - { - "cell_type": "code", - "execution_count": 16, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:38.927072Z", - "iopub.status.busy": "2026-01-26T23:41:38.927072Z", - "iopub.status.idle": "2026-01-26T23:41:39.005944Z", - "shell.execute_reply": "2026-01-26T23:41:39.005944Z" - } - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 784B\n",
-       "Dimensions:          (latitude: 8, longitude: 10)\n",
-       "Coordinates:\n",
-       "  * latitude         (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n",
-       "  * longitude        (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n",
-       "Data variables:\n",
-       "    edge_seal_width  (latitude, longitude) float64 640B nan nan nan ... nan nan
" - ], - "text/plain": [ - " Size: 784B\n", - "Dimensions: (latitude: 8, longitude: 10)\n", - "Coordinates:\n", - " * latitude (latitude) float64 64B 39.41 39.45 39.53 ... 39.81 39.89\n", - " * longitude (longitude) float64 80B -106.4 -106.3 ... -105.9 -105.9\n", - "Data variables:\n", - " edge_seal_width (latitude, longitude) float64 640B nan nan nan ... nan nan" - ] - }, - "execution_count": 16, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "edge_seal_template = pvdeg.geospatial.auto_template(\n", - " func=pvdeg.design.edge_seal_width, ds_gids=geo_weather\n", - ")\n", - "\n", - "pvdeg.geospatial.analysis(\n", - " weather_ds=geo_weather,\n", - " meta_df=geo_meta,\n", - " func=pvdeg.design.edge_seal_width,\n", - " template=edge_seal_template,\n", - ")" - ] - }, - { - "cell_type": "code", - "execution_count": 17, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:41:39.008906Z", - "iopub.status.busy": "2026-01-26T23:41:39.008906Z", - "iopub.status.idle": "2026-01-26T23:41:39.018146Z", - "shell.execute_reply": "2026-01-26T23:41:39.018146Z" - } - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 176B\n",
-       "Dimensions:  (gid: 11)\n",
-       "Coordinates:\n",
-       "  * gid      (gid) int64 88B 449211 452064 453020 ... 459670 460613 462498\n",
-       "Data variables:\n",
-       "    width    (gid) float64 88B dask.array<chunksize=(11,), meta=np.ndarray>
" - ], - "text/plain": [ - " Size: 176B\n", - "Dimensions: (gid: 11)\n", - "Coordinates:\n", - " * gid (gid) int64 88B 449211 452064 453020 ... 459670 460613 462498\n", - "Data variables:\n", - " width (gid) float64 88B dask.array" - ] - }, - "execution_count": 17, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "edge_seal_template" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "pvdeg", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.12.9" - } - }, - "nbformat": 4, - "nbformat_minor": 4 -} diff --git a/tutorials/04_geospatial/08_module_standoff_iec63126.ipynb b/tutorials/04_geospatial/08_module_standoff_iec63126.ipynb deleted file mode 100644 index 99add4da..00000000 --- a/tutorials/04_geospatial/08_module_standoff_iec63126.ipynb +++ /dev/null @@ -1,723 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Tools - Module Standoff for IEC TS 63126\n", - "\n", - "## Calculation of module standoff distance according to IEC TS 63126\n", - "\n", - "**Requirements:**\n", - "- Local weather data file or site longitude and latittude\n", - "\n", - "**Objectives:**\n", - "1. Import weather data.\n", - "2. Calculate installation standoff - Level 1 and Level 2.\n", - "3. Calculate $X_{eff}$ from provided module temperature data.\n", - "4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$.\n", - "5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$.\n", - "6. Plot $X_{min}$ for Level 1, Level 2, or a $T_{98}$ for a given region.\n", - "\n", - "**Background:**\n", - "\n", - "This notebook calculates the a minimum effective standoff distance ($X_{eff}$) necessary for roof-mounted PV modules to ensure that the $98^{th}$ percentile operating temperature, $T_{98}$, remains under 70°C for compliance to IEC 61730 and IEC 61215. For higher $T_{98}$ values above 70°C or 80°C testing must be done to the specifications for Level 1 and Level 2 of IEC TS 63126. This method is outlined in the appendix of IEC TS 63126 and is based on the model from *[King 2004] and data from **[Fuentes, 1987] to model the approximate exponential decay in temperature, $T(X)$, with increasing standoff distance, $X$, as,\n", - "\n", - "$$ X = -X_0 \\ln\\left(1-\\frac{T_0-T}{\\Delta T}\\right), Equation 1 $$\n", - "\n", - "where $T_0$ is the temperature for $X=0$ (insulated-back) and $\\Delta T$ is the temperature difference between an insulated-back ($X=0$) and open-rack mounting configuration ($X=\\infty)$.\n", - "\n", - " We used pvlib and data from the National Solar Radiation Database (NSRDB) to calculate the module temperatures for the insulated-back and open-rack mounting configurations and apply our model to obtain the minimum standoff distance for roof-mounted PV systems to achieve a temperature lower than a specified $T_{98}$. The following figure showcases this calulation for the entire world for an $X_{eff}$ that results in $T_{98}$=70°C. Values of $X_{eff}$ higher than this will require Level 1 or Level 2 certification.\n", - "\n", - "$*$ D. L. King, W. E. Boyson, and J. A. Kratochvil, \"Photovoltaic array performance model,\" SAND2004-3535, Sandia National Laboratories, Albuquerque, NM, 2004. '\\\n", - "$**$ M. K. Fuentes, \"A simplified thermal model for Flat-Plate photovoltaic arrays,\" United States, 1987-05-01 1987. https://www.osti.gov/biblio/6802914\n" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "![alt text](images/T98_70C_standoff_Map.png)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# if running on google colab, uncomment the next line and execute this cell to install the dependencies\n", - "# and prevent \"ModuleNotFoundError\" in later cells:\n", - "#!pip install pvdeg" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "import os\n", - "import pvdeg\n", - "import pandas as pd\n", - "from pvdeg import DATA_DIR\n", - "import dask\n", - "import matplotlib.pyplot as plt\n", - "import seaborn as sns\n", - "import math\n", - "import numpy as np" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# This information helps with debugging and getting support :)\n", - "import sys\n", - "import platform\n", - "\n", - "print(\"Working on a \", platform.system(), platform.release())\n", - "print(\"Python version \", sys.version)\n", - "print(\"Pandas version \", pd.__version__)\n", - "print(\"pvdeg version \", pvdeg.__version__)\n", - "print(\"dask version\", dask.__version__)\n", - "print(DATA_DIR)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 1. Import Weather Data\n", - "\n", - "The function has these minimum requirements when using a weather data file:\n", - "- Weather data containing (at least) DNI, DHI, GHI, Temperature, RH, and Wind-Speed data at module level.\n", - "- Site meta-data containing (at least) latitude, longitude, and time zone\n", - "\n", - "Alternatively one may can get meterological data from the NSRDB or PVGIS with just the longitude and latitude. This function for the NSRDB (via NSRDB 'PSM3') works primarily for most of North America and South America. PVGIS works for most of the rest of the world (via SARAH 'PVGIS'). See the tutorial \"Weather Database Access.ipynb\" tutorial on PVdeg or Jensen et al. https://doi.org/10.1016/j.solener.2023.112092 for satellite coverage information.\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Get data from a supplied data file (Do not use the next box of code if using your own file)\n", - "weather_file = os.path.join(DATA_DIR, \"psm3_demo.csv\")\n", - "WEATHER_df, META = pvdeg.weather.read(weather_file, \"csv\")\n", - "print(META)" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# This routine will get a meteorological dataset from anywhere in the world where it is available\n", - "# weather_id = (24.7136, 46.6753) #Riyadh, Saudi Arabia\n", - "# weather_id = (35.6754, 139.65) #Tokyo, Japan\n", - "# weather_id = (-43.52646, 172.62165) #Christchurch, New Zealand\n", - "# weather_id = (64.84031, -147.73836) #Fairbanks, Alaska\n", - "# weather_id = (65.14037, -21.91633) #Reykjavik, Iceland\n", - "# weather_id = (33.4152, -111.8315) #Mesa, Arizona\n", - "# WEATHER_df, META = pvdeg.weather.get_anywhere(id=weather_id)\n", - "# print(META)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 2. Calculate Installation Standoff Minimum - Level 1 and Level 2\n", - "\n", - "According to IEC TS 63126, Level 0, Level 1 and Level 2 certification is limited to T₉₈<70°C, <80°C and <90°C, respectively. Level 0 certification is essentially compliance to IEC 61730 and IEC 61215. The default value of T₉₈<70°C represents the minimium gap to avoid higher temperature certification according to IEC TS 63126. This minimum standoff ($x_{min}$) is the distance between the bottom of the module frame and the roof and can be extimated for a given environment as,\n", - "\n", - "$$ X_{min} = -X_0 \\ln\\left(1-\\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\\right), Equation 2 $$\n", - "\n", - "where $T_{98,0}$ is the $98^{th}$ percentile temperature for an insulated back module and $T_{98,inf}$ is the $98^{th}$ percentile temperature for an open rack mounted module.\n", - "\n", - "Once the meterological data has been obtained, the input parameter possibilities are:\n", - "\n", - "- T₉₈ : Does not necessarily need to be set at 70°C or 80°C for IEC TS 63216, you might want to use a different number to compensate for a thermal aspect of the particular system you are considering. The default is 70°C.\n", - "- tilt : tilt from horizontal of PV module. The default is 0°.\n", - "- azimuth : azimuth in degrees from North. The default is 180° for south facing.\n", - "- sky_model : pvlib compatible model for generating sky characteristics (Options: 'isotropic', 'klucher', 'haydavies', 'reindl', 'king', 'perez'). The default is 'isotropic'.\n", - "- temp_model : pvlib compatible module temperature model. (Options: 'sapm', 'pvsyst', 'faiman', 'sandia'). The default is 'sapm'.\n", - "- conf_0 : Temperature model for hotest mounting configuration. Default is \"insulated_back_glass_polymer\".\n", - "- conf_inf : Temperature model for open rack mounting. Default is \"open_rack_glass_polymer\".\n", - "- x_0 : thermal decay constant [cm] (see documentation). The default is 6.5 cm.\n", - "- wind_factor : Wind speed power law correction factor to account for different wind speed measurement heights between weather database (e.g. NSRDB) and the tempeature model (e.g. SAPM). The default is 0.33." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following is the minimum function call. It defaults to horizontal tilt and T₉₈=70°C." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "standoff = pvdeg.standards.standoff(weather_df=WEATHER_df, meta=META)\n", - "print(pvdeg.standards.interpret_standoff(standoff))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The following is a full function call for both T₉₈=70°C and 80°C separately even though the second standoff distance can be calculated using only T98_0 and T98_inf. With this function, one may also want to change the tilt, azimuth, or T98." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "standoff_1 = pvdeg.standards.standoff(\n", - " weather_df=WEATHER_df,\n", - " meta=META,\n", - " T98=70,\n", - " tilt=META[\"latitude\"],\n", - " azimuth=None,\n", - " sky_model=\"isotropic\",\n", - " temp_model=\"sapm\",\n", - " conf_0=\"insulated_back_glass_polymer\",\n", - " conf_inf=\"open_rack_glass_polymer\",\n", - " x_0=6.5,\n", - " wind_factor=0.33,\n", - ")\n", - "print(\"First calculation standoff = \", \"%.1f\" % standoff_1[\"x\"].iloc[0], \" cm.\")\n", - "standoff_2 = pvdeg.standards.standoff(\n", - " weather_df=WEATHER_df,\n", - " meta=META,\n", - " T98=80,\n", - " tilt=META[\"latitude\"],\n", - " azimuth=None,\n", - " sky_model=\"isotropic\",\n", - " temp_model=\"sapm\",\n", - " conf_0=\"insulated_back_glass_polymer\",\n", - " conf_inf=\"open_rack_glass_polymer\",\n", - " x_0=6.5,\n", - " wind_factor=0.33,\n", - ")\n", - "print(\"Second calculation standoff = \", \"%.1f\" % standoff_2[\"x\"].iloc[0], \" cm.\")\n", - "print(pvdeg.standards.interpret_standoff(standoff_1=standoff_1, standoff_2=standoff_2))" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 3. Calculate $X_{eff}$ from provided module temperature data.\n", - "\n", - "To do this calculation, one must use a set of data with:\n", - " - meterological irradiance data sufficient to calculate the POA irradiance (DHI, GHI, and DNI),\n", - " - ambient temperature data,\n", - " - wind speed at module height, (wind_factor=0.33 will be used unless otherwise specified)\n", - " - temperature measurements of the module in the test system. Ideally this would be measured under a worst case scenario that maximizes the module temperature for a given site,\n", - " - geographic meta data including longitude and latitude,\n", - "\n", - "To create a weather file of your own, copy the format of the example file 'xeff_demo.csv'. This is formatted with the first row containing meta data variable names, the second row containing the corresponding values, the third row containing meteorological data headers, and all the remaining rows containing the meteorological data.\n", - "\n", - "To do this calculation, one should also filter the data to remove times when the sun is not shining or when snow is likely to be on the module. The recommendations and programmed defaults are to use poa_min=100 W/m² and data when the minimum ambient temperature t_amb_min=0." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Read the weather file\n", - "weather_file = os.path.join(DATA_DIR, \"xeff_demo.csv\")\n", - "xeff_weather, xeff_meta = pvdeg.weather.read(weather_file, \"csv\")\n", - "# Pull measured temperature and calculate theoretical insulated back module temperature and open rack module temperature\n", - "T_0, T_inf, xeff_poa = pvdeg.standards.eff_gap_parameters(\n", - " weather_df=xeff_weather,\n", - " meta=xeff_meta,\n", - " sky_model=\"isotropic\",\n", - " temp_model=\"sapm\",\n", - " conf_0=\"insulated_back_glass_polymer\",\n", - " conf_inf=\"open_rack_glass_polymer\",\n", - " wind_factor=0.33,\n", - ")\n", - "# Now calculate X_eff.\n", - "x_eff = pvdeg.standards.eff_gap(\n", - " T_0,\n", - " T_inf,\n", - " xeff_weather[\"module_temperature\"],\n", - " xeff_weather[\"temp_air\"],\n", - " xeff_poa[\"poa_global\"],\n", - " x_0=6.5,\n", - " poa_min=100,\n", - " t_amb_min=0,\n", - ")\n", - "print(\"The effective standoff for this system is\", \"%.1f\" % x_eff, \"cm.\")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$.\n", - "\n", - "Equation 2 can be reorganized as,\n", - "\n", - "$$ T_{98} = T_{98,0} -( T_{98,0}- T_{98,inf}) \\left(1-e^{-\\frac{x_{eff}}{x_{0}}}\\right), Equation 3 $$\n", - "\n", - "and used to calculate the $98^{th}$ percential temperature, $T_{98}$, for a PV system having a given effective standoff height, $X_{eff}$, for an arbitrarily oriented module. Here, $T_{98,0}$ is the $98^{th}$ percentile for an insulated-back module and $T_{98,inf}$ is the $98^{th}$ percentile for a rack-mounted module. The input parameter possibilities are the same as shown in Objective #2 above, but the example below uses the default parameters. The actual tilt [degrees], azimuth [degrees] and $X_{eff}$ [cm] can be modifed as desired." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# This is the minimal function call using the common default settings to estimate T₉₈.\n", - "T_98 = pvdeg.standards.T98_estimate(\n", - " weather_df=WEATHER_df, meta=META, tilt=-META[\"latitude\"], azimuth=None, x_eff=10\n", - ")\n", - "print(\"The 98ᵗʰ percential temperature is estimated to be\", \"%.1f\" % T_98, \"°C.\")" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# This code will calculate the temperature for an arbitrary x_eff distance. Either set of kwargs can be modified and use.\n", - "# irradiance_kwarg ={\n", - "# \"axis_tilt\": None,\n", - "# \"axis_azimuth\": None,\n", - "# \"x_eff\": 10,\n", - "# \"module_mount\": '1_axis'}\n", - "irradiance_kwarg = {\n", - " \"tilt\": META[\"latitude\"],\n", - " \"azimuth\": None,\n", - " \"x_eff\": 10,\n", - " \"module_mount\": \"fixed\",\n", - "}\n", - "\n", - "T_xeff = pvdeg.standards.x_eff_temperature_estimate(\n", - " weather_df=WEATHER_df, meta=META, **irradiance_kwarg\n", - ")\n", - "\n", - "print(\n", - " \"The 98ᵗʰ percential temperature is estimated to be\",\n", - " \"%.1f\" % np.percentile(T_xeff, 98),\n", - " \"°C.\",\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$.\n", - "\n", - "The temperature of a system is affected by the orientation. This section will scan all possible tilts and azimuths calculating the minimum standoff distance for a given $T_{98}$. Similar additional factors as above can also be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $T_{98}$=70°C, the boundary between Level 0 and Level 1 requirements. The temperature model information given below is unnecessary as these are default values that would get populated automatically. However, they were included here for clarity into a standard practice as per IEC TS 63126.\n", - "\n", - "$$ X_{min} = -X_0 \\ln\\left(1-\\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\\right), Equation 2 $$" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Scans through all the azimuth and tilt running the minimum standoff calculation\n", - "# Set up keyword parameters for the calculation\n", - "\n", - "kwarg_x = dict(\n", - " sky_model=\"isotropic\",\n", - " temp_model=\"sapm\",\n", - " conf_0=\"insulated_back_glass_polymer\",\n", - " conf_inf=\"open_rack_glass_polymer\",\n", - " T98=70,\n", - " x_0=6.5,\n", - " wind_factor=0.33,\n", - ")\n", - "# Run the calculation\n", - "x_azimuth_step = 10\n", - "x_tilt_step = 10\n", - "standoff_series = pvdeg.utilities.tilt_azimuth_scan(\n", - " weather_df=WEATHER_df,\n", - " meta=META,\n", - " tilt_step=x_tilt_step,\n", - " azimuth_step=x_azimuth_step,\n", - " func=pvdeg.standards.standoff_x,\n", - " **kwarg_x,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The next cell creates a plot of the calculated data. Some of the things you may want to change are:\n", - "- cmap=\"Spectral_r\": Change to have different colors\n", - "- plt.title : This will change the plot title.\n", - "- figsize=(16,4) : Change the plot dimensions and/or aspect ratio.\n", - "- vmax=None : This can be set to a numeric value to control the depth scale maximum\n", - "- vmin=0 : This controls the minimum of the depth scale.\n", - "- v_ticks=37 : This changes the number of vertical tick marks\n", - "- h_ticks=10 : This changes the number of horizontal tick marks\n", - "- Unblock the last two lines to ouput the plot as an *.png image file" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "standoff_series_df = pd.DataFrame(\n", - " {\n", - " \"Tilt\": standoff_series[:, 0],\n", - " \"Azimuth\": standoff_series[:, 1],\n", - " \"Xₘᵢₙ\": standoff_series[:, 2],\n", - " }\n", - ")\n", - "x_fig = plt.figure(figsize=(16, 4))\n", - "plt.title(\n", - " r\"Plot of $\\it{Xₘᵢₙ}$ for all orientations for $\\it{T₉₈}$=\"\n", - " + \"%.0f\" % kwarg_x[\"T98\"]\n", - " + \"°C.\",\n", - " fontsize=15,\n", - " y=1.08,\n", - ")\n", - "x_fig = sns.heatmap(\n", - " standoff_series_df.pivot(index=\"Tilt\", columns=\"Azimuth\", values=\"Xₘᵢₙ\"),\n", - " cbar_kws={\"label\": \"Xₘᵢₙ\", \"format\": \"%.0f\", \"pad\": 0.02},\n", - " cmap=\"Spectral_r\",\n", - " vmin=0,\n", - " vmax=None,\n", - ")\n", - "\n", - "h_ticks = 37\n", - "x_number = math.ceil(360 / x_azimuth_step) + 1\n", - "x_ticks = [\n", - " (x * (360 / (h_ticks - 1)) / x_azimuth_step + 0.5) for x in range(h_ticks - 1)\n", - "]\n", - "x_labels = [(\"%.0f\" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)]\n", - "x_ticks.append(x_number - 0.5)\n", - "x_fig.set_xticks(x_ticks)\n", - "x_fig.set_xticklabels(x_labels, rotation=90)\n", - "\n", - "v_ticks = 10\n", - "y_number = math.ceil(90 / x_tilt_step) + 1\n", - "y_ticks = [(x * (90 / (v_ticks - 1)) / x_tilt_step + 0.5) for x in range(v_ticks - 1)]\n", - "y_labels = [(\"%.0f\" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)]\n", - "y_ticks.append(y_number - 0.5)\n", - "x_fig.set_yticks(y_ticks)\n", - "x_fig.set_yticklabels(y_labels, rotation=0)\n", - "\n", - "x_fig.set_xlabel(\"Azimuth [°]\", fontsize=15, labelpad=10)\n", - "x_fig.set_ylabel(\"Tilt [°]\", fontsize=15)\n", - "x_fig.figure.axes[-1].set_ylabel(r\"$\\it{Xₘᵢₙ}$ [cm]\", size=15)\n", - "x_fig.invert_yaxis()\n", - "\n", - "output_folder = os.path.join(\n", - " os.path.dirname(os.path.dirname(os.getcwd())), \"TEMP\", \"results\"\n", - ")\n", - "try:\n", - " os.makedirs(output_folder)\n", - "except OSError as error:\n", - " print(error)\n", - "\n", - "plt.savefig(\n", - " os.path.join(output_folder, \"Standoff_Scan.png\"), bbox_inches=\"tight\"\n", - ") # Creates an image file of the standoff plot\n", - "plt.show()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 6. Plot $T_{98}$ for all azimuth and tilt for a given $X_{eff}$.\n", - "\n", - "The temperature of a system is affected by the orientation and the effective standoff, $X_{eff}$, of the system. This section will scan all possible tilts and azimuths calculating the $T_{98}$ for a given $X_{eff}$. As above, additional factors can be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $X_{eff}$=10 cm, a common effective standoff distance on a rooftop system. A value of $X_{eff}$=None will run the calculations for an open rack system and $X_{eff}$=0 for an insulated-back system." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# Scans through all the azimuth and tilt running the 98ᵗʰ percentile temperature calculation.\n", - "# Set up keyword parameters for the calculation\n", - "kwarg_T = dict(\n", - " sky_model=\"isotropic\",\n", - " temp_model=\"sapm\",\n", - " conf_0=\"insulated_back_glass_polymer\",\n", - " conf_inf=\"open_rack_glass_polymer\",\n", - " x_eff=5,\n", - " x_0=6.5,\n", - " wind_factor=0.33,\n", - ")\n", - "# Run the calculation\n", - "T_azimuth_step = 10\n", - "T_tilt_step = 10\n", - "T98_series = pvdeg.utilities.tilt_azimuth_scan(\n", - " weather_df=WEATHER_df,\n", - " meta=META,\n", - " tilt_step=T_tilt_step,\n", - " azimuth_step=T_azimuth_step,\n", - " func=pvdeg.standards.T98_estimate,\n", - " **kwarg_T,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The next cell creates a plot of the calculated data. Some of the things you may want to change are:\n", - "- cmap=\"Spectral_r\": Change to have different colors\n", - "- plt.title : This will change the plot title.\n", - "- figsize=(16,4) : Change the plot dimensions and/or aspect ratio.\n", - "- vmax=None : This can be set to a numeric value to control the depth scale maximum\n", - "- vmin=None : This controls the minimum of the depth scale.\n", - "- v_ticks=37 : This changes the number of vertical tick marks\n", - "- h_ticks=10 : This changes the number of horizontal tick marks\n", - "- Unblock the last two lines to ouput the plot as an *.png image file" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "# This produces the plot of the data\n", - "T98_series_df = pd.DataFrame(\n", - " {\"Tilt\": T98_series[:, 0], \"Azimuth\": T98_series[:, 1], \"T₉₈\": T98_series[:, 2]}\n", - ")\n", - "T98_fig = plt.figure(figsize=(16, 4))\n", - "if kwarg_T[\"x_eff\"] == None:\n", - " plt.title(\n", - " r\"Plot of $\\it{T₉₈}$ for all orientations for an open-rack mounting.\",\n", - " fontsize=15,\n", - " y=1.08,\n", - " )\n", - "else:\n", - " plt.title(\n", - " r\"Plot of $\\it{T₉₈}$ for all orientations for $X_{eff}$=\"\n", - " + \"%.0f\" % kwarg_T[\"x_eff\"]\n", - " + \" cm.\",\n", - " fontsize=15,\n", - " y=1.08,\n", - " )\n", - "T98_fig = sns.heatmap(\n", - " T98_series_df.pivot(index=\"Tilt\", columns=\"Azimuth\", values=\"T₉₈\"),\n", - " cbar_kws={\"label\": \"Xₘᵢₙ\", \"format\": \"%.0f\", \"pad\": 0.02},\n", - " cmap=\"Spectral_r\",\n", - " vmin=None,\n", - " vmax=None,\n", - ")\n", - "\n", - "h_ticks = 37\n", - "x_number = math.ceil(360 / T_azimuth_step) + 1\n", - "x_ticks = [\n", - " (x * (360 / (h_ticks - 1)) / T_azimuth_step + 0.5) for x in range(h_ticks - 1)\n", - "]\n", - "x_labels = [(\"%.0f\" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)]\n", - "x_ticks.append(x_number - 0.5)\n", - "T98_fig.set_xticks(x_ticks)\n", - "T98_fig.set_xticklabels(x_labels, rotation=90)\n", - "\n", - "v_ticks = 10\n", - "y_number = math.ceil(90 / T_tilt_step) + 1\n", - "y_ticks = [(x * (90 / (v_ticks - 1)) / T_tilt_step + 0.5) for x in range(v_ticks - 1)]\n", - "y_labels = [(\"%.0f\" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)]\n", - "y_ticks.append(y_number - 0.5)\n", - "T98_fig.set_yticks(y_ticks)\n", - "T98_fig.set_yticklabels(y_labels, rotation=0)\n", - "\n", - "T98_fig.set_xlabel(\"Azimuth [°]\", fontsize=15, labelpad=10)\n", - "T98_fig.set_ylabel(\"Tilt [°]\", fontsize=15)\n", - "T98_fig.figure.axes[-1].set_ylabel(r\"$\\it{T₉₈}$ [°C]\", size=15)\n", - "T98_fig.invert_yaxis()\n", - "\n", - "plt.savefig(\n", - " os.path.join(output_folder, \"T98_Scan.png\"), bbox_inches=\"tight\"\n", - ") # Creates an image file of the standoff plot\n", - "plt.show(T98_fig)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 7. Plot $X_{min}$ for a $T_{98}$, and plot $T_{98}$ for a given region.\n", - "\n", - "This last Objective is much more complicated and is set up to utilize acess to a lot of computational power to run many sites simultaneously to create a regional map of standoff distance.\n", - "For more in-depth instructions on doing this, look at the tutorial \"04_scenario_geospatial.ipynb\" here in PVDeg.\n", - "\n", - "Step #1: Create an object, \"geospatial_standoff_scenario\" to be used to run the computations." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario = pvdeg.GeospatialScenario(\n", - " name=\"standoff geospatial\",\n", - " geospatial=True,\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Step #2: Identifies a subset of locations from the database to run the computations.\n", - "Specifically all are from the NSRDB." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario.addLocation(\n", - " state=\"Colorado\", county=\"Summit\"\n", - ") # Identifies a subset of locations from the database to run the computations. Specifically all are from the NSRDB." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Step #3: indicate which function will be run. Here the default is the standoff calculation, but it could be any other function with a key word argument dictionary.\n", - "Here the 98th percential temperature is defined as 70C, but any arbitrary value can be specified." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario.addJob(\n", - " func=pvdeg.standards.standoff, func_params={\"T98\": 70}\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Step #4: Run the scenario" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario.run()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Step #5: Create a plot of the standoff calculation." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario.plot_world(\"x\")" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "geospatial_standoff_scenario.plot_world(\"T98_inf\")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# 8. Save data outputs.\n", - "\n", - "This cell contains a number of pre-scripted commands for exporting and saving data. The code to save plots is located after the plot creation and is blocked by default. First check that the output folder exists, then unblock the code for data you would like to save." - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": { - "scrolled": true - }, - "outputs": [], - "source": [ - "print(\"Your results will be stored in %s\" % output_folder)\n", - "print(\"The folder must already exist or the file will not be created\")\n", - "\n", - "pvdeg.weather.write(\n", - " data_df=WEATHER_df,\n", - " metadata=META,\n", - " savefile=os.path.join(output_folder, \"WeatherFile.csv\"),\n", - ") # Writes the meterological data to an *.csv file.\n", - "\n", - "pd.DataFrame(standoff_series_df).to_csv(\n", - " os.path.join(output_folder, \"Standoff_Scan.csv\")\n", - ") # Writes a file with the Tilt and Azimuth scan calculations of standoff.\n", - "\n", - "pd.DataFrame(T98_series_df).to_csv(\n", - " os.path.join(output_folder, \"T98_Scan.csv\")\n", - ") # Writes a file with the Tilt and Azimuth scan calculations of T98." - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "pvdeg", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.12.9" - } - }, - "nbformat": 4, - "nbformat_minor": 4 -} diff --git a/tutorials/04_geospatial/scripts/02_geospatial_templates.py b/tutorials/04_geospatial/scripts/02_geospatial_templates.py deleted file mode 100644 index 730ec803..00000000 --- a/tutorials/04_geospatial/scripts/02_geospatial_templates.py +++ /dev/null @@ -1,282 +0,0 @@ -# %% [markdown] -# # Geospatial Templates (HPC) -# - -# %% -import pvdeg -from pvdeg import TEST_DATA_DIR -import pandas as pd -import os -import xarray as xr - - -# %% [markdown] -# # Geospatial Templates -# -# When running a geospatial analysis using `pvdeg.geospatial.analysis` on arbitary `pvdeg` functions you will need to specify a template for the shape of the output data. This is because the input data comes with dimensions of gid and time while the output will have data in a different shape usually corresonding to coordinates. -# - gid, identification number corresponding to an NSRDB datapoint's location -# - time, timeseries corresponding to the hourly time indicies of NSRDB datapoint's yearly meteorological data. -# -# Follow the steps below to see how we generate templates before running the analysis. -# -# The only functions where this is not required are currently `pvdeg.standards.standoff`, `pvdeg.humidity.moduke` and, `letid.calc_letid_outdoors` as they are predefined within the package. - -# %% [markdown] -# # Loading Geospatial Data -# -# This step skips over making the `pvdeg.weather.get` call with `geospatial == True`. See the [Duramat Demo](../10_workshop_demos/02_duramat_live_demo.ipynb) for information on how to do this traditionally. -# -# We can also use a `GeospatialScenario` object. See the [Geospatial Scenario Tutorial](./04_scenario_geospatial.ipynb) for more information on how to use this approach. -# -# *The cell below loads a pickled xarray object, this is not the best way to do this. xarray datasets should be stored as `.nc` - netcdf files* - -# %% -geo_meta = pd.read_csv(os.path.join(TEST_DATA_DIR, "summit-meta.csv"), index_col=0) - -# Use xarray to open NetCDF file instead of pickle -geo_weather = xr.open_dataset(os.path.join(TEST_DATA_DIR, "summit-weather.nc")) - -geo_weather - - -# %% [markdown] -# # Creating Templates Manually -# -# `pvdeg.geospatial.ouput_template` we can produce a template for our result data. -# -# We need to do this because different functions return different types of values, some return multiple values as tuples, some return only single numerics, others return timeseries results. We need to specify the shape of our data to create an output xarray dataset. - -# %% [markdown] -# # Examples -# -# ## 98ᵗʰ module percential temperature at Standoff Height -# -# Say we want to estimate the 98ᵗʰ percential temperature for the module at the given tilt, azimuth, and x_eff. `PVDeg` has a function to do this, `pvdeg.standards.T98_estimate` BUT it doesn't have a preset geospatial template. We will need to make one. -# -# - look at the function return values. -# From the docstring we can see that `T98_estimate` only has one return value. IMPORTANT, this value is a single float, NOT a timeseries. This means our output shape will only be dependent on the input identifier and NOT time. -# -# Therefore we will map the output variable `T98` to the location identifier `gid` using a dictionary with `str: tuple` mappings. -# -# *IMPORTANT: you must use the syntax below where the variable maps to a tuple of the coordinates. in this case there needs to be a trailing comma in the tuple or python will iterate over the characters in the tuple instead of the elements. See further examples to alleviate confusion.* - -# %% -# define output shape -shapes = { - "T98": ( - "gid", - ) # one return value at each datapoint, only dependent on datapoint, not time -} - -# create xarray template for output to be populated when analysis is run -template = pvdeg.geospatial.output_template( - ds_gids=geo_weather, - shapes=shapes, -) - -# %% -geo_estimate_temp = pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.standards.T98_estimate, - template=template, -) - -# %% [markdown] -# # Glass Glass Estimated Module Temperature -# -# Now we want to calculate geospatial timeseries temperature values for a module using `pvdeg.temperature.module`. This is not super practical because all `pvdeg` functions that need to use tempeature for their calculations preform the temperature calculation internally, this is just for show. -# -# This calculation differs from the above because the temperature functions return the module temperature in a timeseries format. So we care about 2 dimensions, location identifier and TIME. - -# %% -# define output shape -shapes = { - "module_temperature": ( - "gid", - "time", - ) # one return value at each datapoint, only dependent on datapoint, not time -} - -# create xarray template for output to be populated when analysis is run -temperature_template = pvdeg.geospatial.output_template( - ds_gids=geo_weather, - shapes=shapes, -) - -# %% -geo_temperature_res = pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.module, - template=temperature_template, # use the template we created - conf="open_rack_glass_glass", # provide kwargs for function here -) - -# %% -# plot the temperature at ONE of the geospatial analysis result locations but we have calculated all of these. -import matplotlib.pyplot as plt - -module_temps = ( - geo_temperature_res["module"].sel(latitude=39.89, longitude="-106.42").values -) - -plt.plot(module_temps) - -# %% [markdown] -# # Self Explaining Code -# -# If we are looking at adding templates for other functions, we can also look at the 3 presaved templates for existing `pvdeg` functions. Visit [pvdeg.geospatial.template_parameters](../../pvdeg/geospatial.py) and inspect this function to see how these different target functions utilize templates and shapes. - -# %% [markdown] -# # Creating Templates Programatically -# -# We can use `pvdeg.geospatial.autotemplate` to generate a template for a given function. This can return a bad result which will fail or work improperly when running `pvdeg.geospatial.analysis` with the generated template. Results should be scrutinized to make sure they are the right format. - -# %% [markdown] -# # Examples Below -# Steps -# - Create template using autotemplating function. Pulls in information about function to determine shape of output. Not usable on functions with ambigious return types. -# - Call geospatial analysis function using template - -# %% [markdown] -# # Geospatial Cell Temperature Calculation -# As shown below, we have two options, we can choose to provide a template that is generated by a function which supports autotemplating. Or we can provide the function to `geospatial.analysis` and let it generate a template internally. -# -# ## Providing a Template with `Geospatial.auto_template` - -# %% -# create a template using auto_template for the desired function -cell_temp_template = pvdeg.geospatial.auto_template( - func=pvdeg.temperature.cell, ds_gids=geo_weather -) - -# run the geospatial analysis with the template -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.cell, - template=cell_temp_template, -) - -# %% [markdown] -# # Analysis Without Providing a Template -# -# If a function is supported by `geospatial.auto_template` we do not need to create a template outside of the function as shown in the cell above. We can simply pass the function to `geospatial.analysis` and it will create a template for us. - -# %% -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.cell, -) - -# %% [markdown] -# # Geospatial Module Temperature Calculation - -# %% -module_temp_template = pvdeg.geospatial.auto_template( - func=pvdeg.temperature.module, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.module, - template=module_temp_template, -) - -# %% [markdown] -# # Geospatial Solar Position Calculation - -# %% -solar_position_template = pvdeg.geospatial.auto_template( - func=pvdeg.spectral.solar_position, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.spectral.solar_position, - template=solar_position_template, -) - -# %% [markdown] -# # Geospatial POA Irradiance Calculation - -# %% -poa_irradiance_template = pvdeg.geospatial.auto_template( - func=pvdeg.spectral.poa_irradiance, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.spectral.poa_irradiance, - template=poa_irradiance_template, -) - -# %% [markdown] -# # Geospatial 98th Percentile Operating Temperature Calculation - -# %% -standoff_template = pvdeg.geospatial.auto_template( - func=pvdeg.standards.T98_estimate, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.standards.T98_estimate, - template=standoff_template, -) - -# %% [markdown] -# # Geospatial Module Humidity Calculation - -# %% -humidity_template = pvdeg.geospatial.auto_template( - func=pvdeg.humidity.module, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.humidity.module, - template=humidity_template, - backsheet_thickness=0.3, - back_encap_thickness=0.5, -) - -# %% [markdown] -# # Geospatial IwaVantHoff Environment Characterization Calculation - -# %% -iwa_vant_hoff_template = pvdeg.geospatial.auto_template( - func=pvdeg.degradation.IwaVantHoff, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.degradation.IwaVantHoff, - template=iwa_vant_hoff_template, -) - -# %% [markdown] -# # Geospatial Edge Seal Width Calculation - -# %% -edge_seal_template = pvdeg.geospatial.auto_template( - func=pvdeg.design.edge_seal_width, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.design.edge_seal_width, - template=edge_seal_template, -) - -# %% -edge_seal_template diff --git a/tutorials/04_geospatial/scripts/08_module_standoff_iec63126.py b/tutorials/04_geospatial/scripts/08_module_standoff_iec63126.py deleted file mode 100644 index f87f716b..00000000 --- a/tutorials/04_geospatial/scripts/08_module_standoff_iec63126.py +++ /dev/null @@ -1,499 +0,0 @@ -# %% [markdown] -# # Tools - Module Standoff for IEC TS 63126 -# -# ## Calculation of module standoff distance according to IEC TS 63126 -# -# **Requirements:** -# - Local weather data file or site longitude and latittude -# -# **Objectives:** -# 1. Import weather data. -# 2. Calculate installation standoff - Level 1 and Level 2. -# 3. Calculate $X_{eff}$ from provided module temperature data. -# 4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$. -# 5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$. -# 6. Plot $X_{min}$ for Level 1, Level 2, or a $T_{98}$ for a given region. -# -# **Background:** -# -# This notebook calculates the a minimum effective standoff distance ($X_{eff}$) necessary for roof-mounted PV modules to ensure that the $98^{th}$ percentile operating temperature, $T_{98}$, remains under 70°C for compliance to IEC 61730 and IEC 61215. For higher $T_{98}$ values above 70°C or 80°C testing must be done to the specifications for Level 1 and Level 2 of IEC TS 63126. This method is outlined in the appendix of IEC TS 63126 and is based on the model from *[King 2004] and data from **[Fuentes, 1987] to model the approximate exponential decay in temperature, $T(X)$, with increasing standoff distance, $X$, as, -# -# $$ X = -X_0 \ln\left(1-\frac{T_0-T}{\Delta T}\right), Equation 1 $$ -# -# where $T_0$ is the temperature for $X=0$ (insulated-back) and $\Delta T$ is the temperature difference between an insulated-back ($X=0$) and open-rack mounting configuration ($X=\infty)$. -# -# We used pvlib and data from the National Solar Radiation Database (NSRDB) to calculate the module temperatures for the insulated-back and open-rack mounting configurations and apply our model to obtain the minimum standoff distance for roof-mounted PV systems to achieve a temperature lower than a specified $T_{98}$. The following figure showcases this calulation for the entire world for an $X_{eff}$ that results in $T_{98}$=70°C. Values of $X_{eff}$ higher than this will require Level 1 or Level 2 certification. -# -# $*$ D. L. King, W. E. Boyson, and J. A. Kratochvil, "Photovoltaic array performance model," SAND2004-3535, Sandia National Laboratories, Albuquerque, NM, 2004. '\ -# $**$ M. K. Fuentes, "A simplified thermal model for Flat-Plate photovoltaic arrays," United States, 1987-05-01 1987. https://www.osti.gov/biblio/6802914 -# - -# %% [markdown] -# ![alt text](images/T98_70C_standoff_Map.png) - -# %% -# if running on google colab, uncomment the next line and execute this cell to install the dependencies -# and prevent "ModuleNotFoundError" in later cells: -# #!pip install pvdeg - -# %% -import os -import pvdeg -import pandas as pd -from pvdeg import DATA_DIR -import dask -import matplotlib.pyplot as plt -import seaborn as sns -import math -import numpy as np - -# %% -# This information helps with debugging and getting support :) -import sys -import platform - -print("Working on a ", platform.system(), platform.release()) -print("Python version ", sys.version) -print("Pandas version ", pd.__version__) -print("pvdeg version ", pvdeg.__version__) -print("dask version", dask.__version__) -print(DATA_DIR) - -# %% [markdown] -# # 1. Import Weather Data -# -# The function has these minimum requirements when using a weather data file: -# - Weather data containing (at least) DNI, DHI, GHI, Temperature, RH, and Wind-Speed data at module level. -# - Site meta-data containing (at least) latitude, longitude, and time zone -# -# Alternatively one may can get meterological data from the NSRDB or PVGIS with just the longitude and latitude. This function for the NSRDB (via NSRDB 'PSM3') works primarily for most of North America and South America. PVGIS works for most of the rest of the world (via SARAH 'PVGIS'). See the tutorial "Weather Database Access.ipynb" tutorial on PVdeg or Jensen et al. https://doi.org/10.1016/j.solener.2023.112092 for satellite coverage information. -# - -# %% -# Get data from a supplied data file (Do not use the next box of code if using your own file) -weather_file = os.path.join(DATA_DIR, "psm3_demo.csv") -WEATHER_df, META = pvdeg.weather.read(weather_file, "csv") -print(META) - -# %% -# This routine will get a meteorological dataset from anywhere in the world where it is available -# weather_id = (24.7136, 46.6753) #Riyadh, Saudi Arabia -# weather_id = (35.6754, 139.65) #Tokyo, Japan -# weather_id = (-43.52646, 172.62165) #Christchurch, New Zealand -# weather_id = (64.84031, -147.73836) #Fairbanks, Alaska -# weather_id = (65.14037, -21.91633) #Reykjavik, Iceland -# weather_id = (33.4152, -111.8315) #Mesa, Arizona -# WEATHER_df, META = pvdeg.weather.get_anywhere(id=weather_id) -# print(META) - -# %% [markdown] -# # 2. Calculate Installation Standoff Minimum - Level 1 and Level 2 -# -# According to IEC TS 63126, Level 0, Level 1 and Level 2 certification is limited to T₉₈<70°C, <80°C and <90°C, respectively. Level 0 certification is essentially compliance to IEC 61730 and IEC 61215. The default value of T₉₈<70°C represents the minimium gap to avoid higher temperature certification according to IEC TS 63126. This minimum standoff ($x_{min}$) is the distance between the bottom of the module frame and the roof and can be extimated for a given environment as, -# -# $$ X_{min} = -X_0 \ln\left(1-\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\right), Equation 2 $$ -# -# where $T_{98,0}$ is the $98^{th}$ percentile temperature for an insulated back module and $T_{98,inf}$ is the $98^{th}$ percentile temperature for an open rack mounted module. -# -# Once the meterological data has been obtained, the input parameter possibilities are: -# -# - T₉₈ : Does not necessarily need to be set at 70°C or 80°C for IEC TS 63216, you might want to use a different number to compensate for a thermal aspect of the particular system you are considering. The default is 70°C. -# - tilt : tilt from horizontal of PV module. The default is 0°. -# - azimuth : azimuth in degrees from North. The default is 180° for south facing. -# - sky_model : pvlib compatible model for generating sky characteristics (Options: 'isotropic', 'klucher', 'haydavies', 'reindl', 'king', 'perez'). The default is 'isotropic'. -# - temp_model : pvlib compatible module temperature model. (Options: 'sapm', 'pvsyst', 'faiman', 'sandia'). The default is 'sapm'. -# - conf_0 : Temperature model for hotest mounting configuration. Default is "insulated_back_glass_polymer". -# - conf_inf : Temperature model for open rack mounting. Default is "open_rack_glass_polymer". -# - x_0 : thermal decay constant [cm] (see documentation). The default is 6.5 cm. -# - wind_factor : Wind speed power law correction factor to account for different wind speed measurement heights between weather database (e.g. NSRDB) and the tempeature model (e.g. SAPM). The default is 0.33. - -# %% [markdown] -# The following is the minimum function call. It defaults to horizontal tilt and T₉₈=70°C. - -# %% -standoff = pvdeg.standards.standoff(weather_df=WEATHER_df, meta=META) -print(pvdeg.standards.interpret_standoff(standoff)) - -# %% [markdown] -# The following is a full function call for both T₉₈=70°C and 80°C separately even though the second standoff distance can be calculated using only T98_0 and T98_inf. With this function, one may also want to change the tilt, azimuth, or T98. - -# %% -standoff_1 = pvdeg.standards.standoff( - weather_df=WEATHER_df, - meta=META, - T98=70, - tilt=META["latitude"], - azimuth=None, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_0=6.5, - wind_factor=0.33, -) -print("First calculation standoff = ", "%.1f" % standoff_1["x"].iloc[0], " cm.") -standoff_2 = pvdeg.standards.standoff( - weather_df=WEATHER_df, - meta=META, - T98=80, - tilt=META["latitude"], - azimuth=None, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_0=6.5, - wind_factor=0.33, -) -print("Second calculation standoff = ", "%.1f" % standoff_2["x"].iloc[0], " cm.") -print(pvdeg.standards.interpret_standoff(standoff_1=standoff_1, standoff_2=standoff_2)) - -# %% [markdown] -# # 3. Calculate $X_{eff}$ from provided module temperature data. -# -# To do this calculation, one must use a set of data with: -# - meterological irradiance data sufficient to calculate the POA irradiance (DHI, GHI, and DNI), -# - ambient temperature data, -# - wind speed at module height, (wind_factor=0.33 will be used unless otherwise specified) -# - temperature measurements of the module in the test system. Ideally this would be measured under a worst case scenario that maximizes the module temperature for a given site, -# - geographic meta data including longitude and latitude, -# -# To create a weather file of your own, copy the format of the example file 'xeff_demo.csv'. This is formatted with the first row containing meta data variable names, the second row containing the corresponding values, the third row containing meteorological data headers, and all the remaining rows containing the meteorological data. -# -# To do this calculation, one should also filter the data to remove times when the sun is not shining or when snow is likely to be on the module. The recommendations and programmed defaults are to use poa_min=100 W/m² and data when the minimum ambient temperature t_amb_min=0. - -# %% -# Read the weather file -weather_file = os.path.join(DATA_DIR, "xeff_demo.csv") -xeff_weather, xeff_meta = pvdeg.weather.read(weather_file, "csv") -# Pull measured temperature and calculate theoretical insulated back module temperature and open rack module temperature -T_0, T_inf, xeff_poa = pvdeg.standards.eff_gap_parameters( - weather_df=xeff_weather, - meta=xeff_meta, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - wind_factor=0.33, -) -# Now calculate X_eff. -x_eff = pvdeg.standards.eff_gap( - T_0, - T_inf, - xeff_weather["module_temperature"], - xeff_weather["temp_air"], - xeff_poa["poa_global"], - x_0=6.5, - poa_min=100, - t_amb_min=0, -) -print("The effective standoff for this system is", "%.1f" % x_eff, "cm.") - -# %% [markdown] -# # 4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$. -# -# Equation 2 can be reorganized as, -# -# $$ T_{98} = T_{98,0} -( T_{98,0}- T_{98,inf}) \left(1-e^{-\frac{x_{eff}}{x_{0}}}\right), Equation 3 $$ -# -# and used to calculate the $98^{th}$ percential temperature, $T_{98}$, for a PV system having a given effective standoff height, $X_{eff}$, for an arbitrarily oriented module. Here, $T_{98,0}$ is the $98^{th}$ percentile for an insulated-back module and $T_{98,inf}$ is the $98^{th}$ percentile for a rack-mounted module. The input parameter possibilities are the same as shown in Objective #2 above, but the example below uses the default parameters. The actual tilt [degrees], azimuth [degrees] and $X_{eff}$ [cm] can be modifed as desired. - -# %% -# This is the minimal function call using the common default settings to estimate T₉₈. -T_98 = pvdeg.standards.T98_estimate( - weather_df=WEATHER_df, meta=META, tilt=-META["latitude"], azimuth=None, x_eff=10 -) -print("The 98ᵗʰ percential temperature is estimated to be", "%.1f" % T_98, "°C.") - -# %% -# This code will calculate the temperature for an arbitrary x_eff distance. Either set of kwargs can be modified and use. -# irradiance_kwarg ={ -# "axis_tilt": None, -# "axis_azimuth": None, -# "x_eff": 10, -# "module_mount": '1_axis'} -irradiance_kwarg = { - "tilt": META["latitude"], - "azimuth": None, - "x_eff": 10, - "module_mount": "fixed", -} - -T_xeff = pvdeg.standards.x_eff_temperature_estimate( - weather_df=WEATHER_df, meta=META, **irradiance_kwarg -) - -print( - "The 98ᵗʰ percential temperature is estimated to be", - "%.1f" % np.percentile(T_xeff, 98), - "°C.", -) - -# %% [markdown] -# # 5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$. -# -# The temperature of a system is affected by the orientation. This section will scan all possible tilts and azimuths calculating the minimum standoff distance for a given $T_{98}$. Similar additional factors as above can also be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $T_{98}$=70°C, the boundary between Level 0 and Level 1 requirements. The temperature model information given below is unnecessary as these are default values that would get populated automatically. However, they were included here for clarity into a standard practice as per IEC TS 63126. -# -# $$ X_{min} = -X_0 \ln\left(1-\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\right), Equation 2 $$ - -# %% -# Scans through all the azimuth and tilt running the minimum standoff calculation -# Set up keyword parameters for the calculation - -kwarg_x = dict( - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - T98=70, - x_0=6.5, - wind_factor=0.33, -) -# Run the calculation -x_azimuth_step = 10 -x_tilt_step = 10 -standoff_series = pvdeg.utilities.tilt_azimuth_scan( - weather_df=WEATHER_df, - meta=META, - tilt_step=x_tilt_step, - azimuth_step=x_azimuth_step, - func=pvdeg.standards.standoff_x, - **kwarg_x, -) - -# %% [markdown] -# The next cell creates a plot of the calculated data. Some of the things you may want to change are: -# - cmap="Spectral_r": Change to have different colors -# - plt.title : This will change the plot title. -# - figsize=(16,4) : Change the plot dimensions and/or aspect ratio. -# - vmax=None : This can be set to a numeric value to control the depth scale maximum -# - vmin=0 : This controls the minimum of the depth scale. -# - v_ticks=37 : This changes the number of vertical tick marks -# - h_ticks=10 : This changes the number of horizontal tick marks -# - Unblock the last two lines to ouput the plot as an *.png image file - -# %% -standoff_series_df = pd.DataFrame( - { - "Tilt": standoff_series[:, 0], - "Azimuth": standoff_series[:, 1], - "Xₘᵢₙ": standoff_series[:, 2], - } -) -x_fig = plt.figure(figsize=(16, 4)) -plt.title( - r"Plot of $\it{Xₘᵢₙ}$ for all orientations for $\it{T₉₈}$=" - + "%.0f" % kwarg_x["T98"] - + "°C.", - fontsize=15, - y=1.08, -) -x_fig = sns.heatmap( - standoff_series_df.pivot(index="Tilt", columns="Azimuth", values="Xₘᵢₙ"), - cbar_kws={"label": "Xₘᵢₙ", "format": "%.0f", "pad": 0.02}, - cmap="Spectral_r", - vmin=0, - vmax=None, -) - -h_ticks = 37 -x_number = math.ceil(360 / x_azimuth_step) + 1 -x_ticks = [ - (x * (360 / (h_ticks - 1)) / x_azimuth_step + 0.5) for x in range(h_ticks - 1) -] -x_labels = [("%.0f" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)] -x_ticks.append(x_number - 0.5) -x_fig.set_xticks(x_ticks) -x_fig.set_xticklabels(x_labels, rotation=90) - -v_ticks = 10 -y_number = math.ceil(90 / x_tilt_step) + 1 -y_ticks = [(x * (90 / (v_ticks - 1)) / x_tilt_step + 0.5) for x in range(v_ticks - 1)] -y_labels = [("%.0f" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)] -y_ticks.append(y_number - 0.5) -x_fig.set_yticks(y_ticks) -x_fig.set_yticklabels(y_labels, rotation=0) - -x_fig.set_xlabel("Azimuth [°]", fontsize=15, labelpad=10) -x_fig.set_ylabel("Tilt [°]", fontsize=15) -x_fig.figure.axes[-1].set_ylabel(r"$\it{Xₘᵢₙ}$ [cm]", size=15) -x_fig.invert_yaxis() - -output_folder = os.path.join( - os.path.dirname(os.path.dirname(os.getcwd())), "TEMP", "results" -) -try: - os.makedirs(output_folder) -except OSError as error: - print(error) - -plt.savefig( - os.path.join(output_folder, "Standoff_Scan.png"), bbox_inches="tight" -) # Creates an image file of the standoff plot -plt.show() - -# %% [markdown] -# # 6. Plot $T_{98}$ for all azimuth and tilt for a given $X_{eff}$. -# -# The temperature of a system is affected by the orientation and the effective standoff, $X_{eff}$, of the system. This section will scan all possible tilts and azimuths calculating the $T_{98}$ for a given $X_{eff}$. As above, additional factors can be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $X_{eff}$=10 cm, a common effective standoff distance on a rooftop system. A value of $X_{eff}$=None will run the calculations for an open rack system and $X_{eff}$=0 for an insulated-back system. - -# %% -# Scans through all the azimuth and tilt running the 98ᵗʰ percentile temperature calculation. -# Set up keyword parameters for the calculation -kwarg_T = dict( - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_eff=5, - x_0=6.5, - wind_factor=0.33, -) -# Run the calculation -T_azimuth_step = 10 -T_tilt_step = 10 -T98_series = pvdeg.utilities.tilt_azimuth_scan( - weather_df=WEATHER_df, - meta=META, - tilt_step=T_tilt_step, - azimuth_step=T_azimuth_step, - func=pvdeg.standards.T98_estimate, - **kwarg_T, -) - -# %% [markdown] -# The next cell creates a plot of the calculated data. Some of the things you may want to change are: -# - cmap="Spectral_r": Change to have different colors -# - plt.title : This will change the plot title. -# - figsize=(16,4) : Change the plot dimensions and/or aspect ratio. -# - vmax=None : This can be set to a numeric value to control the depth scale maximum -# - vmin=None : This controls the minimum of the depth scale. -# - v_ticks=37 : This changes the number of vertical tick marks -# - h_ticks=10 : This changes the number of horizontal tick marks -# - Unblock the last two lines to ouput the plot as an *.png image file - -# %% -# This produces the plot of the data -T98_series_df = pd.DataFrame( - {"Tilt": T98_series[:, 0], "Azimuth": T98_series[:, 1], "T₉₈": T98_series[:, 2]} -) -T98_fig = plt.figure(figsize=(16, 4)) -if kwarg_T["x_eff"] == None: - plt.title( - r"Plot of $\it{T₉₈}$ for all orientations for an open-rack mounting.", - fontsize=15, - y=1.08, - ) -else: - plt.title( - r"Plot of $\it{T₉₈}$ for all orientations for $X_{eff}$=" - + "%.0f" % kwarg_T["x_eff"] - + " cm.", - fontsize=15, - y=1.08, - ) -T98_fig = sns.heatmap( - T98_series_df.pivot(index="Tilt", columns="Azimuth", values="T₉₈"), - cbar_kws={"label": "Xₘᵢₙ", "format": "%.0f", "pad": 0.02}, - cmap="Spectral_r", - vmin=None, - vmax=None, -) - -h_ticks = 37 -x_number = math.ceil(360 / T_azimuth_step) + 1 -x_ticks = [ - (x * (360 / (h_ticks - 1)) / T_azimuth_step + 0.5) for x in range(h_ticks - 1) -] -x_labels = [("%.0f" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)] -x_ticks.append(x_number - 0.5) -T98_fig.set_xticks(x_ticks) -T98_fig.set_xticklabels(x_labels, rotation=90) - -v_ticks = 10 -y_number = math.ceil(90 / T_tilt_step) + 1 -y_ticks = [(x * (90 / (v_ticks - 1)) / T_tilt_step + 0.5) for x in range(v_ticks - 1)] -y_labels = [("%.0f" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)] -y_ticks.append(y_number - 0.5) -T98_fig.set_yticks(y_ticks) -T98_fig.set_yticklabels(y_labels, rotation=0) - -T98_fig.set_xlabel("Azimuth [°]", fontsize=15, labelpad=10) -T98_fig.set_ylabel("Tilt [°]", fontsize=15) -T98_fig.figure.axes[-1].set_ylabel(r"$\it{T₉₈}$ [°C]", size=15) -T98_fig.invert_yaxis() - -plt.savefig( - os.path.join(output_folder, "T98_Scan.png"), bbox_inches="tight" -) # Creates an image file of the standoff plot -plt.show(T98_fig) - -# %% [markdown] -# # 7. Plot $X_{min}$ for a $T_{98}$, and plot $T_{98}$ for a given region. -# -# This last Objective is much more complicated and is set up to utilize acess to a lot of computational power to run many sites simultaneously to create a regional map of standoff distance. -# For more in-depth instructions on doing this, look at the tutorial "04_scenario_geospatial.ipynb" here in PVDeg. -# -# Step #1: Create an object, "geospatial_standoff_scenario" to be used to run the computations. - -# %% -geospatial_standoff_scenario = pvdeg.GeospatialScenario( - name="standoff geospatial", - geospatial=True, -) - -# %% [markdown] -# Step #2: Identifies a subset of locations from the database to run the computations. -# Specifically all are from the NSRDB. - -# %% -geospatial_standoff_scenario.addLocation( - state="Colorado", county="Summit" -) # Identifies a subset of locations from the database to run the computations. Specifically all are from the NSRDB. - -# %% [markdown] -# Step #3: indicate which function will be run. Here the default is the standoff calculation, but it could be any other function with a key word argument dictionary. -# Here the 98th percential temperature is defined as 70C, but any arbitrary value can be specified. - -# %% -geospatial_standoff_scenario.addJob( - func=pvdeg.standards.standoff, func_params={"T98": 70} -) - -# %% [markdown] -# Step #4: Run the scenario - -# %% -geospatial_standoff_scenario.run() - -# %% [markdown] -# Step #5: Create a plot of the standoff calculation. - -# %% -geospatial_standoff_scenario.plot_world("x") - -# %% -geospatial_standoff_scenario.plot_world("T98_inf") - -# %% [markdown] -# # 8. Save data outputs. -# -# This cell contains a number of pre-scripted commands for exporting and saving data. The code to save plots is located after the plot creation and is blocked by default. First check that the output folder exists, then unblock the code for data you would like to save. - -# %% -print("Your results will be stored in %s" % output_folder) -print("The folder must already exist or the file will not be created") - -pvdeg.weather.write( - data_df=WEATHER_df, - metadata=META, - savefile=os.path.join(output_folder, "WeatherFile.csv"), -) # Writes the meterological data to an *.csv file. - -pd.DataFrame(standoff_series_df).to_csv( - os.path.join(output_folder, "Standoff_Scan.csv") -) # Writes a file with the Tilt and Azimuth scan calculations of standoff. - -pd.DataFrame(T98_series_df).to_csv( - os.path.join(output_folder, "T98_Scan.csv") -) # Writes a file with the Tilt and Azimuth scan calculations of T98. diff --git a/tutorials/04_scenario/02_scenario_single_location.ipynb b/tutorials/04_scenario/02_scenario_single_location.ipynb index c0cedf9d..e80953a8 100644 --- a/tutorials/04_scenario/02_scenario_single_location.ipynb +++ b/tutorials/04_scenario/02_scenario_single_location.ipynb @@ -6,7 +6,7 @@ "source": [ "# Single Location\n", "\n", - "Author: Tobin Ford | tobin.ford@nrel.gov\n", + "Author: Tobin Ford | tobin.ford@nlr.gov\n", "\n", "2024\n", "****\n", @@ -90,7 +90,7 @@ "A way around this is to provide the weather and metadata in the pipeline job arguments or you can load data from somewhere else and provide it in the same fashion.\n", "\n", "
\n", - "Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.)\n", + "Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.)\n", "
" ] }, @@ -1286,7 +1286,7 @@ ], "metadata": { "kernelspec": { - "display_name": "Python 3", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -1300,7 +1300,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.12.9" + "version": "3.13.5" } }, "nbformat": 4, diff --git a/tutorials/04_scenario/scripts/01_local_scenario.py b/tutorials/04_scenario/scripts/01_local_scenario.py deleted file mode 100644 index 8b0ed0ba..00000000 --- a/tutorials/04_scenario/scripts/01_local_scenario.py +++ /dev/null @@ -1,18 +0,0 @@ -# %% [markdown] -# # Local Scenario -# -# Author: Tobin Ford -# Email : tobin.ford@nrel.gov - -# %% [markdown] -# # Use Case -# - If you are unable to access the NSRDB via NREL HPC or AWS - -# %% [markdown] -# # Sourcing Data ## -# -# On the [NSRDB Dataviewer](https://nsrdb.nrel.gov/data-viewer), you can use the area querys to collect information from the area you want. Only tested with USA & Americas - Typical Meteorological Year so far. Use the dropdowns to select the attributes you will need to run calculations. Select a year and follow the instructions to get the zip file containing the weather data csv's for each location in the selected region. -# -# - -# %% diff --git a/tutorials/04_scenario/scripts/02_geospatial_templates.py b/tutorials/04_scenario/scripts/02_geospatial_templates.py deleted file mode 100644 index 56a33c6f..00000000 --- a/tutorials/04_scenario/scripts/02_geospatial_templates.py +++ /dev/null @@ -1,282 +0,0 @@ -# %% [markdown] -# # Geospatial Templates (HPC) -# - -# %% -import pvdeg -from pvdeg import TEST_DATA_DIR -import pandas as pd -import os -import xarray as xr - - -# %% [markdown] -# # Geospatial Templates -# -# When running a geospatial analysis using `pvdeg.geospatial.analysis` on arbitary `pvdeg` functions you will need to specify a template for the shape of the output data. This is because the input data comes with dimensions of gid and time while the output will have data in a different shape usually corresonding to coordinates. -# - gid, identification number corresponding to an NSRDB datapoint's location -# - time, timeseries corresponding to the hourly time indicies of NSRDB datapoint's yearly meteorological data. -# -# Follow the steps below to see how we generate templates before running the analysis. -# -# The only functions where this is not required are currently `pvdeg.standards.standoff`, `pvdeg.humidity.moduke` and, `letid.calc_letid_outdoors` as they are predefined within the package. - -# %% [markdown] -# # Loading Geospatial Data -# -# This step skips over making the `pvdeg.weather.get` call with `geospatial == True`. See the [Duramat Demo](./DuraMAT%20Live%20Demo.ipynb) for information on how to do this traditionally. -# -# We can also use a `GeospatialScenario` object. See the [Geospatial Scenario Tutorial](./Scenario%20-%20Geospatial.ipynb) for more information on how to use this approach. -# -# *The cell below loads a pickled xarray object, this is not the best way to do this. xarray datasets should be stored as `.nc` - netcdf files* - -# %% -geo_meta = pd.read_csv(os.path.join(TEST_DATA_DIR, "summit-meta.csv"), index_col=0) - -# Use xarray to open NetCDF file instead of pickle -geo_weather = xr.open_dataset(os.path.join(TEST_DATA_DIR, "summit-weather.nc")) - -geo_weather - - -# %% [markdown] -# # Creating Templates Manually -# -# `pvdeg.geospatial.ouput_template` we can produce a template for our result data. -# -# We need to do this because different functions return different types of values, some return multiple values as tuples, some return only single numerics, others return timeseries results. We need to specify the shape of our data to create an output xarray dataset. - -# %% [markdown] -# # Examples -# -# ## 98ᵗʰ module percential temperature at Standoff Height -# -# Say we want to estimate the 98ᵗʰ percential temperature for the module at the given tilt, azimuth, and x_eff. `PVDeg` has a function to do this, `pvdeg.standards.T98_estimate` BUT it doesn't have a preset geospatial template. We will need to make one. -# -# - look at the function return values. -# From the docstring we can see that `T98_estimate` only has one return value. IMPORTANT, this value is a single float, NOT a timeseries. This means our output shape will only be dependent on the input identifier and NOT time. -# -# Therefore we will map the output variable `T98` to the location identifier `gid` using a dictionary with `str: tuple` mappings. -# -# *IMPORTANT: you must use the syntax below where the variable maps to a tuple of the coordinates. in this case there needs to be a trailing comma in the tuple or python will iterate over the characters in the tuple instead of the elements. See further examples to alleviate confusion.* - -# %% -# define output shape -shapes = { - "T98": ( - "gid", - ) # one return value at each datapoint, only dependent on datapoint, not time -} - -# create xarray template for output to be populated when analysis is run -template = pvdeg.geospatial.output_template( - ds_gids=geo_weather, - shapes=shapes, -) - -# %% -geo_estimate_temp = pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.standards.T98_estimate, - template=template, -) - -# %% [markdown] -# # Glass Glass Estimated Module Temperature -# -# Now we want to calculate geospatial timeseries temperature values for a module using `pvdeg.temperature.module`. This is not super practical because all `pvdeg` functions that need to use tempeature for their calculations preform the temperature calculation internally, this is just for show. -# -# This calculation differs from the above because the temperature functions return the module temperature in a timeseries format. So we care about 2 dimensions, location identifier and TIME. - -# %% -# define output shape -shapes = { - "module_temperature": ( - "gid", - "time", - ) # one return value at each datapoint, only dependent on datapoint, not time -} - -# create xarray template for output to be populated when analysis is run -temperature_template = pvdeg.geospatial.output_template( - ds_gids=geo_weather, - shapes=shapes, -) - -# %% -geo_temperature_res = pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.module, - template=temperature_template, # use the template we created - conf="open_rack_glass_glass", # provide kwargs for function here -) - -# %% -# plot the temperature at ONE of the geospatial analysis result locations but we have calculated all of these. -import matplotlib.pyplot as plt - -module_temps = ( - geo_temperature_res["module"].sel(latitude=39.89, longitude="-106.42").values -) - -plt.plot(module_temps) - -# %% [markdown] -# # Self Explaining Code -# -# If we are looking at adding templates for other functions, we can also look at the 3 presaved templates for existing `pvdeg` functions. Visit [pvdeg.geospatial.template_parameters](../../pvdeg/geospatial.py) and inspect this function to see how these different target functions utilize templates and shapes. - -# %% [markdown] -# # Creating Templates Programatically -# -# We can use `pvdeg.geospatial.autotemplate` to generate a template for a given function. This can return a bad result which will fail or work improperly when running `pvdeg.geospatial.analysis` with the generated template. Results should be scrutinized to make sure they are the right format. - -# %% [markdown] -# # Examples Below -# Steps -# - Create template using autotemplating function. Pulls in information about function to determine shape of output. Not usable on functions with ambigious return types. -# - Call geospatial analysis function using template - -# %% [markdown] -# # Geospatial Cell Temperature Calculation -# As shown below, we have two options, we can choose to provide a template that is generated by a function which supports autotemplating. Or we can provide the function to `geospatial.analysis` and let it generate a template internally. -# -# ## Providing a Template with `Geospatial.auto_template` - -# %% -# create a template using auto_template for the desired function -cell_temp_template = pvdeg.geospatial.auto_template( - func=pvdeg.temperature.cell, ds_gids=geo_weather -) - -# run the geospatial analysis with the template -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.cell, - template=cell_temp_template, -) - -# %% [markdown] -# # Analysis Without Providing a Template -# -# If a function is supported by `geospatial.auto_template` we do not need to create a template outside of the function as shown in the cell above. We can simply pass the function to `geospatial.analysis` and it will create a template for us. - -# %% -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.cell, -) - -# %% [markdown] -# # Geospatial Module Temperature Calculation - -# %% -module_temp_template = pvdeg.geospatial.auto_template( - func=pvdeg.temperature.module, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.temperature.module, - template=module_temp_template, -) - -# %% [markdown] -# # Geospatial Solar Position Calculation - -# %% -solar_position_template = pvdeg.geospatial.auto_template( - func=pvdeg.spectral.solar_position, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.spectral.solar_position, - template=solar_position_template, -) - -# %% [markdown] -# # Geospatial POA Irradiance Calculation - -# %% -poa_irradiance_template = pvdeg.geospatial.auto_template( - func=pvdeg.spectral.poa_irradiance, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.spectral.poa_irradiance, - template=poa_irradiance_template, -) - -# %% [markdown] -# # Geospatial 98th Percentile Operating Temperature Calculation - -# %% -standoff_template = pvdeg.geospatial.auto_template( - func=pvdeg.standards.T98_estimate, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.standards.T98_estimate, - template=standoff_template, -) - -# %% [markdown] -# # Geospatial Module Humidity Calculation - -# %% -humidity_template = pvdeg.geospatial.auto_template( - func=pvdeg.humidity.module, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.humidity.module, - template=humidity_template, - backsheet_thickness=0.3, - back_encap_thickness=0.5, -) - -# %% [markdown] -# # Geospatial IwaVantHoff Environment Characterization Calculation - -# %% -iwa_vant_hoff_template = pvdeg.geospatial.auto_template( - func=pvdeg.degradation.IwaVantHoff, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.degradation.IwaVantHoff, - template=iwa_vant_hoff_template, -) - -# %% [markdown] -# # Geospatial Edge Seal Width Calculation - -# %% -edge_seal_template = pvdeg.geospatial.auto_template( - func=pvdeg.design.edge_seal_width, ds_gids=geo_weather -) - -pvdeg.geospatial.analysis( - weather_ds=geo_weather, - meta_df=geo_meta, - func=pvdeg.design.edge_seal_width, - template=edge_seal_template, -) - -# %% -edge_seal_template diff --git a/tutorials/04_scenario/scripts/02_scenario_single_location.py b/tutorials/04_scenario/scripts/02_scenario_single_location.py index 1200adb8..42f261e8 100644 --- a/tutorials/04_scenario/scripts/02_scenario_single_location.py +++ b/tutorials/04_scenario/scripts/02_scenario_single_location.py @@ -1,22 +1,32 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # Single Location -# -# Author: Tobin Ford | tobin.ford@nrel.gov -# +# +# Author: Tobin Ford | tobin.ford@nlr.gov +# # 2024 # **** -# +# # A simple object orented workflow walkthrough using pvdeg. -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg +# !pip install pvdeg + + +# In[2]: + -# %% import pvdeg import os -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -25,69 +35,81 @@ print("Python version ", sys.version) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # Define Single Point Scenario Object # Scenario is a general class that can be used to replace the legacy functional pvdeg analysis approach with an object orented one. ``Scenario`` can preform single location or geospatial analysis. The scenario constructor takes many arguments but the only required one for the following use cases is the ``name`` attribute. It is visible in when we display the entire scenario and is present in the file of saved information about the scenario. We also need to provide the class constructor with our API key and email. -# +# # A way around this is to provide the weather and metadata in the pipeline job arguments or you can load data from somewhere else and provide it in the same fashion. -# +# #
-# Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.) +# Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nlr.gov/signup/ so register now.) #
-# %% +# In[4]: + + simple_scenario = pvdeg.Scenario( name="Point Minimum Standoff", email="user@mail.com", api_key="DEMO_KEY" ) -# %% [markdown] + # # Adding A Location # To add a single point using data from the Physical Solar Model (PSM3), simply feed the scenario a single coordinate in tuple form via the ``addLocation`` method. Currently this is the only way to add a location to a non-geospatial scenario, all of the other arguments are unusable when ``Scenario.geospatial == False``. -# +# # Attempting to add a second location by calling the method again with a different coordinate pair will overwrite the old location data stored in the class instance. -# %% +# In[5]: + + simple_scenario.addLocation( lat_long=(25.783388, -80.189029), ) -# %% + +# In[6]: + + simple_scenario.weather_data -# %% [markdown] + # # Scenario Pipelines -# +# # The pipeline is a list of tasks called jobs for the scenario to run. We will populate the pipeline with a list of jobs before executing them all at once. -# +# # To add a job to the pipeline use the ``updatePipeline`` method. Two examples of adding functions to the pipeline will be shown below. -# %% [markdown] # # Adding a job without function arguments -# +# # The simplest case of adding a job to the pipeline is when it only requires us to provide simple weather and metadata. In the function definition and docstring these appear as ``weather_df`` and ``meta``. Since these attributes are contained in our scenario class instance we do not have to worry about them. We can simply add the function as shown below. -# %% +# In[7]: + + simple_scenario.addJob(func=pvdeg.standards.standoff) -# %% [markdown] + # # Adding a job with function arguments -# +# # When adding a job that contains a function requiring other arguments such as ``solder_fatigue`` which requires a value for ``wind_factor``, we will need to provide it. The most straightforeward way to do this is using a kwargs dictionary and passing it to the function. We do not unpack the dictionary before passing it. This is done inside of the scenario at pipeline runtime (when ``runPipeline`` is called). -# %% +# In[8]: + + kwargs = {"wind_factor": 0.33} simple_scenario.addJob(func=pvdeg.fatigue.solder_fatigue, func_kwarg=kwargs) -# %% [markdown] + # # Adding a job with weather and metadata from outside of the class # ## Not functional -# +# # could just directly set weather data with scenario.weather_data = weather and scenario.meta_data = meta but that would only work for all of the jobs in the pipeline -# +# # Say local weather data is available or other, if we want to use this rather than the PSM3 data at a latitude and longitude we can also provide the weather and metadata in the function arguments. This is probably the best if avoided but follows the same syntax as providing other function arguments. See the example below. -# %% +# In[9]: + + PSM_FILE = os.path.join(pvdeg.DATA_DIR, "psm3_demo.csv") weather, meta = pvdeg.weather.read(PSM_FILE, "psm") @@ -98,40 +120,48 @@ # FIX THIS CASE IN SCENARIO CLASS # (simple_scenario.pipeline[1]['job'])(**simple_scenario.pipeline[1]['params']) -# %% [markdown] + # # View Scenario -# +# # The ``viewScenario`` method provides an overview of the information contained within your scenario object. Here you can see if it contains the location weather and metadata. As well as the jobs in the pipeline and their arguments. -# %% +# In[10]: + + simple_scenario.viewScenario() -# %% [markdown] + # # Display -# +# # The fancier cousin of viewScenario. Only works in a jupyter environemnt as it uses a special ipython backend to render the html and javascript. -# +# # It can be called with just the Scenario instance as follows # `simple_scenario` -# +# # or using the display function # `display(simple_scenario)` -# %% +# In[11]: + + simple_scenario -# %% [markdown] + # # Executing Pipeline Jobs # To run the pipeline after we have populated it with the desired jobs call the ``runPipeline`` method on our scenario instance. This will run all of the jobs we have previously added. The functions that need weather and metadata will grab it from the scenario instance using the correct location added above. The pipeline jobs results will be saved to the scenario instance. -# %% +# In[12]: + + simple_scenario.run() -# %% [markdown] + # # Results Series ## # We will use a series to store the various return values of functions run in our pipeline. These can partially obfuscate the dataframes within them so to access the dataframes, use the function name to access it. To get one of the results we can index it using dictionary syntax. If the job was called `'KSDJQ'` do `'simple_scenario.results['KSDJQ']` to directly access the result for that job -# %% +# In[13]: + + print(simple_scenario.results) print("We can't see out data in here so we need to do another step", end="\n\n") @@ -145,10 +175,13 @@ print(keys) display(results) -# %% [markdown] + # # Cleaning Up the Scenario -# +# # Each scenario object creates a directory named ``pvd_job_...`` that contains information about the scenario instance. To remove the directory and all of its information call ``clean`` on the scenario. This will permanently delete the directory created by the scenario. -# %% +# In[14]: + + simple_scenario.clean() + diff --git a/tutorials/04_scenario/scripts/03_scenario_geographical_features.py b/tutorials/04_scenario/scripts/03_scenario_geographical_features.py deleted file mode 100644 index 17459eae..00000000 --- a/tutorials/04_scenario/scripts/03_scenario_geographical_features.py +++ /dev/null @@ -1,116 +0,0 @@ -# %% [markdown] -# # Geographical Features (HPC) -# - -# %% -import pvdeg -import numpy as np - -# %% [markdown] -# # Create Scenario and Populate Location -# The default geospatial datapoints for addLocation are from the Americas satellite. This is the only data we are interested in for this case so don't specify any location arguments. -# To speed up calculations we will downsample to get coarser data. The downsampling function is not linear. A downsample factor of 10 takes us from 2018267 entries to only 5109. - -# %% -features = pvdeg.GeospatialScenario(name="finding-features") - -features.addLocation(downsample_factor=10) - -# %% [markdown] -# # Intro to KDTrees -# -# A K dimensional tree is a datastructure for organizing n points in a k dimensional space. They are often used for nearest neighbors searches and many ML algorithms. Letting k = 2, our dimensions will be latitude and longitude. This is much faster than iterating over tabular data structures to find neighbors. The function to create a kdtree below requires scikit learn (also known as sklearn), a python machine learning library which is not included in the pvdeg dependency list so you will have to install it independently. Scipy also has a kdtree class but it is much slower. Depending on the number of datapoints the tree build time could be quite long but for our purposes, the cell below will be quick. - -# %% -tree = pvdeg.geospatial.meta_KDtree(meta_df=features.meta_data) - -# %% [markdown] -# # Remaining Points After Downsampling -# -# Use `plot_coords` to see all latitude longitude coordinate pairs included in a scenario's metadata. You can provide corners of a bounding box to change the extent of the matplotlib plot. - -# %% -features.plot_coords() - -# %% [markdown] -# # Identifying Mountains from Geospatial Data I -# -# Identify moutains by using the function below. There are currently 2 methods to identify mountains from our data. Both will add a column to our `GeospatialScenario.meta_data` attribute called `mountain` containing boolean values that represent if a data point is in/near mountains or not near mountains. -# -# - `classify_mountains_weights` uses k nearest neighbors search and calculates local changes in height with different methods, the default is the `mean` absolute change compared to a points neighbors, then it assigns a weight to each point. If the remaining points are above the threshold then the percentile argument determines sensitivity to classifying points as mountains (higher weight, more mountainous). Much better than the following option. -# - `classify_mountains_radii` uses 2 nearest neighbors radius searches at each point and compares the average elevation of the inner circle to the avg radius of the circle. - -# %% -features.classify_mountains_weights(kdtree=tree) - -# %% [markdown] -# # Identifing Mountains from Geospatial Data II -# -# We do not need to provide a `kdtree` when we call this method. We can provide a `kdtree` if we create one first with `pvdeg.geospatial.meta_kdtree` but it will work without supplying it. -# GeospatialScenario will create a kdtree internally for us. So the method can simply be called as shown below. - -# %% -# this will work without running the above cells to create a kdtree or the cell directly above where we utilize the created kdtree. -features.classify_mountains_weights() - -# %% [markdown] -# # Convenience Plotting for Binary Classifications -# -# the `plot_meta_classification` method allows you to plot your data with respect to a boolean classifier present in the `scenario.meta_data` dataframe columns. - -# %% -features.plot_meta_classification(col_name="mountain") - -# %% [markdown] -# # Identifying mountains within a bounding box. -# -# Sometimes we only want information from one area. To allow for more control over the selection process. To pick only one area from an existing superset of data we can use a bounding box. -# Either provide the top left and bottom right (latitude-longitude) pairs or a 2d numpy array of these pairs which will use the most extreme entries to form the bounding box. - -# %% -features.classify_mountains_radii( - kdtree=tree, - elevation_floor=0, - bbox_kwarg={ - "coords": np.array( - [ - [47.960502, -115.048828], - [47.842658, -101.118164], - [36.738884, -113.686523], - [36.633162, -100.283203], - ] - ) - }, -) - -# %% -features.plot_meta_classification(col_name="mountain") - -# %% [markdown] -# # Identifying Coastlines from Geospatial Data -# -# To identify points within a search radius of any [natural earth features](https://www.naturalearthdata.com/features/). The following examples show the process of identifying coastline datapoints. - -# %% -features.classify_feature( - kdtree=tree, - feature_name="coastline", - resolution="10m", - radius=0.5, # change this to get a wider band of selected data near the coastline -) - -# %% -features.plot_meta_classification(col_name="coastline") - -# %% [markdown] -# # Finding Rivers -# -# Above we saw how to find points near coastlines, now lets do points near rivers. It is very similar - -# %% -features.classify_feature( - kdtree=tree, feature_name="rivers_lake_centerlines", resolution="10m", radius=0.2 -) - -# %% -features.plot_meta_classification(col_name="rivers_lake_centerlines") diff --git a/tutorials/04_scenario/scripts/04_scenario_geospatial.py b/tutorials/04_scenario/scripts/04_scenario_geospatial.py deleted file mode 100644 index 2438baf0..00000000 --- a/tutorials/04_scenario/scripts/04_scenario_geospatial.py +++ /dev/null @@ -1,96 +0,0 @@ -# %% [markdown] -# # Scenario Geospatial (HPC) -# - -# %% -import pvdeg - -# %% [markdown] -# # Define Geospatial Scenario Object -# -# To preform geospatial analysis we can create a `GeospatialScenario` object. Alternatively, to preform single location analysis use `Scenario`. Scenario and GeospatialScenario are generalized classes that can be used to replace the legacy functional pvdeg analysis approach with an object orented one. - -# %% -geospatial_standoff_scenario = pvdeg.GeospatialScenario( - name="standoff geospatial", -) - -# %% [markdown] -# # Add Location -# To add locations for geospatial analysis we will use the ``.addLocation`` method. We can choose downselect from the NSRDB to a country, state and county in that order. *Support for multiple of each category in list form soon.* The ``see_added`` flag allows us to see the gids we have added to the scenario. - -# %% -geospatial_standoff_scenario.addLocation( - state="Colorado", county="Summit", see_added=True, downsample_factor=3 -) - -# %% [markdown] -# # Add Functions to the Pipeline -# The scenario has a queue of jobs to preform. These are stored in an attribute called ``pipeline``, you can directly update the pipeline but this will bypass the assistance given in creating the job function and parameters. The easiest way to add a job to the pipeline is the ``.updatePipeline`` method. For geospatial analysis, weather and metadata is collected and stored in the scenario at the time of the ``.addLocation`` method call so we do not need to include it below, but if we have other function kwargs to include, they should go in the ``func_params`` argument. -# -# Only a few pvdeg functions are currently supported for geospatial analysis. See the docstring for ``.updatePipeline`` to view currently supported functions. ``updatePipeline`` will not let you add unsupported geospatial functions. The ``see_added`` flag allows us to see the new job added to the pipeline. - -# %% -geospatial_standoff_scenario.addJob(func=pvdeg.standards.standoff, see_added=True) - -# %% -geospatial_standoff_scenario - -# %% [markdown] -# # Run the job in the pipeline -# -# Currently ``scenario`` only supports one geospatial analysis at a time. We cannot have two geospatial jobs at the same time. - -# %% -geospatial_standoff_scenario.run() - -# %% [markdown] -# # Directly Access Results Attribute -# -# We can either view the results of the scenario pipeline using ``.viewScenario`` as shown above. The results will be displayed only if the pipeline has been run. Alternatively, we can directly view the ``results`` atribute of the scenario. - -# %% -geospatial_standoff_scenario.results - -# %% [markdown] -# # Cleanup -# -# The scenario object will store its attributes in a file the python script's current working directory. If we want to delete this file when we are done with the scenario instance we can use the ``.clean()`` method as shown below. - -# %% -geospatial_standoff_scenario.clean() - -# %% [markdown] -# # Example Geospatial Functionality -# Many functions are supported for geospatial analysis, here are a few. -# - ``pvdeg.standards.standoff`` -# - ``pvdeg.humidity.module`` -# - ``pvdeg.letid.calc_letid_outdoors`` -# -# See the Geospatial Templates tutorial for an example on this. - -# %% -geospatial_humidity_scenario = pvdeg.GeospatialScenario( - name="humidity scenario", geospatial=True -) - -geospatial_humidity_scenario.addLocation( - state="Colorado", county="Jefferson", see_added=True -) - -geospatial_humidity_scenario.addJob( - func=pvdeg.humidity.module, - func_params={ - "backsheet_thickness": 0.3, # mm, thickness of PET backsheet - "back_encap_thickness": 0.46, # mm, thickness of EVA backside encapsulant - "encapsulant": "W001", # EVA encapsulant - "backsheet": "W017", # PET backsheet - }, - see_added=True, -) - -# %% -geospatial_humidity_scenario.run() - -# %% -geospatial_humidity_scenario.results diff --git a/tutorials/04_scenario/scripts/05_scenario_mountain_downselect.py b/tutorials/04_scenario/scripts/05_scenario_mountain_downselect.py deleted file mode 100644 index 4068b532..00000000 --- a/tutorials/04_scenario/scripts/05_scenario_mountain_downselect.py +++ /dev/null @@ -1,124 +0,0 @@ -# %% [markdown] -# # Mountain Downselect (HPC) -# - -# %% -import pvdeg -import numpy as np - -# %% [markdown] -# # Adding Points -# -# We are going to add all of the points in the American West to the scenario and downsample by a factor of 1. This will include only half of the points in the latitude axis and half in the longitude axis. - -# %% -dynamic_points = pvdeg.GeospatialScenario(name="dynamic-selection") - -dynamic_points.addLocation( - state=["CO", "UT"], # , 'NM', 'NV', 'ID', 'WY', 'AZ', 'CA', 'OR', 'WA'], - downsample_factor=1, -) - -# %% [markdown] -# # Preview The Scenario's Points -# -# Use `plot_cords` to get a quick snapshot of all coordinates included in the scenario's metadata. - -# %% -dynamic_points.plot_coords( - coord_1=[48.574790, -130.253906], # uncomment to see Larger scale view - coord_2=[25.482951, -68.027344], - size=0.005, -) - -# %% -dynamic_points.meta_data - -# %% [markdown] -# # Downselecting -# -# Using weighted random choices based on elevation we will create a sparse grid from the full metadata for fast calculations. This requires sklearn to be installed but this is not in the `pvdeg` dependency list to you will have to install it seperately. -# -# ## Normalization -# -# At each metadata point in our dataset we will calculate a weight based on its changes in elevation compared to its neighbors. The higher the weight, the greater the change in elevation from a point's immediate neighbors. The downselection methods and functions use these weights to randomly select a subset of the datapoints, prefferentially selecting those with higher weights. -# -# We have some control over which points get selected because all points' weights must be normalized (mapped from 0 to 1) before downselecting. We can apply a function such as $e^x$ or $\log x$ to the weights during normalization. This could help change the distribution of weights that are chosen. This could remove points from the mountains and add them to areas with fewer changes in elevation, or vice versa. -# -# *Note: `pvdeg`'s downselection functions use `numpy.random`, the random seed is not fixed so the result will change between runs.* - -# %% [markdown] -# # Providing a KdTree -# -# As shown below the lines to create a kdtree are commented out. -# - -# %% -# Set random seed for reproducible results -np.random.seed(42) - -# west_tree = pvdeg.geospatial.meta_KDtree(meta_df=dynamic_points.meta_data) - -dynamic_points.downselect_elevation_stochastic( - # kdtree=west_tree, - downselect_prop=0.5, - normalization="linear", -) - -# %% -dynamic_points.plot_coords() - -# %% [markdown] -# # Extracting from Scenario -# -# Scenarios provide an easy way to select and downsample geospatial data but we can easily pull out the data to use other `pvdeg` functions on it. In the cell below, we extract the weather data and meta data from the scenario and take only the matching entries from the weather. Then we load the xarray dataset into memory. Previously, it was stored lazily out of memory but we want to do operations on it. (Chunking causes issues when calculating so this eliminates any chunks) - -# %% -weather = dynamic_points.weather_data - -sub_weather = weather.sel( - gid=dynamic_points.meta_data.index -) # downselect weather using restricted metadata set - -sub_weather = sub_weather.compute() # load into memory - -# %% [markdown] -# # Geospatial Calculation -# -# Run a standoff calculation on the extracted scenario weather data and scenario meta data. - -# %% -# geospatial analysis now - -geo = { - "func": pvdeg.standards.standoff, - "weather_ds": sub_weather, - "meta_df": dynamic_points.meta_data, -} - -analysis_result = pvdeg.geospatial.analysis(**geo) - -# %% [markdown] -# # Viewing Results -# -# Inspecting the xarray dataset below shows us that we have many Not a Number (NaN) entries. These occur because we did not provide weather data at every point in the grid of possile latitude-longitude pairs. Expanding the `x` datavariable shows that there are some valid results but these are uncommon. - -# %% -analysis_result - -# %% [markdown] -# # Plotting Sparse Data I -# -# If we try to plot existing data with the current plotting methods exposed by `pvdeg` we will encounter issues. This will produce weak plotting results. - -# %% -# This cell demonstrates the issue with plotting sparse data directly -# It will raise a TypeError because there's no numeric data to plot -try: - pvdeg.geospatial.plot_USA(analysis_result["x"]) -except TypeError as e: - print(f"Expected error when plotting sparse data: {e}") - print("This is why we need to use plot_sparse_analysis() instead (see next cell)") - -# %% -pvdeg.geospatial.plot_sparse_analysis(analysis_result, data_var="x", method="linear") diff --git a/tutorials/04_scenario/scripts/06_scenario_single_location.py b/tutorials/04_scenario/scripts/06_scenario_single_location.py deleted file mode 100644 index dd9daa36..00000000 --- a/tutorials/04_scenario/scripts/06_scenario_single_location.py +++ /dev/null @@ -1,154 +0,0 @@ -# %% [markdown] -# # Single Location (HPC) -# -# Author: Tobin Ford | tobin.ford@nrel.gov -# -# 2024 -# **** -# -# A simple object orented workflow walkthrough using pvdeg. - -# %% -# if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg - -# %% -import pvdeg -import os - -# %% -# This information helps with debugging and getting support :) -import sys -import platform - -print("Working on a ", platform.system(), platform.release()) -print("Python version ", sys.version) -print("pvdeg version ", pvdeg.__version__) - -# %% [markdown] -# # Define Single Point Scenario Object -# Scenario is a general class that can be used to replace the legacy functional pvdeg analysis approach with an object orented one. ``Scenario`` can preform single location or geospatial analysis. The scenario constructor takes many arguments but the only required one for the following use cases is the ``name`` attribute. It is visible in when we display the entire scenario and is present in the file of saved information about the scenario. We also need to provide the class constructor with our API key and email. -# -# A way around this is to provide the weather and metadata in the pipeline job arguments or you can load data from somewhere else and provide it in the same fashion. -# -#
-# Please use your own API key: The block below makes an NSRDB API to get weather and meta data. This tutorial will work with the DEMO Key provided, but it will take you less than 3 minutes to obtain your own at https://developer.nrel.gov/signup/ so register now.) -#
- -# %% -simple_scenario = pvdeg.Scenario( - name="Point Minimum Standoff", email="user@mail.com", api_key="DEMO_KEY" -) - -# %% [markdown] -# # Adding A Location -# To add a single point using data from the Physical Solar Model (PSM3), simply feed the scenario a single coordinate in tuple form via the ``addLocation`` method. Currently this is the only way to add a location to a non-geospatial scenario, all of the other arguments are unusable when ``Scenario.geospatial == False``. -# -# Attempting to add a second location by calling the method again with a different coordinate pair will overwrite the old location data stored in the class instance. - -# %% -simple_scenario.addLocation( - lat_long=(25.783388, -80.189029), -) - -# %% -simple_scenario.weather_data - -# %% [markdown] -# # Scenario Pipelines -# -# The pipeline is a list of tasks called jobs for the scenario to run. We will populate the pipeline with a list of jobs before executing them all at once. -# -# To add a job to the pipeline use the ``updatePipeline`` method. Two examples of adding functions to the pipeline will be shown below. - -# %% [markdown] -# # Adding a job without function arguments -# -# The simplest case of adding a job to the pipeline is when it only requires us to provide simple weather and metadata. In the function definition and docstring these appear as ``weather_df`` and ``meta``. Since these attributes are contained in our scenario class instance we do not have to worry about them. We can simply add the function as shown below. - -# %% -simple_scenario.addJob(func=pvdeg.standards.standoff) - -# %% [markdown] -# # Adding a job with function arguments -# -# When adding a job that contains a function requiring other arguments such as ``solder_fatigue`` which requires a value for ``wind_factor``, we will need to provide it. The most straightforeward way to do this is using a kwargs dictionary and passing it to the function. We do not unpack the dictionary before passing it. This is done inside of the scenario at pipeline runtime (when ``runPipeline`` is called). - -# %% -kwargs = {"wind_factor": 0.33} - -simple_scenario.addJob(func=pvdeg.fatigue.solder_fatigue, func_kwarg=kwargs) - -# %% [markdown] -# # Adding a job with weather and metadata from outside of the class -# ## Not functional -# -# could just directly set weather data with scenario.weather_data = weather and scenario.meta_data = meta but that would only work for all of the jobs in the pipeline -# -# Say local weather data is available or other, if we want to use this rather than the PSM3 data at a latitude and longitude we can also provide the weather and metadata in the function arguments. This is probably the best if avoided but follows the same syntax as providing other function arguments. See the example below. - -# %% -PSM_FILE = os.path.join(pvdeg.DATA_DIR, "psm3_demo.csv") -weather, meta = pvdeg.weather.read(PSM_FILE, "psm") - -kwargs = {"weather_df": weather, "meta": meta} - -simple_scenario.addJob(func=pvdeg.standards.standoff, func_kwarg=kwargs) - -# FIX THIS CASE IN SCENARIO CLASS -# (simple_scenario.pipeline[1]['job'])(**simple_scenario.pipeline[1]['params']) - -# %% [markdown] -# # View Scenario -# -# The ``viewScenario`` method provides an overview of the information contained within your scenario object. Here you can see if it contains the location weather and metadata. As well as the jobs in the pipeline and their arguments. - -# %% -simple_scenario.viewScenario() - -# %% [markdown] -# # Display -# -# The fancier cousin of viewScenario. Only works in a jupyter environemnt as it uses a special ipython backend to render the html and javascript. -# -# It can be called with just the Scenario instance as follows -# `simple_scenario` -# -# or using the display function -# `display(simple_scenario)` - -# %% -simple_scenario - -# %% [markdown] -# # Executing Pipeline Jobs -# To run the pipeline after we have populated it with the desired jobs call the ``runPipeline`` method on our scenario instance. This will run all of the jobs we have previously added. The functions that need weather and metadata will grab it from the scenario instance using the correct location added above. The pipeline jobs results will be saved to the scenario instance. - -# %% -simple_scenario.run() - -# %% [markdown] -# # Results Series ## -# We will use a series to store the various return values of functions run in our pipeline. These can partially obfuscate the dataframes within them so to access the dataframes, use the function name to access it. To get one of the results we can index it using dictionary syntax. If the job was called `'KSDJQ'` do `'simple_scenario.results['KSDJQ']` to directly access the result for that job - -# %% -print(simple_scenario.results) -print("We can't see out data in here so we need to do another step", end="\n\n") - -# to see all available ouputs of results do -print( - f"this is the list of all available frames in results : {simple_scenario.results.index}\n" -) - -# loop over all results and display -for keys, results in simple_scenario.results.items(): - print(keys) - display(results) - -# %% [markdown] -# # Cleaning Up the Scenario -# -# Each scenario object creates a directory named ``pvd_job_...`` that contains information about the scenario instance. To remove the directory and all of its information call ``clean`` on the scenario. This will permanently delete the directory created by the scenario. - -# %% -simple_scenario.clean() diff --git a/tutorials/04_scenario/scripts/07_scenario_temperature.py b/tutorials/04_scenario/scripts/07_scenario_temperature.py deleted file mode 100644 index 607d2a78..00000000 --- a/tutorials/04_scenario/scripts/07_scenario_temperature.py +++ /dev/null @@ -1,163 +0,0 @@ -# %% [markdown] -# # Temperature Scenario (HPC) -# - -# %% -# if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# # !pip install pvdeg - -# %% -import pvdeg -import os - -# %% -# This information helps with debugging and getting support :) -import sys -import platform - -print("Working on a ", platform.system(), platform.release()) -print("Python version ", sys.version) -print("pvdeg version ", pvdeg.__version__) - -# %% [markdown] -# # Adding Modules and Pipeline Jobs (Run Functions on Scenario Object) -# -# Material: `OX003` corresponds to a set of EVA material parameters from the default file `O2Permeation.json` in the `pvdeg/data` directory. Look in these files to see available options. - -# %% -scene_temp = pvdeg.Scenario( - name="temperature and degradation", - api_key="DEMO_KEY", - email="user@mail.com", -) - -scene_temp.addLocation( - lat_long=(25.783388, -80.189029), -) - -# this module will be overwritten because another with the same name is added afterwards -scene_temp.addModule(module_name="sapm_1", temperature_model="sapm") - -scene_temp.addModule( - module_name="sapm_1", - racking="open_rack_glass_polymer", - materials="OX003", - temperature_model="sapm", - irradiance_kwarg={"azimuth": 120, "tilt": 30}, - model_kwarg={"irrad_ref": 1100}, -) - -scene_temp.addModule( - module_name="pvsyst_1", - racking="freestanding", - materials="OX003", - temperature_model="pvsyst", - irradiance_kwarg={"azimuth": 180, "tilt": 0}, - model_kwarg={"module_efficiency": 0.15}, -) -scene_temp.addModule( - module_name="sapm_2", - racking="open_rack_glass_polymer", - materials="OX003", - temperature_model="sapm", - irradiance_kwarg={"azimuth": 120, "tilt": 30}, - model_kwarg={"irrad_ref": 1000}, -) -scene_temp.addModule( - module_name="sapm_3", - racking="open_rack_glass_polymer", - materials="OX003", - temperature_model="sapm", - irradiance_kwarg={"azimuth": 180, "tilt": 0}, - model_kwarg={"irrad_ref": 1000}, -) - -scene_temp.addModule( - module_name="pvsyst_2", - racking="freestanding", - materials="OX003", - temperature_model="pvsyst", - irradiance_kwarg={"azimuth": 180, "tilt": 0}, - model_kwarg={"module_efficiency": 0.2}, -) - -scene_temp.addJob( - func=pvdeg.temperature.temperature, - func_kwarg={"cell_or_mod": "cell"}, -) - -scene_temp.addJob( - func=pvdeg.degradation.vantHoff_deg, - func_kwarg={"I_chamber": 1000, "temp_chamber": 25}, -) - -scene_temp.addJob( - func=pvdeg.degradation.vantHoff_deg, - func_kwarg={"I_chamber": 1000, "temp_chamber": 30}, -) - -scene_temp.addJob( - func=pvdeg.degradation.IwaVantHoff, -) - -# %% [markdown] -# # Run and View Scenario Results - -# %% -scene_temp.run() - -scene_temp - -# %% -scene_temp.dump() - -# %% [markdown] -# # Plotting and Extracting Results -# These methods are independent of one another (i.e. you do not need to extract before plotting but both are shown below.) - -# %% -import datetime - -t0 = datetime.datetime(1970, 1, 1, 0, 0) -tf = datetime.datetime(1970, 1, 1, 23, 59) - -# Get the first function result dynamically -function_ids = [key[1] for key in scene_temp.results.keys() if key[0] == "function"] -if function_ids: - temp_df = scene_temp.extract( - ("function", function_ids[0]), tmy=True, start_time=t0, end_time=tf - ) - display(temp_df) -else: - print("No function results found") - -# %% -# Get the first function result dynamically for plotting -function_ids = [key[1] for key in scene_temp.results.keys() if key[0] == "function"] -if function_ids: - scene_temp.plot( - ("function", function_ids[0]), - tmy=True, - start_time=t0, - end_time=tf, - title="single day cell temperature", - ) -else: - print("No function results found") - -# %% [markdown] -# # Create a Copy of a Scenario - -# %% -from pathlib import Path - -parent_dir = Path(pvdeg.PVDEG_DIR).parent -new_path = parent_dir / "tutorials" / "data" / "temperature_and_degradation.json" - -copy = pvdeg.scenario.Scenario.load_json( - file_path=str(new_path), - email="user@mail.com", - api_key="DEMO_KEY", -) - -# copy diff --git a/tutorials/04_scenario/scripts/08_module_standoff_iec63126.py b/tutorials/04_scenario/scripts/08_module_standoff_iec63126.py deleted file mode 100644 index cdd201ef..00000000 --- a/tutorials/04_scenario/scripts/08_module_standoff_iec63126.py +++ /dev/null @@ -1,499 +0,0 @@ -# %% [markdown] -# # Tools - Module Standoff for IEC TS 63126 -# -# ## Calculation of module standoff distance according to IEC TS 63126 -# -# **Requirements:** -# - Local weather data file or site longitude and latittude -# -# **Objectives:** -# 1. Import weather data. -# 2. Calculate installation standoff - Level 1 and Level 2. -# 3. Calculate $X_{eff}$ from provided module temperature data. -# 4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$. -# 5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$. -# 6. Plot $X_{min}$ for Level 1, Level 2, or a $T_{98}$ for a given region. -# -# **Background:** -# -# This notebook calculates the a minimum effective standoff distance ($X_{eff}$) necessary for roof-mounted PV modules to ensure that the $98^{th}$ percentile operating temperature, $T_{98}$, remains under 70°C for compliance to IEC 61730 and IEC 61215. For higher $T_{98}$ values above 70°C or 80°C testing must be done to the specifications for Level 1 and Level 2 of IEC TS 63126. This method is outlined in the appendix of IEC TS 63126 and is based on the model from *[King 2004] and data from **[Fuentes, 1987] to model the approximate exponential decay in temperature, $T(X)$, with increasing standoff distance, $X$, as, -# -# $$ X = -X_0 \ln\left(1-\frac{T_0-T}{\Delta T}\right), Equation 1 $$ -# -# where $T_0$ is the temperature for $X=0$ (insulated-back) and $\Delta T$ is the temperature difference between an insulated-back ($X=0$) and open-rack mounting configuration ($X=\infty)$. -# -# We used pvlib and data from the National Solar Radiation Database (NSRDB) to calculate the module temperatures for the insulated-back and open-rack mounting configurations and apply our model to obtain the minimum standoff distance for roof-mounted PV systems to achieve a temperature lower than a specified $T_{98}$. The following figure showcases this calulation for the entire world for an $X_{eff}$ that results in $T_{98}$=70°C. Values of $X_{eff}$ higher than this will require Level 1 or Level 2 certification. -# -# $*$ D. L. King, W. E. Boyson, and J. A. Kratochvil, "Photovoltaic array performance model," SAND2004-3535, Sandia National Laboratories, Albuquerque, NM, 2004. '\ -# $**$ M. K. Fuentes, "A simplified thermal model for Flat-Plate photovoltaic arrays," United States, 1987-05-01 1987. https://www.osti.gov/biblio/6802914 -# - -# %% [markdown] -# ![alt text](images/T98_70C_standoff_Map.png) - -# %% -# if running on google colab, uncomment the next line and execute this cell to install the dependencies -# and prevent "ModuleNotFoundError" in later cells: -# #!pip install pvdeg - -# %% -import os -import pvdeg -import pandas as pd -from pvdeg import DATA_DIR -import dask -import matplotlib.pyplot as plt -import seaborn as sns -import math -import numpy as np - -# %% -# This information helps with debugging and getting support :) -import sys -import platform - -print("Working on a ", platform.system(), platform.release()) -print("Python version ", sys.version) -print("Pandas version ", pd.__version__) -print("pvdeg version ", pvdeg.__version__) -print("dask version", dask.__version__) -print(DATA_DIR) - -# %% [markdown] -# # 1. Import Weather Data -# -# The function has these minimum requirements when using a weather data file: -# - Weather data containing (at least) DNI, DHI, GHI, Temperature, RH, and Wind-Speed data at module level. -# - Site meta-data containing (at least) latitude, longitude, and time zone -# -# Alternatively one may can get meterological data from the NSRDB or PVGIS with just the longitude and latitude. This function for the NSRDB (via NSRDB 'PSM3') works primarily for most of North America and South America. PVGIS works for most of the rest of the world (via SARAH 'PVGIS'). See the tutorial "Weather Database Access.ipynb" tutorial on PVdeg or Jensen et al. https://doi.org/10.1016/j.solener.2023.112092 for satellite coverage information. -# - -# %% -# Get data from a supplied data file (Do not use the next box of code if using your own file) -weather_file = os.path.join(DATA_DIR, "psm3_demo.csv") -WEATHER_df, META = pvdeg.weather.read(weather_file, "csv") -print(META) - -# %% -# This routine will get a meteorological dataset from anywhere in the world where it is available -# weather_id = (24.7136, 46.6753) #Riyadh, Saudi Arabia -# weather_id = (35.6754, 139.65) #Tokyo, Japan -# weather_id = (-43.52646, 172.62165) #Christchurch, New Zealand -# weather_id = (64.84031, -147.73836) #Fairbanks, Alaska -# weather_id = (65.14037, -21.91633) #Reykjavik, Iceland -# weather_id = (33.4152, -111.8315) #Mesa, Arizona -# WEATHER_df, META = pvdeg.weather.get_anywhere(id=weather_id) -# print(META) - -# %% [markdown] -# # 2. Calculate Installation Standoff Minimum - Level 1 and Level 2 -# -# According to IEC TS 63126, Level 0, Level 1 and Level 2 certification is limited to T₉₈<70°C, <80°C and <90°C, respectively. Level 0 certification is essentially compliance to IEC 61730 and IEC 61215. The default value of T₉₈<70°C represents the minimium gap to avoid higher temperature certification according to IEC TS 63126. This minimum standoff ($x_{min}$) is the distance between the bottom of the module frame and the roof and can be extimated for a given environment as, -# -# $$ X_{min} = -X_0 \ln\left(1-\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\right), Equation 2 $$ -# -# where $T_{98,0}$ is the $98^{th}$ percentile temperature for an insulated back module and $T_{98,inf}$ is the $98^{th}$ percentile temperature for an open rack mounted module. -# -# Once the meterological data has been obtained, the input parameter possibilities are: -# -# - T₉₈ : Does not necessarily need to be set at 70°C or 80°C for IEC TS 63216, you might want to use a different number to compensate for a thermal aspect of the particular system you are considering. The default is 70°C. -# - tilt : tilt from horizontal of PV module. The default is 0°. -# - azimuth : azimuth in degrees from North. The default is 180° for south facing. -# - sky_model : pvlib compatible model for generating sky characteristics (Options: 'isotropic', 'klucher', 'haydavies', 'reindl', 'king', 'perez'). The default is 'isotropic'. -# - temp_model : pvlib compatible module temperature model. (Options: 'sapm', 'pvsyst', 'faiman', 'sandia'). The default is 'sapm'. -# - conf_0 : Temperature model for hotest mounting configuration. Default is "insulated_back_glass_polymer". -# - conf_inf : Temperature model for open rack mounting. Default is "open_rack_glass_polymer". -# - x_0 : thermal decay constant [cm] (see documentation). The default is 6.5 cm. -# - wind_factor : Wind speed power law correction factor to account for different wind speed measurement heights between weather database (e.g. NSRDB) and the tempeature model (e.g. SAPM). The default is 0.33. - -# %% [markdown] -# The following is the minimum function call. It defaults to horizontal tilt and T₉₈=70°C. - -# %% -standoff = pvdeg.standards.standoff(weather_df=WEATHER_df, meta=META) -print(pvdeg.standards.interpret_standoff(standoff)) - -# %% [markdown] -# The following is a full function call for both T₉₈=70°C and 80°C separately even though the second standoff distance can be calculated using only T98_0 and T98_inf. With this function, one may also want to change the tilt, azimuth, or T98. - -# %% -standoff_1 = pvdeg.standards.standoff( - weather_df=WEATHER_df, - meta=META, - T98=70, - tilt=META["latitude"], - azimuth=None, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_0=6.5, - wind_factor=0.33, -) -print("First calculation standoff = ", "%.1f" % standoff_1["x"].iloc[0], " cm.") -standoff_2 = pvdeg.standards.standoff( - weather_df=WEATHER_df, - meta=META, - T98=80, - tilt=META["latitude"], - azimuth=None, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_0=6.5, - wind_factor=0.33, -) -print("Second calculation standoff = ", "%.1f" % standoff_2["x"].iloc[0], " cm.") -print(pvdeg.standards.interpret_standoff(standoff_1=standoff_1, standoff_2=standoff_2)) - -# %% [markdown] -# # 3. Calculate $X_{eff}$ from provided module temperature data. -# -# To do this calculation, one must use a set of data with: -# - meterological irradiance data sufficient to calculate the POA irradiance (DHI, GHI, and DNI), -# - ambient temperature data, -# - wind speed at module height, (wind_factor=0.33 will be used unless otherwise specified) -# - temperature measurements of the module in the test system. Ideally this would be measured under a worst case scenario that maximizes the module temperature for a given site, -# - geographic meta data including longitude and latitude, -# -# To create a weather file of your own, copy the format of the example file 'xeff_demo.csv'. This is formatted with the first row containing meta data variable names, the second row containing the corresponding values, the third row containing meteorological data headers, and all the remaining rows containing the meteorological data. -# -# To do this calculation, one should also filter the data to remove times when the sun is not shining or when snow is likely to be on the module. The recommendations and programmed defaults are to use poa_min=100 W/m² and data when the minimum ambient temperature t_amb_min=0. - -# %% -# Read the weather file -weather_file = os.path.join(DATA_DIR, "xeff_demo.csv") -xeff_weather, xeff_meta = pvdeg.weather.read(weather_file, "csv") -# Pull measured temperature and calculate theoretical insulated back module temperature and open rack module temperature -T_0, T_inf, xeff_poa = pvdeg.standards.eff_gap_parameters( - weather_df=xeff_weather, - meta=xeff_meta, - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - wind_factor=0.33, -) -# Now calculate X_eff. -x_eff = pvdeg.standards.eff_gap( - T_0, - T_inf, - xeff_weather["module_temperature"], - xeff_weather["temp_air"], - xeff_poa["poa_global"], - x_0=6.5, - poa_min=100, - t_amb_min=0, -) -print("The effective standoff for this system is", "%.1f" % x_eff, "cm.") - -# %% [markdown] -# # 4. Calculate $T_{98}$ for a given azimuth, tilt, and $X_{eff}$. -# -# Equation 2 can be reorganized as, -# -# $$ T_{98} = T_{98,0} -( T_{98,0}- T_{98,inf}) \left(1-e^{-\frac{x_{eff}}{x_{0}}}\right), Equation 3 $$ -# -# and used to calculate the $98^{th}$ percential temperature, $T_{98}$, for a PV system having a given effective standoff height, $X_{eff}$, for an arbitrarily oriented module. Here, $T_{98,0}$ is the $98^{th}$ percentile for an insulated-back module and $T_{98,inf}$ is the $98^{th}$ percentile for a rack-mounted module. The input parameter possibilities are the same as shown in Objective #2 above, but the example below uses the default parameters. The actual tilt [degrees], azimuth [degrees] and $X_{eff}$ [cm] can be modifed as desired. - -# %% -# This is the minimal function call using the common default settings to estimate T₉₈. -T_98 = pvdeg.standards.T98_estimate( - weather_df=WEATHER_df, meta=META, tilt=-META["latitude"], azimuth=None, x_eff=10 -) -print("The 98ᵗʰ percential temperature is estimated to be", "%.1f" % T_98, "°C.") - -# %% -# This code will calculate the temperature for an arbitrary x_eff distance. Either set of kwargs can be modified and use. -# irradiance_kwarg ={ -# "axis_tilt": None, -# "axis_azimuth": None, -# "x_eff": 10, -# "module_mount": '1_axis'} -irradiance_kwarg = { - "tilt": META["latitude"], - "azimuth": None, - "x_eff": 10, - "module_mount": "fixed", -} - -T_xeff = pvdeg.standards.x_eff_temperature_estimate( - weather_df=WEATHER_df, meta=META, **irradiance_kwarg -) - -print( - "The 98ᵗʰ percential temperature is estimated to be", - "%.1f" % np.percentile(T_xeff, 98), - "°C.", -) - -# %% [markdown] -# # 5. Plot $X_{min}$ for all azimuth and tilt for a given $T_{98}$. -# -# The temperature of a system is affected by the orientation. This section will scan all possible tilts and azimuths calculating the minimum standoff distance for a given $T_{98}$. Similar additional factors as above can also be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $T_{98}$=70°C, the boundary between Level 0 and Level 1 requirements. The temperature model information given below is unnecessary as these are default values that would get populated automatically. However, they were included here for clarity into a standard practice as per IEC TS 63126. -# -# $$ X_{min} = -X_0 \ln\left(1-\frac{T_{98,0}-T}{ T_{98,0}- T_{98,inf}}\right), Equation 2 $$ - -# %% -# Scans through all the azimuth and tilt running the minimum standoff calculation -# Set up keyword parameters for the calculation - -kwarg_x = dict( - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - T98=70, - x_0=6.5, - wind_factor=0.33, -) -# Run the calculation -x_azimuth_step = 10 -x_tilt_step = 10 -standoff_series = pvdeg.utilities.tilt_azimuth_scan( - weather_df=WEATHER_df, - meta=META, - tilt_step=x_tilt_step, - azimuth_step=x_azimuth_step, - func=pvdeg.standards.standoff_x, - **kwarg_x, -) - -# %% [markdown] -# The next cell creates a plot of the calculated data. Some of the things you may want to change are: -# - cmap="Spectral_r": Change to have different colors -# - plt.title : This will change the plot title. -# - figsize=(16,4) : Change the plot dimensions and/or aspect ratio. -# - vmax=None : This can be set to a numeric value to control the depth scale maximum -# - vmin=0 : This controls the minimum of the depth scale. -# - v_ticks=37 : This changes the number of vertical tick marks -# - h_ticks=10 : This changes the number of horizontal tick marks -# - Unblock the last two lines to ouput the plot as an *.png image file - -# %% -standoff_series_df = pd.DataFrame( - { - "Tilt": standoff_series[:, 0], - "Azimuth": standoff_series[:, 1], - "Xₘᵢₙ": standoff_series[:, 2], - } -) -x_fig = plt.figure(figsize=(16, 4)) -plt.title( - r"Plot of $\it{Xₘᵢₙ}$ for all orientations for $\it{T₉₈}$=" - + "%.0f" % kwarg_x["T98"] - + "°C.", - fontsize=15, - y=1.08, -) -x_fig = sns.heatmap( - standoff_series_df.pivot(index="Tilt", columns="Azimuth", values="Xₘᵢₙ"), - cbar_kws={"label": "Xₘᵢₙ", "format": "%.0f", "pad": 0.02}, - cmap="Spectral_r", - vmin=0, - vmax=None, -) - -h_ticks = 37 -x_number = math.ceil(360 / x_azimuth_step) + 1 -x_ticks = [ - (x * (360 / (h_ticks - 1)) / x_azimuth_step + 0.5) for x in range(h_ticks - 1) -] -x_labels = [("%.0f" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)] -x_ticks.append(x_number - 0.5) -x_fig.set_xticks(x_ticks) -x_fig.set_xticklabels(x_labels, rotation=90) - -v_ticks = 10 -y_number = math.ceil(90 / x_tilt_step) + 1 -y_ticks = [(x * (90 / (v_ticks - 1)) / x_tilt_step + 0.5) for x in range(v_ticks - 1)] -y_labels = [("%.0f" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)] -y_ticks.append(y_number - 0.5) -x_fig.set_yticks(y_ticks) -x_fig.set_yticklabels(y_labels, rotation=0) - -x_fig.set_xlabel("Azimuth [°]", fontsize=15, labelpad=10) -x_fig.set_ylabel("Tilt [°]", fontsize=15) -x_fig.figure.axes[-1].set_ylabel(r"$\it{Xₘᵢₙ}$ [cm]", size=15) -x_fig.invert_yaxis() - -output_folder = os.path.join( - os.path.dirname(os.path.dirname(os.getcwd())), "TEMP", "results" -) -try: - os.makedirs(output_folder) -except OSError as error: - print(error) - -plt.savefig( - os.path.join(output_folder, "Standoff_Scan.png"), bbox_inches="tight" -) # Creates an image file of the standoff plot -plt.show() - -# %% [markdown] -# # 6. Plot $T_{98}$ for all azimuth and tilt for a given $X_{eff}$. -# -# The temperature of a system is affected by the orientation and the effective standoff, $X_{eff}$, of the system. This section will scan all possible tilts and azimuths calculating the $T_{98}$ for a given $X_{eff}$. As above, additional factors can be modified but are not included here for simplicity. The tilt_step and azimuth_step are the number of degrees for each step for the 90° and 180° tilt and azimuth spans, respectively. The default for this calculation is for $X_{eff}$=10 cm, a common effective standoff distance on a rooftop system. A value of $X_{eff}$=None will run the calculations for an open rack system and $X_{eff}$=0 for an insulated-back system. - -# %% -# Scans through all the azimuth and tilt running the 98ᵗʰ percentile temperature calculation. -# Set up keyword parameters for the calculation -kwarg_T = dict( - sky_model="isotropic", - temp_model="sapm", - conf_0="insulated_back_glass_polymer", - conf_inf="open_rack_glass_polymer", - x_eff=5, - x_0=6.5, - wind_factor=0.33, -) -# Run the calculation -T_azimuth_step = 10 -T_tilt_step = 10 -T98_series = pvdeg.utilities.tilt_azimuth_scan( - weather_df=WEATHER_df, - meta=META, - tilt_step=T_tilt_step, - azimuth_step=T_azimuth_step, - func=pvdeg.standards.T98_estimate, - **kwarg_T, -) - -# %% [markdown] -# The next cell creates a plot of the calculated data. Some of the things you may want to change are: -# - cmap="Spectral_r": Change to have different colors -# - plt.title : This will change the plot title. -# - figsize=(16,4) : Change the plot dimensions and/or aspect ratio. -# - vmax=None : This can be set to a numeric value to control the depth scale maximum -# - vmin=None : This controls the minimum of the depth scale. -# - v_ticks=37 : This changes the number of vertical tick marks -# - h_ticks=10 : This changes the number of horizontal tick marks -# - Unblock the last two lines to ouput the plot as an *.png image file - -# %% -# This produces the plot of the data -T98_series_df = pd.DataFrame( - {"Tilt": T98_series[:, 0], "Azimuth": T98_series[:, 1], "T₉₈": T98_series[:, 2]} -) -T98_fig = plt.figure(figsize=(16, 4)) -if kwarg_T["x_eff"] == None: - plt.title( - r"Plot of $\it{T₉₈}$ for all orientations for an open-rack mounting.", - fontsize=15, - y=1.08, - ) -else: - plt.title( - r"Plot of $\it{T₉₈}$ for all orientations for $X_{eff}$=" - + "%.0f" % kwarg_T["x_eff"] - + " cm.", - fontsize=15, - y=1.08, - ) -T98_fig = sns.heatmap( - T98_series_df.pivot(index="Tilt", columns="Azimuth", values="T₉₈"), - cbar_kws={"label": "Xₘᵢₙ", "format": "%.0f", "pad": 0.02}, - cmap="Spectral_r", - vmin=None, - vmax=None, -) - -h_ticks = 37 -x_number = math.ceil(360 / T_azimuth_step) + 1 -x_ticks = [ - (x * (360 / (h_ticks - 1)) / T_azimuth_step + 0.5) for x in range(h_ticks - 1) -] -x_labels = [("%.0f" % (360 / (h_ticks - 1) * x)) for x in range(h_ticks)] -x_ticks.append(x_number - 0.5) -T98_fig.set_xticks(x_ticks) -T98_fig.set_xticklabels(x_labels, rotation=90) - -v_ticks = 10 -y_number = math.ceil(90 / T_tilt_step) + 1 -y_ticks = [(x * (90 / (v_ticks - 1)) / T_tilt_step + 0.5) for x in range(v_ticks - 1)] -y_labels = [("%.0f" % (90 / (v_ticks - 1) * x)) for x in range(v_ticks)] -y_ticks.append(y_number - 0.5) -T98_fig.set_yticks(y_ticks) -T98_fig.set_yticklabels(y_labels, rotation=0) - -T98_fig.set_xlabel("Azimuth [°]", fontsize=15, labelpad=10) -T98_fig.set_ylabel("Tilt [°]", fontsize=15) -T98_fig.figure.axes[-1].set_ylabel(r"$\it{T₉₈}$ [°C]", size=15) -T98_fig.invert_yaxis() - -plt.savefig( - os.path.join(output_folder, "T98_Scan.png"), bbox_inches="tight" -) # Creates an image file of the standoff plot -plt.show(T98_fig) - -# %% [markdown] -# # 7. Plot $X_{min}$ for a $T_{98}$, and plot $T_{98}$ for a given region. -# -# This last Objective is much more complicated and is set up to utilize acess to a lot of computational power to run many sites simultaneously to create a regional map of standoff distance. -# For more in-depth instructions on doing this, look at the tutorial "Scenario - Geospatial.ipynb" here in PVDeg. -# -# Step #1: Create an object, "geospatial_standoff_scenario" to be used to run the computations. - -# %% -geospatial_standoff_scenario = pvdeg.GeospatialScenario( - name="standoff geospatial", - geospatial=True, -) - -# %% [markdown] -# Step #2: Identifies a subset of locations from the database to run the computations. -# Specifically all are from the NSRDB. - -# %% -geospatial_standoff_scenario.addLocation( - state="Colorado", county="Summit" -) # Identifies a subset of locations from the database to run the computations. Specifically all are from the NSRDB. - -# %% [markdown] -# Step #3: indicate which function will be run. Here the default is the standoff calculation, but it could be any other function with a key word argument dictionary. -# Here the 98th percential temperature is defined as 70C, but any arbitrary value can be specified. - -# %% -geospatial_standoff_scenario.addJob( - func=pvdeg.standards.standoff, func_params={"T98": 70} -) - -# %% [markdown] -# Step #4: Run the scenario - -# %% -geospatial_standoff_scenario.run() - -# %% [markdown] -# Step #5: Create a plot of the standoff calculation. - -# %% -geospatial_standoff_scenario.plot_world("x") - -# %% -geospatial_standoff_scenario.plot_world("T98_inf") - -# %% [markdown] -# # 8. Save data outputs. -# -# This cell contains a number of pre-scripted commands for exporting and saving data. The code to save plots is located after the plot creation and is blocked by default. First check that the output folder exists, then unblock the code for data you would like to save. - -# %% -print("Your results will be stored in %s" % output_folder) -print("The folder must already exist or the file will not be created") - -pvdeg.weather.write( - data_df=WEATHER_df, - metadata=META, - savefile=os.path.join(output_folder, "WeatherFile.csv"), -) # Writes the meterological data to an *.csv file. - -pd.DataFrame(standoff_series_df).to_csv( - os.path.join(output_folder, "Standoff_Scan.csv") -) # Writes a file with the Tilt and Azimuth scan calculations of standoff. - -pd.DataFrame(T98_series_df).to_csv( - os.path.join(output_folder, "T98_Scan.csv") -) # Writes a file with the Tilt and Azimuth scan calculations of T98. diff --git a/tutorials/04_scenario/scripts/09_geospatial_world_map.py b/tutorials/04_scenario/scripts/09_geospatial_world_map.py deleted file mode 100644 index c9a93287..00000000 --- a/tutorials/04_scenario/scripts/09_geospatial_world_map.py +++ /dev/null @@ -1,670 +0,0 @@ -# %% [markdown] -# # Geospatial World Map (HPC) -# M. Springer 2024-06-05 - -# %% -print("Importing libraries...") -import matplotlib.pyplot as plt -import numpy as np -import pandas as pd -import pvdeg -import xarray as xr -import os - -print("Done!") - -# %% [markdown] -# # Calculate Standoff - -# %% -work_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/standoff_fine" -data_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/data" - -# %% -local = { - "manager": "local", - "n_workers": 100, -} - -kestrel = { - "manager": "slurm", - "n_jobs": 8, # Max number of nodes used for parallel processing - "cores": 100, - "processes": 50, - "memory": "245GB", - "account": "pvfem", - "queue": "standard", - "walltime": "8:00:00", - # "scheduler_options": {"host": socket.gethostname()}, -} - -print("Starting Dask client...") -client = pvdeg.geospatial.start_dask(hpc=kestrel) -print("Cluster ready!") - -# %% -# Get weather data -weather_db = "NSRDB" - -# %% -weather_arg = { - "satellite": "Himawari", - "names": "tmy-2020", - "NREL_HPC": True, - "attributes": [ - "air_temperature", - "wind_speed", - "dhi", - "ghi", - "dni", - "relative_humidity", - ], -} - -weather_ds_himawari, meta_df_himawari = pvdeg.weather.get( - weather_db, geospatial=True, **weather_arg -) - -# %% -meta_df_himawari_sub, gids_meta_df_himawari = pvdeg.utilities.gid_downsampling( - meta_df_himawari, 3 -) -weather_ds_himawari_sub = weather_ds_himawari.sel(gid=meta_df_himawari_sub.index) - -# %% -geo_himawari = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_ds_himawari_sub, - "meta_df": meta_df_himawari_sub, -} - -standoff_res_himawari = pvdeg.geospatial.analysis(**geo_himawari) -standoff_res_himawari.to_netcdf(os.path.join(work_dir, "standoff_himawari.nc")) -standoff_res_himawari.to_dataframe().to_csv( - os.path.join(work_dir, "standoff_himawari.csv") -) - -# %% -weather_arg = { - "satellite": "GOES", - "names": 2021, - "NREL_HPC": True, - "attributes": [ - "air_temperature", - "wind_speed", - "dhi", - "ghi", - "dni", - "relative_humidity", - ], -} - -weather_ds_goes, meta_df_goes = pvdeg.weather.get( - weather_db, geospatial=True, **weather_arg -) - -# %% -meta_df_goes_sub, gids_meta_df_goes = pvdeg.utilities.gid_downsampling(meta_df_goes, 8) -weather_ds_goes_sub = weather_ds_goes.sel(gid=meta_df_goes_sub.index) - -# %% -geo_goes = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_ds_goes_sub, - "meta_df": meta_df_goes_sub, -} - -standoff_res_goes = pvdeg.geospatial.analysis(**geo_goes) -standoff_res_goes.to_netcdf(os.path.join(work_dir, "standoff_goes.nc")) -standoff_res_goes.to_dataframe().to_csv(os.path.join(work_dir, "standoff_goes.csv")) - -# %% -weather_arg = { - "satellite": "METEOSAT", - "names": 2019, - "NREL_HPC": True, - "attributes": [ - "air_temperature", - "wind_speed", - "dhi", - "ghi", - "dni", - "relative_humidity", - ], -} - -weather_ds_meteosat, meta_df_meteosat = pvdeg.weather.get( - weather_db, geospatial=True, **weather_arg -) - -# %% -meta_df_meteosat_sub, gids_meta_df_meteosat = pvdeg.utilities.gid_downsampling( - meta_df_meteosat, 4 -) -weather_ds_meteosat_sub = weather_ds_meteosat.sel(gid=meta_df_meteosat_sub.index) - -# %% -geo_meteosat = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_ds_meteosat_sub, - "meta_df": meta_df_meteosat_sub, -} - -standoff_res_meteosat = pvdeg.geospatial.analysis(**geo_meteosat) -standoff_res_meteosat.to_netcdf(os.path.join(work_dir, "standoff_meteosat.nc")) -standoff_res_meteosat.to_dataframe().to_csv( - os.path.join(work_dir, "standoff_meteosat.csv") -) - -# %% -# Auxillary data -import h5py - -fp_weather_aux = "/projects/pvsoiling/pvdeg/data/world_map_aux.h5" -fp_meta_aux = "/projects/pvsoiling/pvdeg/data/meta_world_map_aux.csv" - -# weather_aux = pvdeg.weather.read(fp_weather_aux, 'h5') -meta_aux = pd.read_csv(fp_meta_aux, index_col=0) - -# xarray work around for aux data -dss = [] -drop_variables = ["meta", "time_index", "tmy_year", "tmy_year_short", "coordinates"] - -hf = h5py.File(fp_weather_aux, "r") -attr = list(hf) - -attr_to_read = [elem for elem in attr if elem not in drop_variables] - -chunks = [] -shapes = [] -for var in attr_to_read: - chunks.append(hf[var].chunks if hf[var].chunks is not None else (np.nan, np.nan)) - shapes.append(hf[var].shape if hf[var].shape is not None else (np.nan, np.nan)) -chunks = min(set(chunks)) -shapes = min(set(shapes)) - - -time_index = pd.to_datetime(hf["time_index"][...].astype(str)).values -meta_df = meta_aux -coords = {"gid": meta_df.index.values, "time": time_index} -coords_len = {"time": time_index.shape[0], "gid": meta_df.shape[0]} - -ds = xr.open_dataset( - fp_weather_aux, - engine="h5netcdf", - phony_dims="sort", - chunks={"phony_dim_0": -1, "phony_dim_1": -1}, - drop_variables=drop_variables, - mask_and_scale=False, - decode_cf=True, -) - -for var in ds.data_vars: - if hasattr(getattr(ds, var), "psm_scale_factor"): - scale_factor = 1 / ds[var].psm_scale_factor - print(scale_factor) - getattr(ds, var).attrs["scale_factor"] = scale_factor - -rename = {} -for ( - phony, - length, -) in ds.sizes.items(): - if length == coords_len["time"]: - rename[phony] = "time" - elif length == coords_len["gid"]: - rename[phony] = "gid" -ds = ds.rename(rename) -ds = ds.assign_coords(coords) - -# TODO: In case re-chunking becomes necessary -# ax0 = list(ds.sizes.keys())[list(ds.sizes.values()).index(shapes[0])] -# ax1 = list(ds.sizes.keys())[list(ds.sizes.values()).index(shapes[1])] -# ds = ds.chunk(chunks={ax0:chunks[0], ax1:chunks[1]}) -dss.append(ds) - -ds = xr.merge(dss) -ds = xr.decode_cf(ds) - -# Rechunk time axis -ds = ds.unify_chunks() -ds = ds.chunk(chunks={"time": -1, "gid": ds.chunks["gid"]}) - -weather_ds = ds - -DSET_MAP = {"air_temperature": "temp_air", "Relative Humidity": "relative_humidity"} -META_MAP = {"elevation": "altitude", "Local Time Zone": "tz", "timezone": "tz"} - -for dset in weather_ds.data_vars: - if dset in DSET_MAP.keys(): - weather_ds = weather_ds.rename({dset: DSET_MAP[dset]}) - -for mset in meta_df.columns: - if mset in META_MAP.keys(): - meta_df.rename(columns={mset: META_MAP[mset]}, inplace=True) - -weather_ds_aux = weather_ds - -geo_aux = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_ds_aux, - "meta_df": meta_aux, -} - -standoff_res_aux = pvdeg.geospatial.analysis(**geo_aux) -standoff_res_aux.to_netcdf(os.path.join(work_dir, "standoff_aux.nc")) -standoff_res_aux.to_dataframe().to_csv(os.path.join(work_dir, "standoff_aux.csv")) - -# %% -weather_db = "NSRDB" -weather_arg = { - "satellite": "METEOSAT", - "names": 2019, - "NREL_HPC": True, - "attributes": [ - "air_temperature", - "wind_speed", - "dhi", - "ghi", - "dni", - "relative_humidity", - ], -} -weather_ds_meteosat, meta_df_meteosat = pvdeg.weather.get( - weather_db, geospatial=True, **weather_arg -) - -time_hourly = pd.date_range("2019-01-01", freq="h", periods=365 * 24) -weather_ds_meteosat_hourly = weather_ds_meteosat.sel(time=time_hourly) - -europe = [ - "Spain", - "Portugal", #'Ireland', 'United Kingdom', - "France", - "Belgium", - "Netherlands", - "Norway", - "Luxembourg", - "Germany", - "Switzerland", - "Italy", - "Monaco", - "Denmark", - "Liechtenstein", - "Austria", - "Sweden", - "Czech Republic", - "San Marino", - "Slovenia", - "Croatia", - "Poland", - "Malta", - "Bosnia and Herzegovina", - "Hungary", - "Slovakia", - "Montenegro", - "Serbia", - "Albania", - "Greece", - "Romania", - "Macedonia", - "Latvia", - "Lithuania", - "Finland", - "Estonia", - "Ukraine", - "Bulgaria", - "Belarus", - "Moldova", - "Turkey", - "Cyprus", - "Northern Cyprus", - "Georgia", -] - -meta_df_europe = meta_df_meteosat[meta_df_meteosat["country"].isin(europe)] - -meta_df_europe_sub, gids_sub = pvdeg.utilities.gid_downsampling(meta_df_europe, 4) -weather_europe_sub = weather_ds_meteosat_hourly.sel(gid=meta_df_europe_sub.index) - -meta_uk = pd.read_csv("../../world_map/data/meta_pvgis_uk_4300.csv") -meta_uk_sub, gids_sub = pvdeg.utilities.gid_downsampling(meta_uk, 1) - -meta_scan1 = pd.read_csv(f"{data_dir}/meta_pvgis_scan_coarse1500.csv", index_col=0) -meta_scan12 = pd.read_csv( - f"{data_dir}/meta_pvgis_scan_coarse_1500_1599.csv", index_col=0 -) -meta_scan2 = pd.read_csv(f"{data_dir}/meta_pvgis_scan_coarse2100.csv", index_col=0) -meta_scan = pd.concat([meta_scan1, meta_scan12, meta_scan2]) - -meta_pvgis = pd.concat([meta_scan, meta_uk_sub]) - -lat_NSRDB = meta_df_europe_sub["latitude"].to_numpy() -meta_pvgis["latitude_pvgis"] = meta_pvgis["latitude"] -meta_pvgis["latitude"] = meta_pvgis["latitude"].apply( - lambda x: ( - lat_NSRDB[np.argmin(np.abs(x - lat_NSRDB))] - if x < meta_df_europe_sub["latitude"].max() - else x - ) -) - -lon_NSRDB = meta_df_europe_sub["longitude"].to_numpy() -meta_pvgis["longitude_pvgis"] = meta_pvgis["longitude"] -meta_pvgis["longitude"] = meta_pvgis["longitude"].apply( - lambda x: ( - lon_NSRDB[np.argmin(np.abs(x - lon_NSRDB))] - if x > meta_df_europe_sub["longitude"].min() - else x - ) -) -meta_pvgis["tz"] = 0 -meta_pvgis = meta_pvgis.reset_index() - -meta_eu_coarse = pd.concat([meta_df_europe_sub, meta_pvgis]) -meta_eu_coarse["tz"] = 0 -meta_eu_coarse = meta_eu_coarse.reset_index() -meta_eu_coarse.to_csv("../../world_map/data/meta_eu_coarse.csv") - -weather_uk = xr.open_dataset("../../world_map/data/weather_ds_uk_4300.nc") -weather_uk = weather_uk.sel(gid=meta_uk.index) -weather_uk_sub = weather_uk.sel(gid=meta_uk_sub.index) - -weather_scan1 = xr.open_dataset("../../world_map/data/weather_ds_scan_coarse1500.nc") -weather_scan12 = xr.open_dataset( - "../../world_map/data/weather_ds_scan_coarse_1500_1599.nc" -) -weather_scan2 = xr.open_dataset("../../world_map/data/weather_ds_scan_coarse2100.nc") -weather_scan = xr.concat( - [ - weather_scan1.sel(gid=slice(0, 1500)), - weather_scan12.sel(gid=slice(1501, 1599)), - weather_scan2.sel(gid=slice(1600, None)), - ], - dim="gid", -) -weather_scan = weather_scan.sel(gid=meta_scan.index) -# weather_scan = xr.concat([weather_scan1, weather_scan2], dim='gid') - -weather_pvgis = xr.concat([weather_scan, weather_uk_sub], dim="gid") -weather_pvgis = weather_pvgis.drop_vars(["IR(h)", "wind_direction", "pressure"]) - -weather_pvgis = weather_pvgis.assign_coords({"gid": meta_pvgis.index}) -weather_pvgis = weather_pvgis.chunk(chunks={"time": -1, "gid": 100}) -weather_pvgis = weather_pvgis.unify_chunks() - -meta_df_pvgis_sub, gids_meta_df_pvgis = pvdeg.utilities.gid_downsampling(meta_pvgis, 4) -weather_ds_pvgis_sub = weather_pvgis.sel(gid=meta_df_pvgis_sub.index) - -# weather_europe_sub = weather_europe_sub.assign_coords({'time': pd.date_range("2022-01-01", freq="h", periods=365 * 24),}) -# weather_eu_coarse = xr.concat([weather_europe_sub, weather_pvgis], dim='gid') -# weather_eu_coarse = weather_eu_coarse.assign_coords({'gid': meta_eu_coarse.index}) -# weather_eu_coarse = weather_eu_coarse.chunk(chunks={"time": -1, "gid": 100}) -# weather_eu_coarse = weather_eu_coarse.unify_chunks() - -# with open('weather_eu_coarse.pickle', 'wb') as handle: -# pickle.dump(weather_eu_coarse, handle, protocol=pickle.HIGHEST_PROTOCOL) - - -geo_pvgis = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_pvgis, - "meta_df": meta_pvgis, -} - -standoff_res_pvgis = pvdeg.geospatial.analysis(**geo_pvgis) -standoff_res_pvgis.to_netcdf(os.path.join(work_dir, "standoff_pvgis.nc")) -standoff_res_pvgis.to_dataframe().to_csv(os.path.join(work_dir, "standoff_pvgis.csv")) - -# %% -meta_north = pd.read_csv(f"{data_dir}/meta_pvgis_north_3300.csv", index_col=0) -weather_north = xr.open_dataset(f"{data_dir}/weather_ds_north_3300.nc") -weather_north = weather_north.sel(gid=meta_north.index) - -weather_north = weather_north.assign_coords({"gid": meta_north.index}) -weather_north = weather_north.chunk(chunks={"time": -1, "gid": 100}) -weather_north = weather_north.unify_chunks() - - -geo_pvgis = { - "func": pvdeg.standards.standoff, - "weather_ds": weather_north, - "meta_df": meta_north, -} - -standoff_res_pvgis = pvdeg.geospatial.analysis(**geo_pvgis) -standoff_res_pvgis.to_netcdf(os.path.join(work_dir, "standoff_pvgis_north.nc")) -standoff_res_pvgis.to_dataframe().to_csv( - os.path.join(work_dir, "standoff_pvgis_north.csv") -) - -# %% [markdown] -# # Post process - -# %% -import pvdeg -import os -import pandas as pd -import numpy as np -import xarray as xr -from global_land_mask import globe -import cartopy.crs as ccrs -import cartopy.io.shapereader as shpreader - -work_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/standoff_fine" -data_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/data" - -# %% -# Create 0cm standoff locations - -lon_north = np.arange(-179, 180, 0.25) -lat_north = np.arange(60, 90, 0.25) -lon_grid_north, lat_grid_north = np.meshgrid(lon_north, lat_north) -land_north = globe.is_land(lat_grid_north, lon_grid_north) -lon_land_north = lon_grid_north[land_north] -lat_land_north = lat_grid_north[land_north] - -lon_south = np.arange(-179, 180, 0.25) -lat_south = np.arange(-90, -60, 0.25) -lon_grid_south, lat_grid_south = np.meshgrid(lon_south, lat_south) -land_south = globe.is_land(lat_grid_south, lon_grid_south) -lon_land_south = lon_grid_south[land_south] -lat_land_south = lat_grid_south[land_south] - - -lon_asia = np.arange(80, 105, 0.25) -lat_asia = np.arange(50, 61, 0.25) - -lon = lon_asia -lat = np.full(lon_asia.size, 61) - -for i, lat_coord in enumerate(reversed(lat_asia)): - lon_sub = lon_asia[i:-i] - lat_sub = np.full(lon_sub.size, lat_coord) - lon = np.append(lon, lon_sub) - lat = np.append(lat, lat_sub) - -lon_asia = lon -lat_asia = lat - -# %% -fig, ax = plt.subplots() - -plt.scatter(lon_land_north, lat_land_north, c="r", s=1) -plt.scatter(lon_land_south, lat_land_south, c="b", s=1) -plt.scatter(lon, lat, c="g", s=1) - -ax.set_ylim(-90, 90) -ax.set_xlim(-180, 180) - -# %% -template_params = pvdeg.geospatial.template_parameters(pvdeg.standards.standoff) -standoff_zero_north = pvdeg.geospatial.zero_template( - lat_land_north, lon_land_north, **template_params -) -standoff_zero_south = pvdeg.geospatial.zero_template( - lat_land_south, lon_land_south, **template_params -) -standoff_zero_asia = pvdeg.geospatial.zero_template( - lat_asia, lon_asia, **template_params -) - -# %% -standoff_aux = xr.open_dataset(os.path.join(work_dir, "standoff_aux.nc")) -standoff_himawari = xr.open_dataset(os.path.join(work_dir, "standoff_himawari.nc")) -standoff_meteosat = xr.open_dataset(os.path.join(work_dir, "standoff_meteosat.nc")) -standoff_north = xr.open_dataset(os.path.join(work_dir, "standoff_pvgis_north.nc")) -standoff_pvgis = xr.open_dataset(os.path.join(work_dir, "standoff_pvgis.nc")) -standoff_goes = xr.open_dataset(os.path.join(work_dir, "standoff_goes.nc")) - -# %% -fig = plt.figure(figsize=(10, 5)) -ax = fig.add_axes([0, 0, 1, 1], projection=ccrs.PlateCarree(), frameon=True) -ax.patch.set_visible(True) -ax.set_extent([-180, 180, -85, 85], ccrs.PlateCarree()) - -shapename = "admin_0_countries" -states_shp = shpreader.natural_earth( - resolution="110m", category="cultural", name=shapename -) - -cmap = "Spectral_r" -size = 0.75 - - -standoff_zero_north.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_zero_south.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_zero_asia.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) - -cm = standoff_himawari.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_north.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_aux.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_pvgis.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=size, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_meteosat.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=1, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) -standoff_goes.plot.scatter( - x="longitude", - y="latitude", - hue="x", - cmap=cmap, - s=1, - linewidths=0, - vmin=0, - vmax=14, - add_colorbar=False, - ax=ax, -) - - -ax.add_geometries( - shpreader.Reader(states_shp).geometries(), - ccrs.PlateCarree(), - facecolor="none", - edgecolor="gray", - linewidth=0.5, -) - -cb_title = "Standoff distance [cm]" -cb = plt.colorbar(cm, shrink=0.78, aspect=30, pad=0.02) -cb.set_label(cb_title) -# ax.set_title('title') - -ax.set_xticks(np.arange(-180, 181, 20), crs=ccrs.PlateCarree()) -ax.set_yticks(np.arange(-90, 91, 10), crs=ccrs.PlateCarree()) - -ax.set_xlim(-180, 180) -ax.set_ylim(-85, 85) - -ax.set_xlabel("Longitude") -ax.set_ylabel("Latitude") - -plt.savefig(os.path.join(work_dir, "standoff_map.png"), dpi=1200, bbox_inches="tight") - -# %% diff --git a/tutorials/04_scenario/scripts/10_letid_outdoor_geospatial_demo.py b/tutorials/04_scenario/scripts/10_letid_outdoor_geospatial_demo.py deleted file mode 100644 index ca9f1742..00000000 --- a/tutorials/04_scenario/scripts/10_letid_outdoor_geospatial_demo.py +++ /dev/null @@ -1,226 +0,0 @@ -# %% [markdown] -# # LETID Outdoor Geospatial Demo (HPC) -# -# ![PVDeg Logo](../images/pvdeg_logo.svg) - -# %% -import matplotlib.pyplot as plt -import pandas as pd -import pvdeg -from pvdeg import DATA_DIR -import os - -# %% -# This information helps with debugging and getting support :) -import sys -import platform - -print("Working on a ", platform.system(), platform.release()) -print("Python version ", sys.version) -print("Pandas version ", pd.__version__) -print("pvdeg version ", pvdeg.__version__) - -# %% [markdown] -# # Single location example - -# %% -weather_file = os.path.join(DATA_DIR, "psm3_demo.csv") -WEATHER, META = pvdeg.weather.read(weather_file, "psm") - -# %% -kwargs = { - "tau_0": 115, # us, carrier lifetime in non-degraded states, e.g. LETID/LID states A or C - "tau_deg": 55, # us, carrier lifetime in fully-degraded state, e.g. LETID/LID state B - "wafer_thickness": 180, # um - "s_rear": 46, # cm/s - "cell_area": 243, # cm^2 - "na_0": 100, - "nb_0": 0, - "nc_0": 0, - "mechanism_params": "repins", -} - -# %% -pvdeg.letid.calc_letid_outdoors(weather_df=WEATHER, meta=META, **kwargs) - -# %% [markdown] -# # Start distributed compute cluster - DASK - -# %% -local = { - "manager": "local", - "n_workers": 1, - "threads_per_worker": 8, # Number of CPUs -} - -kestrel = { - "manager": "slurm", - "n_jobs": 1, # Number of nodes used for parallel processing - "cores": 104, - "memory": "256GB", - "account": "pvsoiling", - "queue": "debug", - "walltime": "1:00:00", - "processes": 104, - "job_extra_directives": ["-o ./logs/slurm-%j.out"], -} - -pvdeg.geospatial.start_dask(hpc=kestrel) - -# %% -# Get weather data -weather_db = "NSRDB" - -weather_arg = { - "satellite": "Americas", - "names": 2022, - "NREL_HPC": True, - "attributes": [ - "air_temperature", - "wind_speed", - "dhi", - "ghi", - "dni", - "relative_humidity", - ], -} - -weather_ds, meta_df = pvdeg.weather.get(weather_db, geospatial=True, **weather_arg) - -# Define geographical region -meta_SW = meta_df[meta_df["state"].isin(["Colorado", "New Mexico", "Utah", "Arizona"])] -meta_SW_sub, gids_SW_sub = pvdeg.utilities.gid_downsampling(meta_SW, 6) - -weather_SW_sub = weather_ds.sel(gid=meta_SW_sub.index) - -# %% -weather_SW_sub - -# %% -meta_df - -# %% -# Define desired analysis -geo = { - "func": pvdeg.letid.calc_letid_outdoors, - "weather_ds": weather_SW_sub, - "meta_df": meta_SW_sub, - "tau_0": 115, # us, carrier lifetime in non-degraded states, e.g. LETID/LID states A or C - "tau_deg": 55, # us, carrier lifetime in fully-degraded state, e.g. LETID/LID state B - "wafer_thickness": 180, # um - "s_rear": 46, # cm/s - "cell_area": 243, # cm^2 - "na_0": 100, - "nb_0": 0, - "nc_0": 0, - "mechanism_params": "repins", -} - -letid_res = pvdeg.geospatial.analysis(**geo) - -# %% -letid_res - -# %% -import datetime - -ims = [] -for n in range(1, 13): - for i, np_t in enumerate(letid_res.time): - t = pd.Timestamp(np_t.values).time() - d = pd.Timestamp(np_t.values).day - m = pd.Timestamp(np_t.values).month - if m == n: - if d == 15: - if t == datetime.time(12): - fig, ax = pvdeg.geospatial.plot_USA( - letid_res["Pmp_norm"].sel(time=np_t), - cmap="viridis", - vmin=0.95, - vmax=1, - title=f"Normalized Power - 2022-{m}-{d} 12:00", - cb_title="Normalized Power", - ) - # plt.savefig(f'./images/RH_animation_{n}.png', dpi=600) - -# import imageio -# ims = [imageio.imread(f'./images/RH_animation_{n}.png') for n in range(1, 13)] -# imageio.mimwrite(f'./images/RH_animation.gif', ims, format='GIF', duration=1000, loop=10) - -# %% -import datetime - -ims = [] -dates = [] -subarctics = [] -coldsemiarids = [] -hotdeserts = [] - -for n in range(1, 13): - for i, np_t in enumerate(letid_res.time): - t = pd.Timestamp(np_t.values).time() - d = pd.Timestamp(np_t.values).day - m = pd.Timestamp(np_t.values).month - if m == n: - if d == 15: - if t == datetime.time(12): - dates.append(np_t.values) - - # subartic: near Crested Butte CO - # cold semi-arid: near Springfield CO - # hot desert: near Yuma AZ - - subarctic = letid_res.sel( - time=np_t, latitude=39.01, longitude=-107.1 - ) - subarctics.append(subarctic["Pmp_norm"]) - - coldsemiarid = letid_res.sel( - time=np_t, latitude=37.57, longitude=-102.3 - ) - coldsemiarids.append(coldsemiarid["Pmp_norm"]) - - hotdesert = letid_res.sel( - time=np_t, latitude=32.77, longitude=-114.3 - ) - hotdeserts.append(hotdesert["Pmp_norm"]) - - fig, ax = plt.subplots() - ax.plot( - dates, - subarctics, - marker="o", - c="C0", - label="Central CO - Subarctic Dfc", - ) - ax.plot( - dates, - coldsemiarids, - marker="o", - c="C1", - label="Southeast CO - Cold Semi-Arid BSk", - ) - ax.plot( - dates, - hotdeserts, - marker="o", - c="C2", - label="Southwest AZ - Hot Desert BWh", - ) - - ax.legend(loc="upper right") - - ax.set_xlim([datetime.date(2022, 1, 1), datetime.date(2023, 1, 1)]) - - ax.set_ylim([0.945, 1.005]) - ax.set_ylabel("Normalized Power") - - plt.savefig(f"./images/LETID_plot_animation_{n}.png", dpi=600) - -# %% -import imageio - -ims = [imageio.imread(f"./images/LETID_plot_animation_{n}.png") for n in range(1, 13)] -imageio.mimwrite( - "./images/LETID_plot_animation.gif", ims, format="GIF", duration=1000, loop=10 -) diff --git a/tutorials/05_advanced/04_nsrdb_distributed_api.ipynb b/tutorials/05_advanced/04_nsrdb_distributed_api.ipynb deleted file mode 100644 index febbd102..00000000 --- a/tutorials/05_advanced/04_nsrdb_distributed_api.ipynb +++ /dev/null @@ -1,1974 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# NSRDB Distributed (API Key Required)\n" - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:10.279080Z", - "iopub.status.busy": "2026-01-26T23:43:10.279080Z", - "iopub.status.idle": "2026-01-26T23:43:13.786694Z", - "shell.execute_reply": "2026-01-26T23:43:13.785679Z" - } - }, - "outputs": [], - "source": [ - "from dask.distributed import LocalCluster, Client\n", - "from dotenv import load_dotenv\n", - "import pvdeg\n", - "import os" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Setting Up\n", - "\n", - "We need to get ready to make our parallelized API calls. You need to import your API key and email. This cell will not work for you unless you replace the `api_key` and `email` with your personal NSRDB api keys. [REQUEST A KEY](https://developer.nrel.gov/signup/).\n", - "\n", - "We also need to initalize a dask client. `pvdeg.weather.weather_distributed` will not work without it. It will fail silently and not populate and of the results in the resulting `weather_ds` called `geo_weather` in the example below. It is hard to recognize that this has occured so be careful. Make sure to initialize a dask client first. Visiting the link takes you to a daskboard that shows what dask is doing." - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:13.790134Z", - "iopub.status.busy": "2026-01-26T23:43:13.790134Z", - "iopub.status.idle": "2026-01-26T23:43:16.744202Z", - "shell.execute_reply": "2026-01-26T23:43:16.744202Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Daskboard link\n", - "http://127.0.0.1:8787/status\n" - ] - } - ], - "source": [ - "load_dotenv()\n", - "\n", - "### REPLACE WITH YOUR API KEY AND EMAIL ###\n", - "api_key = \"DEMO_KEY\"\n", - "email = \"user@mail.com\"\n", - "###########################################\n", - "\n", - "workers = 4\n", - "\n", - "cluster = LocalCluster(\n", - " n_workers=workers,\n", - " processes=True,\n", - ")\n", - "\n", - "client = Client(cluster)\n", - "\n", - "print(\"Daskboard link\")\n", - "print(client.dashboard_link)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "**Note on .env file:** Create a file named `.env` in your project root directory (`c:\\Users\\rdaxini\\Documents\\GitHub\\PVDegradationTools_NREL\\`) with the following content:\n", - "```\n", - "api_key=YOUR_NREL_API_KEY\n", - "email=YOUR_EMAIL_ADDRESS\n", - "```\n", - "Replace `YOUR_NREL_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NREL developer credentials." - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Requesting Weather\n", - "\n", - "We will create a list of tuple (latitude, longitude) pairs and call the function on all of them at once. failed will represent a list of failed gids, unique location ID's that correspond to points in space on the NSRDB. These are different than on PVGIS where they are arbitrary indexes that do NOT correspond to a spatial location on earth.\n", - "\n", - "We will request \"PSM4\" data from the Physical Solar Model that represents a typical meteorological year (TMY) from the NSRDB. We will have to supply the api key and email from above here. The only difference between other weather sources lies in the NSRDB/PSM4 data requiring API keys." - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:16.747764Z", - "iopub.status.busy": "2026-01-26T23:43:16.747764Z", - "iopub.status.idle": "2026-01-26T23:43:23.651151Z", - "shell.execute_reply": "2026-01-26T23:43:23.650635Z" - } - }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Connected to a Dask scheduler | Dashboard: http://127.0.0.1:8787/status\n" - ] - } - ], - "source": [ - "coords = [\n", - " (25.783388, -80.189029),\n", - " (24.783388, -80.189029),\n", - "]\n", - "\n", - "geo_weather, geo_meta, failed = pvdeg.weather.weather_distributed(\n", - " database=\"PSM4\", coords=coords, api_key=api_key, email=email\n", - ")" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Viewing Results\n", - "\n", - "Same as in the other tutorial, our results are stored in an xarray dataset with a dask backend so you will have to use `.compute()` on the dataset to inspect the individual values of the dask arrays.\n", - "\n", - "Click on the `Data variables` dropdown to expand the dataset viewer." - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:23.651151Z", - "iopub.status.busy": "2026-01-26T23:43:23.651151Z", - "iopub.status.idle": "2026-01-26T23:43:23.682479Z", - "shell.execute_reply": "2026-01-26T23:43:23.682479Z" - } - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 2MB\n",
-       "Dimensions:            (gid: 2, time: 8760)\n",
-       "Coordinates:\n",
-       "  * time               (time) datetime64[ns] 70kB 2022-01-01 ... 2022-12-31T2...\n",
-       "  * gid                (gid) int64 16B 0 1\n",
-       "Data variables: (12/15)\n",
-       "    Year               (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    Month              (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    Day                (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    Hour               (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    Minute             (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    temp_air           (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    ...                 ...\n",
-       "    ghi                (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    albedo             (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    pressure           (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    wind_direction     (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    wind_speed         (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>\n",
-       "    relative_humidity  (gid, time) float64 140kB dask.array<chunksize=(2, 8760), meta=np.ndarray>
" - ], - "text/plain": [ - " Size: 2MB\n", - "Dimensions: (gid: 2, time: 8760)\n", - "Coordinates:\n", - " * time (time) datetime64[ns] 70kB 2022-01-01 ... 2022-12-31T2...\n", - " * gid (gid) int64 16B 0 1\n", - "Data variables: (12/15)\n", - " Year (gid, time) float64 140kB dask.array\n", - " Month (gid, time) float64 140kB dask.array\n", - " Day (gid, time) float64 140kB dask.array\n", - " Hour (gid, time) float64 140kB dask.array\n", - " Minute (gid, time) float64 140kB dask.array\n", - " temp_air (gid, time) float64 140kB dask.array\n", - " ... ...\n", - " ghi (gid, time) float64 140kB dask.array\n", - " albedo (gid, time) float64 140kB dask.array\n", - " pressure (gid, time) float64 140kB dask.array\n", - " wind_direction (gid, time) float64 140kB dask.array\n", - " wind_speed (gid, time) float64 140kB dask.array\n", - " relative_humidity (gid, time) float64 140kB dask.array" - ] - }, - "execution_count": 4, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "geo_weather" - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:23.684521Z", - "iopub.status.busy": "2026-01-26T23:43:23.683505Z", - "iopub.status.idle": "2026-01-26T23:43:25.061620Z", - "shell.execute_reply": "2026-01-26T23:43:25.059091Z" - } - }, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 2MB\n",
-       "Dimensions:            (gid: 2, time: 8760)\n",
-       "Coordinates:\n",
-       "  * time               (time) datetime64[ns] 70kB 2022-01-01 ... 2022-12-31T2...\n",
-       "  * gid                (gid) int64 16B 0 1\n",
-       "Data variables: (12/15)\n",
-       "    Year               (gid, time) float64 140kB 2.019e+03 ... 2.02e+03\n",
-       "    Month              (gid, time) float64 140kB 1.0 1.0 1.0 ... 12.0 12.0 12.0\n",
-       "    Day                (gid, time) float64 140kB 1.0 1.0 1.0 ... 31.0 31.0 31.0\n",
-       "    Hour               (gid, time) float64 140kB 0.0 1.0 2.0 ... 21.0 22.0 23.0\n",
-       "    Minute             (gid, time) float64 140kB 30.0 30.0 30.0 ... 30.0 30.0\n",
-       "    temp_air           (gid, time) float64 140kB 24.1 24.1 24.0 ... 24.7 24.7\n",
-       "    ...                 ...\n",
-       "    ghi                (gid, time) float64 140kB 0.0 0.0 0.0 0.0 ... 0.0 0.0 0.0\n",
-       "    albedo             (gid, time) float64 140kB 0.08 0.08 0.08 ... 0.0 0.0 0.0\n",
-       "    pressure           (gid, time) float64 140kB 1.019e+03 ... 1.02e+03\n",
-       "    wind_direction     (gid, time) float64 140kB 128.0 126.0 ... 121.0 123.0\n",
-       "    wind_speed         (gid, time) float64 140kB 3.6 3.5 3.4 3.4 ... 9.0 9.0 8.8\n",
-       "    relative_humidity  (gid, time) float64 140kB 87.0 86.47 85.93 ... 84.96 86.0
" - ], - "text/plain": [ - " Size: 2MB\n", - "Dimensions: (gid: 2, time: 8760)\n", - "Coordinates:\n", - " * time (time) datetime64[ns] 70kB 2022-01-01 ... 2022-12-31T2...\n", - " * gid (gid) int64 16B 0 1\n", - "Data variables: (12/15)\n", - " Year (gid, time) float64 140kB 2.019e+03 ... 2.02e+03\n", - " Month (gid, time) float64 140kB 1.0 1.0 1.0 ... 12.0 12.0 12.0\n", - " Day (gid, time) float64 140kB 1.0 1.0 1.0 ... 31.0 31.0 31.0\n", - " Hour (gid, time) float64 140kB 0.0 1.0 2.0 ... 21.0 22.0 23.0\n", - " Minute (gid, time) float64 140kB 30.0 30.0 30.0 ... 30.0 30.0\n", - " temp_air (gid, time) float64 140kB 24.1 24.1 24.0 ... 24.7 24.7\n", - " ... ...\n", - " ghi (gid, time) float64 140kB 0.0 0.0 0.0 0.0 ... 0.0 0.0 0.0\n", - " albedo (gid, time) float64 140kB 0.08 0.08 0.08 ... 0.0 0.0 0.0\n", - " pressure (gid, time) float64 140kB 1.019e+03 ... 1.02e+03\n", - " wind_direction (gid, time) float64 140kB 128.0 126.0 ... 121.0 123.0\n", - " wind_speed (gid, time) float64 140kB 3.6 3.5 3.4 3.4 ... 9.0 9.0 8.8\n", - " relative_humidity (gid, time) float64 140kB 87.0 86.47 85.93 ... 84.96 86.0" - ] - }, - "execution_count": 5, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "geo_weather.compute()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Spot Check\n", - "\n", - "We can plot the entire TMY air_temperature to check that our data has loaded correctly.\n", - "\n", - "Explanation of steps\n", - "\n", - "geo_weather is our weather xarray dataset. We can index into the first entry at the 0th index by using isel (index-select). This will grab the data from the first gid. Then we pick the air temperature attribute. This can be replaced with bracket notation so `.temp_air` becomes `[\"temp_air\"].\n", - "\n", - "This selects a single array from the dataset that is labeled as \"temp_air\". This array will be a dask array so the values will be stored out of memory, we would have to load it using `.compute()` to directly inspect it but when plotting with matplotlib it will load the array for us." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": { - "execution": { - "iopub.execute_input": "2026-01-26T23:43:25.065596Z", - "iopub.status.busy": "2026-01-26T23:43:25.062132Z", - "iopub.status.idle": "2026-01-26T23:43:25.377604Z", - "shell.execute_reply": "2026-01-26T23:43:25.377604Z" - } - }, - "outputs": [ - { - "data": { - "text/plain": [ - "[]" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAiwAAAGdCAYAAAAxCSikAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjMsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvZiW1igAAAAlwSFlzAAAPYQAAD2EBqD+naQAAe0BJREFUeJztnQd8FGX6x59NSKEl9BJ670WqVFEQRE/AjocCinJ6oIdYORW7KN7fjqCeihwiigUVFaQjSleaIh3pIC2BQEJI9v9532Q2M7PTy+7M7u/rZw07O213Zt75zVMDwWAwSAAAAAAAHiYh2jsAAAAAAKAHBAsAAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAAAAAPE8JigEKCgro4MGDVLZsWQoEAtHeHQAAAAAYgNWuPX36NGVkZFBCQkLsCxYmVmrVqhXt3QAAAACABfbt20c1a9aMfcHCLCvCF05LS4v27gAAAADAAFlZWdzgINzHY16wCG4gJlYgWAAAAAB/YSScA0G3AAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAAAAAPA8ECwAAAAA8DwQLAAAAADwPBAsAABgkjO5F2jykp204+iZaO8KAHEDBAsAAJjknWW76MW5f9DYT9dHe1cAiBsgWAAAwCQLtxzhfzfuz6R44bN1++mu/62j7NwL0d4VEKdAsAAAYp5gMOjo+gIB8t0+2+WBWRto7m+H6ePVe6O9KyBOgWABAMQ0246cpnbPzKdJi3c4ts4AuatYHpy1gS79zxI6dz6fvMbhzJxo7wKIUyBYAAAxS35BkP41cz2dPJtHL83bSn5h1rr9tOf4Wfp+8yHyGt6y+4B4AoIFABCzvLZgG205lBXt3YgpPOapAnEEBAsAwJewGI8vf92vKkh2/nWGXl/knBvITAzLyl3HQ4G5kRYHS7f9Rat2HacjWTn0yZq9lJPnrFspCBsLiBIlorVhAACww4/bj9F9n2ygSmVSaO1jfcI+7/1/S6Pmhhr8zkr+75XjelO19NSIbftk9nka9v5q/u+M9FQ6mJnDa8U8elVzx7YBCwuIFrCwAAB8yaYDhSnFx87khn322oLtujd2lvVy9RvL6av1B0xvW8vAkpdfEPr3j9v/ovs/3UD7T54lK5jVBqfO5YX+zcQKY8GWo5a2DYDXgIUFAOBL1NJ+j2bl0CsLtmku+8y3v9MXvxQKFRaUO7BtDcd8QgWi/Xrws438757j2fT53V3JbQI+SI8GwCqwsAAAYorcC8UWDiVm/3ogJFasoiZXsnLy6PKXl4VNX/fnSfrvj7tMb8es2FCa22m5AgEEogUECwDAl6jdN/Xup2M+ca+c/lfrD9KBU+cUP3v22y2G1sFEjxgll5cR646AE/pCvE+QKyBaQLAAAGKKSGSxqHmEcmwWess8m0etn/wh9P79n/ZQh2cX0JSlOw0tfyE/aEjEmIEVr+s5cXHoPQwsIFpAsAAAYoqCKN1QL+QX0Hybqcyr95yQvBdStl/4/g9Dyy/6IzzAdv/JczzI2CosPfrUWbGFBYoFRAcIFgBATMHSiqPB1J/30OrdUsFhltwL1i00y7b9xTtIKzG0KNXZCWBhAdECggUA4EvU7ptWXCCbD2TS899t4ZaI/8zbSj/tOKY5v5JH6NtN9sroHzx1jv79xSbLy6/afVw3BdwK8t/zo1V76dRZ6xYbAKyCtGYAQEyhFMehx9/eWM7/vrOsMJPnzcU7aM8LV6nOH1AIYrFr2Hnuuy2UlXPBcw0Zlb7Xo7M306S/t3NlewCoAQsLACCm0LKwTFRxmZhFSRoUGFAsdR/5lm5+ZyWdPV8sTN5YuJ1Pm7v5sOayby7SLobnFkppzBv2nYrKvoD4BoIFAOBL1HSJVgzLW0t2mlh/0FQfHqOxMyt2HadvNhwMvf+/+dv4NL3l//ODdjE8p3sGCZxVyHwyIs4AiKpgmTx5MrVu3ZrS0tL4q0uXLvT999+HPs/JyaFRo0ZRxYoVqUyZMnTdddfRkSNHdAeF8ePHU/Xq1alkyZLUp08f2r49Ok8SAAD/c8Ghm+n9szZQu2fm074TZw2lNZuJnTltw/WjBCtK99/lu8lpWE2ZgZN+CpsOvQI8L1hq1qxJL7zwAq1bt47Wrl1Ll112GQ0cOJB+++03/vl9991H33zzDc2aNYuWLl1KBw8epGuvvVZznRMnTqTXX3+dpkyZQqtWraLSpUtTv379uPgBAACz2K07IsCq4TLrwqdr94U9ZLHKtfLCbmY2a3Uf/zyeLXHLHC/attGidGZ578fdrv7GALgmWK6++mq68sorqVGjRtS4cWN67rnnuCVl5cqVlJmZSe+99x69/PLLXMi0b9+ePvjgA/r555/550qwC//VV1+lxx57jAsfZr2ZNm0aFzqzZ8829UUAAPGFuB6IOM7C6bRm+c2ZVbMVb6LrhEWF2zVxE7d6v7/kpSW83svaPSe45aPrC4XbdotMUTNFMRAswFcxLPn5+TRz5kzKzs7mriFmdcnLy+MuHYGmTZtS7dq1acWKFYrr2L17Nx0+fFiyTHp6OnXu3Fl1GUZubi5lZWVJXgCA+EJ8zxT/22nBkiDz/8zZKE1fPl/UndlMXAeLpTktK8FvFLa9pdv+CvVN+loUD6PFzzu1U7XN9A1S+6r/W/knrdhZmF791+lcmrR4B29GCUBUBMumTZu4VSUlJYXuuusu+vLLL6l58+ZceCQnJ1O5cuUk81etWpV/poQwnc1jdBnGhAkTuLARXrVq1TL7NQAAMYT4id9pwSJPYU4poTxsmrGwMMvFM3N+t7Q/7PuJN3Xvx78aWu7v764yvS2l9G01CwsTRI/P3kw3v1toUf/nR+vopXlb6c5pa01vFwBHBEuTJk1o/fr1PN7k7rvvpmHDhtHvv1u78Kwybtw47oISXvv2SX3MAMQK01f+yW9IzA0AimGBsK8t3B4mFljfm3/8b52j20qQ3bMT5RMsCqWfiywRZtl25DSvExMJVL6q4nfdcfSM5P2aPYVxPhv2Wy9aB4CtwnHMitKwYUP+bxansmbNGnrttdfopptuovPnz9OpU6ckVhaWJVStWjXFdQnT2TwsS0i8TNu2bVX3gVl32AuAWOex2Zv5315NKtO17WpGdNuCO0DtKTuS+yHfh3tnSq0KwgM/K49/zuH0XnlBthIqd3GzYR1WU4PHf1WY5BAJ5O4wre/qVlo1AI7VYSkoKOAxJUy8JCUl0cKFC0Ofbd26lfbu3ctjXJSoV68eFy3iZVg8CrPeqC0DQDySpRL86KZIuOntlXTDlBWqcQyRYNqKPXTRM/N56Xx5Qz8lF8WJ7MKsmWhYWMzCrEJWftvfDkYuZi8hwbiFJScPVkDgIcHCXDHLli2jPXv28FgW9n7JkiU0ZMgQHksyYsQIGjt2LC1evJgH4d52221ceFx88cWSQFwW98JgT01jxoyhZ599lr7++mu+zqFDh1JGRgYNGjTI+W8LgE/Zcuh0RIUD687LOgev/fMk/VWUOhsNmDWB7csDszZIpst/CjfrgiTIBEqJRGfqbbKbfrQaNRqB7dv2I1I3j1bHZqfq3wDgiEvo6NGjXFAcOnSICxSWhjxv3jy6/PLL+eevvPIKJSQk8IJxzOrC6qm89dZbknUwqwuLOxF46KGHeKbRyJEjuTupe/fuNHfuXEpNTTWzawDENJ+s3Ueta6XTkM51IrK9YAR61JghLyyGJ6hoYXFD08m9ImouIbOeMyYIvHyTZ+5IJliVUNrt6J8lINYxJVhYnRUtmMiYNGkSf6khf0pkVpann36avwAA6rAuwk4Jlr3Hz9KKXcd4XEySQxaDSCIXJswSsOuvM6odnJ2M41ByCX0rS3U2KlgiXc+Exc3ILUZqfLx6r+pn0XQTgvgF3ZoB8Al5FroQq9HzpcWhEvF39Kjv+RuSPOhWvnfXTf6Z/61YOtnxbScYsLCMmvGL6fUyK0WkXULMopPsQAyOl11ZIHbx36MVAMAxVu46oThdfDt6bPamiKZVM7H0/HdbJE/48lusmmXiePZ51y0sam4SSxaWCMepOiU0FF1C8AkBl4GFBQCf4M79QKWSqeiONO+3I7RgyxG6omVx6QE3YYLgnWW7yKvWnfX7Tjmy3sIYlsgqlryCAipJiRHdJgBOAQsLAHGsWMSGig9/3kOX/WcJ79Arr9p61/RfeLM9PX47mEmXvLSY5mzULhe/bNtf1O2FRbR469GwzzLPhqdwizXDy/O38cyhSMEq0lopa68H+43NVMd1AqO1X95eutP2tpQsLj/tOEY9Jy7mfwEwCwQLADEMc+UIHX2VEN++nvj6N9p1LJte/P4POpoVvsyLc//Q3R6ryvvn8bM0eoZ2ufixn27gwuihzzbybsfiomOHMqU1VuSZSq+LKtxGiqe+dr6adzRcQkazkiZ8r3+stWCiM6Dgihry31W098RZ/lcJ1l/p7PkLtrYNYhcIFgBimMHvrKT2zy6gPw4rFxtTCq5lpd9ZJ2A5Rww0sWPN+IzARIrQIK/LhIW8QB1j5uq99LhCJddox0fsO3nWlfV61cJiBbGo7D5xkSTO5db39PsYMdHa6skfqMUT8zwX9A28AQQLADGMECD6+br9ip8v3voXnTorDVT94/BpxXmN3EKMCAtmWZFnP20qqmRr98neLYSvtfMv5UJqVsl3MPPLCKwYoB6/7LUfVMyyz8z2TRLOC6ZVkIUElIBgAcAn2DEyaD2wPv2NMXeHUw+9LHZFDa/3o/l7USdiv1pYmKvupEYmFfv9r32rMEU8mkT6dwH+AIIFAI90Zf5o1Z+urV9r+P/i1wOG1rH7WLYpC8Pq3eFP81/8omzpYTw4a4Nhl1K0MoWOKMT22CEalgTBHadEdq65+JHP1u2nV+Zv4yLolQXbTC3Latd8ska5OB0sLEAJCBYAosyJ7PO8DPqjX252zcLg1AProDd/MrydOz5coxhsq8YsFbeVFzpGu0WkK90WblP9MzOtAlicCevx9NrC7TToLe3zQglWHfjhzzfR+SKRqhSkC4AYCBYAoow4i0fr/uXmTfuRzzcamu+06AmcdVAe+OZySYqqeP+zRHEMTIjZcafEplwhGqQQ3Ow2vx/KpKvfWE7zfz/CU8RZQCzr1cRimQa8udzwesTHmmWGWeXmd1eGRIuA0eyp95fvphunrKAzJi1DwJ+gcBwAUUbsBtF64rZjfdHTOjPX7DO9zjunraVDmTk8RXXPC1ep7nNqUiLN/vWAocBLvf2P1pM327wbmStnz0c+ZueZOVu4Ve+u6etCvyeri7Pn+FlTLi+n4kzW/XmSlmw9Sg2rlDG07nPn86lkciIXOU/P+T1UQ2jUpQ0d2R/gXWBhAcBDBHWEjZFaKJGyULCUZD1aPTmPsnLy6JxDrq4rX/uRokWsuCmYWJF/H9bQ8axJK4WT7ix2foh/XrUKwHM3H6Jm4+fSlKU7qfuLiyzH3gB/AsECgA7MVL5mzwnTT9isMuzR0/q1S8SsFaWdMpeLnMlLdtq2sJjdJ7V6LEY8VCxl+ecdx8L68ZhFWHzrEeWUa9cJKDef/Hp0N91FH7uqGUWLkT3DG1sqwdoNGO3iLPDrXmdaFCgJIDWX0H2fFMZAvcCKG4oEc2xISaAHBAsAOvR7dRkvbPbD70cML8PjOyb9RJ2eW2hqW8M/WEO/7j1J+06cpb+9YTyeQA9x/IvZfZLT+fmFqjE1SqKOPTnbDb9hRcmiXUzsvEIDyNY1y1H7OuU1l2PdsKunp1re7qYn+1KlMimWlv33lcbE0qsLtnO3jNmihE4itviouYTUNBWyoOMDCBYAdBD8+ou2GB/QV+6yHq/x4/ZjtFWleJtXOJyZExYoyQiqPDnbdUkxwWMmg8Vp2P6zwFSzvDWkHf/70R2dLW+buWtKmLR+iPlk5MWG5luzx5ku1I4IlnzlCszZUYj5Ad4BggUAg7BAv0hRItHZqBOnY1iMlFoXP/06keF0IcJVYeVYESxXtirscF2/chmaeH1rS9tl7jQmWqzSuX5FSUCrV9GzsPR9ZZnqskE4heICZAkBEEVYwS2WbSPHbsyH22w/Ki0gd/nLS+l49vlQQGeYhcWBr6MWiBkpxsxcrzjd6Fe7vl1NXjhv5S798vjyc8GugPX22VRUjl8kUkwHOEOvxAWwsAAQRV5dsI2nBruBJOYj4L6AURIrhfshbYxnhUCULSyspswqWeXe1jXTNZe5uVMtyXsW1Hpzp9qmt82MK3df0kDxs/SSSRQriBszmhUs0SjAByIPBAsABjET9GnUBXLybJ5iXYqh768mu/t67eTo94QRbiY2PBqFBAKUF2ULi5x3h3ZQnP72re1p/n096dlBrcI+S05UH3LXj79c1cJyU8daNLijVAA9NaAFdW9YydC+etxgx106+TYEy7s/7qZ5vx12Yc+Al4BgAUADlq0j4MYznFJswtJtfzlSI0WcdmrXwmGHwhgWe+sIuFwHZfzfmpteRk18NKpShhpVLat4bJM0BEu5Usk04dpwkcMsM0wAs3WKaVC5jK3YFi8JfrlLaO+JbNPr/8f/1lneN+APIFgAUOFoVg71mLg49N4Nq3OkbjfRfMJmP5ttl1DAPZdQ76ZV6Lr2NU0vp1a3pESC+rCqF4vCXEalVYK75ZlCbDN2socijZbgZJ+JDWh3Tf/Fdr0gEHtAsACgwm8Hs1zPRLCyRlZG3WzNjJmr91rKcnECHl/gwH3Vzv5f2aqaqhBg6dIsFuSO7vXo9m71DK9TybrRskYa1a5YSnUZLZeQQFKJBEMCKVEne0jNZeWVQG35OSLPDJJffwBAsABg8GnYK3F9LL6FFZg7nRMe/yIQVIiV+e+Pu13fN8V9caIOi02X0GuDL6IbOkhjQATa1CrH/z72t+Y0/mrjriElrTDnnh6ay6gJDCZ09Cw08kV5fRYVi80N7WvS5c2resIlKNBfo60C06LioFuG/D0AECwAqCBPLfZa4OJpUTdkAVZw7o4P19C1b4UH3P68s7irciQ5cCqHHvzMWDdoNVgMh1JpfDPHckyfRmE3feYO+mevBpbXaRYlwcKmvXJj29D7JBURwiwqRgK761QsRY+bEF5m6aBT2dcKe0+cpdEzfpFMg14BciBYAFDBjkCJhLZRsjhcP/lnWrDlKB04dS7ss2ilBb++cLvtdfC0ZhtZQkwnsKDWey5rJJl+V68GvJu0pX2ycJCV4l5uvbiOJKBWzWrSoW6FMKHTr0W1sPlG9WpIaalJrontR/o3JadhzQzlVWyj3YoBeA8IFgAsPEGP/WQ99X1laVS7xCrVnjitsT9+7zZspzS/YI2QH1M7BfqEZc2sQm4lYVxUu9AlJZCk4hJi1WprlCspWdcljSvTl//sGmoBwPfL5UDclBKRqfgs/L7svL3x7RUR2SbwNhAsIKazfOzcpMNcQiK7yRe/HqBtR87wvj/Rwux3Y0GNfo4LMGMhem9YccBp26IYFYb8Xq50b3/1praGuixbETtKy5RJkRYc1xIczTOKY12YrmFC7KLa5SUNGA3E9doiNSkytw1Bt7FsodWyon2xKsqBNijND2IS1nyQdZO9rGkVen94R0vrMDLwqw2QkRg2zVb35IGrPjWzF6Y1G3MJLX2wF9WpWJr+eOaKwrL2IgFgxPow6KIadEXLatxV1KtJZerzsnIPGyuGjBSFm31JmUuqYulk2qGyfPlSxa4e8aEUCyG32zpEysIiWMXMiBBWi+W/IrEKYgtYWEBM8sFPhRkxi/4wl/7LYNk3czYepHPnpTdI4T7gpG/dzrrMPkyy/fdrCXNm3TLqEhIsFkxwJJdIkIgU+b1cbZVCXEuZlCTdG2rJZOPPfayoHEux1mqqyZoktikq+3//5Y0ln90nei/+PcTrUOqi7UTzSZaS/cK1rSJmYck5n0/fbDhIpxSqQauxYMsRV/cJRBdYWEBMolVRVI97Pv6Vlmz9SxIvIEb8xBfNLrFWxIfHqtsbh1lYDO68Wh0TpRgSvd/QSCXZi2qV47VxShno5s2Ew1tD2lPdR75VFSzMOvTV6O6Ky1ctmxr6d77o9xDXmDkrC161QtNqZemPw6dD70dd2oAe7FcYbJt5zriAYOImJ8/aSffst1t48HilMimWlgexBywsICYRF+hiN5MXvv/DsEuBiRWGUqYNs4hMnLdV9N6e5cTOk69Z8cGsFL51CfHCccb2XS1oVcldohfTo5ZiLOaOHvVoRPd6vIeQFeQuIS3E1iLx7yE+j84oBF6bOcvKppbgQbzPX1PcJkB82qRoCEI5Vt2x4uvv2Jlcy+sAsQUsLCAmET8ZC40Ea1coRX/vbL5brgBbIwv+e2fZLvICliwsPhUsZmIZtMrfG3UJFa9L/+ZcNjWJHrfQi0jNwmKUWhWkFXWZRZDd5FvV0O4irce4/s2ofuUy/PXvLzfxaRfXr2hasLD96dqgEhdASjWDADALBAuISZTuQ/tPFjcytIrcHB40WulVxZJiL4ZFuuyna/ZpL8BiWDyYRcECSeVdq1lPHdZ4cvmOwiysoInS/Fr9deQuHr3fPxK9esxYWBgLxvbkv5fcZTnjzs60/cgZ6tEovIOzGUOeuOruT49cRtuPnKaejSuL1mVsZdPv6BwK0j1NkRMsry7YxoPuP7y9U8QChIEHXUITJkygjh07UtmyZalKlSo0aNAg2rq12Dy+Z88efjIrvWbNmqW63uHDh4fNf8UVV9j7ZiBiRLMWidNo3cDYeWmlOy7TCG78RnKLw0Ofb/Rl2qdSIbLLm1ehay6qITkuRtOatW6ocpdQy6LgVrOCpUpZ5+IqzBaua1ilLHWUFZETYl/6NK9qO8BWvD9MFPVqUsXSeupVKm3aheQEry7YTit3naC5mw9HdLvAfUydSUuXLqVRo0bRypUraf78+ZSXl0d9+/al7OzCVuC1atWiQ4cOSV5PPfUUlSlThvr376+5biZQxMt9/PHH9r4ZiFgV0xZPzKPFJpvxuY1Vw8W4LwpN4GrI02IDBrZ/z8e/UPtn53OLgZPESlrzgDY16KXrW0umnb8QlATPMp3lhNgSH76NT/YNqwgrR0mgdqxbnpY9dCl5IUDcKGY0jNNWJZapFQ387P4Eypg6k+bOncutIS1atKA2bdrQ1KlTae/evbRu3Tr+eWJiIlWrVk3y+vLLL+nGG2/kokWLlJQUyXLlyzvfrwI4z8vzt/G/j325mbyOkeFrpo5bJbyfi/46v9t0mGdKfLZuf9hndp6G2VOkWZeSF7OEWGArs6Z0rlch1IyQ1T8RB06zm0+eys7L04S1EP9aemJFOD7Xtiu29DAql02xXM7fDzgtoJTOUdbXyW0SNYKvgT+xdUQzMzP53woVws2TDCZk1q9fTyNGjNBd15IlS7ibqUmTJnT33XfT8ePHVefNzc2lrKwsyQsAPZx44AqPgVDZloI8UnrisxPD8tK8raYr7XrRwsICW9nrk390oT0vXEVfjerGBUHlssmheVjsjZpLqEVGuqSyrRaVLaTIvixqTMhoWFn74csMd11irfGinwTLnuPhlkXWOZsd6ydcbNKIXkSxh+Uzs6CggMaMGUPdunWjli1bKs7z3nvvUbNmzahr16667qBp06bRwoUL6cUXX+SuJ+ZCys/PV42lSU9PD72YKwoAMZGqj2LG7PzGoh106ux5R7f/4/bCFGwjsD2dvESthqr3aFe7PDWrXhgAyrxBSoXjWNbX7d3q8YrGd/aoR/93QxvNdV7evCoN7VInVH7fCnf3akhOEakO4AbjlQ2ncttFsFS6GRQLl1DsYVmwsFiWzZs308yZMxU/P3fuHM2YMcOQdWXw4ME0YMAAatWqFQ/knTNnDq1Zs4ZbXZQYN24ct+4Ir337dLIjQPzh0lglHwPNjol6MTJuPg2zlOzpK/eSX2DuGCEgl918lOrosFohLC2YzfvoVc3puvY1NdfJLDlPD2zJy+9bgQkoq2nISkRIr9BlTYuzfMTF4aymcttFMFS6Gd9iRqQBf2DpbBk9ejQXFYsXL6aaNZUHiM8++4zOnj1LQ4cONb3++vXrU6VKlWjHjh2q8S5paWmSF4gukXpSjKbVRam0vdJT3O5j2fT8d38oruPnnccd1VWRCGhkvXnmjelJ0UC4sf12MIsmfC/9Tb9RqQbrJkaLD3qNey4Ljxl5ZlBLevG6VjwdWoyVTDiBCqWL3XhaCLFbbmYQPTBrA53MtmbR/HnnMRr8zgratL8w7AF4gwSzPkEmVlgg7aJFi6hevXqq8zJ3ELOaVK4cruz12L9/P49hqV69uullAVCN1XDA6hIuWMLnGT3jF9XljdYSMYo8CNhp2L2Lpcs2UXkat0taqnYpKK1Gfq10UpLdwGg/I68JfRYT1KeZND2ZBR3f1LE2talZ3M2aYSdJ6NErCztc39G9+N4wXqGoniCK7IgjI/znh+KyG2YYM3M9D2p/dLazFlEQQcHC3EDTp0/nrh5Wi+Xw4cP8xdw/YphlZNmyZXTHHXcorqdp06Zc9DDOnDlDDz74IE+VZnVcWBzLwIEDqWHDhtSvXz873w3EKV9vOEhfrT/o+HpZaXu5DlKysOw/GV7SX9znRcisKlynPdz20tu5P7MbpLgImRIrxvX2leVOqbGg3XMqUshPVaEicOmUEjS8a11Huj2zjKr59/WU1Na5rVvxuou3Ia3V4hZHT+faWm4jLCz+FSyTJ0/mMSO9evXi1g/h9cknn0jme//997mriNVoUYIVmxMyjFgq9MaNG7k1pnHjxjzmpX379vTjjz9y1w8AZrn3418Nz8tqe6zfd8qw5UMuUJQyEfSC/VjtmngILGRFx/SCKtnN0ooFSfwEH0mMNmA0SiQFmfxcEfdcqp6eakqwTLxOWjeHwUQKc/U0qlpWEgejlLov1DNy26UpXJ+sQjVz1YI4Ks1vNE3s+eef5y8j6ylZsiTNmzfPzG4AD+K1J2EllM7eKUt38vTgf/SsT+OKzNlmLA5KFggzGsKu3HC7cK2diq7ppZIltVSsIC/UJwTa2ukJZYWKpZPpePZ5alVD6j6xS/lSxmI+nCCo0XNJLFKMeGlu7FhLUll594QrNWsKDetShz5c8WfY9uyeH0avj3bPzOcPJ4sf6OW6VQe4ByrrgLiGiRXG28t20c9FfWu0gnblT6lK1VcjWf7e7VoT00Z0srRczfIleZqxuFqtAGuGZxSlm2cEkljCmDSkHfVuWoX+fWV4GwErvHxjG/pb6+oRFV5qLiGGWGsYdQmJ40/0CiDKi7i5HXslIFyvwjW5apd6fS81nK5QDawDwQJAEX//7yrNz9nYF3TAJeQkbm6Lmf2bVkszJE7kTLmlPe9iLH+CZrESrAuwUZRuhPUqOVe4zSisW/F7wzua2nctrm1Xk978e7uIVszVcgmJf2ejWsJM00Z5bRdBLJkRr1Zw4tnhytd/dGJXgANAsADLzNl4MCrBg9GCDfhGXEKRRLgHvbtsl+PrViuFL4dlgYhjIMRP6UrWkHwTcSDyp332tlNRCX9gjpy8fA2XkPl2EWbq0YitMW/f2j5UP6hcqWR6a0g7eneosUrFZpF3J7dyuZ7OiZ3mrn4HggVYZvQM48GtXsCu+4QtbqQOSyTjeYJF3+u577Y4vm55KfybO0krSgtPxy1rpNO02zsp3qBKyFwBbF9v61oYMCvuxqyG3HUw9OI6pr4DKObgqRxVC4mV9GIh8PnqNhmmGir2ayHt/XRlq+q8AvEAA+uRoxcDY+R6Bf7BXXscAA7w8Gcb6dd9J2n2qG5UKtnZU5bdD42OYUYtLJFOVT3vUjEzeebU+L+1oI9XF1eV/v5fPfjTZ0a5QpfQGzdfRPcUZWhp3f+YUKlTsRQ1z9B3N8nFXySqsMYqYosISz3Wy+TRY0T3erwWjryOi9VGhC/d0JqGda3D658IsWV68NUqd3DhREqfZOXk0TWTfqI+zaoaCt4H1sDVDxzBTavCJ2v30bYjZ2jRH0cdq0TKBpgT2edNZ/QYiWGJNE4G+V4vKm0vL5LGbnhTbmnHn8bv69OYapYvFer1w2hVIz0su0e+b8GizzrUrWBIfMpdQmI3BjDH/Zc35sfusaua8dRjMVZ+VSZ4ujaopJuaLljnWFXbgW3VrSgsBb59nQqmqt/qBe/uOZ4tuUbllysbL5wIqv141V7a+Vc2D94H7gELC/ANrOiaHre+t1r1M+Epkg1SrZ/8wfT22cDnPZdQuNXHKuVKJdFL17emz9bt5+/zFIqkXdGyOm1+sopi/ILYrSAIDbvdoeUP5vEQK+UW/VtVp9+aVlEM9LVTLM4IVdJSadOT/Qw1VjTTH0sp7V3MocwcekckIuRnI0vN/uKXA/T+8A50WdOqZJVsA2MTsA8sLMA3zFq7j774Zb9mf5AVGmmLTHCw1xe/HrC0fRYrKo8XjXbQLRuBnfLLsxuF2DUQNBlsmVYyKcydFBb0aHJX5TfSs+cRAGkHtawklyvkh4rEGXE9mbGiGRE3kh5UshOQiRXGe8t3kx3Ee7xSNgat3XMibBqwBgQL8A1r9pyksZ9uoOFT11heB2s++NBnxQWvzKBUhyXaQXzcTeVSP77aFUpZ7gskBFnatrDIbnB5skBg4AxuW1jMIE631oO5dc0QNBhgbhax5WjwOyvp2JnC0v65F/Lp+ikr+DRWbRfYAy4hk/x+MIs+WvUn735aTZbK6WXmbj5Emw5k0gN9m1gKsPMSG/adsrzsL3+etLwsMxbI779K92Mzv67tXkLMauRwR6EZd3Sm1XtOmM7aYOcVS1E9nJkTqlcSHsNibl/lT/5ON48EhXhpSNBz89hBTT8Lkz/4aTf9tMO8NUQ+ph7NyqVKZVIo53zx+Zp5No/SRVZIYB4IFpPcP2sDbTmUxbMjXr/5IvILd00v7CDctlZ5nkIYr9gZDAuzhGQWFps+Ibvikd2/nXJLCXvStWEl/rICS1HV7r1k78nf6eaDwHsWFjOXaNmUEnQ61xk3IbPWPPXN785W2BWd8HatjQAuIdMwscJYbCBjxYscylTvJBzrsPHC1rjM40UMpDUb2IhT2UVKIspL2M1gCncJQbC4gQkvjOsYrQlTqUwyff7PrqbWrXXdnZMV1hNz/6cbNO8J8jRsthkWb3XNWz+FpnlHEvoXD52m/kKe8ukXrPhqmR+WvcSck0XFO30xsrRjszd2I0/fdnqYZOVccCyGRTh9Ag4IAqf0ihtntFJasxnkhwsxLO7Qo1Fl/rdB5eg3BjRqdfzhvktMtQdgqJ497GFGY7nPf9kfqhR8JvdCaFxif29RaOnBxoWPVu6lP48Xp0x74cEiq2hc9SsQLHGG2fs1e6Lt+OwC6vXSkpD7g4mVFk/MdWcHieidZTupzVM/8LgbM7qw10uLXTV9L9hyhJZt+0syTUlQGdlCaLmAfeGsJerKGKiR4eYToFywmKmxoeTCM7s8MAaLt9j4ZF+aO6ZntHdF9aFCbHlhpfwrlE42XZdHa249QcHO5XV/nqSWT8yjRz7fxKc9/PlG3sVbDluTPMg22nJlydajfFx9dcE28iu4+k1w4FSxO8VDLl9TmN3t/SfPccsCq2eQU2Rl+e1gpqvpvM9/9we3Gjz9ze+Gn0rYTftgZo72PA4ctx9+PyJ5b+d3YE9sy7drd4jW4/iZXM19uKFDTUpNMnaZu3FOiw0iTauVpdGXNjS1fJJMsDzS35luySCctNQkUzVQ3EJtF+7v2zj0byF41WxLAa3LVa/F1erdJ+j1hdtDxSwZn67dr7yuYDCsAnW0i0z+54etfFx9dUHhdzBDdu4FWrz1aJilPdJE/+z0Ed1eWETx3DBNcINFKsvo1Lk8w4LFqHBwOrhQcf8MbCJY5Bc/erow/dGOgNLKvClXMpmWPHApRQtxUDJ7eq9YJsXU8qz6qcAnIy+mWiZTrYH/UBpfRvasT5c2qRJ6L+gUea8qqyiVLJBz29Q1tPXwacPnvdxFHW2PUG6e9fgv1qvstg/W2K5XYxcIFotE++Szih2xIcS/KD3UuJFlwAYQM31+jOB0xuRbS3bSqbNSk7DRTXy76ZAj+6Al1pjJvFSKMT+/Gxk4doNuWbExAZ+GjQEHXEJskrjWiTCOWWnaqAQbPoxk8RzO0rbiCjBr79Sf90i3Qe49VL62YDttPpCpOZ/VMZoVvpuxai//96RFOyiaQLAAwwh9ehRPfBeMLsxEa/SGZ2S+gEs1Hh79crOFLCHntq+XWl3aYMPIXI8Llmib1EFkUDKasJYMYneVcBk7JVicPr+e+Eo6JrgZdPv20l30yoJt9Lc3lruyflb4zisPDRAsBmB+uzs+lFZX9W0MS8D6DScv5BJyRr1/tf4ADXt/NS+opLhtEym7RmbjMSzkPKzImlmcLPam1a05Ny+fD+od6pQ35f5zCru1J5y8IQF/oCb4xd2lhfFGqKhsFFZn5YiKlcRsxvybi9RjQU4qjGlu6e3NB7UtK4xvNx6irUeMubO8DASLAVbsPE4LtkjrrsTLw5647oWWhcXKbeVfM9fT0m1/8acD9Roj9m/abitN+bkQaTH7ZVE/FCVYs0JG5/oVdNfjxtOT3cJ64t9T3BkaxJdLKL+gQBKALYxBVgTt47PDrR9BCxaQ//xgLtvGrXtGwMA8o2YUFg61i9NVtc0CwaLDtiOnafgH1nvXeA0zl/f+k2fpmrd+DrO2KHqEbNykhb4bVs20rBheu2fmG9qW2d00NiCav4idHLzeXKzsV36gb2NqnpFmu/6MHZxwwW14oi+t/ndvKl862ZF9At5G6Zpj9XfEFpbQvBrn9ZIHepmKQ7HrvtTDLZdQgl/N/RaAYNHhqW9+U5weD+fI5CU7DRfLYz5mx/t7BI0NIlOW7HRtsLn7kga684g3zYpKnVJxcdnhksaFhb3MIM6oEQuHWXd1oSZVy9I1F9Ugt3nlxra8xsfE61vbSretkuafvl3AHkpjK4uvEtdcEVyN7Lzu3bQ4e0hMnYrKGWVKYwWrr/KrjR5l0SSgM/QeNRgo7AcgWHSQV3T1PSaUlnxWIUvI8QcFjV1yUodYKc3PnuonXqd9sxVbgUY7ZHqV858b2tiKBRA/idapUIrmjulBr9zUltymVc10WvNob7qxQy3XtwViAyZQlTLYkkUWFrHoeG94R1OxMGpjgJKrKBYsLJ2eX0ixAgSLRbwew3L0dA7vPGqnpbncaqJlxXDL4mTEJeRmXRgWg6G3evHPsmSrtBKuGr8X9aQyipWvKP7tgrLaJpHs2O337uAgsrTISKPhXetS/UrFbQIuFBRIAmzt3PyFcc2J+CpPxLAE1D9z+jtG+76Hbs0xmqUw7P01vCkXCxh+Z2iH0HQz3ybMwlJUCtLpJwXtctmObsr0BZxXUKD7BGPl97hWFBtkpMlbwGbfKCFgmpGiUvmWVaIFINowgfvkgBb833Uf+TZ0LovHYis90aLdD86trSVojE+GkxF8AiwsFk8Grz80Cl2leSVUiwIjoHCBsw6kLLsnUhhJizUadFZYzdLc9tnAqLt6l8e92aO6WQqsEwQm47xogFfrx/Ph7Z0s7iEA7mcrii114nPbKpFuRujW9gIaQ4PbgcSRBoLFooUl2qYxM4jPWTP3Pbkpn928pyzdRbuPZevOG0mLiNFNs2NmdtBglgk3LCxGueXi2lSzfClLAlnc2ZilhWodKxaAWxWBrcCjyK0hjatKrYGtaqRL3mekq5/LwukfcQtLFNKa8x3eaLRvexAsOnjdkmIEp26oTK0fOFncAFKMnZ9JS+wY2XWjXjtWKvtIlrnePaxYnt45wPafWbFYETynybtgvX+T2A0kFi+K88bYkxiIzXpQqx/tTQvGXhImrj/5x8Whf5cvlURz79PvOp3vgFvJHO5sL0iRi2GJ9u0QgiUOEAsWpfRjdrP983i2NEgzGOSdmuXxHGqmWLeEnRGxZeZmPmWpNFXbyE1fb/2BorRLVgTPaQQftJXfVyxCxAUAlRBbYADwGkLMSpWyqdSwSpmwz0uJ2k80rZammGkUts4In/NuPRMENdZ78FTspDQzIFjiwPKid8+fuWYfXfLSEnpD1Nhq0uIdtGDLkbAnErVgt6gKFnIPZpkwYsFxyy0U6pBtYdmUpETDQYqx5usGsUV6SX0BIm6lokUgSgGpkQ4jOJR5jq58/UeKJSBYdFAriObmybdh3ylaslXaCsAxC4vC13l2zu/878vzt2mWnc65kK/6pG6ncJxtIeCyeFT6bvIeJm7d7wXLh9mgW+bDv6F9TYl1THs7ECzAe0y5pT1dXL8Cjb+6ueFlcvIKTLlbI4VbDzVBlenr9/qzEJ4WSGv2oCVl4KSf+N/lD1/KAy7toncvyjHYpffQqRxXYh3spjW7WZq6QZUyihYWFowt/i3cuuELlhH5V2RFtLo0qKjqhpp6eydKNWFhYWZ0ALzGFS2r8ZcZ2IOVEWLFwhJUWbEbnemjDSwsHkN88jnlf5TGsIRj9LQulZKoGwthl0/W7DX9VOLWdXlV6+p0c8daijEscguLk63pxQiiSEmUaX3v8JR05eM2557udO9lDWnUpQ3t7SgAHiFXz8JSdC25PZZFqnFgMI56DJkSLBMmTKCOHTtS2bJlqUqVKjRo0CDaunWrZJ5evXrxAV78uuuuuzTXywb78ePHU/Xq1alkyZLUp08f2r5dvXV3JIl0lU7xk7pTN0G9SHGjJzZ7SlcTEHZ+JvGyD3++SfKZkbg4t9xRYy9vzBuuKX03ebq7Wx6Vfi2qqh4jraKG8vO2T7PC9VSUNRBsWSOdxvZtQiWTi60xAPgR9oDBuOXiOobmj7hgccsDFVSerNAr0veY+kpLly6lUaNG0cqVK2n+/PmUl5dHffv2pexsaV2OO++8kw4dOhR6TZw4UXO97PPXX3+dpkyZQqtWraLSpUtTv379KCcn+hHORm6F3286RFe8uox2HD1te3vivHmnzu+honRbI52W/7fyT8X1aDc/NCegbp8a3gFbSaAZyxIiW0we0k5xuiASlMSC2N1yOvcCfbxaahlygps71aYb2tdSFCfsaU1LaMo/GtS2Bi8M94OBdE8A/Mhzg1rS27e2p9u61fWVYPnjcBa/f8z/XZrkYJeEGLSwmIphmTt3ruT91KlTuaVl3bp11LNn8UBYqlQpqlbNmN+R3aReffVVeuyxx2jgwIF82rRp06hq1ao0e/ZsGjx4MEUTI8f87o8KG96xCrDf3tvD1vbcyLTbuD/T1HdUawImruuhuxINdh/PpkV/hAcVK8WBOJ3WrETvIuuDHEEjKBkykmXVYl+aJ7U0OsHDVzQJ+aGV9kFTsMjnTQhY6vgMgF8oVyqZ+rXQv++EsoQiHHSr5hIa9dEvtPOvbLpz2lra9fyVpmNPgirrdUKwyB8iox2ab8tolJlZeCOsUKGCZPpHH31ElSpVopYtW9K4cePo7NmzquvYvXs3HT58mLuBBNLT06lz5860YsUKxWVyc3MpKytL8nILM4f8dM4FZy0sLpwdSus07BJiRdRUfhGzFhalZZUsOP+nkK3kdAyLPB6leL2F05V+HrXy9k6x+al+fADWEmVaLqFYfLoCwEnm/XY4ottTM1CL7xt3TFtrer1BlfU6cfvwWvKg5SyhgoICGjNmDHXr1o0LE4G///3vVKdOHcrIyKCNGzfSww8/zONcvvjiC8X1MLHCYBYVMey98JlSLM1TTz1FkcDM07tc6bKW6D/tPEad6lag0iklzMewuKBn9U7AE9nnVT/TyjQR/0xZOXm0ZvcJfiGxSPwOdcpTFQNl35UsLEaKsdmNYVF7ohG+k9I5wDoem6Vv86q8t5MRyuicL6w+jNapCb0CgH7l60ii5PJmvdmOni6uvq1kedYiJy+fFquUwNCKXdx34izVqqCfgSq3cAf8KlhYLMvmzZtp+fLlkukjR44M/btVq1Y8kLZ37960c+dOatCgATkBs9qMHTs29J5ZWGrVKvT1O405y4H0/Uvz/qB3f9xNPRtXpmkGG8tJbtpuWFh0Vtr3lWW2q6He9sEaWvfnSYklYOfzV4r2QYogCKx2YHXr5izsl9Lq5S4hI0y+pT01+Pd35BSaQbdRH1oA8CbREvNKlpB7P/7V1jrHf7VZte6Mlju9x8TFtOeFqzzXIFIPS3bt0aNH05w5c2jx4sVUs2ZxcSolmGuHsWNHcRVVMUKsy5Ej0idP9l4tDiYlJYXS0tIkLy+e3B+tKgzEXGaiZLtEsLhwYSmJbvFmjp3JtdRXRzxZLFaULCdqNUuslsp2K6353Pl8VfeKFZeQ07uZaCLoFgAg5eo2GZqf92hUyfWHxQVbjiq2DzGaIfrp2v2K00+dPU8vzv2D7OIxvWJOsLAfkYmVL7/8khYtWkT16tXTXWb9+vX8L7O0KMHWwYTJwoULJRYTli3UpUsXij4mXEIOmM/EitaVk0VhpUbdXlrF0cy4ztTWY7X4mlup50I5cCXBYtbC0qByaUdFRPPqaZrfG4IFAG3SUrUdDP8bUfiw7RRGx/MXvv+Dlmy115fs9YU7aNuRM2QXX1tYmBto+vTpNGPGDF6LhcWYsNe5c4VN8pjb55lnnuFZQ3v27KGvv/6ahg4dyjOIWrduHVpP06ZNuehhsEGXxcI8++yzfP5NmzbxZVgMDKvzEm3MDPzy+62VG6n4pu3GyfL4V7/RtiPW0q+1VL+ZbyoPrg0UxfsM+yA81TlaMBde5bIpGkG35mJY/jusIznJtBGdNOssRLp+EAB+QbgyIt6rWbbBj1Ypl49g7D2hnqhiBDutXTYfyKSb3l5Bv+w96bmWHaYEy+TJk3lmECsOxywmwuuTTz7hnycnJ9OCBQt4bRYmSu6//3667rrr6JtvvpGshwXhChlGjIceeojuueceHv/CCtOdOXOGp1CnpuoHarqNqRgWBywsUsFCrvDk179ZWo7tjxM3QqX06IVbjtCWQ1mOCak2tcoZWrZzPWmGm5I5WOkrJ5cw9zvUq8QsLMaWSU3SvixZEHelMimm0poBiEe0LrlIGw/km3v0S+XyEQy7QuGCjeVvfnclrdp9gq5962d/Zwnp+dVY4CsrLmd2PWwgf/rpp/krlrBybxeLHrfMcb/LhIHR3WT7E3DguypdTNlF8SJWUPqZpo/oRK2e/EF3WVZMTQmxuFCuMuteWvO6xy7X/Fw4L7TqNSCtGYDCsU19FPVu80M7l++7y3ZZttAwd5Q4zfqtJcqxp9EiBov3OkspEyXLnXYJudWfRh5Yyiq1GkFrdzYfyOIpekZQenowGzh7ODOH/jxeWGFZ6SlAXIlWCyPzJSXq9xJyEr0UeOHrIugWAG2UxmBhWsQtLEFpWrEWdi7f577bYnlZFvAr5u2lu8hLQLDoEIymhSUClaOZ+c8oTECpfadzefk05L+rrFlYAuYtAhdPWEiXvLSE13xRenJx8n6tFK8Szd47gpA10/wQgHjEWw2Lg6HaKSytWAurrvegx4JknQaCRQc9H97PO4+J3tk/WcThHW65hMSrXb/vlGO/xa97T9GKncd116NUz8VqK/SDp84pXqROBp0qpTC3q12eokXQwG+GoFsAtK8Dt+/tzDLbv2U1ql+ptGR7J8+qF+cUsHr5BmNbr0Cw6KEnGv7+brFVwYkAJXEtEq+de1oxLGYsNqxKq7zImZ0nIb3aMmpUT0+1ZGF5ZmALV11CeginpJnmhwDEI32aVdHIEtIfYbs1rGh52yvG9ebFIlmAfOH23I8vC1JsA8Fiw8Qmj8VwwhwnNj64Zd6zul4n9obFudyjUN3R6kXMxI7VFgaf3d3V0HxhNVcCgagKgqChSrcAgAnXFpfTkKM1DH41qhv/++bNyp3cxTzYrwlNvK512JgQijELSLdnZOwwcv3OV2jzYWZs96P7CIJFB7U4EnaTZL5I6TTZPEF7zQ/NWmzcPgELY1js3Qrf+3E3r7kix3ELi876/u+GNlSjXElD6y9furCAnGf84qEYFmQJAaBX/LFSmeImomK0RkuhLEL50sl0zUU1NLcx6tKGdGPHWpQkK4wkXIPCeCFY6w21zTBw/d6p0ChROQFB+TZv5XYR7UJyECwWDxDr3/DGImnKlxPH0mrhuCNZOdT1hUX06oJtuiWZre6m3SBg1h31/+Yrd1+26k5jv5di0K3GBX99+5q6g5DcJfTGzRdJBqJoxogUm5bV54FeAaAQ+bVq9towmnGYJLsghcoHgkAxM8RZvXyDClspqbL/VoZcuTvfN80P4wHW7ZIV0DGaAia3cFixeEjrsBhf7rWF2+lQZg69umA7uUVefgHtOVaYSmyFf/xvnepFZrVQEov5MbtPrHu02SBfcWdTtqhbVpb7L2+sO49wiqD5IQD6qF0masPzC9e2krwfdWkD+nh1YV84LeS1pAQLiyCQhPuBm5bwoMKq1dqIFO5HQNJ/yOvAwqKTfpt5Ls/w/E6chlbrsJw1WEulcMVkiS9+PUC7bAgWLWFo1dTIrFzzfgv35WohN90aWyYgeWJzQxCUSSlB9/RupDtfyLSs9agIvQIAR+1aVbJGsID6wZ1qS6bVLF+Knh7YQnc7dSsWP9SIHyjkl6nSs9n0EZ0lwsrJLKHSycp2iaDsYbTt0/PJ60CwmExvbV9HI51VVhgoS1QxUIml2/4KSyu+IDK5mbmJe62Espxdf53R3Hfx9zaDUuCZHiUUCsGZETkBl1wuRq02IQuLZgyLQzsFgM+RXwshAaMw5KSouE+MFZiU3k6LY26lheqUxnUWZyJ+ALH6QBRU+FLNM9Kob/OqYdPF+2H0wZw1jIxmsC4EiwZyUxqzuEyQmQvFiA+jXmGgA6fO0bD3V9OgST9Jpp/Pz7cUM+JmzEId2ZODFS77P/WWDezCEQcbu40VC4s4jZnHsJDzGP0JirOE1OdBHRYAtK8FpctNrVxBssrFJtRYUXLRJspdQkVbVIu5E1vXrV6+BQpfiu3Xq4Pbhk0X74bRIP159/WM6tgCwWICdkJpPbmeyb1Aa/eox7yIEZdmFvs2n/22uKwyO7Gzcy/QY7M36XbfdPMUcrvmCPv6zC0UKax8H7HIYcF0blyzRi1qwvmiNXCoDbAAxBtql4mSpUAtLkztIUfci0x+PRbHsAQkD6BKlznbbG1RnJxVggorZ/uh9r3YfejBWRvo1veMVSmPdvYhRjUdypdKMnXArp+ywtB6z4jcRUKp+l/3naJdfxXHiLBzj7Ugn75yLw3/YI3m+txUvVYsEmZv1Ha6i5rfHtlyIxWeA4Go7ZeRoFul/kcAxCNhY7bGpaE2vitdTyxmRRyMf107aeahENhfXKiu6K+KFaS3QpE7J0gIBChJoVkr249nv/2dZq3bT78dzDK8rmgCwaLD/LGXOHLAxKnGx8/k0h2iHHrBFJgl8yMyE+Jug0GuZvbK7P3abROgWmqyW4irCRtFzT/tJEYL4AnbVqpnw1gwNrpmWwB8kSWkME3VwqKQaSN/yLqtWz26uk1G2HzyLCGlsY7dV8RWUaV5Nh/IpMtfXkoLNOL2lL4Tz2pU+F5svGFxlGaIdmwcBIsOQlllszn5ciYvKU6BfnOxcv2WEjIVzK4H+TRVXDyRzl+Qpus5TUGEXUJKAb7/uaGN5jLiJxQ2mET7SYOxcX+m4vSGVcpGfF8A8F0Mi6JrxngMi3zMYmLnsqaVw7cvbE9YTjGGRRozqTQe3jV9HW0/eoY/7KqNl2a+E1uFVuC+nHKlkqhcKeUifJECgsUkFUrbP2C5sidjIeBUru7ZiS12RbCy9mqYCQM1G+V99HQuuQnbn0jWI1LKEmLF5AZ3rGVoGdag0g25UrG0VBzrcV7cKRMAoEhAR0CIUasHpeQWP5iZEzYtUeEBMySYQllCZMDCEj5Ptqh0xfCpKiECSutOcOY+sHJcb003dCSAYDHAZ3d1oabVytLMkRerFuExg/yQC4pbfi6wk1Z8ofxx+DRFg9M66dl2Yd8/UqlyLC29T7PwFD9GWklpvJK6YCnQdQld0aIaLwtuhneGtjc0n/BTmaq9A0CcYiboVs1drBTDomR1UdpUsUAKRbEoCxbRvWXBliO8Nop8HoFlKq6cPcfDQwjUhlazI65V74KTQLAYoEPdCjR3TE+6uL71zp1aF1CwQFlVB03cyM0IX7MnarU0Y12NbbmEir4nK57mJp/f3VX1wgsYdAlxC4vO7z35lnamjskrN7WhFhnpxhcgonqilEoAgDJyl4jWtSsPAdCysLSqGX69VlToWyRsTxjflVOPpdv4cfsxemOhtGp5goEB5ca3VxgXLEH/dXeGYPHABSS4hOTmSOanFItsrVRVt0IqWHv1O3vWJzfJysmjF+du1WzUFQm0AlXFgwW3sOg4hXg1XBMHRS2AVgnhSe3fVzYzvAwA8YqqhUX2no09XRtUNCxYxP3FBLrUr0j3XNaQXhd9JowDWoXj2Dxy6/3MNfsk7xMC5sMNNF3HflMrECzeECzCCSwvnsb0SwVRp2Cn0ovNeF+eGtCSSie7awr8de+pkFiLptnRqL7g2QEG5jWjIZUGGj1YJ9nm1dNMLwdAPKH6cCEbBx/o20T1IUPuEnqwXxPKUOj2zpa/v28TGiDKFpK7hJS8Tuye0FpmsZFnISVYfCrNVUma0MtKvLtXA/IaECxRIKy3RNGJyZ7cJdODQc24Csk6XUoTYkFWkUyIcUuwsIZm393bQ3Meo1+TiSsj85r53cxYWMS8fauxuBcA4pXwMixFFg8TJgb5w6KZmLvitGainLx8Gvp+eJE2Zj1pXbMcVVJwKdkWLHmFY8sCWYkOrcTMj++8mEZf2pC8BgRLFJC7fgQlLU+3lVeA1brA3BIVLO0tkp1/mxVZDKxEo/+tdXX+t3/LapLpNcqV5A3NWE8NLYz+hkxIOl3nxIqFhSEuXAUAMH6jN2NpZn3krC5bLJBYIdC9dOzMedV9VEsI4PMkkCUOFWUzNaxSxrDouqh2OSqdUoIe6d80ZFHyAu5GOIIw5mw8SNNW/CmZNuDNn2jtY33CY1h4uq+xG6NbgoVfJBG0sLDmWr8+fjkv1NTyiXmmlr2odnl+gVWVBQkbLUpn9AmGCUxjeirgimCJYu8xAHyH1Ru9GHl8iZlLMDSsBIN06my4WBHHyGkF1gYsDsRta5dTnK71HYQHxn/0rE/9WlQL60QdLSBYIszoGb+GTTt2Jlc1hkWsgrVvVMZP5nN5+XTufD6VNBCbwl1CFDnY9lhshhXYNcZawcsxWvbfzPd0+ldR8zMr4YWidQD4Bfm1KggYcxYW6VhppjK3OEtIbSwSdIqaXmGupL2i/nNGqVw2he69rJHiZ1pfQdwHyUvZiHAJeQRePE12MjMXkHia1glm9h42/IPVhuZjmUmRLPNu52asWtHRaBVdg9tmqY9GZhV3ctWjvIkKktArAJDl60UYJ8zEsCTbcQmFsoSCdEElY0fYJ7XKs3eKWrkYZcMTfWnNo32oWrpyWQqt7x/tEvxqQLB4BK6+lWJYRJM0Y1hMbm/Vbv2u0kO71KGK7OasM4+T2LkZqy1r1MIiD3pWqpVyY4eadO1FNQz93qylO4ur+eKfXTXn69GoEg3rUpeMAgsLAMaRP3DJ04yNWn6FeI7CZU1YWIRlNPdReV/FdVnMotexnTXaPXte2bLr1V5kECweobA8fXgdFrHp0UkLixGEC1Rt3Xd0r8frDjiJnZux2kWmVm5bLeBXjWsuqkkTr29DJdhAYGA3Wdrjm39vR+1ql9ec75Wb2hpyzwnp5d0bVdLfOABA0VogvDcbCnbXJdbSfMUCSW18C1lYHDRtJOl0bL//0w3kNxDD4hHYxRMedKuTeybCjUgT4SJSush6N61C9/RuRD/vMK/8heDaNrXKhT052Llg1RY1usqrWlWnrGsu8Aj5SP7eRtfEqi0v+uMo3aTR8wgAoH19CePZfI2ux3qYCroVLaP2UCW4nJx0xSTqrOzAqXPkN2BhcQErfXGYJUXuumDTJC6hCFtYtNZ9S5c6vFeO1e0ObFuDRipU0BWvT0hPrmMwQl2u7YT133d5Y0PLs8Hk751r61pa5PsZKasSS2Ee1rWuJ3p6AOAX3HChWgm6ZfcFtV0R0qaNlN83ilfdOnaAhSVCmUB6QoZ9lC8LyOJuInInhsUIWud7nsW6IQLsulQaSE5l54X+/drgi2jUkdO07s+T9MTXv+muU/77sdoBzGrSqoa5Hj1GcPL3jsFxBQDPVxa3g7k6LMXLqOkRId4E8WnawMLiAt9uOmT6JOeCRfa5mbRmN9S0OLVNTud6QuyKte0W9toJn75y93GJmbRljXRJp2Qt5BYqVp2SuZ2cfGpx4/eOxSchADyDvLK4A3WMzKxDnJWkJEhY/6KATpYQKAQWlgiipewLU5jDS/OLjSqRrhcmXDryS+jbe7tTeiljLQO0/KtKF2+jKmXDphntoZQnV3w+AWMUAO4hf15Rs3SbeXAwkxItDKBss0rb+OiOzor7aiW0INaBhSWCaJ1+eReC9ImsOydT8dIYFukaNuw7RYv+OOLaTa/YwqJeptrqdtVcQg/0a2w6PU9ArcZBpBn/t+am5nfSDPza4LaOrQuAWEAeIO+IDrBYml/pSheLGCvW4AFtMgyPkX7H1LecMGECdezYkcqWLUtVqlShQYMG0datW0Ofnzhxgu655x5q0qQJlSxZkmrXrk333nsvZWZmaq53+PDhRS6C4tcVV1xB8WRhee+n3bTzr2zJNCZQJGnNsmUGTvqJbp+6lvYcy3YlSyhUG0C2bidcGOwmLb/GWBp102pplvvl5Dlh6zWI1tPP7d3rmVqXk0eOBTMDANRL8zsSw2JiXnHzQ72h08rDy+s3X0TnPfKw5inBsnTpUho1ahStXLmS5s+fT3l5edS3b1/Kzi680R48eJC//vOf/9DmzZtp6tSpNHfuXBoxYoTuuplAOXToUOj18ccfU6yhdZ3M23w4bBrPEjJwE9530nzJZiMIwkStUiT/zOK62ZOEXPioPVy0r1OenhrQgj64raOtwm9O4qQ0crL2AgDAWtBtssFYOeuF45RjWNTGgpNn8+jXvScNbyceMBXDwsSHGCZImKVl3bp11LNnT2rZsiV9/vnnoc8bNGhAzz33HN1yyy104cIFKlFCfXMpKSlUrZq0y65XYSXXdx2TWkOMoKXslXrJFAbiGgu6dfqeJ76uwusYiOcLOOYS0rqYWTqvHvJKwX4BggUA95CPm2rPgO3rVDC8TjPGXLGFRe9Slw+B17z1M+154SrjG4txbDm+BFdPhQoVNOdJS0vTFCuMJUuWcPHD3El33303HT9enC0iJzc3l7KysiSvSDL5lvaWltM6yXn1VIX5pReb8gq+33yY/rt8t+n9eXbO76F/vzWkneQz8XWjZWGxCltHgsPrjWTQrZPxcMgMAMA9wrIMZdduRnoqb5/RPEO//lJoFVZiWHgdloDnx4KV43pTzAmWgoICGjNmDHXr1o1bVpQ4duwYPfPMMzRy5Ehdd9C0adNo4cKF9OKLL3LXU//+/Sk/P181liY9PT30qlUrspU/S6dYK9ylZUZUCppiFhkjzQ9nrNpraX/EIoeV2C+bWkJFPMhjWNQ+MQ67cOUCxa6lwQnfdKSpUa6kK2nXAIBCRl/akI8tNcuX5O+PZ+dKxuLq5Urqts+QY0ZXiLtDuxHDYobHrmqmO49as0RfCxYWy8LiVGbOnKn4ObN6XHXVVdS8eXN68sknNdc1ePBgGjBgALVq1YoH8s6ZM4fWrFnDrS5KjBs3jltuhNe+fdLsGrex6gbRsrDIu4EaCbo1g949sdDiIYpNkbh9wue1C3uSCHcJRUdIRpMlD/aK9i4AENN0qFuBfh1/OT12VWH2HktuuO+T9bbWKX6406c4S0hv7HT74eWOHvV9nVFkac9Hjx7NRcXixYupZs2aYZ+fPn2aW01YNtGXX35JSUnmanbUr1+fKlWqRDt27FCNd2FuJvErklg9p4xaWIT1F6Y1h1tYWCDuApN9MErIQ+VlBFg/P5XYFLVeHIXzmdoN0TrCo/ftXKysTcDIHtaak1lBrQ7Du0M7mFqP0RozAADrpKUmSSy4s9cftBRAyywUrNfYCBOZgJIsIZ15ndIrL13fOqYe7ARMjZbswDKxwkTIokWLqF69eoqWFZY5lJycTF9//TWlppo3L+3fv5/HsFSvXp28iNUUYiZAVHtJJCWEWWIKs4Qo7MKav+UI3TFtralt6+gVbvFQcwIZzeYxtz9KFhbrK/5mtP1idnbp16IqXd68asS327pmYeuBCqWTI75tAPxCwAGLObNQfPnPblQ2Ncn1LCE73NBBPUxC6/t6IITGOcHC3EDTp0+nGTNmcOvJ4cOH+evcuXMSscLSnN977z3+XphHHI/StGlTLnoYZ86coQcffJCnSu/Zs4fHsQwcOJAaNmxI/fr1Iy9i9aAyAaJ2woqLsRXPL8sSKvq7Yqd6QLJVkaXpEpKvyxELS3jQrZ2As6QSEb7SFB7KzIbQzBx5sSO7MuWW9jSsSx367K4ujqwPgFhE76HNLYRhTeuBNZIZgwGNz7zey8hUWvPkyZP53169pH73Dz74gBd/++WXX2jVqlV8GhMcYnbv3k116xamprJic0KGUWJiIm3cuJE+/PBDOnXqFGVkZHDRw4J1mevHi1g9plomQaUMl7AYlqJ/JpmoF2DmO4m/l5bbxxELi0Jas51rRc/l5TRK2sRs3bqL6wv9mOyRUa4kPTVQOfAdABDdnl2hh0UDWUJJERjHAloWFqLYESx6vj4mZIz4A8XzsIq48+bNIz9h1SWk1V5cnA2kVjhOiJtwr9GhctyKfHNJ4tL8Fn8Ltj15oLGdpws3RJxZlGrpAAC8QbSsB8KwZuR5xmijVzto/Qxet7Ag4i+iLiF1sZEnKq3co1Gl0PwSHWMjTUg/nU5qOZFWsw2oBghbFRkshkUecGrnYol08KqSLs/J0xcs/+rdiP8d17+pG7sFAFAhYKMopa3tFo1r5y8UhD3QsyreevW4HN8f0vjM23oF3ZqtYPXGWhjDovzZhSJlcnu3elQ1LYV+3H6ssA6LQgxLwKXvpFp6RW5hEV1UpZKt16SR/4y2XEIesLAYsTaN6dOIbupYi7txAADRH7e7Nyx8QHSbt5ftosEdi4Nhy6aUoE9kcWxJkYhhCah/BgtLDGL1kDIBonZTEzoN165Qsvik4ZVug47sgF5AK+/to+YSkq9LdFGVtChYmECrXCbFOQtLxGNYwo/Ls9e0NPS0BbECQORR0wJuawTxsDZzTXHNsIplksMsKpEJug2ofub1GpYQLBaQ31fLpBgzVGlVOhSCbhMTE0RR5cqVbq3EjRipcSIJuhXNryUkSiVbM9IxgcZu3qzTqJHt6BHparHymKPXBrelxlXLRnQfAADGUXPHu9Hp3sj6xdbziAqWgPpnDT0+hkGwOHDiT75F2odHjcLmVyoWlqKCK7weStE8YTEsoe2b32cjF4JaF2at7ZW2aGERBFrVsim+UfdabQCilYEAADCG2vjCima6iWqihUJmaCQevBI0xqo3RQ+QXgSCxQLi412pTAr1aFRZcT55FkyhS0g7hoVZCIsr3SpnCVnBrPVCK+hWjHWXUKFAqywSLFZv+tHodlyzfCnJez+JLQDiEbXxxe24Db0x304tqiqi8dMudSuWoloVpOOa14BgsYDRU2rqbR3Dn8rVgm4Fl1BCQugCYrMr1WGxcnkZCT5X6x+kdQ1ZdwkFw5a3etN3qgCbGZj758F+TSJmVgYAuGRhcXu7KhtWatZawuQg+P5w6T3GCHphCV4GgsUCUkWufpDlHUC1ml8Jac1yC4tT51DHuhV051Hr0BzQsW5Ut9DdU7g4Son6WijVonHqu7nB31oXt46AhQUAb+M1C4vSeGfGJZSRnkota6Sb3x+VTdzZw3h/pGgBwWIBo+e3fD6twnG5FwoFS3JioiiGJSjJErKqXfq3rEZ9W1Qztb8Si4zO9/3mnu6G9mPNo33CBFqppGLBkuOzwmuSrCoIFgD8aWFx+9pVi2FRcglFIUvouWta0nf39nC9Ho0TQLBYwKj5Xz4fLxynMm9uUdExFvciKH7eS8ii1UFMi4w0QxaAgIEYFrFVQRzHYwRxvIoQwyJO68vJE3V61OiWylq7z7izM69Xc3Mn9SZfbqPW3RoA4D3ULCluX7qqWUJKFhaFncnOvaC4fNDi/sjvBQ0rl6HmGWm+GMNQOM4C4uOq1YlAfvzlzQ//2asBfbp2Hx07cz5kYWEl5kOlnMN6CVk7RQtDZ8xlCald3E871LNGyV+q1ABSqVsqK67HTKcrHukd8XRmMVoNIgEA3kLtfuy6S0gtbtGgheXi5xc6vD8BzfdeBoLFTZeQ7D0zKkjdLuxNQHLyyi0sRYYIjnB6m5UtbD2G7usGAm2dOrWFQnmMf1/ZlFbtOkFXtNR3WzEEkRJNsSK/0L1eIRKAeCfBYzEsRoNuT6tYWKwi34Kf4u8gWNx0CckuhMK0ZGkwq/xkYX16QoVug8GQ60SyHpOKhadT23AJidOprV7c1dKkgblNqqWF/j2yZwP+8huS3ktwrgLgadx+CFNDbcw06hJSI6hyHxB60aki24SfnrUgWCxgVJHK5yssHCeaIO/foxfDErRWj6Uw2NecS0g1+Mviyf3pP7rwv3Pu6U5Lt/1Ft3f3foCXHtJWBj666gGIQ6IWw6KyfqXwRDNBt0GV+8AzOm778C34Z+yCYLGAUZ+ffD65paPQwhIIaywoPK2z+cV+TuEELSnKrFGDxcIIcSKsBLQhj5CaSyhoz3zYoU55ql2xsCARS8OzkornRcS/hZ+eUgCIR1QtLFFyCSlhpllzUEXw1K1UWnt/ZN/XTy4hGLItYPX4FmYJSVNh5esSW1jYCSk3G27Yd4peXbBdd1slRD4KrZYAjFX/7l24P6K9UX8a8dHZ7TbIEgLAN0QtzszEdu3uY8DQNmTL+GjsgmCxmyVkYB6B/IICOnH2fPE87D/ZjCyGRUCp+eFd09cZ2scSicXr5RYWlXOSBXlVLYovkTQ/VPmOVtS49+snWkOt9xIAwHtEy5JgZrPiB023xtWAbI9gYYlxzMaDCDz02UY6X5S+XLge0olhkQoWRk5RvRY9xNHmrB+RmnIPqnyvMzkXFE2PVp4AfHQ9mEItSBkA4D2iZUkws1m7wfsBI8kVcguLj0ZoCBaXUDoFdv6VHTaP/OThMSzioFtZLyHWa8gI4oJs/EJVOSfFtV3EsxzMzCGniAsLi3+ueQDikmg9VJjZrqmg2yA5gp/GLggWlzB6EshP5kILiyitWVRgLWgiKCtJdOLXqVjK0EWjNosTac2xCH4KAPxDwGPbbVMz3Wa35qBDhePIN0CwOMTLN7YxbWaTVmUpzu4p7iUUHnRrVDAwC8v/RnSif1xSn27qUMtgMJZ1UROPiC98/CwAeJtoPWypbXbyLe3DppkphnnBYtuWsKBbH41eSGu2iXCoL2lcWfkDDeSl+oWgW3G3Zklas8L8WjEsPRpV5i+GsRgW5XXZjWGJVfBTAOAfopckFFDsjJxRrqQtC8sFhfYmRsRHWAyLj8YxWFgcwkpuOxcCgXDriqQ0v6xbs9GgrJLJ0lotZivd2pknXsBvAYB/iHYrDzFqD36JouxOPfJE7U3MIBc1ECxxiPyYG3IJySwmQkpzSJTwGJYCUwqcdTBmjL28sYXoceWZxKLJyslttWmj14G1CQD/ELUYloDxnTFlYSmwGsMi3xX/jGNwCbl08zJy3hUWkismqahbsVoMi14BOMYTV7egi2qXo+rpJW2bCkPblczjn5PbbSQ/BX4WADxN1GJYFIvhq1hYTFiB8hUEi5G2LWEP1z4auyBYbBI6PcJUq5FlpQXdQhaWooms5kr2+XxTZk2WFi0XK4XrVNkH0fmttuYu9StSi4w0/gLFwMICgH+IlkdIabtqQ0dExpSA7OGa/AMEi0PIzzMjJx4TyOL5mNgoXLbw/fajZ2RLMBeS8Qq30v3T3x+1fU5NSqRv7+2hu3w84yezKgDxiJcKx6ntiRkLi9VxKLw0P/kGxLA4RNjN3mCWkJiUIpeQakaPrBeRXoVb6f7p74+fTlwvAAsLAP4heqX5A8aDbhMCNKBNhsv7I8c/4xgEi2tBtwZQsbBoLat3j1RT6G7fW2sUpeiVlmUnxTLi31ouPgEA3sJTFhaNXRnZs77L+xMwvC9eAy4hmwRUc9uNuIRkMSyyoFs5Rm6JuXkFUblYv/hnV1qw5Qi38Dz8+SbJZ8E4ECxnZbFGAABv4aGsZlsPpVF5uPYIsLA4hFpue9/mVU3EsAR0A2T1hEfDKmVU9s9dWMfnIZ3rhArVxRstayAgGQAvEy0Li6L7R2NfbLmaAwZmsfBw7RUgWFwOup00pB2lpZbQECDhFharBY4mD2lHtSqUcuQiKJNizfjGqjf++NCltOGJvqFpsewt2TC+L/++SplZAADv4KUsIa19sSVYghZcQuQfIFgcQi3mlsWlNK2WpuESUs8SUkyD1tiH+pXLGN4/PZ4c0IKswkRTeskkevxvzfnfCde2olglvVSSqkgEAHiHqHVrVhjQtZIn3BZWAfn7QIwKlgkTJlDHjh2pbNmyVKVKFRo0aBBt3bpVMk9OTg6NGjWKKlasSGXKlKHrrruOjhw5olsJdfz48VS9enUqWbIk9enTh7Zv305+IGig3LF6jx6pABGyhKya6LROdLNpt/1aqLuyjDKiez369fHLqVl1uEsAANHFS72EtPbFbRdNwEIJDl8KlqVLl3IxsnLlSpo/fz7l5eVR3759KTs7OzTPfffdR9988w3NmjWLz3/w4EG69tprNdc7ceJEev3112nKlCm0atUqKl26NPXr14+LH78QfswDhirIij8rtrBopDVrnujWPlOePxBz/TsAAPFL1CwsJuqw8M8M7uZXo7qZW3Fof/w7JpsKVJg7d67k/dSpU7mlZd26ddSzZ0/KzMyk9957j2bMmEGXXXYZn+eDDz6gZs2acZFz8cUXK1oZXn31VXrsscdo4MCBfNq0adOoatWqNHv2bBo8eDD5AflJID5J1awb8m7N5/LydVxC2jipzP17SgMAQDjRuk8r9QfS2hcjgqJJ1bLUplY5u7umuy8xFcPCBAqjQoUK/C8TLszqwlw6Ak2bNqXatWvTihUrFNexe/duOnz4sGSZ9PR06ty5s+oyubm5lJWVJXlFGy2/oFp/B3kvoVJFNUy0Tli7J7pR/KzCAQDAOxYWJcFiL4alwEYmg3zRuMgSKigooDFjxlC3bt2oZcuWfBoTHsnJyVSunFT5MWsJ+0wJYTqbx+gyLJaGiRrhVatWLfJcHRYL3ZqF4E2tmBctzHT61MNH5zAAAOgSrSHNbOE4I8IqX61TswUd46eh3rJgYbEsmzdvppkzZ1KkGTduHLfuCK99+/ZRtNGqHqimM/h0BdeRpoVF4/QS0qIBAAB42MKiMY4b2c28AuUCoUaQW/z99HBq6Q43evRomjNnDi1evJhq1qwZml6tWjU6f/48nTp1SjI/yxJinykhTJdnEmktk5KSQmlpaZKXFxAfePFJqiZ6C2NYtNdjBiHLyAngEgIAxBLRGtISEpy3sDw7SKVURMCCS8hHNhZTdzjmkmBi5csvv6RFixZRvXr1JJ+3b9+ekpKSaOHChaFpLO1579691KVLF8V1snUwYSJehsWksGwhtWW8itnDXhjDEp5NpJUlpEVKUoKDWULm5gcAAC/jpUq3WnuiJ1jeHdqBLmnsXEVxP431CWbdQNOnT+dZQKwWC4sxYa9z587xz1k8yYgRI2js2LHc+sKCcG+77TYuPMQZQiwQl4ke4SRisTDPPvssff3117Rp0yYaOnQoZWRk8Dovfr0gJCeBhktIrL4DRuqpqHzWqW4FKpXsXGsoH53DAADgK6GUYCPoNsWmJT3o47He1B1u8uTJ/G+vXr0k01nq8vDhw/m/X3nlFUpISOAF41g2D6un8tZbb0nmZ1YXIcOI8dBDD/FaLiNHjuTupO7du/MU6tTUVPI6QdmJlm9CtRYWjgsXOerND9VNLC9e35qs0L5OecXpcAkBAIB9lEbSaBaOIx8rFlOCRS9LhcFExqRJk/jL6HrYAXr66af5y88Uio/C7yYWIuppzdJuzXpBt7xwnOq2rfHidcq+UOgVAACwj9mxVM/CEiRnidkYFqCDJOjWQJaQqrlQaxsBx07KDnXKU8MqZVU245+TGAAAvIrS2KtdhyVg23CgRdxlCQEVguZu+PLCcUaCblUtLBZOuhhuogwAAJ5AsQ6Lxvy6goXsEZ4l5B8gWGwiPtjn84tz4w3E3Cq4hIr+Bpw36ylVRrSr1AEAAJhHM7FC764cdHZf/GRNh2BxC4NBtxI1XfRvVQuLhpjRO+egTQAAwCtBtzZcQqQ+mAdiO+YWgsUtJIXjVNQCO2klLiGFZcWcPX+Bft0rLcpnlOYZadS5XmHPp9B+WVoTAAAA90rza68vUakSnQnCk17IN0CwuIQRl1CdCqWkLqFQDIt0PuH9+K9+s7w/iQkB+uQf0kJ8sLoAAIDbmCscp+feL18qSfWzoCULi38UCwSLTdROED3V2qNRJRp9WUNF06B8WgkDitpPKhkAAOLbwmK+l9BTA1rQnT3qUasa6Q7unb98Qs6VRgUSJHVYFFTNxOtb88q0UpdQQPGE5XolX2d7UCwAAOD7wnFqIQHDutZ1ZH/CsoR8dOuAhcUl9E4C4aRUdglJF0506YyCRwgAANxF0YquYdbQi2HRAkG3wBVhECrDLy7NTyoxLAbOYEsnHYJYAADAFxYWt/CTdR6CxSYBI1pAQRiEyvCLmx+q9BIqYUSwoHAcAAD4AheKmVvGP3IFgsU1lAq1KYkQafNDoQ6L+TQ2P0V6AwBAvKAkQLSsKJG2eAR8dOuAYHEJiYFF4fMkoUW4gZM5EUcJAAB8iaIg8JBICHhpZ3TArTAKXNWqOpVJKaGqtOXTkNYMAAD+REkQJCV6Z8AOeGdXdIFgcQlxNUG5d+jhK5qG/q3U/FAxrVkHJ845xOACAIDDKAzOSTCbWwK/mk2CBqbLBYhSKrMkENfFtObZo7oV7yMECgAAuIrS6J3sIcESgIUFiBWL0VbiaqX5WVl9XQyedG1rlYuJExcAAPyAUhBtKIbR8W2R/jxh7/0z8EOwRCJLSHYWSa0q4f8OD7o1UofF/kkHiwsAAPjXwhIMWigc5x+9AsHiWh0WjWUkgsRAJ09Dac2W6rBAoQAA4pcH+jaOynaTXbKwWMFHegWCJRIngdxAIhYXSi4huQnRLXdnQYH6fgEAQCxTuWwKjb6skevbCXgs6DYQ5cq6doBgiQBaPkOl5odyjBWOc764HQAAxBJPXt3cVAVxJ1Aa16MZdBuUvfeRXoFgiQaqWUIqJ46R88lKdcT8AggWAED8MLxbvdC/L0Ro/FMampNLeEclBHykWCBYonBCSN1F+ieLg0lCEvJhYQEAxCkX8mU+cZdQGpvdcgkl+Eh8WAGCJRJ1WOQfqlpYlE+2tJJJ5AbQKwCAeCVSFpa6lUpHLOg24NLDrVeAYHEJjaxmmU9T+/SpV6k0lU4uLOOvhRVhDZcQACBeuZAfmfEvo1xJ+uyuLqbbrbhlYQmSf4FgiULQlXql23Ba1Ug3JImt1GFB0C0AIF6J5ANbh7oVJO/LlXLHah7rQLBEAnlvIJFKEcenmLGShEWZW7CwpCYlml8IAABigGjVoRrapY5rMSz5BkQYXEJxjKEMHo33ainOoWkB5en/ubENWeWtIe2ofuXS9PrgiyyvAwAA/Ey0POKlU/Rd/FbJN2A197NdHYIlAhhufqhgYlETRC0y0ugfPeurbkOLK1tVp0X396LmGWnGFwIAgBgiGCWXuJsWjoIYj0t0T+rFCWpCQWxuDIthEb3XC5JiIkZplhRZlLmfzXwAABBpYvHenm9AhPn5XgELi0uIz5sejStJP1QrEKfUV0iteZYLaXGIwQUAAHdxs1RKMMbHcAgW2+iffXf2qG85S0iNlBKJsnouftbNAAAQWaI1ZFrJ6HSLr0d3i/YumAKCxTb6kpZFhA9qm6Ff6VYphkXDJaQXsGsWaB4AQKwzrEsd/vehfk0pVhjRvbDlwNjLzXWfbl2zHMW0YFm2bBldffXVlJGRwW+ms2fPVrjBhr9eeukl1XU++eSTYfM3bervk0mrsZakQ7NoulqWkJG0ZifERqybEwEA4ImrW9CCsZfQXZdILd9+5tErm/HvdM9lDSmWMS1YsrOzqU2bNjRp0iTFzw8dOiR5vf/++1yAXHfddZrrbdGihWS55cuXkz9QVgoJMsEidtlIBEvAfAwLs67I1w8AAEAfNnY2rFImam50I5v9xyX1qXp6qm++k2ezhPr3789falSrVk3y/quvvqJLL72U6tfXVrMlSpQIWzZWLSzic0o/Syg8PuWD4R3D1uMlvygAAABljIzU4/o3o0euaEr1xn0XgT3yD67GsBw5coS+/fZbGjFihO6827dv524mJmyGDBlCe/fuJT+gpjfkQkTi+lHxAxkVHYKAUV0nAAAAXxPr1hLP1WH58MMPqWzZsnTttddqzte5c2eaOnUqNWnShLuDnnrqKerRowdt3ryZLy8nNzeXvwSysrLIa2hZTqTBssruIfHnAVlgL85jAADwKRjAvSlYWPwKs5akpmr74sQuptatW3MBU6dOHfr0008VrTMTJkzgosYLqJ16JRLDWjSruIS016UW1+IGiLkFAAB3gVzxoEvoxx9/pK1bt9Idd9xhetly5cpR48aNaceOHYqfjxs3jjIzM0Ovffv2kZ9OSlNBtwpRt0LArd6yAAAAQKzgmmB57733qH379jyjyCxnzpyhnTt3UvXq1RU/T0lJobS0NMnLe6X5NZYx4B6SzqEcD6O/LAAAABCngoWJifXr1/MXY/fu3fzf4iBZFlMya9YsVetK79696c033wy9f+CBB2jp0qW0Z88e+vnnn+maa66hxMREuvnmm8mvFMiKmqhZVaxkJys2SXSkDgucQgAA4CawhkcwhmXt2rU8TVlg7Nix/O+wYcN44Cxj5syZ/OanJjiY9eTYsWOh9/v37+fzHj9+nCpXrkzdu3enlStX8n97HbV7fEGBeppzwERt/sK05vBpAAAA/Aes4REULL169dJ9Eh85ciR/qcEsKWKYwPErOXn5hiwsiSqmFL1Kt0kJAcqTTROsNVb7EKkB+woAAACvgl5CFqlYOpn/bVu7vOLnrWumGxMsOg0MWfXCsGVkf9WWBQAA4C1qVyxpeN4+zaryv2mprib0+gb8Chb54p9dacbqvaGmU3Ievaq5IcGi1lfok5EX0087jtHNnWrTpgObVJcBAADgfWbc0ZlW7zlBA9vUMLzMxOtb09vLdtIN7Wu5um9+AYLFInUqlublk9VIL5kkeZ+oIjLUqtV2rl+RvxSXCZlYnO3WjJhbAABwh64NK/GXGSqUTta8z1jBz8+7cAlFiER5ITkTJ4+RIC0/n4QAAAAiQ1Kif2/7/t3zGGmGKI49URMd8umhoFuV9QAAAABKpJTw723fv3vuMxITlH9qKx2X3dIm8AgBAEBsk5YqDVfwE4hhiRDqMSw2LCwOC5eCAkgWAACIZR77WzPacfQMDe1ah/wGBEuECGuGaKvSLblCvUqladOBTHdWDgAAIOpUTy9J8+7rSX4EgiVCWK3DUvSJ5J2wKqcrJj5xdXMqnVKCbuqIFDoAAADeAoIl2kG3ljKA3DGxVCyTQhOubeXKugEAAAA7IOg2QqgVe7NSXj9kYUFiEAAAgDgBgiXKMSyG0po1lgEAAADiAQiWaFtYJP82VlxOqZcQAAAAEMtAsEQ5hkU82ajhxK20ZgAAAMCrQLBEiPJF3Z3JgntHbnmBUAEAABBvIEsoQlzerCrd2KEmta5ZznbQLQQLAACAeAOCJUIkJARo4vVtDHdr1oxhCbmEoFwAAADEB3AJRRmp6NAPzLVaHRcAAADwMxAsUUZa6dbgMsgPAgAAEGdAsEQZSfNDtXlkSgYWFgAAAPEGBEuUsSQ+UOkWAABAnAHBEmWMNT9UqcMC1xAAAIA4AYLFBy6h8GUAAACA+AKCxQdBt8FgUPIelW4BAADEGxAsUcZI80OpXCmeD3oFAABAvADBEmWMND+UGVhQMA4AAEDcAcHiAiWTEg3Pm2DgCARlNhboFQAAAPEGBIuDPDWgBf/7zKCWhpeRWFVUY1ik7xHDAgAAIN5ALyEHGda1Lg1ok6HamVkJI80Pw2JYQn+hWAAAAMQHsLA4jBmxEh50GzBlYQEAAADiBQiWKKPf+pCBGBYAAADxDQSLL+qwKC8D4QIAACBegGCJMkbcO2GCBbErAAAA4gwIFh/UYSkIq3Tr8k4BAAAAfhcsy5Yto6uvvpoyMjJ4kOjs2bMlnw8fPpxPF7+uuOIK3fVOmjSJ6tatS6mpqdS5c2davXo1xQNGXELhywhpzVAuAAAA4gPTgiU7O5vatGnDBYYaTKAcOnQo9Pr444811/nJJ5/Q2LFj6YknnqBffvmFr79fv3509OhRin0CptOaBQsL5AoAAIB4wXQdlv79+/OXFikpKVStWjXD63z55ZfpzjvvpNtuu42/nzJlCn377bf0/vvv0yOPPEKxjMS9o6JA5C4hWFYAAADEG67EsCxZsoSqVKlCTZo0obvvvpuOHz+uOu/58+dp3bp11KdPn+KdSkjg71esWKG4TG5uLmVlZUlefkVSh0W11G3k9gcAAACIC8HC3EHTpk2jhQsX0osvvkhLly7lFpn8/HzF+Y8dO8Y/q1q1qmQ6e3/48GHFZSZMmEDp6emhV61atcivGLGVqOkVueUFAAAAiFUcL80/ePDg0L9btWpFrVu3pgYNGnCrS+/evR3Zxrhx43jMiwCzsPhVtIibH6rXYVEWJtArAAAA4gXX05rr169PlSpVoh07dih+zj5LTEykI0eOSKaz92pxMCxGJi0tTfLyKwELQbcC+VAsAAAA4gTXBcv+/ft5DEv16tUVP09OTqb27dtzF5JAQUEBf9+lSxeKeSRpzcZ6CQn0aFSJ/00pgXI6AAAAYhvTLqEzZ85IrCW7d++m9evXU4UKFfjrqaeeouuuu45bR3bu3EkPPfQQNWzYkKcpCzDX0DXXXEOjR4/m75l7Z9iwYdShQwfq1KkTvfrqqzx9WsgaipdKt6ouIZVlW2Sk0w/39aSqZVPd2TkAAADAr4Jl7dq1dOmll4beC7EkTHBMnjyZNm7cSB9++CGdOnWKF5fr27cvPfPMM9yNI8CEDAu2Fbjpppvor7/+ovHjx/NA27Zt29LcuXPDAnFjEbsJyo2rlnVoTwAAAIAYEiy9evVSDQJlzJs3T3cde/bsCZvGrC2CxSWekFS6VZsnUjsDAAAAeBQEP/jAJQQAAADEOxAsUUaqUZQVi1jIdK5Xwe1dAgAAADwHBEu0MdD8UDz5zb+3c32XAAAAAK8BweIll5DKPOJ058pli4OXAQAAgHgBgiXKSHofIogFAAAAUASCxVPND1XmidjeAAAAAN4EgsVLac1QLAAAAIAiECxRJkFSh0UlSwiKBQAAQJwDwRJ19OuwILQFAABAvAPBEmWMiBHoFQAAAPEOBEuUQaVbAAAAQB8IFi+lNRuodAsAAADEIxAsPsgSQtAtAACAeAeCxQcuIVhYAAAAxDsQLB4CLiEAAABAGQgWPxSOg0sIAABAnAPB4oPmh0TBCO0NAAAA4E0gWHxgYQlCrwAAAIhzIFg8FbeirFgKoFgAAADEORAsXuolpGJhGdCmBv9bt2KpCO0VAAAA4C1KRHsH4h2JS0hlnu6NKtG8MT2pZvmSkdotAAAAwFNAsEQdcR0W9bDbJtXKRmh/AAAAAO8Bl5CXXELR3BEAAADAw0CwRBmxVQUF4gAAAABlIFh80PwQAAAAiHcgWHxR6RYAAACIbyBYPFTpFgAAAADKQLB4CGgXAAAAQBkIFk+5hKBYAAAAACUgWHzR/BAAAACIbyBYogyCbgEAAAB9IFiijDiVGWnNAAAAgDIQLD5ofggAAADEO6YFy7Jly+jqq6+mjIwMHiQ6e/bs0Gd5eXn08MMPU6tWrah06dJ8nqFDh9LBgwc11/nkk0/ydYlfTZs2pbgApfkBAAAA5wVLdnY2tWnThiZNmhT22dmzZ+mXX36hxx9/nP/94osvaOvWrTRgwADd9bZo0YIOHToUei1fvpziAYkbCIoFAAAAcKZbc//+/flLifT0dJo/f75k2ptvvkmdOnWivXv3Uu3atVXXW6JECapWrRrFd/NDKBYAAAAgKjEsmZmZ3MVTrlw5zfm2b9/OXUj169enIUOGcIGjRm5uLmVlZUlefgW1VwAAAIAoC5acnBwe03LzzTdTWlqa6nydO3emqVOn0ty5c2ny5Mm0e/du6tGjB50+fVpx/gkTJnBrjvCqVasW+RXIFQAAACCKgoUF4N54440UDAa5CNGCuZhuuOEGat26NfXr14++++47OnXqFH366aeK848bN45bboTXvn37KBYIUjDauwAAAADERgyLGbHy559/0qJFizStK0ow91Hjxo1px44dip+npKTwVyxwoaBYpJRIQJY5AAAAoESCW2KFxaQsWLCAKlasaHodZ86coZ07d1L16tUp1klNKj4EpZITo7ovAAAAQMxYWJiYEFs+WLzJ+vXrqUKFClxgXH/99Tylec6cOZSfn0+HDx/m87HPk5OT+b979+5N11xzDY0ePZq/f+CBB3htlzp16vCaLU888QQlJiby2JdYp2xqEn0wvCMlJgQoNQmCBQAAAHBEsKxdu5YuvfTS0PuxY8fyv8OGDeMF4L7++mv+vm3btpLlFi9eTL169eL/ZtaTY8eOhT7bv38/FyfHjx+nypUrU/fu3WnlypX83/HApU2rRHsXAAAAAE8TCLKoWJ/D0ppZthALwDUbLwMAAAAA79+/EeUJAAAAAM8DwQIAAAAAzwPBAgAAAADPA8ECAAAAAM8DwQIAAAAAzwPBAgAAAADPA8ECAAAAAM8DwQIAAAAAzwPBAgAAAADPA8ECAAAAAM8DwQIAAAAAzwPBAgAAAIDY69bsRYT+jayJEgAAAAD8gXDfNtKHOSYEy+nTp/nfWrVqRXtXAAAAAGDhPs66NmsRCBqRNR6noKCADh48SGXLlqVAIOC4+mNCaN++fbqtr0FkwbHxLjg23gXHxrvE47EJBoNcrGRkZFBCQkLsW1jYl6xZs6ar22AnT7ycQH4Dx8a74Nh4Fxwb7xJvxyZdx7IigKBbAAAAAHgeCBYAAAAAeB4IFh1SUlLoiSee4H+Bt8Cx8S44Nt4Fx8a74NhQ7AfdAgAAACC2gYUFAAAAAJ4HggUAAAAAngeCBQAAAACeB4IFAAAAAJ4HgkWHSZMmUd26dSk1NZU6d+5Mq1evjvYuxQwTJkygjh078grFVapUoUGDBtHWrVsl8+Tk5NCoUaOoYsWKVKZMGbruuuvoyJEjknn27t1LV111FZUqVYqv58EHH6QLFy5I5lmyZAm1a9eOR983bNiQpk6dGpHvGCu88MILvIr0mDFjQtNwbKLHgQMH6JZbbuG/fcmSJalVq1a0du3a0Ocsl2L8+PFUvXp1/nmfPn1o+/btknWcOHGChgwZwguUlStXjkaMGEFnzpyRzLNx40bq0aMHH/9YBdaJEydG7Dv6kfz8fHr88cepXr16/Hdv0KABPfPMM5I+OTg2NmBZQkCZmTNnBpOTk4Pvv/9+8LfffgveeeedwXLlygWPHDkS7V2LCfr16xf84IMPgps3bw6uX78+eOWVVwZr164dPHPmTGieu+66K1irVq3gwoULg2vXrg1efPHFwa5du4Y+v3DhQrBly5bBPn36BH/99dfgd999F6xUqVJw3LhxoXl27doVLFWqVHDs2LHB33//PfjGG28EExMTg3Pnzo34d/Yjq1evDtatWzfYunXr4L/+9a/QdByb6HDixIlgnTp1gsOHDw+uWrWK/4bz5s0L7tixIzTPCy+8EExPTw/Onj07uGHDhuCAAQOC9erVC547dy40zxVXXBFs06ZNcOXKlcEff/wx2LBhw+DNN98c+jwzMzNYtWrV4JAhQ/g1+vHHHwdLliwZfPvttyP+nf3Cc889F6xYsWJwzpw5wd27dwdnzZoVLFOmTPC1114LzYNjYx0IFg06deoUHDVqVOh9fn5+MCMjIzhhwoSo7lescvToUfYYEly6dCl/f+rUqWBSUhK/6AW2bNnC51mxYgV/z26CCQkJwcOHD4fmmTx5cjAtLS2Ym5vL3z/00EPBFi1aSLZ10003ccEEtDl9+nSwUaNGwfnz5wcvueSSkGDBsYkeDz/8cLB79+6qnxcUFASrVasWfOmll0LT2PFKSUnhNzYGE4fsWK1ZsyY0z/fffx8MBALBAwcO8PdvvfVWsHz58qFjJWy7SZMmLn0z/3PVVVcFb7/9dsm0a6+9lgsLBo6NPeASUuH8+fO0bt06bq4T9yxi71esWBHVfYtVMjMz+d8KFSrwv+z3z8vLkxyDpk2bUu3atUPHgP1l5vCqVauG5unXrx9vIvbbb7+F5hGvQ5gHx1Ef5vJhLh3574djEz2+/vpr6tChA91www3czXbRRRfRu+++G/p89+7ddPjwYcnvynq1MJe2+NgwVwNbjwCbn41xq1atCs3Ts2dPSk5Olhwb5rY9efJkhL6tv+jatSstXLiQtm3bxt9v2LCBli9fTv379+fvcWzsERPND93g2LFj3B8pHmwZ7P0ff/wRtf2KVVjHbRYf0a1bN2rZsiWfxi5sdkGyi1d+DNhnwjxKx0j4TGseduM8d+4c9yODcGbOnEm//PILrVmzJuwzHJvosWvXLpo8eTKNHTuW/v3vf/Pjc++99/LjMWzYsNBvq/S7in93JnbElChRgj8siOdhsRjydQiflS9f3tXv6UceeeQRfu4y8Z6YmMjvIc899xyPR2Hg2NgDggV45kl+8+bN/GkERB/W3v5f//oXzZ8/nwf1AW+Je/b0/fzzz/P3zMLCrp0pU6ZwwQKix6effkofffQRzZgxg1q0aEHr16/nD2IZGRk4Ng4Al5AKlSpV4gpZnvXA3lerVi1q+xWLjB49mubMmUOLFy+mmjVrhqaz35m55k6dOqV6DNhfpWMkfKY1D4vAxxO8Mszlc/ToUZ69w57u2Gvp0qX0+uuv83+zpzkcm+jAskuaN28umdasWTOekSX+bbXGLvaXHV8xLHuLZaeYOX5ACsuCY1aWwYMHc3forbfeSvfddx/PiGTg2NgDgkUFZl5t374990eKn2zY+y5dukR132IFFvTNxMqXX35JixYtCjNxst8/KSlJcgyYj5YNzMIxYH83bdokucCZVYDd8IRBnc0jXocwD46jOr179+a/K3tCFF7sqZ6ZtoV/49hEB+Y2laf/s5iJOnXq8H+z64jdtMS/K3NTsPgH8bFhYpMJUwF2DbIxjsVTCPMsW7aMxyqJj02TJk1i1uVgl7Nnz/JYEzHswZf9rgwcG5vYDNqN+bRmFr09depUHrk9cuRIntYsznoA1rn77rt5et+SJUuChw4dCr3Onj0rSZ1lqc6LFi3iqbNdunThL3nqbN++fXlqNEuHrVy5smLq7IMPPsgzWSZNmoTUWQuIs4QYODbRSzMvUaIET6Hdvn178KOPPuK/4fTp0yWps2ys+uqrr4IbN24MDhw4UDF19qKLLuKp0cuXL+fZYOLUWZa9wlJnb731Vp46y8ZDtp1YT521w7Bhw4I1atQIpTV/8cUXPJWfZcMJ4NhYB4JFB1YXgg3KrB4LS3NmefHAGZheVnqx2iwC7CL+5z//yVP42AV5zTXXcFEjZs+ePcH+/fvzOgRscLj//vuDeXl5knkWL14cbNu2LT+O9evXl2wDWBMsODbR45tvvuFikD1QNW3aNPjOO+9IPmfps48//ji/qbF5evfuHdy6datknuPHj/ObIKsTwlLNb7vtNp7GLobVCWEp1Gwd7EbMbrZAnaysLH6NsHtGamoqP58fffRRSfoxjo11Aux/dq00AAAAAABughgWAAAAAHgeCBYAAAAAeB4IFgAAAAB4HggWAAAAAHgeCBYAAAAAeB4IFgAAAAB4HggWAAAAAHgeCBYAAAAAeB4IFgAAAAB4HggWAAAAAHgeCBYAAAAAeB4IFgAAAACQ1/l/Wgf5/OymCcoAAAAASUVORK5CYII=", - "text/plain": [ - "
" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "import matplotlib.pyplot as plt\n", - "\n", - "plt.plot(geo_weather.isel(gid=0).temp_air)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Next Steps\n", - "\n", - "Now we have data ready to use for geospatial calculations. You can see how to do this in [Geospatial Templates.ipynb](../04_geospatial/02_geospatial_templates.ipynb)" - ] - } - ], - "metadata": { - "kernelspec": { - "display_name": "pvdeg", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.12.9" - } - }, - "nbformat": 4, - "nbformat_minor": 2 -} diff --git a/tutorials/05_advanced/scripts/04_nsrdb_distributed_api.py b/tutorials/05_advanced/scripts/04_nsrdb_distributed_api.py deleted file mode 100644 index 2d3adad6..00000000 --- a/tutorials/05_advanced/scripts/04_nsrdb_distributed_api.py +++ /dev/null @@ -1,95 +0,0 @@ -# %% [markdown] -# # NSRDB Distributed (API Key Required) -# - -# %% -from dask.distributed import LocalCluster, Client -from dotenv import load_dotenv -import pvdeg -import os - -# %% [markdown] -# # Setting Up -# -# We need to get ready to make our parallelized API calls. You need to import your API key and email. This cell will not work for you unless you replace the `api_key` and `email` with your personal NSRDB api keys. [REQUEST A KEY](https://developer.nrel.gov/signup/). -# -# We also need to initalize a dask client. `pvdeg.weather.weather_distributed` will not work without it. It will fail silently and not populate and of the results in the resulting `weather_ds` called `geo_weather` in the example below. It is hard to recognize that this has occured so be careful. Make sure to initialize a dask client first. Visiting the link takes you to a daskboard that shows what dask is doing. - -# %% -load_dotenv() - -### REPLACE WITH YOUR API KEY AND EMAIL ### -api_key = "DEMO_KEY" -email = "user@mail.com" -########################################### - -workers = 4 - -cluster = LocalCluster( - n_workers=workers, - processes=True, -) - -client = Client(cluster) - -print("Daskboard link") -print(client.dashboard_link) - -# %% [markdown] -# **Note on .env file:** Create a file named `.env` in your project root directory (`c:\Users\rdaxini\Documents\GitHub\PVDegradationTools_NREL\`) with the following content: -# ``` -# api_key=YOUR_NREL_API_KEY -# email=YOUR_EMAIL_ADDRESS -# ``` -# Replace `YOUR_NREL_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NREL developer credentials. - -# %% [markdown] -# # Requesting Weather -# -# We will create a list of tuple (latitude, longitude) pairs and call the function on all of them at once. failed will represent a list of failed gids, unique location ID's that correspond to points in space on the NSRDB. These are different than on PVGIS where they are arbitrary indexes that do NOT correspond to a spatial location on earth. -# -# We will request "PSM4" data from the Physical Solar Model that represents a typical meteorological year (TMY) from the NSRDB. We will have to supply the api key and email from above here. The only difference between other weather sources lies in the NSRDB/PSM4 data requiring API keys. - -# %% -coords = [ - (25.783388, -80.189029), - (24.783388, -80.189029), -] - -geo_weather, geo_meta, failed = pvdeg.weather.weather_distributed( - database="PSM4", coords=coords, api_key=api_key, email=email -) - -# %% [markdown] -# # Viewing Results -# -# Same as in the other tutorial, our results are stored in an xarray dataset with a dask backend so you will have to use `.compute()` on the dataset to inspect the individual values of the dask arrays. -# -# Click on the `Data variables` dropdown to expand the dataset viewer. - -# %% -geo_weather - -# %% -geo_weather.compute() - -# %% [markdown] -# # Spot Check -# -# We can plot the entire TMY air_temperature to check that our data has loaded correctly. -# -# Explanation of steps -# -# geo_weather is our weather xarray dataset. We can index into the first entry at the 0th index by using isel (index-select). This will grab the data from the first gid. Then we pick the air temperature attribute. This can be replaced with bracket notation so `.temp_air` becomes `["temp_air"]. -# -# This selects a single array from the dataset that is labeled as "temp_air". This array will be a dask array so the values will be stored out of memory, we would have to load it using `.compute()` to directly inspect it but when plotting with matplotlib it will load the array for us. - -# %% -import matplotlib.pyplot as plt - -plt.plot(geo_weather.isel(gid=0).temp_air) - -# %% [markdown] -# # Next Steps -# -# Now we have data ready to use for geospatial calculations. You can see how to do this in [Geospatial Templates.ipynb](../04_geospatial/02_geospatial_templates.ipynb) diff --git a/tutorials/05_geospatial/02_geospatial_world_map.ipynb b/tutorials/05_geospatial/02_geospatial_world_map.ipynb index 63277905..ecee8ed6 100644 --- a/tutorials/05_geospatial/02_geospatial_world_map.ipynb +++ b/tutorials/05_geospatial/02_geospatial_world_map.ipynb @@ -10,9 +10,18 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 12, "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Importing libraries...\n", + "Done!\n" + ] + } + ], "source": [ "print(\"Importing libraries...\")\n", "import matplotlib.pyplot as plt\n", @@ -34,7 +43,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 13, "metadata": {}, "outputs": [], "source": [ @@ -44,9 +53,29 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 18, "metadata": {}, - "outputs": [], + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Starting Dask client...\n", + "Dashboard: http://192.168.1.189:55010/status\n", + "Cluster ready!\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "C:\\ProgramData\\anaconda3\\Lib\\site-packages\\distributed\\node.py:187: UserWarning: Port 8787 is already in use.\n", + "Perhaps you already have a cluster running?\n", + "Hosting the HTTP server on port 55010 instead\n", + " warnings.warn(\n" + ] + } + ], "source": [ "local = {\n", " \"manager\": \"local\",\n", @@ -59,7 +88,7 @@ " \"cores\": 100,\n", " \"processes\": 50,\n", " \"memory\": \"245GB\",\n", - " \"account\": \"pvfem\",\n", + " \"account\": \"pvsoiling\",\n", " \"queue\": \"standard\",\n", " \"walltime\": \"8:00:00\",\n", " # \"scheduler_options\": {\"host\": socket.gethostname()},\n", @@ -72,7 +101,7 @@ }, { "cell_type": "code", - "execution_count": null, + "execution_count": 19, "metadata": {}, "outputs": [], "source": [ @@ -82,15 +111,28 @@ }, { "cell_type": "code", - "execution_count": null, - "id": "a4643119", + "execution_count": 20, "metadata": {}, - "outputs": [], + "outputs": [ + { + "ename": "ConnectionError", + "evalue": "connected to not a node of kestrel.hpc.nrel.gov", + "output_type": "error", + "traceback": [ + "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m", + "\u001b[1;31mConnectionError\u001b[0m Traceback (most recent call last)", + "Cell \u001b[1;32mIn[20], line 15\u001b[0m\n\u001b[0;32m 1\u001b[0m weather_arg \u001b[38;5;241m=\u001b[39m {\n\u001b[0;32m 2\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124msatellite\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mHimawari\u001b[39m\u001b[38;5;124m\"\u001b[39m,\n\u001b[0;32m 3\u001b[0m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnames\u001b[39m\u001b[38;5;124m\"\u001b[39m: \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mtmy-2020\u001b[39m\u001b[38;5;124m\"\u001b[39m,\n\u001b[1;32m (...)\u001b[0m\n\u001b[0;32m 12\u001b[0m ],\n\u001b[0;32m 13\u001b[0m }\n\u001b[1;32m---> 15\u001b[0m weather_ds_himawari, meta_df_himawari \u001b[38;5;241m=\u001b[39m pvdeg\u001b[38;5;241m.\u001b[39mweather\u001b[38;5;241m.\u001b[39mget(\n\u001b[0;32m 16\u001b[0m weather_db, geospatial\u001b[38;5;241m=\u001b[39m\u001b[38;5;28;01mTrue\u001b[39;00m, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mweather_arg\n\u001b[0;32m 17\u001b[0m )\n", + "File \u001b[1;32m~\\Documents\\GitHub\\public\\PVDeg\\pvdeg\\weather.py:260\u001b[0m, in \u001b[0;36mget\u001b[1;34m(database, id, geospatial, find_meta, **kwargs)\u001b[0m\n\u001b[0;32m 258\u001b[0m \u001b[38;5;28;01melif\u001b[39;00m geospatial:\n\u001b[0;32m 259\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m database \u001b[38;5;241m==\u001b[39m \u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mNSRDB\u001b[39m\u001b[38;5;124m\"\u001b[39m:\n\u001b[1;32m--> 260\u001b[0m nrel_kestrel_check()\n\u001b[0;32m 262\u001b[0m weather_ds, meta_df \u001b[38;5;241m=\u001b[39m get_NSRDB(geospatial\u001b[38;5;241m=\u001b[39mgeospatial, \u001b[38;5;241m*\u001b[39m\u001b[38;5;241m*\u001b[39mkwargs)\n\u001b[0;32m 263\u001b[0m meta_df[\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mwind_height\u001b[39m\u001b[38;5;124m\"\u001b[39m] \u001b[38;5;241m=\u001b[39m \u001b[38;5;241m2\u001b[39m\n", + "File \u001b[1;32m~\\Documents\\GitHub\\public\\PVDeg\\pvdeg\\utilities.py:1137\u001b[0m, in \u001b[0;36mnrel_kestrel_check\u001b[1;34m()\u001b[0m\n\u001b[0;32m 1134\u001b[0m msg \u001b[38;5;241m=\u001b[39m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mconnected to \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mdevice_domain\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m \u001b[38;5;124mf\u001b[39m\u001b[38;5;124m\"\u001b[39m\u001b[38;5;124mnot a node of \u001b[39m\u001b[38;5;132;01m{\u001b[39;00mKESTREL_HOSTNAME\u001b[38;5;132;01m}\u001b[39;00m\u001b[38;5;124m\"\u001b[39m\n\u001b[0;32m 1136\u001b[0m \u001b[38;5;28;01mif\u001b[39;00m KESTREL_HOSTNAME \u001b[38;5;241m!=\u001b[39m device_domain:\n\u001b[1;32m-> 1137\u001b[0m \u001b[38;5;28;01mraise\u001b[39;00m \u001b[38;5;167;01mConnectionError\u001b[39;00m(msg)\n", + "\u001b[1;31mConnectionError\u001b[0m: connected to not a node of kestrel.hpc.nrel.gov" + ] + } + ], "source": [ "weather_arg = {\n", " \"satellite\": \"Himawari\",\n", " \"names\": \"tmy-2020\",\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -109,7 +151,6 @@ { "cell_type": "code", "execution_count": null, - "id": "4cf9f951", "metadata": {}, "outputs": [], "source": [ @@ -141,14 +182,13 @@ { "cell_type": "code", "execution_count": null, - "id": "acf47855", "metadata": {}, "outputs": [], "source": [ "weather_arg = {\n", " \"satellite\": \"GOES\",\n", " \"names\": 2021,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -167,7 +207,6 @@ { "cell_type": "code", "execution_count": null, - "id": "ba3b18b8", "metadata": {}, "outputs": [], "source": [ @@ -195,14 +234,13 @@ { "cell_type": "code", "execution_count": null, - "id": "00fa08de", "metadata": {}, "outputs": [], "source": [ "weather_arg = {\n", " \"satellite\": \"METEOSAT\",\n", " \"names\": 2019,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -221,7 +259,6 @@ { "cell_type": "code", "execution_count": null, - "id": "dcf9d773", "metadata": {}, "outputs": [], "source": [ @@ -365,7 +402,7 @@ "weather_arg = {\n", " \"satellite\": \"METEOSAT\",\n", " \"names\": 2019,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -843,7 +880,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.7" + "version": "3.13.5" } }, "nbformat": 4, diff --git a/tutorials/05_geospatial/03_letid_outdoor_geospatial_demo.ipynb b/tutorials/05_geospatial/03_letid_outdoor_geospatial_demo.ipynb index 9fdfd9f0..3dd907f5 100644 --- a/tutorials/05_geospatial/03_letid_outdoor_geospatial_demo.ipynb +++ b/tutorials/05_geospatial/03_letid_outdoor_geospatial_demo.ipynb @@ -134,7 +134,7 @@ "weather_arg = {\n", " \"satellite\": \"Americas\",\n", " \"names\": 2022,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -345,7 +345,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.4" + "version": "3.13.5" } }, "nbformat": 4, diff --git a/tutorials/05_geospatial/scripts/02_geospatial_world_map.py b/tutorials/05_geospatial/scripts/02_geospatial_world_map.py index c9a93287..39765bbe 100644 --- a/tutorials/05_geospatial/scripts/02_geospatial_world_map.py +++ b/tutorials/05_geospatial/scripts/02_geospatial_world_map.py @@ -1,8 +1,12 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # Geospatial World Map (HPC) # M. Springer 2024-06-05 -# %% +# In[12]: + + print("Importing libraries...") import matplotlib.pyplot as plt import numpy as np @@ -13,14 +17,19 @@ print("Done!") -# %% [markdown] + # # Calculate Standoff -# %% +# In[13]: + + work_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/standoff_fine" data_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/data" -# %% + +# In[18]: + + local = { "manager": "local", "n_workers": 100, @@ -32,7 +41,7 @@ "cores": 100, "processes": 50, "memory": "245GB", - "account": "pvfem", + "account": "pvsoiling", "queue": "standard", "walltime": "8:00:00", # "scheduler_options": {"host": socket.gethostname()}, @@ -42,15 +51,21 @@ client = pvdeg.geospatial.start_dask(hpc=kestrel) print("Cluster ready!") -# %% + +# In[19]: + + # Get weather data weather_db = "NSRDB" -# %% + +# In[20]: + + weather_arg = { "satellite": "Himawari", "names": "tmy-2020", - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -65,13 +80,19 @@ weather_db, geospatial=True, **weather_arg ) -# %% + +# In[ ]: + + meta_df_himawari_sub, gids_meta_df_himawari = pvdeg.utilities.gid_downsampling( meta_df_himawari, 3 ) weather_ds_himawari_sub = weather_ds_himawari.sel(gid=meta_df_himawari_sub.index) -# %% + +# In[ ]: + + geo_himawari = { "func": pvdeg.standards.standoff, "weather_ds": weather_ds_himawari_sub, @@ -84,11 +105,14 @@ os.path.join(work_dir, "standoff_himawari.csv") ) -# %% + +# In[ ]: + + weather_arg = { "satellite": "GOES", "names": 2021, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -103,11 +127,17 @@ weather_db, geospatial=True, **weather_arg ) -# %% + +# In[ ]: + + meta_df_goes_sub, gids_meta_df_goes = pvdeg.utilities.gid_downsampling(meta_df_goes, 8) weather_ds_goes_sub = weather_ds_goes.sel(gid=meta_df_goes_sub.index) -# %% + +# In[ ]: + + geo_goes = { "func": pvdeg.standards.standoff, "weather_ds": weather_ds_goes_sub, @@ -118,11 +148,14 @@ standoff_res_goes.to_netcdf(os.path.join(work_dir, "standoff_goes.nc")) standoff_res_goes.to_dataframe().to_csv(os.path.join(work_dir, "standoff_goes.csv")) -# %% + +# In[ ]: + + weather_arg = { "satellite": "METEOSAT", "names": 2019, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -137,13 +170,19 @@ weather_db, geospatial=True, **weather_arg ) -# %% + +# In[ ]: + + meta_df_meteosat_sub, gids_meta_df_meteosat = pvdeg.utilities.gid_downsampling( meta_df_meteosat, 4 ) weather_ds_meteosat_sub = weather_ds_meteosat.sel(gid=meta_df_meteosat_sub.index) -# %% + +# In[ ]: + + geo_meteosat = { "func": pvdeg.standards.standoff, "weather_ds": weather_ds_meteosat_sub, @@ -156,7 +195,10 @@ os.path.join(work_dir, "standoff_meteosat.csv") ) -# %% + +# In[ ]: + + # Auxillary data import h5py @@ -255,12 +297,15 @@ standoff_res_aux.to_netcdf(os.path.join(work_dir, "standoff_aux.nc")) standoff_res_aux.to_dataframe().to_csv(os.path.join(work_dir, "standoff_aux.csv")) -# %% + +# In[ ]: + + weather_db = "NSRDB" weather_arg = { "satellite": "METEOSAT", "names": 2019, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -416,7 +461,10 @@ standoff_res_pvgis.to_netcdf(os.path.join(work_dir, "standoff_pvgis.nc")) standoff_res_pvgis.to_dataframe().to_csv(os.path.join(work_dir, "standoff_pvgis.csv")) -# %% + +# In[ ]: + + meta_north = pd.read_csv(f"{data_dir}/meta_pvgis_north_3300.csv", index_col=0) weather_north = xr.open_dataset(f"{data_dir}/weather_ds_north_3300.nc") weather_north = weather_north.sel(gid=meta_north.index) @@ -438,10 +486,12 @@ os.path.join(work_dir, "standoff_pvgis_north.csv") ) -# %% [markdown] + # # Post process -# %% +# In[ ]: + + import pvdeg import os import pandas as pd @@ -454,7 +504,10 @@ work_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/standoff_fine" data_dir = "/projects/pvsoiling/pvdeg/analysis/world_map/data" -# %% + +# In[ ]: + + # Create 0cm standoff locations lon_north = np.arange(-179, 180, 0.25) @@ -487,7 +540,10 @@ lon_asia = lon lat_asia = lat -# %% + +# In[ ]: + + fig, ax = plt.subplots() plt.scatter(lon_land_north, lat_land_north, c="r", s=1) @@ -497,7 +553,10 @@ ax.set_ylim(-90, 90) ax.set_xlim(-180, 180) -# %% + +# In[ ]: + + template_params = pvdeg.geospatial.template_parameters(pvdeg.standards.standoff) standoff_zero_north = pvdeg.geospatial.zero_template( lat_land_north, lon_land_north, **template_params @@ -509,7 +568,10 @@ lat_asia, lon_asia, **template_params ) -# %% + +# In[ ]: + + standoff_aux = xr.open_dataset(os.path.join(work_dir, "standoff_aux.nc")) standoff_himawari = xr.open_dataset(os.path.join(work_dir, "standoff_himawari.nc")) standoff_meteosat = xr.open_dataset(os.path.join(work_dir, "standoff_meteosat.nc")) @@ -517,7 +579,10 @@ standoff_pvgis = xr.open_dataset(os.path.join(work_dir, "standoff_pvgis.nc")) standoff_goes = xr.open_dataset(os.path.join(work_dir, "standoff_goes.nc")) -# %% + +# In[ ]: + + fig = plt.figure(figsize=(10, 5)) ax = fig.add_axes([0, 0, 1, 1], projection=ccrs.PlateCarree(), frameon=True) ax.patch.set_visible(True) @@ -667,4 +732,9 @@ plt.savefig(os.path.join(work_dir, "standoff_map.png"), dpi=1200, bbox_inches="tight") -# %% + +# In[ ]: + + + + diff --git a/tutorials/05_geospatial/scripts/03_letid_outdoor_geospatial_demo.py b/tutorials/05_geospatial/scripts/03_letid_outdoor_geospatial_demo.py index ca9f1742..75c3201c 100644 --- a/tutorials/05_geospatial/scripts/03_letid_outdoor_geospatial_demo.py +++ b/tutorials/05_geospatial/scripts/03_letid_outdoor_geospatial_demo.py @@ -1,16 +1,23 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # LETID Outdoor Geospatial Demo (HPC) -# +# # ![PVDeg Logo](../images/pvdeg_logo.svg) -# %% +# In[ ]: + + import matplotlib.pyplot as plt import pandas as pd import pvdeg from pvdeg import DATA_DIR import os -# %% + +# In[ ]: + + # This information helps with debugging and getting support :) import sys import platform @@ -20,14 +27,19 @@ print("Pandas version ", pd.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # Single location example -# %% +# In[ ]: + + weather_file = os.path.join(DATA_DIR, "psm3_demo.csv") WEATHER, META = pvdeg.weather.read(weather_file, "psm") -# %% + +# In[ ]: + + kwargs = { "tau_0": 115, # us, carrier lifetime in non-degraded states, e.g. LETID/LID states A or C "tau_deg": 55, # us, carrier lifetime in fully-degraded state, e.g. LETID/LID state B @@ -40,13 +52,18 @@ "mechanism_params": "repins", } -# %% + +# In[ ]: + + pvdeg.letid.calc_letid_outdoors(weather_df=WEATHER, meta=META, **kwargs) -# %% [markdown] + # # Start distributed compute cluster - DASK -# %% +# In[ ]: + + local = { "manager": "local", "n_workers": 1, @@ -67,14 +84,17 @@ pvdeg.geospatial.start_dask(hpc=kestrel) -# %% + +# In[ ]: + + # Get weather data weather_db = "NSRDB" weather_arg = { "satellite": "Americas", "names": 2022, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -93,13 +113,22 @@ weather_SW_sub = weather_ds.sel(gid=meta_SW_sub.index) -# %% + +# In[ ]: + + weather_SW_sub -# %% + +# In[ ]: + + meta_df -# %% + +# In[ ]: + + # Define desired analysis geo = { "func": pvdeg.letid.calc_letid_outdoors, @@ -118,10 +147,16 @@ letid_res = pvdeg.geospatial.analysis(**geo) -# %% + +# In[ ]: + + letid_res -# %% + +# In[ ]: + + import datetime ims = [] @@ -147,7 +182,10 @@ # ims = [imageio.imread(f'./images/RH_animation_{n}.png') for n in range(1, 13)] # imageio.mimwrite(f'./images/RH_animation.gif', ims, format='GIF', duration=1000, loop=10) -# %% + +# In[ ]: + + import datetime ims = [] @@ -217,10 +255,14 @@ plt.savefig(f"./images/LETID_plot_animation_{n}.png", dpi=600) -# %% + +# In[ ]: + + import imageio ims = [imageio.imread(f"./images/LETID_plot_animation_{n}.png") for n in range(1, 13)] imageio.mimwrite( "./images/LETID_plot_animation.gif", ims, format="GIF", duration=1000, loop=10 ) + diff --git a/tutorials/06_advanced/04_nsrdb_distributed_api.ipynb b/tutorials/06_advanced/04_nsrdb_distributed_api.ipynb index 6f1248dc..f1d73d1a 100644 --- a/tutorials/06_advanced/04_nsrdb_distributed_api.ipynb +++ b/tutorials/06_advanced/04_nsrdb_distributed_api.ipynb @@ -83,12 +83,12 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "**Note on .env file:** Create a file named `.env` in your project root directory (`c:\\Users\\rdaxini\\Documents\\GitHub\\PVDegradationTools_NREL\\`) with the following content:\n", + "**Note on .env file:** Create a file named `.env` in your project root directory (`c:\\Users\\rdaxini\\Documents\\GitHub\\PVDegradationTools_NLR\\`) with the following content:\n", "```\n", - "api_key=YOUR_NREL_API_KEY\n", + "api_key=YOUR_NLR_API_KEY\n", "email=YOUR_EMAIL_ADDRESS\n", "```\n", - "Replace `YOUR_NREL_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NREL developer credentials." + "Replace `YOUR_NLR_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NLR developer credentials." ] }, { @@ -1952,7 +1952,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -1966,9 +1966,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.12.9" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/06_advanced/README.md b/tutorials/06_advanced/README.md index 06738abe..d63fa216 100644 --- a/tutorials/06_advanced/README.md +++ b/tutorials/06_advanced/README.md @@ -21,7 +21,7 @@ The following tutorials require API access to external data sources: ⚠️ **Note**: Several tutorials require API keys: - **NSRDB API Key**: Required for `03_pysam_api.ipynb` and `04_nsrdb_distributed_api.ipynb` - - Get your free API key at: https://developer.nrel.gov/signup/ + - Get your free API key at: https://developer.nlr.gov/signup/ - Rate limits apply for free tier ## Topics Covered diff --git a/tutorials/06_advanced/scripts/04_nsrdb_distributed_api.py b/tutorials/06_advanced/scripts/04_nsrdb_distributed_api.py index a611a757..f2e6ed22 100644 --- a/tutorials/06_advanced/scripts/04_nsrdb_distributed_api.py +++ b/tutorials/06_advanced/scripts/04_nsrdb_distributed_api.py @@ -1,21 +1,27 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # NSRDB Distributed (API Key Required) -# +# + +# In[1]: + -# %% from dask.distributed import LocalCluster, Client from dotenv import load_dotenv import pvdeg import os -# %% [markdown] + # # Setting Up -# +# # As in [load_pvgis_distributed.ipynb](./load_pvgis_distributed.ipynb) we need to get ready to make our parallelized API calls. The notebook linked here goes through the process in more detail but we need to import our API key and email. This cell will not work for you unless you replace the `api_key` and `email` with your personal NSRDB api keys. [REQUEST A KEY](https://developer.nrel.gov/signup/). -# +# # We also need to initalize a dask client. `pvdeg.weather.weather_distributed` will not work without it. It will fail silently and not populate and of the results in the resulting `weather_ds` called `geo_weather` in the example below. It is hard to recognize that this has occured so be careful. Make sure to initialize a dask client first. Visiting the link takes you to a daskboard that shows what dask is doing. -# %% +# In[2]: + + load_dotenv() ### REPLACE WITH YOUR API KEY AND EMAIL ### @@ -35,22 +41,23 @@ print("Daskboard link") print(client.dashboard_link) -# %% [markdown] -# **Note on .env file:** Create a file named `.env` in your project root directory (`c:\Users\rdaxini\Documents\GitHub\PVDegradationTools_NREL\`) with the following content: + +# **Note on .env file:** Create a file named `.env` in your project root directory (`c:\Users\rdaxini\Documents\GitHub\PVDegradationTools_NLR\`) with the following content: # ``` -# api_key=YOUR_NREL_API_KEY +# api_key=YOUR_NLR_API_KEY # email=YOUR_EMAIL_ADDRESS # ``` -# Replace `YOUR_NREL_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NREL developer credentials. +# Replace `YOUR_NLR_API_KEY` and `YOUR_EMAIL_ADDRESS` with your actual NLR developer credentials. -# %% [markdown] # # Requesting Weather -# +# # As with the other script [load_pvgis_distributed.ipynb](./load_pvgis_distributed.ipynb). We will create a list of tuple (latitude, longitude) pairs and call the function on all of them at once. failed will represent a list of failed gids, unique location ID's that correspond to points in space on the NSRDB. These are different than on PVGIS where they are arbitrary indexes that do NOT correspond to a spatial location on earth. -# +# # We will request "PSM4" data from the Physical Solar Model that represents a typical meteorological year (TMY) from the NSRDB. We will have to supply the api key and email from above here. Refer to the linked script to see this in further detail. The only difference between the scripts, lies in the NSRDB/PSM4 data requiring Api keys. -# %% +# In[3]: + + coords = [ (25.783388, -80.189029), (24.783388, -80.189029), @@ -60,36 +67,43 @@ database="PSM4", coords=coords, api_key=api_key, email=email ) -# %% [markdown] + # # Viewing Results -# +# # Same as in the other tutorial, our results are stored in an xarray dataset with a dask backend so you will have to use `.compute()` on the dataset to inspect the individual values of the dask arrays. -# +# # Click on the `Data variables` dropdown to expand the dataset viewer. -# %% +# In[4]: + + geo_weather -# %% + +# In[5]: + + geo_weather.compute() -# %% [markdown] + # # Spot Check -# +# # We can plot the entire TMY air_temperature to check that our data has loaded correctly. -# +# # Explanation of steps -# +# # geo_weather is our weather xarray dataset. We can index into the first entry at the 0th index by using isel (index-select). This will grab the data from the first gid. Then we pick the air temperature attribute. This can be replaced with bracket notation so `.temp_air` becomes `["temp_air"]. -# +# # This selects a single array from the dataset that is labeled as "temp_air". This array will be a dask array so the values will be stored out of memory, we would have to load it using `.compute()` to directly inspect it but when plotting with matplotlib it will load the array for us. -# %% +# In[6]: + + import matplotlib.pyplot as plt plt.plot(geo_weather.isel(gid=0).temp_air) -# %% [markdown] + # # Next Steps -# +# # Now we have data ready to use for geospatial calculations. This is shown in the other distributed script [load_pvgis_distributed.ipynb](./load_pvgis_distributed.ipynb). You can also see how to do this in [Geospatial Templates.ipynb](../tutorials_and_tools/tutorials_and_tools/Geospatial%20Templates.ipynb) diff --git a/tutorials/10_workshop_demos/01_astm_live_demo.ipynb b/tutorials/10_workshop_demos/01_astm_live_demo.ipynb index c15050f7..41e5a7b6 100644 --- a/tutorials/10_workshop_demos/01_astm_live_demo.ipynb +++ b/tutorials/10_workshop_demos/01_astm_live_demo.ipynb @@ -117,13 +117,13 @@ "\n", "There are many different sources of solar irradiance data. For your projects, these are some of the most common:\n", "\n", - "- [NSRDB](https://maps.nrel.gov/nsrdb-viewer/) - National Solar Radiation Database. You can access data through the website for many locations accross the world, or you can use their [web API](https://developer.nrel.gov/docs/solar/nsrdb/) to download data programmatically. An \"API\" is an [\"application programming interface\"](https://en.wikipedia.org/wiki/API), and a \"web API\" is a programming interface that allows you to write code to interact with web services like the NSRDB.\n", + "- [NSRDB](https://maps.nlr.gov/nsrdb-viewer/) - National Solar Radiation Database. You can access data through the website for many locations accross the world, or you can use their [web API](https://developer.nlr.gov/docs/solar/nsrdb/) to download data programmatically. An \"API\" is an [\"application programming interface\"](https://en.wikipedia.org/wiki/API), and a \"web API\" is a programming interface that allows you to write code to interact with web services like the NSRDB.\n", "\n", "- [EPW](https://www.energy.gov/eere/buildings/downloads/energyplus-0) - Energy Plus Weather data is available for many locations accross the world. It's in its own format file ('EPW') so you can't open it easily in a spreadsheet program like Excel, but you can use [`pvlib.iotools.read_epw()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.read_epw.html) to get it into a dataframe and use it.\n", "\n", "- [PVGIS](https://re.jrc.ec.europa.eu/pvg_tools/en/) - Free global weather data provided by the European Union and derived from many govermental agencies including the NSRDB. PVGIS also provides a web API. You can get PVGIS TMY data using [`pvlib.iotools.get_pvgis_tmy()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_pvgis_tmy.html).\n", "\n", - "- Perhaps another useful link: https://sam.nrel.gov/weather-data.html\n", + "- Perhaps another useful link: https://sam.nlr.gov/weather-data.html\n", "\n", "## Where else can you get historical irradiance data?\n", "\n", @@ -148,26 +148,26 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# NREL API Key\n", - "At the [NREL Developer Network](https://developer.nrel.gov/), there are [APIs](https://en.wikipedia.org/wiki/API) to a lot of valuable [solar resources](https://developer.nrel.gov/docs/solar/) like [weather data from the NSRDB](https://developer.nrel.gov/docs/solar/nsrdb/), [operational data from PVDAQ](https://developer.nrel.gov/docs/solar/pvdaq-v3/), or indicative calculations using [PVWatts](https://developer.nrel.gov/docs/solar/pvwatts/). In order to use these resources from NREL, you need to [register for a free API key](https://developer.nrel.gov/signup/). You can test out the APIs using the `DEMO_KEY` but it has limited bandwidth compared to the [usage limit for registered users](https://developer.nrel.gov/docs/rate-limits/). NREL has some [API usage instructions](https://developer.nrel.gov/docs/api-key/), but pvlib has a few builtin functions, like [`pvlib.iotools.get_psm3()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_psm3.html), that wrap the NREL API, and call them for you to make it much easier to use. Skip ahead to the next section to learn more. But before you do...\n", + "# NLR API Key\n", + "At the [NLR Developer Network](https://developer.nlr.gov/), there are [APIs](https://en.wikipedia.org/wiki/API) to a lot of valuable [solar resources](https://developer.nlr.gov/docs/solar/) like [weather data from the NSRDB](https://developer.nlr.gov/docs/solar/nsrdb/), [operational data from PVDAQ](https://developer.nlr.gov/docs/solar/pvdaq-v3/), or indicative calculations using [PVWatts](https://developer.nlr.gov/docs/solar/pvwatts/). In order to use these resources from NLR, you need to [register for a free API key](https://developer.nlr.gov/signup/). You can test out the APIs using the `DEMO_KEY` but it has limited bandwidth compared to the [usage limit for registered users](https://developer.nlr.gov/docs/rate-limits/). NLR has some [API usage instructions](https://developer.nlr.gov/docs/api-key/), but pvlib has a few builtin functions, like [`pvlib.iotools.get_psm3()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_psm3.html), that wrap the NLR API, and call them for you to make it much easier to use. Skip ahead to the next section to learn more. But before you do...\n", "\n", - "**Please pause now to visit https://developer.nrel.gov/signup/ and get an API key.**\n", + "**Please pause now to visit https://developer.nlr.gov/signup/ and get an API key.**\n", "\n", "## Application Programming Interface (API)\n", "What exactly is an API? Nowadays, the phrase is used interchangeably with a \"web API\" but in general an API is just a recipe for how to interface with a application programmatically, _IE_: in code. An API could be as simple as a function signature or its published documentation, _EG_: the API for the `solarposition` function is you give it an ISO8601 formatted date with a timezone, the latitude, longitude, and elevation as numbers, and it returns the zenith and azimuth as numbers.\n", "\n", - "A web API is the same, except the application is a web service, that you access at its URL using web methods. We won't go into too much more detail here, but the most common web method is `GET` which is pretty self explanatory. Look over the [NREL web usage instructions](https://developer.nrel.gov/docs/api-key/) for some examples, but interacting with a web API can be as easy as entering a URL into a browser. Try the URL below to _get_ the PVWatts energy output for a fixed tilt site in [Broomfield, CO](https://goo.gl/maps/awkEcNGzSur9Has18).\n", + "A web API is the same, except the application is a web service, that you access at its URL using web methods. We won't go into too much more detail here, but the most common web method is `GET` which is pretty self explanatory. Look over the [NLR web usage instructions](https://developer.nlr.gov/docs/api-key/) for some examples, but interacting with a web API can be as easy as entering a URL into a browser. Try the URL below to _get_ the PVWatts energy output for a fixed tilt site in [Broomfield, CO](https://goo.gl/maps/awkEcNGzSur9Has18).\n", "\n", - "https://developer.nrel.gov/api/pvwatts/v6.json?api_key=DEMO_KEY&lat=40&lon=-105&system_capacity=4&azimuth=180&tilt=40&array_type=1&module_type=1&losses=10\n", + "https://developer.nlr.gov/api/pvwatts/v6.json?api_key=DEMO_KEY&lat=40&lon=-105&system_capacity=4&azimuth=180&tilt=40&array_type=1&module_type=1&losses=10\n", "\n", "In addition to just using your browser, you can also access web APIs programmatically. The most popular Python package to interact with web APIs is [requests](https://docs.python-requests.org/en/master/). There's also free open source command-line tools like [cURL](https://curl.se/) and [HTTPie](https://httpie.io/), and a popular nagware/freemium GUI application called [Postman](https://www.postman.com/).\n", "\n", - "**If you have an NREL API key please enter it in the next cell.**" + "**If you have an NLR API key please enter it in the next cell.**" ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": { "execution": { "iopub.execute_input": "2026-02-03T18:09:11.764380Z", @@ -178,19 +178,19 @@ }, "outputs": [], "source": [ - "NREL_API_KEY = None # <-- please set your NREL API key here\n", + "NLR_API_KEY = None # <-- please set your NLR API key here\n", "\n", "# note you must use \"quotes\" around your key, for example:\n", - "# NREL_API_KEY = 'DEMO_KEY' # single or double both work fine\n", + "# NLR_API_KEY = 'DEMO_KEY' # single or double both work fine\n", "\n", "# during the live tutorial, we've stored a dedicated key on our server\n", - "if NREL_API_KEY is None:\n", + "if NLR_API_KEY is None:\n", " try:\n", - " NREL_API_KEY = os.environ[\n", - " \"NREL_API_KEY\"\n", + " NLR_API_KEY = os.environ[\n", + " \"NLR_API_KEY\"\n", " ] # get dedicated key for tutorial from servier\n", " except KeyError:\n", - " NREL_API_KEY = \"DEMO_KEY\" # OK for this demo, but better to get your own key" + " NLR_API_KEY = \"DEMO_KEY\" # OK for this demo, but better to get your own key" ] }, { @@ -235,7 +235,7 @@ "\"\"\"\n", "weather_db = 'PSM4'\n", "weather_id = (33.4484, -112.0740)\n", - "weather_arg = {'api_key': NREL_API_KEY,\n", + "weather_arg = {'api_key': NLR_API_KEY,\n", " 'email': 'user@mail.com',\n", " 'year': '2021',\n", " 'map_variables': True,\n", @@ -247,7 +247,7 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": { "execution": { "iopub.execute_input": "2026-02-03T18:09:11.804287Z", @@ -261,8 +261,8 @@ "weather_df, meta = pvlib.iotools.get_nsrdb_psm4_tmy(\n", " latitude=33.4484,\n", " longitude=-112.0740,\n", - " api_key=NREL_API_KEY,\n", - " email=\"silvana.ovaitt@nrel.gov\", # <-- any email works here fine\n", + " api_key=NLR_API_KEY,\n", + " email=\"silvana.ovaitt@nlr.gov\", # <-- any email works here fine\n", " year=\"tmy\",\n", " map_variables=True,\n", " leap_day=False,\n", @@ -1033,7 +1033,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -1047,9 +1047,9 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.13.11" + "version": "3.13.5" } }, "nbformat": 4, - "nbformat_minor": 2 + "nbformat_minor": 4 } diff --git a/tutorials/10_workshop_demos/02_duramat_live_demo.ipynb b/tutorials/10_workshop_demos/02_duramat_live_demo.ipynb index 8cf8d4ca..52701962 100644 --- a/tutorials/10_workshop_demos/02_duramat_live_demo.ipynb +++ b/tutorials/10_workshop_demos/02_duramat_live_demo.ipynb @@ -87,7 +87,7 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": {}, "outputs": [ { @@ -111,7 +111,7 @@ "weather_arg = {\n", " \"satellite\": \"Americas\",\n", " \"names\": 2022,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -209,11 +209,11 @@ "# State bar of new mexico: (35.16482, -106.58979)\n", "\n", "weather_db = \"NSRDB\"\n", - "weather_id = (35.16482, -106.58979) # NREL (39.741931, -105.169891)\n", + "weather_id = (35.16482, -106.58979) # NLR (39.741931, -105.169891)\n", "weather_arg = {\n", " \"satellite\": \"Americas\",\n", " \"names\": 2022,\n", - " \"NREL_HPC\": True,\n", + " \"NLR_HPC\": True,\n", " \"attributes\": [\n", " \"air_temperature\",\n", " \"wind_speed\",\n", @@ -327,7 +327,7 @@ ], "metadata": { "kernelspec": { - "display_name": "pvdeg", + "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, @@ -341,7 +341,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.12.9" + "version": "3.13.5" } }, "nbformat": 4, diff --git a/tutorials/10_workshop_demos/README.md b/tutorials/10_workshop_demos/README.md index 74669f52..63b5d7a0 100644 --- a/tutorials/10_workshop_demos/README.md +++ b/tutorials/10_workshop_demos/README.md @@ -5,7 +5,7 @@ Live demonstration materials from PVDeg workshops and conferences. ## Tutorials 1. **01_astm_live_demo.ipynb** - ASTM workshop live demonstration -2. **02_duramat_live_demo.ipynb** - DuraMAT workshop live demonstration ⚠️ **Requires NREL HPC** +2. **02_duramat_live_demo.ipynb** - DuraMAT workshop live demonstration ⚠️ **Requires NLR HPC** ## Requirements @@ -14,9 +14,9 @@ Live demonstration materials from PVDeg workshops and conferences. - Can run locally or in Google Colab ### DuraMAT Live Demo (02) -⚠️ **NREL HPC Access Required** -- Requires access to NREL's High Performance Computing resources -- NREL network connectivity or VPN required +⚠️ **NLR HPC Access Required** +- Requires access to NLR's High Performance Computing resources +- NLR network connectivity or VPN required - Appropriate computing allocations needed ## About These Demos @@ -37,6 +37,6 @@ Demos without HPC requirements can be launched in Google Colab using the rocket ### HPC Execution For demos requiring HPC (marked with ⚠️), you must: -1. Have NREL HPC account and allocations -2. Be on NREL network or connected via VPN -3. Follow NREL HPC documentation for job submission +1. Have NLR HPC account and allocations +2. Be on NLR network or connected via VPN +3. Follow NLR HPC documentation for job submission diff --git a/tutorials/10_workshop_demos/scripts/01_astm_live_demo.py b/tutorials/10_workshop_demos/scripts/01_astm_live_demo.py index 00cd630d..dcc1728b 100644 --- a/tutorials/10_workshop_demos/scripts/01_astm_live_demo.py +++ b/tutorials/10_workshop_demos/scripts/01_astm_live_demo.py @@ -1,36 +1,46 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # ASTM Live Demo -# +# # ![PVDeg Logo](../images/pvdeg_logo.svg) -# -# +# +# # **Steps:** # 1. Import weather data # 2. Calculate installation standoff # 3. Calculate installation standoff - with more detail -# +# # **Background:** -# +# # This example demonstrates the calculation of a minimum standoff distance necessary for roof-mounted PV modules to ensure that the $T_{98}$ operational temperature remains under 70°C, in which case the more rigorous thermal stability testing requirements of IEC TS 63126 would not needed to be considered. We use data from [Fuentes, 1987] to model the approximate exponential decay in temperature, $T(X)$, with increasing standoff distance, $X$, as, -# +# # $$ X = -X_0 \ln\left(1-\frac{T_0-T}{\Delta T}\right)$$ -# +# # where $T_0$ is the temperature for $X=0$ (insulated back) and $\Delta T$ is the temperature difference between an insulated back ($X=0$) and open rack mounting configuration ($X=\infty)$. -# +# # The following figure showcases this calulation for the entire United States. We used pvlib and data from the National Solar Radiation Database (NSRDB) to calculate the module temperatures for different mounting configuration and applied our model to obtain the standoff distance for roof-mounted PV systems. -# %% +# In[1]: + + # if running on google colab, uncomment the next line and execute this cell to install the dependencies and prevent "ModuleNotFoundError" in later cells: -# #!pip install pvdeg +#!pip install pvdeg + + +# In[2]: + -# %% import os import pvlib import pvdeg import pandas as pd import matplotlib.pyplot as plt -# %% + +# In[3]: + + # This information helps with debugging and getting support :) import sys import platform @@ -41,90 +51,91 @@ print("pvlib version ", pvlib.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # 1. Import Weather Data -# +# # The function has two minimum requirements: # - Weather data containing (at least) DNI, DHI, GHI, Temperature, RH, Wind-Speed # - Site meta-data containing (at least) Latitude, Longitude, Time Zone -# +# -# %% [markdown] # # Where to get _Free_ Solar Irradiance Data? -# +# # There are many different sources of solar irradiance data. For your projects, these are some of the most common: -# -# - [NSRDB](https://maps.nrel.gov/nsrdb-viewer/) - National Solar Radiation Database. You can access data through the website for many locations accross the world, or you can use their [web API](https://developer.nrel.gov/docs/solar/nsrdb/) to download data programmatically. An "API" is an ["application programming interface"](https://en.wikipedia.org/wiki/API), and a "web API" is a programming interface that allows you to write code to interact with web services like the NSRDB. -# +# +# - [NSRDB](https://maps.nlr.gov/nsrdb-viewer/) - National Solar Radiation Database. You can access data through the website for many locations accross the world, or you can use their [web API](https://developer.nlr.gov/docs/solar/nsrdb/) to download data programmatically. An "API" is an ["application programming interface"](https://en.wikipedia.org/wiki/API), and a "web API" is a programming interface that allows you to write code to interact with web services like the NSRDB. +# # - [EPW](https://www.energy.gov/eere/buildings/downloads/energyplus-0) - Energy Plus Weather data is available for many locations accross the world. It's in its own format file ('EPW') so you can't open it easily in a spreadsheet program like Excel, but you can use [`pvlib.iotools.read_epw()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.read_epw.html) to get it into a dataframe and use it. -# +# # - [PVGIS](https://re.jrc.ec.europa.eu/pvg_tools/en/) - Free global weather data provided by the European Union and derived from many govermental agencies including the NSRDB. PVGIS also provides a web API. You can get PVGIS TMY data using [`pvlib.iotools.get_pvgis_tmy()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_pvgis_tmy.html). -# -# - Perhaps another useful link: https://sam.nrel.gov/weather-data.html -# +# +# - Perhaps another useful link: https://sam.nlr.gov/weather-data.html +# # ## Where else can you get historical irradiance data? -# +# # There are several commercial providers of solar irradiance data. Data is available at different spatial and time resolutions. Each provider offers data under subscription that will provide access to irradiance (and other weather variables) via API to leverage in python. -# +# # * [SolarAnywhere](https://www.solaranywhere.com/) # * [SolarGIS](https://solargis.com/) # * [Vaisala](https://www.vaisala.com/en) # * [Meteonorm](https://meteonorm.com/en/) # * [DNV Solar Resource Compass](https://src.dnv.com/) -# %% [markdown] -# +# # ![NSRDB Example](../images/tutorial_1_NSRDB_example.PNG) -# - -# %% [markdown] -# # NREL API Key -# At the [NREL Developer Network](https://developer.nrel.gov/), there are [APIs](https://en.wikipedia.org/wiki/API) to a lot of valuable [solar resources](https://developer.nrel.gov/docs/solar/) like [weather data from the NSRDB](https://developer.nrel.gov/docs/solar/nsrdb/), [operational data from PVDAQ](https://developer.nrel.gov/docs/solar/pvdaq-v3/), or indicative calculations using [PVWatts](https://developer.nrel.gov/docs/solar/pvwatts/). In order to use these resources from NREL, you need to [register for a free API key](https://developer.nrel.gov/signup/). You can test out the APIs using the `DEMO_KEY` but it has limited bandwidth compared to the [usage limit for registered users](https://developer.nrel.gov/docs/rate-limits/). NREL has some [API usage instructions](https://developer.nrel.gov/docs/api-key/), but pvlib has a few builtin functions, like [`pvlib.iotools.get_psm3()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_psm3.html), that wrap the NREL API, and call them for you to make it much easier to use. Skip ahead to the next section to learn more. But before you do... -# -# **Please pause now to visit https://developer.nrel.gov/signup/ and get an API key.** -# +# + +# # NLR API Key +# At the [NLR Developer Network](https://developer.nlr.gov/), there are [APIs](https://en.wikipedia.org/wiki/API) to a lot of valuable [solar resources](https://developer.nlr.gov/docs/solar/) like [weather data from the NSRDB](https://developer.nlr.gov/docs/solar/nsrdb/), [operational data from PVDAQ](https://developer.nlr.gov/docs/solar/pvdaq-v3/), or indicative calculations using [PVWatts](https://developer.nlr.gov/docs/solar/pvwatts/). In order to use these resources from NLR, you need to [register for a free API key](https://developer.nlr.gov/signup/). You can test out the APIs using the `DEMO_KEY` but it has limited bandwidth compared to the [usage limit for registered users](https://developer.nlr.gov/docs/rate-limits/). NLR has some [API usage instructions](https://developer.nlr.gov/docs/api-key/), but pvlib has a few builtin functions, like [`pvlib.iotools.get_psm3()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_psm3.html), that wrap the NLR API, and call them for you to make it much easier to use. Skip ahead to the next section to learn more. But before you do... +# +# **Please pause now to visit https://developer.nlr.gov/signup/ and get an API key.** +# # ## Application Programming Interface (API) # What exactly is an API? Nowadays, the phrase is used interchangeably with a "web API" but in general an API is just a recipe for how to interface with a application programmatically, _IE_: in code. An API could be as simple as a function signature or its published documentation, _EG_: the API for the `solarposition` function is you give it an ISO8601 formatted date with a timezone, the latitude, longitude, and elevation as numbers, and it returns the zenith and azimuth as numbers. -# -# A web API is the same, except the application is a web service, that you access at its URL using web methods. We won't go into too much more detail here, but the most common web method is `GET` which is pretty self explanatory. Look over the [NREL web usage instructions](https://developer.nrel.gov/docs/api-key/) for some examples, but interacting with a web API can be as easy as entering a URL into a browser. Try the URL below to _get_ the PVWatts energy output for a fixed tilt site in [Broomfield, CO](https://goo.gl/maps/awkEcNGzSur9Has18). -# -# https://developer.nrel.gov/api/pvwatts/v6.json?api_key=DEMO_KEY&lat=40&lon=-105&system_capacity=4&azimuth=180&tilt=40&array_type=1&module_type=1&losses=10 -# +# +# A web API is the same, except the application is a web service, that you access at its URL using web methods. We won't go into too much more detail here, but the most common web method is `GET` which is pretty self explanatory. Look over the [NLR web usage instructions](https://developer.nlr.gov/docs/api-key/) for some examples, but interacting with a web API can be as easy as entering a URL into a browser. Try the URL below to _get_ the PVWatts energy output for a fixed tilt site in [Broomfield, CO](https://goo.gl/maps/awkEcNGzSur9Has18). +# +# https://developer.nlr.gov/api/pvwatts/v6.json?api_key=DEMO_KEY&lat=40&lon=-105&system_capacity=4&azimuth=180&tilt=40&array_type=1&module_type=1&losses=10 +# # In addition to just using your browser, you can also access web APIs programmatically. The most popular Python package to interact with web APIs is [requests](https://docs.python-requests.org/en/master/). There's also free open source command-line tools like [cURL](https://curl.se/) and [HTTPie](https://httpie.io/), and a popular nagware/freemium GUI application called [Postman](https://www.postman.com/). -# -# **If you have an NREL API key please enter it in the next cell.** +# +# **If you have an NLR API key please enter it in the next cell.** + +# In[ ]: + -# %% -NREL_API_KEY = None # <-- please set your NREL API key here +NLR_API_KEY = None # <-- please set your NLR API key here # note you must use "quotes" around your key, for example: -# NREL_API_KEY = 'DEMO_KEY' # single or double both work fine +# NLR_API_KEY = 'DEMO_KEY' # single or double both work fine # during the live tutorial, we've stored a dedicated key on our server -if NREL_API_KEY is None: +if NLR_API_KEY is None: try: - NREL_API_KEY = os.environ[ - "NREL_API_KEY" + NLR_API_KEY = os.environ[ + "NLR_API_KEY" ] # get dedicated key for tutorial from servier except KeyError: - NREL_API_KEY = "DEMO_KEY" # OK for this demo, but better to get your own key + NLR_API_KEY = "DEMO_KEY" # OK for this demo, but better to get your own key + -# %% [markdown] # # Fetching TMYs from the NSRDB -# +# # The NSRDB, one of many sources of weather data intended for PV modeling, is free and easy to access using pvlib. As an example, we'll fetch a TMY dataset for Phoenix, AZ at coordinates [(33.4484, -112.0740)](https://goo.gl/maps/hGV92QHCm5FHJKbf9). -# +# # This function uses [`pvdeg.weather.get()`](https://pvdegradationtools.readthedocs.io/en/latest/_autosummary/pvdeg.weather.html#pvdeg.weather.get), which returns a Python dictionary of metadata and a Pandas dataframe of the timeseries weather data. -# +# # This function internally leverages [`pvlib.iotools.get_psm3()`](https://pvlib-python.readthedocs.io/en/stable/reference/generated/pvlib.iotools.get_psm3.html). However, for some of the NSRDB data relative humidity is not a given parameter, and `pvdeg` calculates the values from the downloaded data as an internal processing step. -# %% +# In[5]: + + # This cell is for documentation only and is not meant to be executed. # The next cell performs the same request directly with PVLib. """ weather_db = 'PSM4' weather_id = (33.4484, -112.0740) -weather_arg = {'api_key': NREL_API_KEY, +weather_arg = {'api_key': NLR_API_KEY, 'email': 'user@mail.com', 'year': '2021', 'map_variables': True, @@ -133,27 +144,42 @@ weather_df, meta = pvdeg.weather.get(weather_db, weather_id, **weather_arg) """ -# %% + +# In[ ]: + + weather_df, meta = pvlib.iotools.get_nsrdb_psm4_tmy( latitude=33.4484, longitude=-112.0740, - api_key=NREL_API_KEY, - email="silvana.ovaitt@nrel.gov", # <-- any email works here fine + api_key=NLR_API_KEY, + email="silvana.ovaitt@nlr.gov", # <-- any email works here fine year="tmy", map_variables=True, leap_day=False, ) -# %% + +# In[7]: + + weather_df -# %% + +# In[8]: + + meta -# %% + +# In[9]: + + weather_df.head() -# %% + +# In[10]: + + # Choose the date you want to plot date = "2010-01-01" mask = weather_df.index.date == pd.to_datetime(date).date() @@ -170,29 +196,37 @@ plt.title(f"Weather Data for {date}") plt.show() -# %% + +# In[11]: + + print(weather_df.columns) print(weather_df.index.min(), weather_df.index.max()) print(weather_df.head()) -# %% [markdown] + # # 2. Calculate Installation Standoff - Level 1 -# +# # We use [`pvlib.standards.calc_standoff()`](https://pvdegradationtools.readthedocs.io/en/latest/_autosummary/pvdeg.standards.html#pvdeg.standards.calc_standoff) which takes at minimum the weather data and metadata, and returns the minimum installation distance in centimeters. -# -# +# +# + +# In[12]: + -# %% standoff = pvdeg.standards.standoff(weather_df=weather_df, meta=meta) -# %% + +# In[13]: + + print("Minimum installation distance:", standoff["x"]) -# %% [markdown] + # # 3. Calculate Installation Standoff - Level 2 -# +# # Let's take a closer look at the function and some optional parameters. -# +# # - level : 1 or 2 (see IEC TS 63216) # - tilt and azimuth : tilt from horizontal of PV module and azimuth in degrees from North # - sky_model : pvlib compatible model for generating sky characteristics (Options: 'isotropic', 'klucher', 'haydavies', 'reindl', 'king', 'perez') @@ -201,10 +235,20 @@ # - x_0 : thermal decay constant [cm] (see documentation) # - wind_speed_factor : Wind speed correction factor to account for different wind speed measurement heights between weather database (e.g. NSRDB) and the tempeature model (e.g. SAPM) -# %% +# In[14]: + + standoff = pvdeg.standards.standoff(weather_df=weather_df, meta=meta, T98=70) -# %% + +# In[15]: + + print("Minimum installation distance:", standoff["x"]) -# %% + +# In[ ]: + + + + diff --git a/tutorials/10_workshop_demos/scripts/02_duramat_live_demo.py b/tutorials/10_workshop_demos/scripts/02_duramat_live_demo.py index 1918b84a..1d97bd78 100644 --- a/tutorials/10_workshop_demos/scripts/02_duramat_live_demo.py +++ b/tutorials/10_workshop_demos/scripts/02_duramat_live_demo.py @@ -1,23 +1,30 @@ -# %% [markdown] +#!/usr/bin/env python +# coding: utf-8 + # # DuraMAT Live Demo (HPC) -# +# # ![PVDeg Logo](../images/pvdeg_logo.svg) -# -# +# +# # **Steps:** # 1. Initialize weather data into xarray # 2. Calculate installation standoff for New Mexico # 3. Plot results -# +# # **Xarray: multi-dimensional data frame** -# +# # ![Xarray](../images/xarray.webp) -# %% +# In[1]: + + import pandas as pd import pvdeg -# %% + +# In[2]: + + # This information helps with debugging and getting support :) import sys import platform @@ -27,20 +34,25 @@ print("Pandas version ", pd.__version__) print("pvdeg version ", pvdeg.__version__) -# %% [markdown] + # # 1 Start distributed compute cluster - DASK -# %% +# In[3]: + + pvdeg.geospatial.start_dask() -# %% + +# In[ ]: + + # Get weather data weather_db = "NSRDB" weather_arg = { "satellite": "Americas", "names": 2022, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -53,17 +65,29 @@ weather_ds, meta_df = pvdeg.weather.get(weather_db, geospatial=True, **weather_arg) -# %% + +# In[ ]: + + weather_ds -# %% + +# In[ ]: + + meta_NM = meta_df[meta_df["state"] == "New Mexico"] -# %% + +# In[ ]: + + meta_NM_sub, gids_NM_sub = pvdeg.utilities.gid_downsampling(meta_NM, 4) weather_NM_sub = weather_ds.sel(gid=meta_NM_sub.index) -# %% + +# In[ ]: + + geo = { "func": pvdeg.standards.standoff, "weather_ds": weather_NM_sub, @@ -72,10 +96,16 @@ standoff_res = pvdeg.geospatial.analysis(**geo) -# %% + +# In[ ]: + + standoff_res -# %% + +# In[ ]: + + fig, ax = pvdeg.geospatial.plot_USA( standoff_res["x"], cmap="viridis", @@ -85,18 +115,20 @@ cb_title="Standoff (cm)", ) -# %% [markdown] + # # Relative Humidity Example - Time dimension -# %% +# In[ ]: + + # State bar of new mexico: (35.16482, -106.58979) weather_db = "NSRDB" -weather_id = (35.16482, -106.58979) # NREL (39.741931, -105.169891) +weather_id = (35.16482, -106.58979) # NLR (39.741931, -105.169891) weather_arg = { "satellite": "Americas", "names": 2022, - "NREL_HPC": True, + "NLR_HPC": True, "attributes": [ "air_temperature", "wind_speed", @@ -111,16 +143,28 @@ weather_db, weather_id, geospatial=False, **weather_arg ) -# %% + +# In[ ]: + + RH_module = pvdeg.humidity.module(weather_df=weather_df, meta=meta) -# %% + +# In[ ]: + + RH_module -# %% + +# In[ ]: + + RH_module.plot(ls="--") -# %% + +# In[ ]: + + geo = { "func": pvdeg.humidity.module, "weather_ds": weather_NM_sub, @@ -129,10 +173,16 @@ RH_module = pvdeg.geospatial.analysis(**geo) -# %% + +# In[ ]: + + RH_module -# %% + +# In[ ]: + + # from matplotlib.animation import FuncAnimation # from matplotlib.animation import PillowWriter # import matplotlib.animation as animation @@ -156,7 +206,11 @@ # ims = [imageio.imread(f'./images/RH_animation_{n}.png') for n in range(1, 13)] # imageio.mimwrite(f'../images/RH_animation.gif', ims, format='GIF', duration=1000, loop=10) -# %% [markdown] + # -# %% +# In[ ]: + + + + diff --git a/tutorials/_config.yml b/tutorials/_config.yml index eaee60a7..5e2fbf15 100644 --- a/tutorials/_config.yml +++ b/tutorials/_config.yml @@ -54,7 +54,7 @@ bibtex_bibfiles: # Information about where the book exists on the web repository: - url: https://github.com/NREL/PVDegradationTools + url: https://github.com/NatLabRockies/PVDegradationTools path_to_book: tutorials # Optional path to your book, relative to the repository root branch: main diff --git a/tutorials/myst.yml b/tutorials/myst.yml index c6c73d43..218a6a74 100644 --- a/tutorials/myst.yml +++ b/tutorials/myst.yml @@ -13,10 +13,10 @@ project: - .ipynb_checkpoints - '*/scripts' - '*/scripts/*' - github: NREL/PVDegradationTools + github: NatLabRockies/PVDegradationTools thebe: binder: - repo: NREL/PVDegradationTools + repo: NatLabRockies/PVDegradationTools provider: github ref: main bibliography: diff --git a/tutorials/tools/README.md b/tutorials/tools/README.md index f3c9448b..8a8ff329 100644 --- a/tutorials/tools/README.md +++ b/tutorials/tools/README.md @@ -1,6 +1,6 @@ # PVDeg Tools -This folder contains standalone computational tools for specific PV degradation calculations. These tools are included in the [Jupyter Book](https://nrel.github.io/PVDegradationTools/intro.html) under the Tools section. +This folder contains standalone computational tools for specific PV degradation calculations. These tools are included in the [Jupyter Book](https://NatLabRockies.github.io/PVDegradationTools/intro.html) under the Tools section. For tutorials and guided examples, see the [tutorials](../tutorials/) directory. @@ -13,14 +13,14 @@ General degradation calculation tools and utilities. Calculations for oxygen ingress through edge seals in PV modules. ### Tools - Module Standoff for IEC TS 63126 -**⚠️ Requires NREL HPC Access** +**⚠️ Requires NLR HPC Access** -Calculation of module standoff distance according to IEC TS 63126. This tool performs geospatial analysis across multiple locations and requires access to NREL's High Performance Computing (HPC) resources for execution. +Calculation of module standoff distance according to IEC TS 63126. This tool performs geospatial analysis across multiple locations and requires access to NLR's High Performance Computing (HPC) resources for execution. ## Requirements Most tools can run locally with standard dependencies. Tools marked with ⚠️ require: -- NREL HPC cluster access +- NLR HPC cluster access - Additional authentication/credentials - Specific environment setup