Skip to content

Commit

Permalink
Public documentation - first stable version (#18)
Browse files Browse the repository at this point in the history
* Add sphinx generated docs. It includes all the information on the repository README for now
* Add requirements for docs
* First stable docs

---------

Co-authored-by: Leticia Martín-Fuertes Moreno <[email protected]>
  • Loading branch information
chucheria and nimbusaeta authored Mar 20, 2024
1 parent f56711e commit fdb936f
Show file tree
Hide file tree
Showing 21 changed files with 816 additions and 288 deletions.
13 changes: 8 additions & 5 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -10,10 +10,13 @@ format:
black promptmeteo/
black tests/

docs:
mkdocs build
mkdocs serve
mkdocs gh-deploy
docsetup:
pip install -e ".[docs]"
pip install -e ".[aws]"
sphinx-apidoc -f -o docs/source/ promptmeteo

html:
$(MAKE) -C docs html

clean:
find . -name '*.pyc' -exec rm -f {} +
Expand All @@ -22,7 +25,7 @@ clean:
find . -name '__pycache__' -exec rm -fr {} +
rm -f .coverage
rm -f .coverage.*
rm -rf ./build
rm -rf ./docs/build
rm -rf ./.pytest_cache

test:
Expand Down
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
7 changes: 7 additions & 0 deletions docs/apidoc/conf.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
conf module
===========

.. automodule:: conf
:members:
:undoc-members:
:show-inheritance:
7 changes: 7 additions & 0 deletions docs/apidoc/modules.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
source
======

.. toctree::
:maxdepth: 4

conf
35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=source
set BUILDDIR=build

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)

if "%1" == "" goto help

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
21 changes: 21 additions & 0 deletions docs/source/classes.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
👻 Classes
==============================

.. autoclass:: promptmeteo.APIGenerator
:members:

.. autoclass:: promptmeteo.APIFormatter
:members:

.. autoclass:: promptmeteo.CodeGenerator
:members:

.. autoclass:: promptmeteo.DocumentClassifier
:members:

.. autoclass:: promptmeteo.DocumentQA
:members:

.. autoclass:: promptmeteo.Summarizer
:members:

44 changes: 44 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Configuration file for the Sphinx documentation builder.
#
# For the full list of built-in configuration values, see the documentation:
# https://www.sphinx-doc.org/en/master/usage/configuration.html
import os
import sys
from datetime import datetime

sys.path.insert(0, os.path.abspath('../..')) # Source code dir relative to this file
# -- Project information -----------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#project-information

project = 'Promptmeteo'
copyright = f"2023 - {datetime.now().year}, Paradigma Digital"
author = 'Ángel Delgado'
release = '0.1.1'

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration

extensions = [
'sphinx.ext.duration',
'sphinx.ext.doctest',
'sphinx.ext.autodoc',
'sphinx.ext.autosummary',
'sphinx.ext.napoleon',
]

templates_path = ['_templates']
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
napoleon_use_admonition_for_examples = True
napoleon_use_admonition_for_notes = True

# -- Options for HTML output -------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#options-for-html-output

html_theme = 'furo'
html_static_path = ['_static']
html_theme_options = {
"navigation_with_keys": True,
}
html_title = "🔥🧔"
pygments_style = "default"
pygments_dark_style = "monokai"
105 changes: 105 additions & 0 deletions docs/source/definition.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
🤔 What is this library for?
==============================

**TL;DR: Industrialize projects powered by LLMs easily.**

LLMs have the capability to address various tasks when provided with specific instructions in the form of input prompts. They can function as a "reasoning engine" for constructing applications. However, deploying and industrializing these applications poses significant challenges for two main reasons.

Firstly, prompts typically encapsulate application logic in their definition, yet they are treated merely as input arguments. This means that a poorly formulated prompt input has the potential to disrupt the application.

Secondly, crafting specific prompts for each task is not only a laborious task but also a complex one. Minor alterations in the input prompt can result in different outcomes, rendering them highly error-prone. Additionally, when composing prompts, considerations extend beyond the task itself to include factors such as the specific LLM being used, the model's capacity, and other relevant aspects.

🚀 How do we do it?
----------------------

**TL;DR: Treating prompts and code equally!!!**

Promptmeteo aims to address the aforementioned issues by dividing the prompt definition into two distinct parts: the task logic, coded within prompt templates, and the concrete problem, included as argument variables. Promptmeteo incorporates high-level objects for various tasks, implemented through `.py` and `.prompt` files.

🏠 Prebuilt tasks
^^^^^^^^^^^^^^^^^^

The project incorporates high-level objects designed to address various NLP tasks, including text classification, named entity recognition, and code generation. These objects only require configuration parameters for execution, eliminating the need to parse the output from the LLMs.

⚙️ Ease of Deployment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Promptmeteo modules adhere to a model interface similar to Scikit-Learn. By defining an interface with independent methods for training, predicting, saving, and loading the model, Promptmeteo enables training in a separate pipeline from prediction. This facilitates the reuse of conventional ML pipelines for LLM projects.

📦 Model Artifacts
^^^^^^^^^^^^^^^^^^^^^^^^^^

To enhance results, LLMs can incorporate examples in their prompts. Promptmeteo supports training with examples, and for reproducibility, the training process can be stored as a binary model artifact. This allows storing and reusing training results multiple times with new data. The training process stores embeddings from the input text in a vector database, such as FAISS.

⚙️ LLMs Integration
^^^^^^^^^^^^^^^^^^^^^^^^^^

Promptmeteo integrates different LLMs through LangChain. This includes models that can be executed locally and remote API calls from providers like OpenAI and HuggingFace.

📄 Prompt Templating
^^^^^^^^^^^^^^^^^^^^^^^^^^

Establishing a concrete format for creating prompts in Promptmeteo (`.prompt`) not only facilitates programmatic use but also enables versioning of prompts. This approach aids in understanding changes when they occur and allows for the definition of code tests oriented toward prompt testing. This testing encompasses aspects such as validating language use and ensuring the prompt size is appropriate for the model.


📋 Current capacilities
----------------------------

✅ Available tasks
^^^^^^^^^^^^^^^^^^

The current available tasks in Promptmeteo are:

.. list-table:: Tasks
:header-rows: 1

* - task type
- description
* - `DocumentQA`
- Document-level question answering
* - `DocumentClassifier`
- Document-level classification
* - `CodeGenerator`
- Code generation
* - `ApiGenerator`
- API REST generation
* - `ApiFormatter`
- API REST correction
* - `Summarizer`
- Text summarization

✅ Available Models
^^^^^^^^^^^^^^^^^^^^^^^^

The current available `model_name` and `language` values are:

.. list-table:: Models
:header-rows: 1

* - provider
- name
- languages
* - openai
- gpt-3.5-turbo-16k
- es, en
* - azure
- gpt-3.5-turbo-16k
- es, en
* - hf_hub_api
- google/flan-t5-xxl
- es, en
* - hf_pipeline
- google/flan-t5-small
- es, en
* - google
- text-bison
- es, en
* - google
- text-bison@001
- es, en
* - google
- text-bison-32k
- es, en
* - bedrock
- anthropic.claude-v2
- es, en
24 changes: 24 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
.. Promptmeteo documentation master file, created by
sphinx-quickstart on Fri Mar 1 15:08:31 2024.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Promptmeteo 🔥🧔
=======================================

Promptmeteo is a Python library for prompt engineering built over LangChain. It simplifies the utilization of large language models (LLMs) for various tasks through a low-code interface. To achieve this, Promptmeteo can employ different LLM models and dynamically generate prompts for specific tasks based on just a few configuration parameters.

.. toctree::
:maxdepth: 2
:caption: Contents:

definition
install
quickstart


.. toctree::
:maxdepth: 1
:caption: Classes

classes
81 changes: 81 additions & 0 deletions docs/source/install.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
⚙️ Installation and configuration
=====

.. _install:

Install
------------

To install the stable version of the library use `pip`_.

.. _pip: https://pypi.org/

.. code-block:: console
(.venv) $ pip install promptmeteo
Configure credentials
------------------------

Create a ```.env``` with the following variables depending on the LLM provider

Google Cloud
^^^^^^^^^^^^^^^^

First you should create a [Service Account](https://cloud.google.com/vertex-ai/docs/general/custom-service-account#configure) with the role: ``Vertex AI User.``

Once created, generate a key, store it locally and reference the path in the .env file:

.. code-block:: console
GOOGLE_CLOUD_PROJECT_ID="MY_GOOGLE_LLM_PROJECT_ID"
GOOGLE_APPLICATION_CREDENTIALS="PATH_TO_SERVICE_ACCOUNT_KEY_FILE.json"
OpenAI
^^^^^^^^^^^^

Create your Secret API key in your User settings [page](https://platform.openai.com/account/api-keys).

Indicate the value of the key in your .env file:

.. code-block:: console
OPENAI_API_KEY="MY_OPENAI_API_KEY"
You can also pass `openai_api_key` as a named parameter.

Hugging Face
^^^^^^^^^^^^^^^^

Create Access Token in your User settings [page](https://huggingface.co/settings/tokens).

.. code-block:: console
HUGGINGFACEHUB_API_TOKEN="MY_HF_API_KEY"
You can also pass `huggingfacehub_api_token` as a named parameter.

AWS Bedrock
^^^^^^^^^^^^^^^^

Create your access keys in security credentials of your user in AWS.

Then write in the files ```~/.aws/config``` and ````~/.aws/credentials```` for Linux and MacOS or ````%USERPROFILE%\.aws\config```` and ````%USERPROFILE%\.aws\credentials```` for Windows:

In credentials

.. code-block:: console
[default]
aws_access_key_id = <YOUR_CREATED_AWS_KEY>
aws_secret_access_key = <YOUR_CREATED_AWS_SECRET_KEY>
In config:

.. code-block:: console
[default]
region = <AWS_REGION>
Loading

0 comments on commit fdb936f

Please sign in to comment.