Build a book reader (Q&A LLM chain)
├── AUTHORS.md
├── LICENSE
├── README.md
├── config
│ ├── `requirements.txt`: packages
│ └── `token_access.py`: the file is used to store tokens (remember to add the filename in `.gitignore` file to protect your token)
├── data
├── notebooks
│ ├── `00-book-time-machine_flan_t5_large.ipynb`: using `google/flan-t5-large`
│ ├── `01-book-time-machine-llama2_7B.ipynb`: using `llama2_7B`
│ └── `02-book-time-machine-mistral_7B.ipynb`: using `mistral_7B`
└── model
└──cache: storing models
git clone https://github.com/dujm/book-reader.git
# create an env (here I name it "llm") with a stable Python version (e.g. Python 3.8)
conda create -n llm python=3.8
# activate env
conda activate llm
pip install -r ./config/requirements.txt
- if you want to use jupyterlab (alternatively, you can use
jupyter notebook
)
conda install -c conda-forge jupyterlab
# add conda environment to jupyter lab
conda install ipykernel
ipython kernel install --user --name=llm
# open jupyter lab
jupyter lab
- Ollama is used in
notebooks/01-book-time-machine-llama2_7B.ipynb
notebooks/02-book-time-machine-mistral_7B.ipynb
- Below is for MacOS. Find more instructions on Ollama if you use other operating systems.
- Download file from Ollama website
- Open Ollama app
- Select a model from Model library.
- I selected
llama2
andmistral
model. - Download them in the terminal
# pull llama2 model
ollama pull llama2
# pull mistral model
ollama pull mistral
- Open Ollama app
Go to notebooks/