Skip to content

offline-function-calling/cli

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Offline Function Calling Agent

This CLI helps you interact with a function-calling enabled offline LLM using Ollama. It uses the @offline-function-calling/sdk library.

Installation

For now, the installation has to be done by cloning the git repository, installing the dependencies (uv is recommended), and running the main.py script.

git clone https://github.com/offline-function-calling/cli ; cd cli
uv run main.py --model gemma3:27b-fc --tools ./tools

Note that the gemma3:*-fc models must be created using the Modelfiles in the models/ directory of this repository:

ollama create gemma3:27b-fc --file models/gemma3-27b-fc.modelfile
ollama create gemma3:12b-fc --file models/gemma3-12b-fc.modelfile

Note that it is recommended to use the 27b parameter model only if you have 20-24 GB of RAM or more.

Usage

You can add more tools by just creating files in the custom tools directory that you mention. Each file must have one or more Python functions with docstrings that contain a description of what the tool does, as well as what the parameters are.

About

A CLI to interact with function-calling enabled offline models.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages