-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: wip #640
chore: wip #640
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #640 +/- ##
==========================================
- Coverage 69.85% 69.19% -0.67%
==========================================
Files 98 98
Lines 13394 13515 +121
==========================================
- Hits 9357 9352 -5
- Misses 4037 4163 +126
|
curl localhost:3000/run --json @map_reduce.json | ||
``` | ||
|
||
To more applicability encode the [MapReduce][map-reduce] example from the [Crowdforge][crowdforge] paper, I implemented a `prompt_chain` Wasm/WASI function registered on the Host that takes in a system prompt (e.g. "You are journalist writing about cities."), an input (e.g. an ongoing article), a map step prompt with a `{{text}}` placeholder that is filled in, a reduce step, which folds over (combines) the generated text(s) from the map step, and then the optional LLaMA model stored as a [`gguf`][gguf]. If the optional model path is not provided, the Host will fall back to the default `Meta-Llama-3-8B-Instruct.Q4_0.gguf` model. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To more applicability encode the [MapReduce][map-reduce] example from the [Crowdforge][crowdforge] paper, I implemented a `prompt_chain` Wasm/WASI function registered on the Host that takes in a system prompt (e.g. "You are journalist writing about cities."), an input (e.g. an ongoing article), a map step prompt with a `{{text}}` placeholder that is filled in, a reduce step, which folds over (combines) the generated text(s) from the map step, and then the optional LLaMA model stored as a [`gguf`][gguf]. If the optional model path is not provided, the Host will fall back to the default `Meta-Llama-3-8B-Instruct.Q4_0.gguf` model. | |
To more applicably encode the [MapReduce][map-reduce] example from the [Crowdforge][crowdforge] paper, I implemented a `prompt_chain` Wasm/WASI function registered on the Host that takes in a system prompt (e.g. "You are journalist writing about cities."), an input (e.g. an ongoing article), a map step prompt with a `{{text}}` placeholder that is filled in, a reduce step, which folds over (combines) the generated text(s) from the map step, and then the optional LLaMA model stored as a [`gguf`][gguf]. If the optional model path is not provided, the Host will fall back to the default `Meta-Llama-3-8B-Instruct.Q4_0.gguf` model. |
No description provided.