Skip to content

Commit

Permalink
SDK regeneration
Browse files Browse the repository at this point in the history
  • Loading branch information
fern-api[bot] committed Sep 13, 2024
1 parent 6ceb338 commit 89b6ebc
Show file tree
Hide file tree
Showing 28 changed files with 663 additions and 571 deletions.
307 changes: 154 additions & 153 deletions poetry.lock

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "cohere"
version = "5.9.1"
version = "5.9.2"
description = ""
readme = "README.md"
authors = []
Expand Down
103 changes: 84 additions & 19 deletions reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,11 @@ To learn how to use the Chat API with Streaming and RAG follow our [Text Generat

```python
from cohere import (
ChatbotMessage,
ChatConnector,
ChatStreamRequestConnectorsSearchOptions,
Client,
Message_Chatbot,
ResponseFormat_Text,
TextResponseFormat,
Tool,
ToolCall,
ToolParameterDefinitionsValue,
Expand All @@ -48,7 +48,7 @@ response = client.chat_stream(
model="string",
preamble="string",
chat_history=[
Message_Chatbot(
ChatbotMessage(
message="string",
tool_calls=[
ToolCall(
Expand Down Expand Up @@ -108,7 +108,7 @@ response = client.chat_stream(
)
],
force_single_step=True,
response_format=ResponseFormat_Text(),
response_format=TextResponseFormat(),
safety_mode="CONTEXTUAL",
)
for chunk in response:
Expand All @@ -135,6 +135,14 @@ Text input for the model to respond to.
Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

<dl>
<dd>

**accepts:** `typing.Optional[typing.Literal["text/event-stream"]]` — Pass text/event-stream to receive the streamed response as server-sent events. The default is `\n` delimited events.

</dd>
</dl>

Expand Down Expand Up @@ -578,14 +586,15 @@ To learn how to use the Chat API with Streaming and RAG follow our [Text Generat
<dd>

```python
from cohere import Client
from cohere import Client, ToolMessage

client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.chat(
message="Can you give me a global market overview of solar panels?",
chat_history=[ToolMessage(), ToolMessage()],
prompt_truncation="OFF",
temperature=0.3,
)
Expand All @@ -611,6 +620,14 @@ Text input for the model to respond to.
Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

<dl>
<dd>

**accepts:** `typing.Optional[typing.Literal["text/event-stream"]]` — Pass text/event-stream to receive the streamed response as server-sent events. The default is `\n` delimited events.

</dd>
</dl>

Expand Down Expand Up @@ -1592,14 +1609,7 @@ client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.embed(
texts=["string"],
images=["string"],
model="string",
input_type="search_document",
embedding_types=["float"],
truncate="NONE",
)
client.embed()

```
</dd>
Expand All @@ -1615,7 +1625,19 @@ client.embed(
<dl>
<dd>

**texts:** `typing.Sequence[str]` — An array of strings for the model to embed. Maximum number of texts per call is `96`. We recommend reducing the length of each text to be under `512` tokens for optimal quality.
**texts:** `typing.Optional[typing.Sequence[str]]` — An array of strings for the model to embed. Maximum number of texts per call is `96`. We recommend reducing the length of each text to be under `512` tokens for optimal quality.

</dd>
</dl>

<dl>
<dd>

**images:** `typing.Optional[typing.Sequence[str]]`

An array of image data URIs for the model to embed. Maximum number of images per call is `1`.

The image must be a valid [data URI](https://developer.mozilla.org/en-US/docs/Web/URI/Schemes/data). The image must be in either `image/jpeg` or `image/png` format and has a maximum size of 5MB.

</dd>
</dl>
Expand Down Expand Up @@ -2312,8 +2334,8 @@ Generates a message from the model in response to a provided conversation. To le
<dd>

```python
from cohere import Client, ResponseFormat2_Text
from cohere.v2 import ChatMessage2_User, Tool2, Tool2Function
from cohere import Client, TextResponseFormat2
from cohere.v2 import Tool2, Tool2Function, UserChatMessage2

client = Client(
client_name="YOUR_CLIENT_NAME",
Expand All @@ -2322,7 +2344,7 @@ client = Client(
response = client.v2.chat_stream(
model="string",
messages=[
ChatMessage2_User(
UserChatMessage2(
content="string",
documents=[{"string": {"key": "value"}}],
)
Expand All @@ -2337,7 +2359,8 @@ response = client.v2.chat_stream(
)
],
citation_mode="FAST",
response_format=ResponseFormat2_Text(),
response_format=TextResponseFormat2(),
safety_mode="CONTEXTUAL",
max_tokens=1,
stop_sequences=["string"],
temperature=1.1,
Expand Down Expand Up @@ -2408,6 +2431,24 @@ Dictates the approach taken to generating citations as part of the RAG flow by a

**response_format:** `typing.Optional[ResponseFormat2]`

</dd>
</dl>

<dl>
<dd>

**safety_mode:** `typing.Optional[V2ChatStreamRequestSafetyMode]`

Used to select the [safety instruction](/docs/safety-modes) inserted into the prompt. Defaults to `CONTEXTUAL`.
When `NONE` is specified, the safety instruction will be omitted.

Safety modes are not yet configurable in combination with `tools`, `tool_results` and `documents` parameters.

**Note**: This parameter is only compatible with models [Command R 08-2024](/docs/command-r#august-2024-release), [Command R+ 08-2024](/docs/command-r-plus#august-2024-release) and newer.

Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

Expand Down Expand Up @@ -2557,14 +2598,20 @@ Generates a message from the model in response to a provided conversation. To le

```python
from cohere import Client
from cohere.v2 import ToolChatMessage2

client = Client(
client_name="YOUR_CLIENT_NAME",
token="YOUR_TOKEN",
)
client.v2.chat(
model="model",
messages=[],
messages=[
ToolChatMessage2(
tool_call_id="messages",
tool_content=["messages"],
)
],
)

```
Expand Down Expand Up @@ -2624,6 +2671,24 @@ Dictates the approach taken to generating citations as part of the RAG flow by a

**response_format:** `typing.Optional[ResponseFormat2]`

</dd>
</dl>

<dl>
<dd>

**safety_mode:** `typing.Optional[V2ChatRequestSafetyMode]`

Used to select the [safety instruction](/docs/safety-modes) inserted into the prompt. Defaults to `CONTEXTUAL`.
When `NONE` is specified, the safety instruction will be omitted.

Safety modes are not yet configurable in combination with `tools`, `tool_results` and `documents` parameters.

**Note**: This parameter is only compatible with models [Command R 08-2024](/docs/command-r#august-2024-release), [Command R+ 08-2024](/docs/command-r-plus#august-2024-release) and newer.

Compatible Deployments: Cohere Platform, Azure, AWS Sagemaker/Bedrock, Private Deployments


</dd>
</dl>

Expand Down
Loading

0 comments on commit 89b6ebc

Please sign in to comment.