Skip to content

fix(gen): remove remaining deprecated macros and update #5262

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/TROUBLESHOOTING_TEMPLATE.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ dates:
categories:
- XXXXX
---
import TroubleshootingTicket from '@macros/general/troubleshooting-ticket.mdx'

## Problem

Expand Down Expand Up @@ -45,4 +46,4 @@ Provide links to documentation pages that may be useful if the issue has not bee

If required, add a paragraph containing elements to provide the support with when creating a support ticket.

<Macro id="troubleshooting-ticket" />
<TroubleshootingTicket />
5 changes: 5 additions & 0 deletions macros/ai/ai-generative-apis.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
macro: ai-generative-apis
---

Generative APIs provide access to pre-configured serverless endpoints of the most popular AI models, hosted in European data centers and priced per 1M tokens used.
5 changes: 5 additions & 0 deletions macros/ai/ai-managed-inference.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
macro: ai-managed-inference
---

Effortlessly deploy AI models on a sovereign infrastructure, manage and scale inference with full data privacy. Start now with a simple interface for creating dedicated endpoints.
4 changes: 3 additions & 1 deletion pages/key-manager/faq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ dates:
productIcon: KmsProductIcon
---

import Encryption from '@macros/key-manager/encryption.mdx'

## Why should you use Scaleway Key Manager?

Key Manager helps organizations achieve secure key management by handling low-level and error-prone cryptographic details for you.
Expand All @@ -28,7 +30,7 @@ Key Manager supports the three following cryptographic operations:

## Which algorithms and key usage does Key Manager support?

<Macro id="encryption" />
<Encryption />

Keys with a [key usage](/key-manager/concepts/#key-usage) set to `symmetric_encryption` are **used to encrypt and decrypt data**.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,8 @@ import NetworkLoadBalancer from '@macros/network/load-balancer.mdx'
import NetworkPublicGateways from '@macros/network/public-gateways.mdx'
import StorageManagedDatabases from '@macros/storage/managed-databases.mdx'
import StorageObjectStorage from '@macros/storage/object-storage.mdx'
import AiManagedInference from '@macros/ai/ai-managed-inference.mdx'
import AiGenerativeApis from '@macros/ai/ai-generative-apis.mdx'


Every [Organization](/iam/concepts/#organization) has quotas, which are limits on the number of Scaleway resources they can use. Below is a list of quotas available for each resource.
Expand Down Expand Up @@ -173,7 +175,8 @@ At Scaleway, quotas are applicable per [Organization](/iam/concepts/#organizatio

## Managed Inference

<Macro id="ai-managed-inference" />
<AiManagedInference />

Managed Inference Deployments are limited to a maximum number of nodes, depending on the node types provisioned.

<Message type="important">
Expand All @@ -187,7 +190,7 @@ Managed Inference Deployments are limited to a maximum number of nodes, dependin

## Generative APIs

<Macro id="ai-generative-apis" />
<AiGenerativeApis />

Generative APIs are rate limited based on:
- Tokens per minute (total input and output tokens)
Expand Down
Loading