Skip to content

ArthurzKV/tinyai

Repository files navigation

TinyAI

TinyAI

npm version bundle size zero dependencies TypeScript License: MIT

The 5KB AI SDK. Zero dependencies. Just works.

"The missing middle between raw API calls and over-engineered frameworks."


npm install tinyai

Why TinyAI?

// Vercel AI SDK - 186KB, lots of setup
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
const { text } = await generateText({
  model: openai('gpt-4o-mini'),
  prompt: 'Summarize this text...',
});

// TinyAI - 5KB, one line
import { summarize } from 'tinyai';
const summary = await summarize(text);

Quick Start

import { tinyai } from 'tinyai';

// Configure once
const ai = tinyai({
  provider: 'openai',
  apiKey: process.env.OPENAI_API_KEY // or auto-detects from env
});

// Use anywhere
const summary = await ai.summarize(longArticle);
const sentiment = await ai.classify(review, ['positive', 'negative', 'neutral']);
const data = await ai.extract(email, { name: 'string', date: 'date', amount: 'number' });
const spanish = await ai.translate(text, 'spanish');
const answer = await ai.ask("What's the capital of France?");
const embedding = await ai.embed(text);

Free Options

TinyAI supports completely free AI providers:

Groq (Free Cloud)

Fast, free cloud AI with generous limits (14K tokens/min).

import { tinyai } from 'tinyai';

// Get free API key: https://console.groq.com/keys
const ai = tinyai({
  provider: 'groq',
  apiKey: process.env.GROQ_API_KEY
});

const summary = await ai.summarize(text); // Free!

Models: Llama 3.3 70B, Llama 3.1 8B, Mixtral, Gemma2

Ollama (Free Local)

Run AI 100% locally on your machine. No API key needed.

# 1. Install Ollama: https://ollama.ai
# 2. Pull a model
ollama pull llama3.2
import { tinyai } from 'tinyai';

// No API key needed!
const ai = tinyai({
  provider: 'ollama',
  model: 'llama3.2'  // or mistral, codellama, phi, gemma2
});

const summary = await ai.summarize(text); // 100% free, runs locally

Provider Comparison

Provider Cost Speed Privacy Setup
Ollama Free Depends on hardware 100% local Install app
Groq Free tier Very fast Cloud Get API key
OpenAI Paid Fast Cloud Get API key

Features

Type-Safe Extraction (like Zod, but for AI)

const invoice = await ai.extract(pdfText, {
  vendor: 'string',
  total: 'number',
  items: [{ name: 'string', price: 'number' }],
  dueDate: 'date',
});

// TypeScript knows the exact shape:
// { vendor: string, total: number, items: { name: string, price: number }[], dueDate: Date }

Zero-Config Defaults

// Just set OPENAI_API_KEY env var
import { summarize, classify, extract } from 'tinyai';

const summary = await summarize(text);  // Works instantly

Streaming

for await (const chunk of ai.stream.summarize(text)) {
  process.stdout.write(chunk);
}

Edge-Ready

Works everywhere: Node.js, Deno, Bun, Cloudflare Workers, Vercel Edge.

// Cloudflare Worker
export default {
  async fetch(req: Request) {
    const summary = await summarize(await req.text());
    return new Response(summary);
  }
};

API

tinyai(config?)

Creates a TinyAI instance.

const ai = tinyai({
  provider: 'openai',  // 'openai' | 'anthropic' | 'groq' | 'ollama'
  apiKey: '...',       // Optional, uses env vars by default
  model: 'gpt-4o',     // Optional, defaults to gpt-4o-mini
});

summarize(text, options?)

Summarizes text.

const summary = await ai.summarize(longText);
const brief = await ai.summarize(text, { maxLength: 50 });

classify(text, categories)

Classifies text into one of the provided categories.

const sentiment = await ai.classify(review, ['positive', 'negative', 'neutral']);
const category = await ai.classify(email, ['urgent', 'normal', 'spam']);

extract(text, schema)

Extracts structured data with full TypeScript inference.

const person = await ai.extract(bio, {
  name: 'string',
  age: 'number',
  skills: ['string'],
  contact: {
    email: 'string',
    phone: 'string',
  },
});

Supported types: 'string' | 'number' | 'boolean' | 'date'

translate(text, language)

Translates text to another language.

const spanish = await ai.translate('Hello, world!', 'spanish');

ask(question, options?)

Answers questions, optionally with context.

const answer = await ai.ask("What's 2+2?");
const specific = await ai.ask("What's the total?", { context: invoiceText });

embed(text)

Generates embedding vectors.

const embedding = await ai.embed("Hello, world!");
// Returns number[] with 1536 dimensions (OpenAI)

generate(prompt, options?)

Low-level text generation.

const poem = await ai.generate("Write a haiku about coding");
const story = await ai.generate("Once upon a time...", {
  system: "You are a creative storyteller"
});

Comparison

Feature TinyAI Vercel AI SDK LangChain
Bundle size 5KB 186KB 500KB+
Dependencies 0 12+ 50+
TypeScript inference Native Partial Plugin
Setup time 1 min 10 min 30 min
Learning curve None Medium Steep

Standalone Functions

All primitives work standalone without creating an instance:

import { summarize, classify, extract } from 'tinyai';

// Just set OPENAI_API_KEY in your environment
const summary = await summarize(text);
const sentiment = await classify(text, ['positive', 'negative']);
const data = await extract(text, { name: 'string' });

Streaming Helpers

import { toReadableStream, collectStream } from 'tinyai';

// Convert to ReadableStream for HTTP responses
const stream = toReadableStream(ai.stream.summarize(text));
return new Response(stream);

// Collect stream to string
const full = await collectStream(ai.stream.summarize(text));

Roadmap

  • Core client with OpenAI provider
  • summarize() - text summarization
  • classify() - text classification
  • extract() - structured data extraction
  • translate() - translation
  • ask() - Q&A
  • embed() - embeddings
  • TypeScript inference
  • Streaming support
  • Provider: Groq (free cloud)
  • Provider: Ollama (free local)
  • Provider: Anthropic
  • pipe() - composable pipelines
  • CLI tool

License

MIT

About

The 5KB AI SDK. Zero dependencies. Just works.

Topics

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors