Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: 多模态支持 #1431

Merged
merged 40 commits into from
Mar 29, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
40 commits
Select commit Hold shift + click to select a range
ae5a493
1.去除空行的bug
Mar 4, 2025
d40df00
Update kirara_ai/workflow/implementations/blocks/system/basic.py
chuanSir123 Mar 5, 2025
1107814
Update chat.py
chuanSir123 Mar 5, 2025
0433eac
Update messages.py
chuanSir123 Mar 5, 2025
7797ff0
feat: 增加Workflow ID属性并支持图片理解功能
Mar 12, 2025
9469a06
feat: 增加Workflow ID属性并支持图片理解功能
Mar 12, 2025
789d520
feat: 增加Workflow ID属性并支持图片理解功能
Mar 12, 2025
55a6ffb
Merge remote-tracking branch 'refs/remotes/upstream/master' into master1
Mar 12, 2025
4abe6aa
#把下载资源和获取格式放到message中
Mar 12, 2025
68fed72
#bug :workflow.id加上默认值
Mar 13, 2025
9ef0865
feat: Introduce MediaManager for media handling and registration
lss233 Mar 13, 2025
64f9075
增加workflow.id和本轮对话的图片理解 (#1430)
lss233 Mar 14, 2025
2c2ca1e
feat: Enhance media management and messaging capabilities
lss233 Mar 14, 2025
d4f9feb
refactor: Simplify media URL generation and enhance OpenAI adapter
lss233 Mar 14, 2025
ad0586f
fix: web api 404 error
lss233 Mar 19, 2025
0d5ba38
feat: Add media management API and enhance media handling
lss233 Mar 22, 2025
82bd651
feat: Enhance memory composition and messaging capabilities
lss233 Mar 23, 2025
0ac617e
refactor: Integrate dependency injection for memory registries and co…
lss233 Mar 23, 2025
799df66
feat: Enhance LLM message handling and media integration
lss233 Mar 23, 2025
4badb69
feat: Enhance media registration and async handling in MediaManager
lss233 Mar 23, 2025
0469869
feat: Improve media registration process with enhanced error handling…
lss233 Mar 24, 2025
c1370a4
feat: Enhance QQBotAdapter to replace URL dots and improve message ha…
lss233 Mar 24, 2025
5a8dbc8
refactor: Simplify log broadcasting by sending recent logs as a list
lss233 Mar 24, 2025
438241a
feat: Add base64 URL support for media and enhance QQBotAdapter URL h…
lss233 Mar 25, 2025
c839ceb
feat: Improve OpenAIAdapter response handling and media date filtering
lss233 Mar 26, 2025
75131cb
feat: Introduce media carrier system for enhanced media management
lss233 Mar 26, 2025
5dc04fc
feat: Add utility function for no-cache file responses
lss233 Mar 27, 2025
329588c
feat: Add references field to MediaMetadata and update media type han…
lss233 Mar 27, 2025
0838137
feat: Add pytz dependency for timezone support in media handling
lss233 Mar 29, 2025
be5fde8
fix: Correct message formatting in ClaudeAdapter request data
lss233 Mar 29, 2025
0be623b
feat: Implement file deletion for media with no references
lss233 Mar 29, 2025
5dca2a8
refactor: Streamline media deletion process for orphaned files
lss233 Mar 29, 2025
892041b
feat: Integrate tracing system for LLM requests and enhance database …
lss233 Mar 29, 2025
3429af0
feat: Enhance LLM tracing with failure handling and cleanup functiona…
lss233 Mar 29, 2025
63db1f8
feat: Add Alembic migration support and initial database migration
lss233 Mar 29, 2025
b7cb667
feat: Introduce tracing configuration and enhance LLM tracing functio…
lss233 Mar 29, 2025
fba5038
Apply suggestions from code review
lss233 Mar 29, 2025
69cef6a
refactor: Improve property injection mechanism in Inject class
lss233 Mar 29, 2025
65f6483
feat: LLM 调用记录 (#1439)
lss233 Mar 29, 2025
0f6d990
Merge branch 'master' into feature/media_api
lss233 Mar 29, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -20,5 +20,6 @@ build/
*.egg-info/
.coverage
/web/
botpy.log
data/frpc/
botpy.log*
data/frpc/
data/db/
3 changes: 2 additions & 1 deletion MANIFEST.in
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
recursive-include kirara_ai/plugins/im_http_legacy_adapter/assets *
recursive-include kirara_ai/plugins/im_qqbot_adapter/assets *
recursive-include kirara_ai/plugins/im_telegram_adapter/assets *
recursive-include kirara_ai/plugins/im_wecom_adapter/assets *
recursive-include kirara_ai/plugins/im_wecom_adapter/assets *
recursive-include kirara_ai/alembic *
119 changes: 119 additions & 0 deletions alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
# A generic, single database configuration.

[alembic]
# path to migration scripts
# Use forward slashes (/) also on windows to provide an os agnostic path
script_location = kirara_ai/alembic

# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
# Uncomment the line below if you want the files to be prepended with date and time
# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file
# for all available tokens
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s

# sys.path path, will be prepended to sys.path if present.
# defaults to the current working directory.
prepend_sys_path = .

# timezone to use when rendering the date within the migration file
# as well as the filename.
# If specified, requires the python>=3.9 or backports.zoneinfo library and tzdata library.
# Any required deps can installed by adding `alembic[tz]` to the pip requirements
# string value is passed to ZoneInfo()
# leave blank for localtime
# timezone =

# max length of characters to apply to the "slug" field
# truncate_slug_length = 40

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false

# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false

# version location specification; This defaults
# to alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path.
# The path separator used here should be the separator specified by "version_path_separator" below.
# version_locations = %(here)s/bar:%(here)s/bat:alembic/versions

# version path separator; As mentioned above, this is the character used to split
# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep.
# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas.
# Valid values for version_path_separator are:
#
# version_path_separator = :
# version_path_separator = ;
# version_path_separator = space
# version_path_separator = newline
#
# Use os.pathsep. Default configuration used for new projects.
version_path_separator = os

# set to 'true' to search source files recursively
# in each "version_locations" directory
# new in Alembic version 1.10
# recursive_version_locations = false

# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8

sqlalchemy.url = sqlite:///./data/db/kirara.db


[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples

# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks = black
# black.type = console_scripts
# black.entrypoint = black
# black.options = -l 79 REVISION_SCRIPT_FILENAME

# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
# hooks = ruff
# ruff.type = exec
# ruff.executable = %(here)s/.venv/bin/ruff
# ruff.options = check --fix REVISION_SCRIPT_FILENAME

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARNING
handlers = console
qualname =

[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
4 changes: 2 additions & 2 deletions data/dispatch_rules/rules.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
- rule_id: chat_normal
name: 群聊AI对话
description: 群聊中使用 /chat 开头对话或者 被@ 时触发聊天
workflow_id: test:test
workflow_id: chat:normal
priority: 5
enabled: true
rule_groups:
Expand All @@ -24,7 +24,7 @@
- rule_id: chat_creative
name: 私聊AI对话
description: 私聊时直接发送内容触发对话
workflow_id: test:test
workflow_id: chat:normal
priority: 5
enabled: true
rule_groups:
Expand Down
2 changes: 2 additions & 0 deletions data/media/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
metadata/*
files/*
1 change: 1 addition & 0 deletions kirara_ai/alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Generic single-database configuration.
79 changes: 79 additions & 0 deletions kirara_ai/alembic/env.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
from logging.config import fileConfig

from alembic import context
from sqlalchemy import engine_from_config, pool

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)

Check warning on line 13 in kirara_ai/alembic/env.py

View check run for this annotation

Codecov / codecov/patch

kirara_ai/alembic/env.py#L13

Added line #L13 was not covered by tests

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
from kirara_ai.database.manager import Base
from kirara_ai.tracing.models import LLMRequestTrace # noqa: F401

target_metadata = Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.


def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.

This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.

Calls to context.execute() here emit the given string to the
script output.

"""
url = config.get_main_option("sqlalchemy.url")
context.configure(

Check warning on line 43 in kirara_ai/alembic/env.py

View check run for this annotation

Codecov / codecov/patch

kirara_ai/alembic/env.py#L42-L43

Added lines #L42 - L43 were not covered by tests
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)

with context.begin_transaction():
context.run_migrations()

Check warning on line 51 in kirara_ai/alembic/env.py

View check run for this annotation

Codecov / codecov/patch

kirara_ai/alembic/env.py#L50-L51

Added lines #L50 - L51 were not covered by tests


def run_migrations_online() -> None:
"""Run migrations in 'online' mode.

In this scenario we need to create an Engine
and associate a connection with the context.

"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)

with connectable.connect() as connection:
context.configure(
connection=connection, target_metadata=target_metadata
)

with context.begin_transaction():
context.run_migrations()


if context.is_offline_mode():
run_migrations_offline()

Check warning on line 77 in kirara_ai/alembic/env.py

View check run for this annotation

Codecov / codecov/patch

kirara_ai/alembic/env.py#L77

Added line #L77 was not covered by tests
else:
run_migrations_online()
28 changes: 28 additions & 0 deletions kirara_ai/alembic/script.py.mako
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
"""${message}

Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}

"""
from typing import Sequence, Union

from alembic import op
import sqlalchemy as sa
${imports if imports else ""}

# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}


def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}


def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}
62 changes: 62 additions & 0 deletions kirara_ai/alembic/versions/4a364dbb8dab_initial_migration.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
"""Initial migration

Revision ID: 4a364dbb8dab
Revises:
Create Date: 2025-03-29 13:59:33.243069

"""
from typing import Sequence, Union

import sqlalchemy as sa
from alembic import op

# revision identifiers, used by Alembic.
revision: str = '4a364dbb8dab'
down_revision: Union[str, None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('llm_request_traces',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('trace_id', sa.String(length=64), nullable=False),
sa.Column('model_id', sa.String(length=64), nullable=False),
sa.Column('backend_name', sa.String(length=64), nullable=False),
sa.Column('request_time', sa.DateTime(), nullable=False),
sa.Column('response_time', sa.DateTime(), nullable=True),
sa.Column('duration', sa.Float(), nullable=True),
sa.Column('request_json', sa.Text(), nullable=True),
sa.Column('response_json', sa.Text(), nullable=True),
sa.Column('prompt_tokens', sa.Integer(), nullable=True),
sa.Column('completion_tokens', sa.Integer(), nullable=True),
sa.Column('total_tokens', sa.Integer(), nullable=True),
sa.Column('cached_tokens', sa.Integer(), nullable=True),
sa.Column('error', sa.Text(), nullable=True),
sa.Column('status', sa.String(length=20), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_index('idx_backend_time', 'llm_request_traces', ['backend_name', 'request_time'], unique=False)
op.create_index('idx_request_model', 'llm_request_traces', ['model_id', 'request_time'], unique=False)
op.create_index('idx_status_time', 'llm_request_traces', ['status', 'request_time'], unique=False)
op.create_index(op.f('ix_llm_request_traces_backend_name'), 'llm_request_traces', ['backend_name'], unique=False)
op.create_index(op.f('ix_llm_request_traces_model_id'), 'llm_request_traces', ['model_id'], unique=False)
op.create_index(op.f('ix_llm_request_traces_request_time'), 'llm_request_traces', ['request_time'], unique=False)
op.create_index(op.f('ix_llm_request_traces_trace_id'), 'llm_request_traces', ['trace_id'], unique=True)
# ### end Alembic commands ###


def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_llm_request_traces_trace_id'), table_name='llm_request_traces')
op.drop_index(op.f('ix_llm_request_traces_request_time'), table_name='llm_request_traces')
op.drop_index(op.f('ix_llm_request_traces_model_id'), table_name='llm_request_traces')
op.drop_index(op.f('ix_llm_request_traces_backend_name'), table_name='llm_request_traces')
op.drop_index('idx_status_time', table_name='llm_request_traces')
op.drop_index('idx_request_model', table_name='llm_request_traces')
op.drop_index('idx_backend_time', table_name='llm_request_traces')
op.drop_table('llm_request_traces')

Check warning on line 61 in kirara_ai/alembic/versions/4a364dbb8dab_initial_migration.py

View check run for this annotation

Codecov / codecov/patch

kirara_ai/alembic/versions/4a364dbb8dab_initial_migration.py#L54-L61

Added lines #L54 - L61 were not covered by tests
# ### end Alembic commands ###
9 changes: 6 additions & 3 deletions kirara_ai/config/config_loader.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import os
import shutil
from functools import wraps
from typing import Optional, Type
from typing import Generic, Optional, TypeVar

from pydantic import BaseModel, ValidationError
from pydantic.json_schema import GenerateJsonSchema, JsonSchemaValue
Expand All @@ -10,6 +10,9 @@
from ..logger import get_logger

CONFIG_FILE = "data/config.yaml"

T = TypeVar("T", bound=BaseModel)

class ConfigLoader:
"""
配置文件加载器,支持加载和保存 YAML 文件,并保留注释。
Expand All @@ -18,7 +21,7 @@ class ConfigLoader:
yaml = YAML()

@staticmethod
def load_config(config_path: str, config_class: Type[BaseModel]) -> BaseModel:
def load_config(config_path: str, config_class: Generic[T]) -> T:
"""
从 YAML 文件中加载配置,并将其序列化为相应的配置对象。
:param config_path: 配置文件路径。
Expand Down Expand Up @@ -84,4 +87,4 @@ def sort(
self, value: JsonSchemaValue, parent_key: Optional[str] = None
) -> JsonSchemaValue:
"""No-op, we don't want to sort schema values at all."""
return value
return value
6 changes: 6 additions & 0 deletions kirara_ai/config/global_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,6 +89,11 @@ class SystemConfig(BaseModel):

timezone: str = Field(default="Asia/Shanghai", description="时区")

class TracingConfig(BaseModel):
"""Tracing 配置"""

llm_tracing_content: bool = Field(default=False, description="是否记录 LLM 请求内容")

class GlobalConfig(BaseModel):
ims: List[IMConfig] = Field(default=[], description="IM配置列表")
llms: LLMConfig = LLMConfig()
Expand All @@ -99,5 +104,6 @@ class GlobalConfig(BaseModel):
update: UpdateConfig = UpdateConfig()
frpc: FrpcConfig = FrpcConfig()
system: SystemConfig = SystemConfig()
tracing: TracingConfig = TracingConfig()

model_config = ConfigDict(extra="allow")
3 changes: 3 additions & 0 deletions kirara_ai/database/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
from kirara_ai.database.manager import Base, DatabaseManager, metadata

__all__ = ["Base", "DatabaseManager", "metadata"]
Loading