Skip to content

Commit 9bed750

Browse files
committed
Hidden token credentials
1 parent af468d1 commit 9bed750

13 files changed

+76
-18
lines changed

README.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -100,6 +100,9 @@ Remove the Docker Image (if needed)
100100
docker rmi ghcr.io/adribaeza/llm-tinyllama-backend:latest
101101

102102

103+
104+
105+
103106
Razones para enviar el historial completo:
104107
Contexto: El modelo necesita el historial completo para entender el contexto de la conversación y generar respuestas adecuadas.
105108
Coherencia: Mantener el historial ayuda a asegurar que las respuestas sean coherentes con las interacciones anteriores.
@@ -108,4 +111,10 @@ Memoria: Los modelos LLM no tienen memoria persistente entre solicitudes, por lo
108111
Cómo usar docker-compose
109112
Construir y ejecutar los servicios:
110113
Detener los servicios:
111-
Este archivo docker-compose.yml te permitirá levantar tanto el backend con FastAPI como el frontend con Streamlit de manera sencilla y reproducible.
114+
Este archivo docker-compose.yml te permitirá levantar tanto el backend con FastAPI como el frontend con Streamlit de manera sencilla y reproducible.
115+
116+
kubectl create secret generic llm-service-token-secret --from-literal=SERVICE_TOKEN=myllservicetoken2024
117+
118+
export SERVICE_TOKEN=myllservicetoken2024
119+
120+
docker-compose up

backend/.env

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
SERVICE_TOKEN=myllservicetoken2024

backend/Dockerfile

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,15 @@ FROM python:3.10-slim
55
WORKDIR /api
66

77
# Copy the requirements file into the container at /api and install dependencies
8-
COPY ./requirements.txt /api/requirements.txt
8+
COPY requirements.txt /api/requirements.txt
99
RUN pip install --no-cache-dir -r requirements.txt
1010

1111
# Copy backend code into the container at /api
1212
COPY ./api /api
1313

14+
# Copy .env file into the container at /api
15+
COPY .env /api
16+
1417
# Expose the port the api runs in
1518
EXPOSE 8000
1619

backend/api/main.py

Lines changed: 14 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,22 @@
99
from fastapi import FastAPI, HTTPException, APIRouter, Depends
1010
from fastapi.security import OAuth2PasswordBearer
1111
from transformers import pipeline
12-
1312
from starlette.middleware.cors import CORSMiddleware #Import the middleware
1413
import json
1514
from pydantic import BaseModel
1615
import asyncio
1716
import os
17+
from dotenv import load_dotenv
18+
19+
# Load the environment variables from the .env file
20+
load_dotenv()
21+
22+
# Static token for the API
23+
STATIC_TOKEN = os.getenv("SERVICE_TOKEN")
24+
25+
# Verify that the SERVICE_TOKEN is defined in the environment variables
26+
if STATIC_TOKEN is None:
27+
raise ValueError("The SERVICE_TOKEN environment variable is not defined")
1828

1929
#Default LLM configuration values
2030
DEFAULT_MAX_NEW_TOKENS = 100
@@ -24,8 +34,8 @@
2434
DEFAULT_TOP_P = 0.6
2535
LLM_MODEL = "TinyLlama/TinyLlama-1.1B-Chat-v1.0"
2636

27-
#instance logger
28-
logger = logging.getLogger(__name__)
37+
# Set up logging
38+
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
2939

3040
#Set default route for the API with prefix /api/v1
3141
api_router = APIRouter(prefix="/api/v1")
@@ -38,9 +48,6 @@
3848
# OAuth2 configuration
3949
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="token")
4050

41-
# Static token for the API
42-
STATIC_TOKEN = "myllservicetoken2024" #os.getenv("STATIC_TOKEN")
43-
4451
def verify_token(token: str):
4552
if token != STATIC_TOKEN:
4653
raise HTTPException(
@@ -62,7 +69,7 @@ async def get_current_user(token: str = Depends(oauth2_scheme)):
6269
allow_headers=["*"],
6370
)
6471

65-
logger.info('Adding v1 endpoints..')
72+
logging.info('Adding v1 endpoints..')
6673

6774
# Load the model with the TinyLlama model
6875
pipe = pipeline("text-generation", model=LLM_MODEL, torch_dtype=torch.bfloat16, device_map="auto")

backend/k8s/deployment.yaml

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,10 @@ spec:
1616
- name: chatllmapi
1717
image: ghcr.io/adribaeza/llm-tinyllama-backend:latest
1818
ports:
19-
- containerPort: 8000
19+
- containerPort: 8000
20+
env:
21+
- name: SERVICE_TOKEN
22+
valueFrom:
23+
secretKeyRef:
24+
name: llm-service-token-secret
25+
key: SERVICE_TOKEN

backend/requirements.txt

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,5 @@ pydantic==2.8.2
66
huggingface_hub==0.24.2
77
accelerate==0.33.0
88
pytest==8.3.2
9-
pytest-cov==5.0.0
9+
pytest-cov==5.0.0
10+
python-dotenv==1.0.1

dicker-compose-ghimages.yaml

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@ services:
88
environment:
99
- MODULE_NAME=api.main
1010
- VARIABLE_NAME=api
11+
- SERVICE_TOKEN=${SERVICE_TOKEN}
1112

1213
frontend:
1314
image: ghcr.io/adribaeza/llm-tinyllama-frontend:latest
@@ -16,4 +17,9 @@ services:
1617
ports:
1718
- "8501:8501"
1819
depends_on:
19-
- backend
20+
- backend
21+
environment:
22+
- SERVICE_TOKEN=${SERVICE_TOKEN}
23+
24+
env_file:
25+
- .env

docker-compose.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,9 @@ services:
99
environment:
1010
- MODULE_NAME=api.main
1111
- VARIABLE_NAME=api
12+
- SERVICE_TOKEN=${SERVICE_TOKEN}
13+
env_file:
14+
- ./frontend/.env
1215

1316
frontend:
1417
build:
@@ -18,4 +21,8 @@ services:
1821
ports:
1922
- "8501:8501"
2023
depends_on:
21-
- backend
24+
- backend
25+
environment:
26+
- SERVICE_TOKEN=${SERVICE_TOKEN}
27+
env_file:
28+
- ./backend/.env

frontend/.env

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
SERVICE_TOKEN=myllservicetoken2024

frontend/Dockerfile

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,9 @@ COPY . /app
1111
COPY requirements.txt /app/
1212
RUN pip install --no-cache-dir -r requirements.txt
1313

14+
# Copy .env file into the container at /api
15+
COPY .env /api
16+
1417
# Make port 8501 available to the world outside this container
1518
EXPOSE 8501
1619

frontend/app/main.py

Lines changed: 10 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,14 @@
11
import streamlit as st
22
import requests, logging, os
3+
from dotenv import load_dotenv
4+
5+
# Load the environment variables from the .env file
6+
load_dotenv()
7+
# Static token for the API
8+
STATIC_TOKEN = os.getenv("SERVICE_TOKEN")
9+
# Verify that the SERVICE_TOKEN is defined in the environment variables
10+
if STATIC_TOKEN is None:
11+
raise ValueError("The SERVICE_TOKEN environment variable is not defined")
312

413
# Set up logging
514
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
@@ -11,8 +20,6 @@
1120
DEFAULT_TOP_K = 50
1221
DEFAULT_TOP_P = 0.9
1322

14-
DEFAULT_TOKEN = "myllservicetoken2024" #os.getenv("TINYLLAMA_API_TOKEN")
15-
1623

1724
# Función para limpiar el historial de mensajes
1825
def clear_chat():
@@ -83,7 +90,7 @@ def main():
8390
st.markdown(prompt)
8491

8592
headers = {
86-
"Authorization": f"Bearer {DEFAULT_TOKEN}",
93+
"Authorization": f"Bearer {STATIC_TOKEN}",
8794
"Content-Type": "application/json"
8895
}
8996
# Construir el historial de la conversación

frontend/k8s/deployment.yaml

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,4 +16,10 @@ spec:
1616
- name: chatllm-frontend
1717
image: ghcr.io/adribaeza/llm-tinyllama-frontend:latest
1818
ports:
19-
- containerPort: 8501
19+
- containerPort: 8501
20+
env:
21+
- name: SERVICE_TOKEN
22+
valueFrom:
23+
secretKeyRef:
24+
name: llm-service-token-secret
25+
key: SERVICE_TOKEN

frontend/requirements.txt

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
11
streamlit==1.37.0
22
pytest==8.3.2
3-
pytest-cov==5.0.0
3+
pytest-cov==5.0.0
4+
python-dotenv==1.0.1

0 commit comments

Comments
 (0)