Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 8 additions & 6 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ DOMAIN=localhost

# DOMAIN=localhost.tiangolo.com

# Environment: local, staging, production
# Environment: "development", "testing", "staging", "production"

ENVIRONMENT=local
ENVIRONMENT=development

PROJECT_NAME="AI Platform"
STACK_NAME=ai-platform
PROJECT_NAME="Kaapi"
STACK_NAME=Kaapi

#Backend
SECRET_KEY=changethis
Expand All @@ -24,10 +24,9 @@ FIRST_SUPERUSER_PASSWORD=changethis
EMAIL_TEST_USER="test@example.com"

# Postgres

POSTGRES_SERVER=localhost
POSTGRES_PORT=5432
POSTGRES_DB=ai_platform
POSTGRES_DB=kaapi
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres

Expand Down Expand Up @@ -78,3 +77,6 @@ CELERY_TIMEZONE=Asia/Kolkata
# Callback Timeouts (in seconds)
CALLBACK_CONNECT_TIMEOUT = 3
CALLBACK_READ_TIMEOUT = 10

# require as a env if you want to use doc transformation
OPENAI_API_KEY=""
2 changes: 1 addition & 1 deletion .github/workflows/cd-production.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
uses: actions/checkout@v5

- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4 # More information on this action can be found below in the 'AWS Credentials' section
uses: aws-actions/configure-aws-credentials@v5 # More information on this action can be found below in the 'AWS Credentials' section
with:
role-to-assume: arn:aws:iam::024209611402:role/github-action-role
aws-region: ap-south-1
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/cd-staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ jobs:
uses: actions/checkout@v5

- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v4 # More information on this action can be found below in the 'AWS Credentials' section
uses: aws-actions/configure-aws-credentials@v5 # More information on this action can be found below in the 'AWS Credentials' section
with:
role-to-assume: arn:aws:iam::024209611402:role/github-action-role
aws-region: ap-south-1
Expand Down
92 changes: 92 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
# CLAUDE.md

This file provides guidance to Claude Code when working with code in this repository.

## Project Overview

Kaapi is an AI platform built with FastAPI and PostgreSQL, containerized with Docker. It provides AI capabilities including OpenAI assistants, fine-tuning, document processing, and collection management.

## Key Commands

### Development

```bash
# Activate virtual environment
source .venv/bin/activate

# Start development server with auto-reload
fastapi run --reload app/main.py

# Run pre-commit hooks
uv run pre-commit run --all-files

# Generate database migration
alembic revision --autogenerate -m 'Description'

# Seed database with test data
uv run python -m app.seed_data.seed_data
```

### Testing

Tests use `.env.test` for environment-specific configuration.

```bash
# Run test suite
uv run bash scripts/tests-start.sh
```

## Architecture

### Backend Structure

The backend follows a layered architecture located in `backend/app/`:

- **Models** (`models/`): SQLModel entities representing database tables and domain objects

- **CRUD** (`crud/`): Database access layer for all data operations

- **Routes** (`api/`): FastAPI REST endpoints organized by domain

- **Core** (`core/`): Core functionality and utilities
- Configuration and settings
- Database connection and session management
- Security (JWT, password hashing, API keys)
- Cloud storage (`cloud/storage.py`)
- Document transformation (`doctransform/`)
- Fine-tuning utilities (`finetune/`)
- Langfuse observability integration (`langfuse/`)
- Exception handlers and middleware

- **Services** (`services/`): Business logic services
- Response service (`response/`): OpenAI Responses API integration, conversation management, and job execution

- **Celery** (`celery/`): Asynchronous task processing with RabbitMQ and Redis
- Task definitions (`tasks/`)
- Celery app configuration with priority queues
- Beat scheduler and worker configuration


### Authentication & Security

- JWT-based authentication
- API key support for programmatic access
- Organization and project-level permissions

## Environment Configuration

The application uses different environment files:
- `.env` - Application environment configuration (use `.env.example` as template)
- `.env.test` - Test environment configuration


## Testing Strategy

- Tests located in `app/tests/`
- Factory pattern for test fixtures
- Automatic coverage reporting

## Code Standards

- Python 3.11+ with type hints
- Pre-commit hooks for linting and formatting
37 changes: 23 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,43 +29,52 @@ cp .env.example .env

You can then update configs in the `.env` files to customize your configurations.

Before deploying it, make sure you change at least the values for:

- `SECRET_KEY`
- `FIRST_SUPERUSER_PASSWORD`
- `POSTGRES_PASSWORD`

````bash
⚠️ Some services depend on these environment variables being set correctly. Missing or invalid values may cause startup issues.

### Generate Secret Keys

Some environment variables in the `.env` file have a default value of `changethis`.

You have to change them with a secret key, to generate secret keys you can run the following command:

```bash

python -c "import secrets; print(secrets.token_urlsafe(32))"

````

Copy the content and use that as password / secret key. And run that again to generate another secure key.

## Boostrap & development mode
## Bootstrap & development mode

You have two options to start this dockerized setup, depending on whether you want to reset the database:
### Option A: Run migrations & seed data (will reset DB)

Use the prestart profile to automatically run database migrations and seed data.
This profile also resets the database, so use it only when you want a fresh start.
```bash
docker compose --profile prestart up
```

This is a dockerized setup, hence start the project using below command
### Option B: Start normally without resetting DB

If you don't want to reset the database, start the project directly:
```bash
docker compose watch
```
This will start all services in watch mode for development — ideal for local iterations.

This should start all necessary services for the project and will also mount file system as volume for easy development.
### Rebuilding Images

You verify backend running by doing a health check
While the backend service supports live code reloading via `docker compose watch`, **Celery does not support auto-reload**. When you make changes to Celery tasks, workers, or related code, you need to rebuild the Docker image:

```bash
curl http://[your-domain]:8000/api/v1/utils/health/
docker compose up --build
```

or by visiting: http://[your-domain]:8000/api/v1/utils/health/ in the browser
This is also necessary when:
- Dependencies change in `pyproject.toml` or `uv.lock`
- You modify Dockerfile configurations
- Changes aren't being reflected in the running containers

## Backend Development

Expand Down
8 changes: 8 additions & 0 deletions backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# The same Dockerfile is used to build images for the backend, Celery worker, and Celery Flower services.

# Use Python 3.12 base image
FROM python:3.12

Expand Down Expand Up @@ -46,3 +48,9 @@ EXPOSE 80


CMD ["uv", "run", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "80", "--workers", "4"]

# command for Celery worker
# CMD ["uv", "run", "celery", "-A", "app.celery.celery_app", "worker", "--loglevel=info"]

# command for Celery Flower
# CMD ["uv", "run", "celery", "-A", "app.celery.celery_app", "flower", "--port=5555"]
41 changes: 0 additions & 41 deletions backend/Dockerfile.celery

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
"""delete processing and failed columns from collection table

Revision ID: 7ab577d3af26
Revises: c6fb6d0b5897
Create Date: 2025-10-06 13:59:28.561706

"""
from alembic import op
import sqlalchemy as sa
import sqlmodel.sql.sqltypes


# revision identifiers, used by Alembic.
revision = "7ab577d3af26"
down_revision = "c6fb6d0b5897"
branch_labels = None
depends_on = None


def upgrade():
op.execute(
"""
DELETE FROM collection
WHERE status IN ('processing', 'failed')
"""
)
op.execute(
"""
DELETE FROM collection
WHERE llm_service_id IS NULL
"""
)


def downgrade():
pass
Loading