Skip to content
/ LamaBot Public

Chatbot [Lama Bot (དླ་མ་བོཏ།)](https://t.me/compassion_lama_bot) helping people to coup with depression and burnout in Elixir and phoenix

Notifications You must be signed in to change notification settings

T0ha/LamaBot

Repository files navigation

Bodhi

To start your Phoenix server:

  • Install dependencies with mix deps.get
  • Create and migrate your database with mix ecto.setup
  • Start Phoenix endpoint with mix phx.server or inside IEx with iex -S mix phx.server

Now you can visit localhost:4000 from your browser.

Ready to run in production? Please check our deployment guides.

AI Provider Configuration

Bodhi supports multiple AI providers that can be switched via configuration.

Available Providers

OpenRouter (Default)

  • Module: Bodhi.OpenRouter
  • Default Model: deepseek/deepseek-r1-0528:free
  • Environment Variable: OPENROUTER_API_KEY
  • Website: https://openrouter.ai/

Google Gemini

  • Module: Bodhi.Gemini
  • Model: gemini-2.0-flash
  • Environment Variable: GEMINI_API_KEY

Switching Providers

To switch AI providers, update config/config.exs:

# Use OpenRouter (default)
config :bodhi, :ai_client, Bodhi.OpenRouter

# Use Google Gemini
config :bodhi, :ai_client, Bodhi.Gemini

Setting Up API Keys

  1. OpenRouter:

  2. Google Gemini:

  3. Reload environment: direnv allow (if using direnv)

Changing OpenRouter Model

Edit lib/bodhi/open_router.ex and modify the @default_model attribute:

@default_model "deepseek/deepseek-r1-0528:free"  # Current default

# Other popular models:
# @default_model "anthropic/claude-3.5-sonnet"
# @default_model "openai/gpt-4-turbo"
# @default_model "meta-llama/llama-3.1-70b-instruct"
# @default_model "google/gemini-pro-1.5"

See all available models at: https://openrouter.ai/models

Learn more

About

Chatbot [Lama Bot (དླ་མ་བོཏ།)](https://t.me/compassion_lama_bot) helping people to coup with depression and burnout in Elixir and phoenix

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages