TweetGPT is a modern, full-stack application that uses Groq or Ollama LLM models to generate, evaluate, and refine tweets automatically β with no manual intervention. It integrates with X (Twitter) OAuth for secure login and posting, features a sleek, mobile-friendly UI built with React + Vite, and deploys via Docker with Nginx for production-ready frontend hosting.
π¦ GitHub Repo: https://github.com/ArchitJ6/TweetGPT
- π€ AI-Powered Tweet Generation β Generates witty, snappy tweets using Groq or Ollama.
- π Iterative Optimization β Auto-refines tweets until quality and engagement criteria are met.
- π X OAuth Integration β Secure sign-in and direct posting to X.
- π― Automatic Evaluation β AI checks humor, originality, brevity, and style without manual review.
- β‘ Fast Model Switching β Toggle between Groq and Ollama seamlessly.
- π¨ Modern UI β Responsive, Twitter-inspired interface.
- π± Mobile Friendly β Optimized for all devices.
- π³ Dockerized Deployment β Backend, frontend, and Nginx all containerized.
- π Nginx Hosting β Production-grade static file serving.
- π Autonomous AI Workflow β Fully automated generationβevaluationβrefinement loop.
| Layer | Technology |
|---|---|
| Frontend | React, Vite, JavaScript, CSS |
| Backend | Python, Flask, LangChain, LangGraph |
| AI Models | Groq LLM, Ollama LLM |
| Auth | X (Twitter) OAuth 2.0 |
| Deployment | Docker, Docker Compose, Nginx |
| Version Control | Git, GitHub |
- π Python 3.12
- π¦ X Developer Account with API access
- π Groq API Key
- π» Ollama installed (if using Ollama mode)
- π³ Docker & Docker Compose
graph TD
A[Start] --> B[Generate Tweet]
B --> C[Evaluate Tweet]
C -->|Approved| E[End]
C -->|Needs Improvement| D[Refine Tweet]
D --> C
Steps:
- Generate β AI crafts a humorous tweet based on the topic.
- Evaluate β AI reviews for creativity, humor, and shareability.
- Refine β AI improves the tweet if it doesnβt pass.
- Loop until approved or max iterations reached.
TweetGPT/
βββ backend/
β βββ .env.example
β βββ app.py
β βββ config.py
β βββ Dockerfile
β βββ llm_workflow.py
β βββ requirements.txt
βββ frontend/
β βββ .env.example
β βββ Dockerfile
β βββ eslint.config.js
β βββ index.html
β βββ package-lock.json
β βββ package.json
β βββ vite.config.js
β βββ public/
β β βββ bot.svg
β βββ src/
β βββ App.jsx
β βββ index.css
β βββ main.jsx
β βββ components/
β β βββ index.js
β β βββ NavigationBar.jsx
β β βββ NotificationBar.jsx
β βββ pages/
β β βββ GeneratorPage.jsx
β β βββ HomePage.jsx
β β βββ index.js
β βββ services/
β βββ api.js
β βββ index.js
βββ .dockerignore
βββ .gitignore
βββ docker-compose.yml
git clone https://github.com/ArchitJ6/TweetGPT.git
cd TweetGPTπ Backend (backend/.env)
cd backend
cp .env.example .env
X_CLIENT_ID=""
X_CLIENT_SECRET=""
X_REDIRECT_URI="http://localhost:5000/callback"
X_AUTH_URL="https://twitter.com/i/oauth2/authorize"
X_TOKEN_URL="https://api.twitter.com/2/oauth2/token"
GROQ_API_KEY="your_groq_api_key"
USE_OLLAMA="true_or_false"
GROQ_LLM_MODEL_NAME="your_groq_llm_model_name"
OLLAMA_MODEL_NAME="your_ollama_model_name"π Frontend (frontend/.env)
cd frontend
cp .env.example .env
VITE_API_BASE_URL="http://localhost:5000"πΉ Backend
cd backend
pip install -r requirements.txt
python app.pyπΉ Frontend
cd frontend
npm install
npm run devdocker-compose up --buildFrontend will be served via π Nginx at http://localhost:5173 and backend will run at http://localhost:5000.
MIT License β free to use and modify.
PRs welcome! Please follow the existing code style.
Made with β€οΈ by Archit Jain


