Last Updated: December 11, 2025 | Version: 3.1.0
Tech Stack: React 19.2.1 | Vite 7.2.7 | TypeScript 5.9.3 | Cloudflare Workers + Zero Trust
R2 Bucket Manager for Cloudflare — A full-featured, self-hosted web app to manage Cloudflare R2 buckets and objects. Supports job history tracking, AI-powered search, drag-and-drop uploads with verification, batch copy/move/delete, multi-file ZIP downloads, signed share links, folder hierarchies, advanced search + filters (extension, size, date), S3 import, and GitHub SSO via Cloudflare Zero Trust.
Live Demo • Docker • Wiki • Changelog • Release Article
- 🎯 NEW! Custom Metadata - User-defined, searchable bucket tags
- 🚀 NEW! S3 Import [BETA] - Migrate data from Amazon, Google, and All S3 Compatible buckets to R2 using Cloudflare's Super
- 📊 Metrics Dashboard - Monitor R2 bucket usage, request counts, and storage analytics in real-time
- 🪝 WebHooks - Set up event notifications for bucket operations (uploads, deletes, migrations) Slurper API
- 📋 Job History & Audit Logging - Complete audit trail for all operations (bulk and individual) with filterable job list and event timeline
- 🤖 AI Search Integration - Connect R2 buckets to Cloudflare AI Search for semantic search and RAG capabilities
- 🔎 Cross-Bucket Search - Search for files across all buckets with advanced filtering
- 🪣 Bucket Management - Create, rename, and delete R2 buckets (with bulk delete support)
- 📦 Multi-Bucket Download - Select and download multiple buckets as a single ZIP archive with "Select All" button
- 🧭 Bucket Filtering - Filter buckets by name, size, and creation date with preset and custom ranges
- 📁 Folder Management - Create, rename, copy, move, and delete folders with hierarchical navigation
- 📄 File Management - Rename files via right-click context menu with validation
- 🔍 Smart Filtering - Real-time client-side filtering by filename/folder name with type filters (All/Files/Folders)
- 🎯 Advanced Filtering - Filter files by extension, size ranges, and upload dates with preset and custom options
- 📤 Smart Uploads - Chunked uploads with automatic retry and integrity verification (10MB chunks, up to 500MB files)*
- ✓ Upload Verification - MD5 checksum verification ensures uploaded files match stored files exactly
- 📥 Bulk Downloads - Download multiple files as ZIP archives
- 🔗 Shareable Links - Generate signed URLs to share files securely
- 🔄 Advanced File Operations - Move and copy files/folders between buckets and to specific folders within buckets
- 🗑️ Bulk Bucket Delete - Select and force delete multiple buckets at once with progress tracking
- 🧭 Breadcrumb Navigation - Navigate through folder hierarchies with ease
- 🔐 Enterprise Auth - GitHub SSO via Cloudflare Access Zero Trust
- 🛡️ Rate Limiting - Tiered API rate limits (100/min reads, 30/min writes, 10/min deletes) with automatic enforcement
- ⚡ Edge Performance - Deployed on Cloudflare's global network with intelligent client-side caching (5-min TTL)
- 🔄 Smart Retry Logic - Automatic exponential backoff for rate limits and transient errors (429/503/504)
- 🎨 Modern UI - Beautiful, responsive interface built with React 19
- 🌓 Light/Dark Mode - Auto-detects system preference with manual toggle (System → Light → Dark)
Upload Size Limits: This application supports uploads up to 500MB per file. However, Cloudflare enforces plan-based limits:
- Free/Pro Plans: 100MB maximum per file
- Business Plan: 200MB maximum per file
- Enterprise Plan: 500MB maximum per file
Accepted file types and size limits:
- Archives (7Z, GZ, RAR, TAR, ZIP) - up to 500MB
- Audio (AAC, FLAC, M4A, MP3, OGG, OPUS, WAV) - up to 100MB
- Code (CSS, GO, HTML, Java, JS, Rust, TS, Python, etc.) - up to 10MB
- Config & Metadata (CONF, ENV, INI, JSON, JSONC, LOCK, TOML, etc.) - up to 10MB
- Data Formats (AVRO, FEATHER, NDJSON) - up to 50MB
- Databases (DB, PARQUET, SQL) - up to 50MB
- Dev Environment (Dockerfile, editorconfig, .gitignore, nvmrc, etc.) - up to 1MB
- Documents (CSV, Excel, Markdown, PDF, PowerPoint, TXT, Word, etc.) - up to 50MB
- Documentation (NFO) - up to 10MB
- Fonts (EOT, OTF, TTF, WOFF, WOFF2) - up to 10MB
- Images (AVIF, BMP, GIF, HEIC, JPG, PNG, PSD, SVG, WebP) - up to 15MB
- Jupyter Notebooks (.ipynb) - up to 10MB
- Videos (3GP, AVI, FLV, M4V, MKV, MOV, MP4, MPEG, OGG, WebM, WMV) - up to 500MB
- Cloudflare account (Free tier works!)
- Node.js 18+ and npm
- Wrangler CLI (4.36.0+ for rate limiting)
- Domain managed by Cloudflare (optional - can use Workers.dev subdomain)
Note: Rate limiting requires a Cloudflare Workers paid plan. All other features work on the free tier.
-
Clone and install:
git clone https://github.com/neverinfamous/R2-Manager-Worker.git cd R2-Manager-Worker npm install -
Configure environment:
cp .env.example .env cp wrangler.toml.example wrangler.toml
Edit both files with your settings.
Optional - Enable Rate Limiting:
- Rate limiting is pre-configured in
wrangler.toml.example - Requires Wrangler 4.36.0+ and Workers paid plan
- If you don't have a paid plan, the app works fine without it
- Rate limiting is pre-configured in
-
Create R2 bucket:
npx wrangler login npx wrangler r2 bucket create your-bucket-name
-
Configure Cloudflare Access:
- Set up GitHub OAuth in Zero Trust
- Create an Access application for your domain
- Copy the Application Audience (AUD) Tag
-
Set Worker secrets:
npx wrangler secret put ACCOUNT_ID npx wrangler secret put CF_EMAIL npx wrangler secret put API_KEY npx wrangler secret put TEAM_DOMAIN npx wrangler secret put POLICY_AUD
-
Deploy:
npm run build npx wrangler deploy
📖 For detailed instructions, see the Installation & Setup Guide.
If you already have Job History set up and want to enable individual action logging (file uploads, downloads, deletes, etc.), run the schema migration to add the new audit_log table:
npx wrangler d1 execute r2-manager-metadata --remote --file=worker/schema.sqlNote: This command is safe to run multiple times - it uses
CREATE TABLE IF NOT EXISTSso existing tables are not affected. If you don't run the migration, the application will continue to work normally but individual actions won't be logged (only bulk operations).
📖 See the Job History Guide for complete documentation.
For local development and testing, use the Docker image:
docker pull writenotenow/r2-bucket-manager:latestdocker run -p 8787:8787 writenotenow/r2-bucket-manager:latestAccess the development server at http://localhost:8787
Note: Docker deployment is for development/testing only. For production, deploy to Cloudflare Workers using the instructions above.
📖 See the Docker Hub page for complete Docker documentation.
Download multiple buckets simultaneously as a single, timestamped ZIP archive.
- Zero Dependencies: Generates archives on the fly without size limits.
- Smart Structure: Maintains full folder hierarchy (
/bucket-name/folder/file.ext). - Bulk Actions: dedicated toolbar for "Select All" and batch processing.
Select and force-delete multiple buckets in one operation.
- Safety First: Enhanced confirmation modal calculates total files and size before deletion.
- Force Delete: Automatically recursively empties buckets before removing them.
- Progress Tracking: Visual feedback for long-running deletion tasks.
Real-time, server-side parallel search across all your buckets instantly.
- Performance: Queries thousands of files in seconds with minimal overhead.
- Direct Actions: Move, copy, download, or delete files directly from search results.
- Deep Linking: Click bucket badges to navigate directly to the source.
Comprehensive client-side filtering for large buckets.
- Smart Filters: Filter by File Type (Images, Code, Docs), Size (Presets or custom ranges), and Date (Upload time).
- Context Aware: Toggle between "Files Only," "Folders Only," or "All."
- Persistent: Active filters remain applied during batch operations and navigation.
Track all operations with a comprehensive audit trail and event timeline.
- Bulk Operation Tracking: Automatically track bulk downloads, uploads, deletions, moves, and copies
- Individual Action Logging: Every file upload, download, delete, rename, move, and copy is logged
- Folder & Bucket Auditing: Track folder creation, deletion, renaming, and all bucket operations
- Filterable List: Filter by status (success, failed, running), operation type, bucket, and date range
- Grouped Operations: Operation types organized by category (Bulk, File, Folder, Bucket, AI)
- Event Timeline: View detailed progress history for bulk jobs in a modal dialog
- Real-time Progress: See percentage completion and item counts during bulk operations
- Job Search: Quickly find jobs by ID
- Sorting Options: Sort by date, item count, or error count
Job history requires a D1 database. Add the binding to your wrangler.toml:
[[d1_databases]]
binding = "METADATA"
database_name = "r2-manager-metadata"
database_id = "your-database-id"Create the database and run the schema:
npx wrangler d1 create r2-manager-metadata
npx wrangler d1 execute r2-manager-metadata --remote --file=worker/schema.sqlConnect your R2 buckets to Cloudflare AI Search (formerly AutoRAG) for powerful semantic search and AI-powered question answering.
- Compatibility Analysis: See which files in your bucket can be indexed by AI Search
- Visual Reports: Donut chart showing indexable vs non-indexable file ratios
- Dual Search Modes:
- AI Search: Get AI-generated answers based on your data
- Semantic Search: Retrieve relevant documents without AI generation
- Instance Management: List, sync, and query AI Search instances from the UI
- Direct Dashboard Link: Quick access to Cloudflare Dashboard for instance creation
AI Search can index these file types (up to 4MB each):
- Text:
.txt,.md,.rst,.log - Config:
.json,.yaml,.yml,.toml,.ini,.conf,.env - Code:
.js,.ts,.py,.html,.css,.xml - Documents:
.tex,.latex,.sh,.bat,.ps1
Add the AI binding to your wrangler.toml:
[ai]
binding = "AI"📖 See the AI Search Guide for complete setup instructions.
Intelligent, per-user rate limiting prevents abuse while ensuring fair resource access. Limits are applied based on the authenticated user's email.
| Tier | Operations | Limit | Period | Scope |
|---|---|---|---|---|
| READ | List, Search, Signed URLs | 100 req | 60s | High-volume access |
| WRITE | Upload, Rename, Move | 30 req | 60s | Modification safety |
| DELETE | Remove Files/Buckets | 10 req | 60s | Destructive actions |
Note: Rate limiting returns standard 429 Too Many Requests headers and can be configured or disabled via wrangler.toml.
When rate limited, responses include:
Retry-After- Seconds to wait before retryingX-RateLimit-Limit- Maximum requests allowed in the periodX-RateLimit-Period- Time period in secondsX-RateLimit-Tier- Which tier was exceeded (READ/WRITE/DELETE)
Rate limits are defined in wrangler.toml using Cloudflare Workers Rate Limiting API bindings. To modify limits, edit the configuration and redeploy:
[[ratelimits]]
name = "RATE_LIMITER_READ"
namespace_id = "1001"
simple = { limit = 100, period = 60 }Requirements:
- Wrangler CLI version 4.36.0 or later
- Cloudflare Workers paid plan (rate limiting not available on free plan)
Note: Rate limiting is optional. If not configured in wrangler.toml, the application will function normally without rate limiting protection.
Terminal 1: Frontend dev server (Vite)
npm run dev- Runs on:
http://localhost:5173 - Hot Module Replacement (HMR) enabled
- Watches for file changes automatically
Terminal 2: Worker dev server (Wrangler)
npx wrangler dev --config wrangler.dev.toml --local- Runs on:
http://localhost:8787 - Uses local bindings with mock data (no secrets required)
- Automatically reloads on code changes
- Note: Returns mock bucket data for testing UI without Cloudflare API access
Open your browser to http://localhost:5173 - the frontend will automatically communicate with the Worker API on port 8787.
Note: Authentication and rate limiting are disabled on localhost for easier development. No Cloudflare Access configuration needed for local dev.
.env- Frontend environment variables (points tohttp://localhost:8787)wrangler.dev.toml- Development-specific Worker config (skips frontend build, adds mock data support)wrangler.toml- Production Worker config (includes build step)
- Authentication: Automatically disabled for localhost requests
- Rate Limiting: Automatically bypassed for localhost requests
- CORS: Configured to allow
http://localhost:5173with credentials - Mock Data: Returns simulated responses (no real Cloudflare API calls)
- No Secrets Required: Works without
ACCOUNT_ID,CF_EMAIL, orAPI_KEY
The following operations return simulated success responses for UI testing:
- ✅ List buckets (returns
dev-bucket) - ✅ Create bucket
- ✅ Rename bucket
- ✅ List files (returns empty array)
- ✅ Upload files (simulates success, files not stored)
- ✅ Create folders
Note: Files and folders are not actually stored. Local development is for UI/UX testing only. For full functionality, deploy to Cloudflare Workers.
📖 For more details, see the Development Guide.
| Component | Technology | Version |
|---|---|---|
| Frontend | React | 19.2.1 |
| Build Tool | Vite | 7.2.7 |
| Language | TypeScript | 5.9.3 |
| Backend | Cloudflare Workers | Runtime API |
| Storage | Cloudflare R2 | S3-compatible |
| Auth | Cloudflare Access | Zero Trust |
├── src/ # Frontend source code
│ ├── app.tsx # Main UI component
│ ├── filegrid.tsx # File browser with grid/list views
│ └── services/ # API client and auth utilities
├── worker/
│ ├── index.ts # Worker runtime & API endpoints
│ ├── routes/ # API route handlers
│ └── utils/ # Helper utilities
├── wrangler.toml.example # Wrangler configuration template
└── .env.example # Environment variables template
📖 For complete API documentation, see the API Reference.
GET /api/buckets- List all bucketsPOST /api/buckets- Create a new bucketDELETE /api/buckets/:bucketName- Delete a bucket (with optional?force=true)PATCH /api/buckets/:bucketName- Rename a bucket
GET /api/files/:bucketName- List files in a bucket (supports?cursor,?limit,?prefix,?skipCache)POST /api/files/:bucketName/upload- Upload a file (supports chunked uploads)GET /api/files/:bucketName/signed-url/:fileName- Generate a signed download URLPOST /api/files/:bucketName/download-zip- Download multiple files as ZIPDELETE /api/files/:bucketName/delete/:fileName- Delete a filePOST /api/files/:bucketName/:fileName/copy- Copy a file to another bucket or folder (supportsdestinationPath)POST /api/files/:bucketName/:fileName/move- Move a file to another bucket or folder (supportsdestinationPath)PATCH /api/files/:bucketName/:fileName/rename- Rename a file within the same bucket
POST /api/folders/:bucketName/create- Create a new folderPATCH /api/folders/:bucketName/rename- Rename a folder (batch operation)POST /api/folders/:bucketName/:folderPath/copy- Copy a folder to another bucket or folder (supportsdestinationPath)POST /api/folders/:bucketName/:folderPath/move- Move a folder to another bucket or folder (supportsdestinationPath)DELETE /api/folders/:bucketName/:folderPath- Delete a folder and its contents (with optional?force=true)
GET /api/ai-search/compatibility/:bucketName- Analyze bucket files for AI Search indexabilityGET /api/ai-search/instances- List AI Search instancesPOST /api/ai-search/instances- Create an AI Search instanceDELETE /api/ai-search/instances/:name- Delete an AI Search instancePOST /api/ai-search/instances/:name/sync- Trigger instance re-indexingPOST /api/ai-search/:instanceName/search- Semantic search (retrieval only)POST /api/ai-search/:instanceName/ai-search- AI-powered search with generated response
GET /api/jobs- List jobs with filtering (supports?status,?operation_type,?bucket_name,?start_date,?end_date,?job_id,?min_errors,?limit,?offset,?sort_by,?sort_order)GET /api/jobs/:jobId- Get job status and detailsGET /api/jobs/:jobId/events- Get job event timelineGET /api/audit- List audit log entries with filtering (supports?operation_type,?bucket_name,?status,?start_date,?end_date,?user_email,?limit,?offset,?sort_by,?sort_order)GET /api/audit/summary- Get operation counts grouped by type
- ✅ Zero Trust Architecture - All requests authenticated by Cloudflare Access
- ✅ JWT Validation - Tokens verified on every API call
- ✅ Rate Limiting - Tiered API rate limits prevent abuse and ensure fair usage
- ✅ HTTPS Only - All traffic encrypted via Cloudflare's edge network
- ✅ Signed URLs - Download operations use HMAC-SHA256 signatures
- ✅ No Stored Credentials - No user passwords stored anywhere
📖 Learn more in the Authentication & Security Guide.
- File Versioning - Track and restore previous file versions
- Offline Upload Queue - Resumable uploads with service workers
📖 See the full Roadmap for details.
You can configure R2 Bucket Manager to hide specific buckets from the UI (e.g., system buckets, internal buckets, or buckets managed by other applications).
- Edit
worker/index.ts:- Locate the
systemBucketsarray (around line 373) - Add your bucket name(s) to the array
- Locate the
const systemBuckets = ['r2-bucket', 'sqlite-mcp-server-wiki', 'your-bucket-name'];- Deploy the changes:
npx wrangler deploy
To hide buckets named blog-wiki and internal-data:
const systemBuckets = ['r2-bucket', 'sqlite-mcp-server-wiki', 'blog-wiki', 'internal-data'];Note: Hidden buckets are completely filtered from the API response and won't appear in the bucket list or be accessible through the UI.
📖 See the Configuration Reference for more details.
To add support for new file extensions:
-
Update
src/services/api.ts:- Add file type category to
FILE_TYPESobject - Map extensions in
getConfigByExtension()
- Add file type category to
-
Update
src/filegrid.tsx:- Add custom icon rendering for new extensions
-
Update
src/app.tsx:- Add file type to upload instructions
📖 See the Development Guide for complete instructions.
Common issues and solutions:
- Authentication errors: Verify
TEAM_DOMAINandPOLICY_AUDsecrets - Bucket not found: Check
wrangler.tomlbucket name matches exactly - Upload failures: Verify your Cloudflare plan's upload size limits
- Deployment issues: Re-authenticate with
npx wrangler login
📖 For detailed solutions, see the Troubleshooting Guide.
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📖 See the Development Guide for setup instructions.
This project is licensed under the MIT License - see the LICENSE file for details.
- 🐛 Bug Reports: GitHub Issues
- 💡 Feature Requests: GitHub Discussions
- 📖 Documentation: GitHub Wiki
- ❓ Questions: FAQ
- Cloudflare Workers Documentation
- Cloudflare R2 Storage Documentation
- Cloudflare Access (Zero Trust) Documentation
- React 19 Documentation
- Vite Documentation
Made with ❤️ for the Cloudflare community