Skip to content
/ Pulse Public

Pulse is a real-time trend and data monitoring engine that tracks live business metrics, detects emerging patterns, and generates executive-grade insight briefs. Built under the Algorzen Research Division by Rishi Singh, Pulse acts as the heartbeat of your analytics ecosystem.

Notifications You must be signed in to change notification settings

rizzshi/Pulse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

1 Commit
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🌐 Algorzen Pulse

Real-Time Trend & Data Monitoring Engine

Algorzen Research Division Β© 2025 β€” Author: Rishi Singh


Algorzen Pulse Banner

"The heartbeat of your data."

Algorzen Pulse is the third release in the Algorzen Intelligence Suite, providing continuous real-time monitoring and trend analysis for business data streams. It detects patterns, forecasts shifts, and delivers actionable insights before they appear in traditional dashboards.


🎯 Overview

System Architecture

Algorzen Pulse extends the Algorzen ecosystem by adding the heartbeat layer β€” a continuous awareness system that watches for shifts, spikes, and opportunities in your data streams.

The Algorzen Intelligence Trilogy

  1. Eviden β†’ Insight Generation
  2. Vigil β†’ Anomaly Detection
  3. Pulse β†’ Real-Time Monitoring ✨

✨ Features

Features Overview

πŸ”„ Live Data Stream Simulation

  • Continuously fetch or simulate real-time data points
  • Support for multiple KPIs (sales, engagement, conversions, revenue)
  • Realistic time-series generation with trends, seasonality, and noise

πŸ“Š Advanced Trend Analysis

  • Moving Averages: Short-term and long-term trend detection
  • Rate-of-Change: Momentum and acceleration analysis
  • Direction Detection: Automatic classification (↑ increasing, ↓ declining, βš–οΈ stable)
  • Spike/Dip Detection: Statistical anomaly identification
  • Forecasting: Next-value prediction using exponential smoothing

πŸ€– AI-Powered Insights

  • GPT-4 Integration: Executive-style narrative summaries
  • Intelligent Fallback: Rule-based summarization when AI is unavailable
  • Priority Classification: Automatic insight ranking (high/medium/low)

πŸ“„ Professional Reporting

  • PDF Reports: Executive-grade reports with visualizations and branding
  • HTML Reports: Interactive web-based reports
  • Metadata Tracking: Full reproducibility with JSON metadata
  • Custom Branding: Algorzen Research Division styling throughout

πŸ“ˆ Interactive Dashboard

  • Streamlit Integration: Real-time visualization interface
  • Dynamic Charts: Interactive Plotly visualizations
  • Trend Gauges: Visual trend strength indicators
  • Data Export: Download reports and data

🧱 Architecture

algorzen-pulse/
β”œβ”€β”€ main.py                     # CLI orchestration script
β”œβ”€β”€ data_stream.py              # Data fetching & simulation engine
β”œβ”€β”€ trend_analyzer.py           # Pattern detection & forecasting
β”œβ”€β”€ ai_summary.py               # GPT-4 / fallback summary engine
β”œβ”€β”€ report_generator.py         # PDF + HTML report generator
β”œβ”€β”€ app/
β”‚   └── streamlit_app.py        # Interactive dashboard
β”œβ”€β”€ data/
β”‚   └── sample_live_feed.csv    # Sample dataset
β”œβ”€β”€ reports/
β”‚   β”œβ”€β”€ Pulse_Report_*.pdf      # Generated reports
β”‚   β”œβ”€β”€ Pulse_Report_*.html
β”‚   └── report_metadata.json
β”œβ”€β”€ requirements.txt
β”œβ”€β”€ .env.example
└── README.md

βš™οΈ Tech Stack

  • Python 3.10+
  • Data Processing: Pandas, NumPy
  • Visualization: Matplotlib, Seaborn, Plotly
  • Analysis: Statsmodels, Prophet
  • Reporting: ReportLab, Pillow
  • AI: OpenAI API (GPT-4)
  • Dashboard: Streamlit
  • Configuration: python-dotenv

πŸš€ Setup

1. Clone the Repository

git clone <repository-url>
cd algorzen-pulse

2. Create Virtual Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

3. Install Dependencies

pip install -r requirements.txt

4. Configure Environment (Optional)

For OpenAI integration:

cp .env.example .env
# Edit .env and add your OpenAI API key

Note: The system works perfectly without OpenAI, using intelligent rule-based summaries as fallback.


πŸ’» Usage

Quick Start

Generate sample data and run analysis:

# Step 1: Generate sample data (30 days of metrics)
python main.py --generate-sample

# Step 2: Run analysis and generate reports
python main.py --input data/sample_live_feed.csv

Command-Line Options

# Basic analysis
python main.py --input data/sample_live_feed.csv

# With OpenAI integration
python main.py --input data/my_data.csv --use-openai

# Custom analysis windows
python main.py --input data/my_data.csv --window-short 12 --window-long 72

# Generate custom sample data
python main.py --generate-sample --num-days 60 --output data/custom_feed.csv

# View all options
python main.py --help

Available Arguments

Argument Description Default
--input Path to input CSV file data/sample_live_feed.csv
--use-openai Use GPT-4 for summaries False
--window-short Short-term window (hours) 24
--window-long Long-term window (hours) 168 (1 week)
--generate-sample Generate sample data -
--num-days Days of sample data 30
--output Sample data output path data/sample_live_feed.csv

πŸ“Š Interactive Dashboard

Dashboard Overview

Launch the Streamlit dashboard for real-time visualization:

streamlit run app/streamlit_app.py

Dashboard Features:

  • Upload custom CSV data or use sample data
  • Interactive time-series charts
  • Trend strength gauges per metric
  • Real-time insight generation
  • Executive summary display
  • PDF report generation
  • Data export functionality

Dashboard Insights


πŸ“ Input Data Format

Your CSV file should have this structure:

timestamp,sales,engagement,conversions,revenue
2025-01-01 00:00:00,10234.56,5123.45,512.34,51234.56
2025-01-01 01:00:00,10456.78,5234.56,523.45,52345.67
...

Requirements:

  • timestamp column (datetime format)
  • One or more metric columns (numeric values)
  • Hourly or finer granularity recommended

πŸ“„ Output Files

After running analysis:

1. PDF Report (reports/Pulse_Report_YYYYMMDD.pdf)

PDF Report Sample

  • Executive summary
  • Key metrics overview
  • Trend visualizations
  • Priority insights
  • Algorzen branding

2. HTML Report (reports/Pulse_Report_YYYYMMDD.html)

HTML Report Sample

  • Interactive web-based report
  • Same content as PDF with responsive design
  • Embedded charts

3. Metadata (reports/report_metadata.json)

{
  "project": "Algorzen Pulse",
  "report_id": "PULSE-2025-Q4-001",
  "generated_by": "Rishi Singh",
  "created_at": "2025-11-15T14:30:00",
  "tone": "Executive Business",
  "openai_used": false,
  "company": "Algorzen Research Division",
  "year": "2025"
}

🧩 Example Workflow

Complete Analysis Pipeline

# 1. Generate fresh sample data
python main.py --generate-sample --num-days 45

# 2. Run analysis with OpenAI
python main.py --input data/sample_live_feed.csv --use-openai

# 3. Launch dashboard for interactive exploration
streamlit run app/streamlit_app.py

Python API Usage

from data_stream import DataStream
from trend_analyzer import TrendAnalyzer
from ai_summary import AISummaryEngine
from report_generator import ReportGenerator

# Load or generate data
stream = DataStream()
df = stream.generate_live_feed(
    output_path='data/my_feed.csv',
    num_days=30,
    metrics=['sales', 'engagement', 'conversions']
)

# Analyze trends
analyzer = TrendAnalyzer(df)
trends = analyzer.analyze_all_metrics()
insights = analyzer.get_top_insights()
summary_stats = analyzer.get_summary_stats()

# Generate AI summary
ai_engine = AISummaryEngine(use_openai=True)
executive_summary = ai_engine.generate_executive_summary(
    trends, insights, summary_stats
)

# Create report
report_gen = ReportGenerator()
report_files = report_gen.generate_report(
    df, trends, insights, summary_stats, 
    executive_summary, use_openai=True
)

print(f"Report generated: {report_files['pdf']}")

🎨 Customization

Adding Custom Metrics

Edit your input CSV to include new metrics:

timestamp,sales,engagement,conversions,revenue,churn_rate,nps_score

The system automatically detects and analyzes all numeric columns.

Adjusting Analysis Windows

# Short-term: 12 hours, Long-term: 3 days (72 hours)
python main.py --input data/my_data.csv --window-short 12 --window-long 72

Custom Thresholds

Modify trend_analyzer.py to adjust spike/dip detection:

spikes = self._detect_spikes(series, threshold=2.5)  # Increase sensitivity

🧠 How It Works

Analysis Pipeline

1. Data Stream Simulation

Generates realistic time-series data with:

  • Baseline trends: Gradual upward/downward movement
  • Seasonality: Daily and weekly patterns
  • Noise: Random variation (Β±5% of baseline)
  • Spikes: Occasional anomalies (3% probability)

2. Trend Analysis

For each metric:

  • Calculate short-term and long-term moving averages
  • Compute rate-of-change (momentum)
  • Detect direction via MA crossover
  • Identify spikes (> mean + 2Οƒ) and dips (< mean - 2Οƒ)
  • Forecast next value using exponential smoothing

3. Insight Generation

  • Classify trends by priority (high/medium/low)
  • Generate human-readable messages
  • Sort by business impact

4. AI Summarization

  • With OpenAI: GPT-4 creates executive narrative
  • Without OpenAI: Rule-based summary with business context

5. Report Generation

  • Create visualizations (time-series, bar charts, pie charts)
  • Build PDF with ReportLab
  • Generate responsive HTML
  • Save metadata for reproducibility

πŸ”’ Privacy & Security

  • Local-First: All processing happens locally
  • API Key Safety: OpenAI keys stored in .env (gitignored)
  • No Data Sharing: Your data never leaves your machine
  • Open Source: Full transparency of algorithms

πŸ› Troubleshooting

Issue: "OpenAI package not installed"

pip install openai

Issue: "No module named 'reportlab'"

pip install reportlab

Issue: Sample data not generating

Ensure data/ directory exists:

mkdir -p data
python main.py --generate-sample

Issue: Dashboard not loading

# Install Streamlit
pip install streamlit

# Run from project root
streamlit run app/streamlit_app.py

πŸ“ˆ Performance

  • Data Processing: ~1000 records/second
  • Report Generation: ~5-10 seconds for 30 days of data
  • Memory Usage: <500MB for typical datasets
  • Recommended Data Size: 7-90 days of hourly data

πŸ—ΊοΈ Roadmap

  • Real-time streaming from APIs (Stripe, Google Analytics, etc.)
  • Advanced forecasting with Prophet/ARIMA
  • Multi-dimensional correlation analysis
  • Alert system (email/Slack notifications)
  • Database integration (PostgreSQL, MongoDB)
  • Custom dashboard themes
  • Export to PowerPoint

🀝 Contributing

This is a proprietary research project by Algorzen Research Division.
For collaboration inquiries, contact: Rishi Singh


πŸ“œ License

Proprietary Software
Β© 2025 Algorzen Research Division. All rights reserved.
Author: Rishi Singh

Unauthorized copying, distribution, or modification is prohibited.


πŸ™ Acknowledgments

Part of the Algorzen Intelligence Suite:

  1. Algorzen Eviden - Insight Generation Engine
  2. Algorzen Vigil - Anomaly Detection System
  3. Algorzen Pulse - Real-Time Monitoring Platform

πŸ“ž Contact

Author: Rishi Singh
Organization: Algorzen Research Division
Project: Algorzen Pulse (Drop 003)
Year: 2025


Algorzen Research Division Β© 2025
The heartbeat of your data.

About

Pulse is a real-time trend and data monitoring engine that tracks live business metrics, detects emerging patterns, and generates executive-grade insight briefs. Built under the Algorzen Research Division by Rishi Singh, Pulse acts as the heartbeat of your analytics ecosystem.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published