Skip to content

Recursive autojournaling: self-creation through activity tracking and LLM generation

Notifications You must be signed in to change notification settings

SyntaxAsSpiral/Autopoesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

📔 Autopoesis

Recursive autojournaling system for self-creation through activity tracking and LLM-powered generation.
[Work in Progress] The framework is currently in specification phase. The reference implementation is my personal site/repo at github.com/SyntaxAsSpiral/SyntaxAsSpiral.

Concept

Autopoesis abstracts the patterns from pulse-log into a reusable framework where you can define:

  • Activity sources: Memory stores (where AI agents write events), git commits, shell history, chat logs, or custom sources
  • Field schemas: What content to generate (status, narrative, glyphs, etc.)
  • Output targets: Static HTML, markdown, social media, or custom destinations

The system creates a recursive feedback loop where generated content feeds back as examples, developing a consistent voice over time.

Memory System Integration

Autopoesis connects to your existing memory systems - it doesn't build its own. Whether you use:

  • Markdown vaults (Obsidian, Logseq, Dendron, plain files)
  • LLM memory systems (mem0, Zep, Letta/MemGPT)
  • MCP memory servers (semantic graphs, knowledge bases)
  • Databases (SQLite, Postgres, MongoDB)
  • Simple logs (JSON, JSONL, CSV files)

The system provides adapters to pull data from these sources. The key requirement: timestamps. Narrative generation is driven by the most recent memories, ordered chronologically. Each adapter knows how to:

  1. Extract timestamps from your memory format
  2. Query within a time window
  3. Return ordered activity data

This means AI agents can write to whatever memory system works for you (Obsidian daily notes, a mem0 instance, an MCP server, etc.) and autopoesis will read from it to generate journal entries.

🚧 Current Status

Specification phase. Requirements and design documents are being developed. The working implementation exists in my personal site repo as a concrete instance.

See .kiro/specs/autopoesis/ for detailed requirements and design documentation.

🎯 Planned Architecture

Core Components

  • Memory System Adapters: Connect to existing memory systems (markdown, MCP, databases, etc.)
  • Activity Sources: Git, shell history, chat logs, and other input streams
  • Timestamp Extraction: Parse dates from various formats to order activities chronologically
  • Context Aggregation: Session grouping and LLM-ready formatting
  • Field Generation: Batch LLM calls with recursive cache feedback
  • Output Targets: Static HTML, markdown, social media, custom destinations

Flow

AI Agents → Memory System → Autopoesis Adapter → Context → LLM → Journal Entry → Output
   ↓            ↓              ↓                    ↓        ↓         ↓            ↓
 (write)    (your choice)  (timestamp aware)   (temporal) (generate) (narrative) (publish)

The system pulls the most recent memories from your memory system, orders them chronologically, generates narrative fields, and publishes to your chosen outputs.

Example Instance: pulse-log

My personal implementation generates:

  • status: Brief update on current work
  • subject: Main focus area
  • mode: Work mode (building, exploring, refining, integrating)
  • glyph: Emoji representing the session essence
  • narrative: Poetic description of the work

Output: Static HTML homepage updated with each journal entry.

🔗 Related Projects

  • pulse-log: First concrete instance (dev journaling → HTML homepage)
  • collectivist: Complementary input side (curates static collections → living indices)
  • amexsomnemon: Memory substrate layer (the living archive)

The trinity:

  • collectivist: Past (what you've collected)
  • autopoesis: Present (what you're creating now)
  • amexsomnemon: Memory substrate (the living archive)

🎮 Future Installation

# Planned: pip install autopoesis
# autopoesis init my-journal --config journal.yaml
# autopoesis run

The generic framework is in development. For now, see the reference implementation.

Configuration Model (Planned)

Each autopoesis instance will have:

  • Source config: Which activity sources, time windows, filters
  • Field schema: What fields to generate, formats, constraints
  • Generation config: Models, temperatures, batch groups, prompt templates
  • Output config: Where to send content, formatting rules
  • Cache config: Recursive feedback settings, sampling strategies

Example Instances (Planned)

pulse-log (dev journaling)

  • Sources: Obsidian vault (AI agents write daily notes), git commits, shell history
  • Fields: status, subject, mode, glyph, narrative
  • Output: HTML homepage
  • Aesthetic: pneumastructural/operator vocabulary
  • Memory flow: Chat agents → Obsidian daily notes → Autopoesis → HTML

social-pulse (social media presence)

  • Sources: mem0 memory system (AI insights), git commits, collectivist indices
  • Fields: status, insight, link
  • Output: X/Twitter, Discord
  • Aesthetic: terse, hyperstitional
  • Memory flow: Multiple AI agents → mem0 → Autopoesis → Social media

project-dashboard (project status)

  • Sources: MCP memory server (AI analysis), git, issue trackers, CI/CD logs
  • Fields: status, blockers, progress, next_steps
  • Output: Markdown, HTML dashboard
  • Aesthetic: technical, clear
  • Memory flow: Coding agents → MCP server → Autopoesis → Dashboard

⚠️ Development Warning

This is experimental software in early specification phase. The architecture is being designed for bespoke personal systems, not enterprise scale.

Contributing

Issues and PRs welcome once the framework is implemented! For now, this is design exploration.

License

Private research tool - not for distribution.


Self-making through recursive poetic generation 🜍

About

Recursive autojournaling: self-creation through activity tracking and LLM generation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •