Skip to content

jtbwatson/network-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

41 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Network Troubleshooting Assistant

A Flask-based web application that provides an interactive network troubleshooting assistant, powered by Ollama for LLM capabilities and ChromaDB for document retrieval.

Features

  • Interactive chat interface for network troubleshooting
  • Intelligent problem type detection
  • Guided information gathering system
  • Document indexing and management for knowledge retrieval
  • Context-aware responses leveraging your network documentation
  • Environment-based configuration system
  • Built-in document viewer and editor for your network documentation

Note: Sample examples of documents are available in the network_doc_samples directory. All of the example documents were generated by claude.ai and are not based on any real network.

Setup and Installation

Prerequisites

  • Python 3.8 or higher
  • Ollama running locally or on a remote server
  • Network documentation (optional, but recommended)

Model Setup with Ollama

The application uses a custom Ollama model specifically tailored for network troubleshooting:

  1. Install Ollama from the official website (https://ollama.com/)

  2. Create the custom model using the included Modelfile:

    ollama create network-assistant -f Modelfile
  3. Verify the model is created:

    ollama list

The model is based on Llama 3.2 or phi4 and configured with a specialized system prompt to provide expert-level network troubleshooting assistance. It's designed to help IT professionals diagnose and resolve networking issues across various domains. You can of course use any model you like, but this has only been tested on Llama 3.2 and phi4.

Installation Steps

  1. Clone the repository:

    git clone https://github.com/jtbwatson/network-assistant.git
    cd network-troubleshooting-assistant
  2. Create a virtual environment and activate it:

    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  3. Install the required packages:

    pip install -r requirements.txt
  4. Create a .env file by copying the sample:

    cp .env.sample .env
  5. Edit the .env file to match your environment:

    # Primary configuration
    OLLAMA_HOST=http://localhost:11434
    OLLAMA_MODEL=network-assistant
    
  6. Start the application:

    python app.py
  7. Open a web browser and go to http://localhost:5000

Model Details

Base Model

  • Utilizes Llama 3.2 or phi4 as the foundational language model
  • Customized with a specialized system prompt for network troubleshooting
  • Configured to provide technical, contextually relevant assistance

Key Capabilities

The model is engineered to:

  • Diagnose complex network issues
  • Provide step-by-step technical guidance
  • Explain networking concepts clearly
  • Offer both immediate and long-term solutions
  • Adapt responses to the user's technical skill level

Configuration

The application uses environment variables for configuration, which can be set in a .env file:

Variable Description Default Value
DOCS_DIR Directory for storing network documentation ./network_docs
DB_DIR Directory for ChromaDB storage ./chroma_db
CHUNK_SIZE Size of text chunks for indexing 512
CHUNK_OVERLAP Overlap between chunks 50
SEARCH_RESULTS Number of search results to retrieve 5
OLLAMA_HOST URL of your Ollama instance http://localhost:11434
OLLAMA_MODEL Name of the Ollama model to use network-assistant
USE_OLLAMA_EMBEDDINGS Whether to use Ollama for generating embeddings true
OLLAMA_EMBEDDING_MODEL Model to use for embeddings when using Ollama nomic-embed-text
OLLAMA_EMBEDDING_BATCH_SIZE Batch size for embedding generation 10
PORT Port to run the application on 5000
DEBUG_MODE Enable Flask debug mode False

Document Indexing and Management

For best results, place your network documentation in the network_docs directory (or your custom configured directory). Supported formats:

  • Markdown (.md)
  • Text files (.txt)
  • YAML configuration files (.yaml, .yml)

Document Features:

  • Indexing: Click the "Index Documents" button in the UI to make your documents searchable
  • Viewing: Browse and view documents directly in the application
  • Editing: Edit and update documents through the built-in document viewer/editor
  • Reindexing: Use "Force Reindex All" when you want to refresh the entire document database

Embedding Options

The application supports two methods for generating embeddings:

  1. Ollama Embeddings (Default): Uses the specified Ollama model to generate embeddings

    • Provides consistent embedding quality across different environments
    • Set USE_OLLAMA_EMBEDDINGS=true and specify OLLAMA_EMBEDDING_MODEL
  2. Local Embeddings: Uses SentenceTransformer locally

    • Works without additional Ollama configuration
    • Set USE_OLLAMA_EMBEDDINGS=false to use this method

Conversation Management

The application maintains conversation history within each session to provide context-aware responses. This helps the assistant remember previous questions and build on prior interactions.

Troubleshooting

ChromaDB Issues

If you see "ChromaDB not available" messages:

  • Make sure you've installed the required packages
  • Check if there are compatibility issues with your Python version
  • Consider manually installing ChromaDB: pip install chromadb

Ollama Connection Issues

If you see "Could not connect to Ollama":

  • Verify that Ollama is running
  • Check that the OLLAMA_HOST setting points to the correct URL
  • Make sure your network allows connections to the Ollama server
  • Verify that the model specified in OLLAMA_MODEL is available on your Ollama instance

Document Indexing Problems

If documents aren't being indexed properly:

  • Check file permissions on the DOCS_DIR directory
  • Ensure documents are in supported formats (.md, .txt, .yaml, .yml)
  • Try using "Force Reindex All" to rebuild the index
  • Check the application logs for specific error messages

Security Notes

  • This application is designed for internal network use only
  • There is no authentication built into the application
  • Do not expose this service to the public internet without adding proper security measures

License

[MIT License]

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published