Skip to content

Latest commit

 

History

History
130 lines (86 loc) · 4.04 KB

File metadata and controls

130 lines (86 loc) · 4.04 KB

🤖 MCP Code Analysis Server

A Kotlin server application that analyzes GitHub repositories using AI models through the Model Context Protocol (MCP).

Features

  • Clone and analyze GitHub repositories
  • Extract code structure and relationships
  • Process code using Model Context Protocol
  • Generate detailed insights and summaries
  • Multiple server modes (stdio, SSE)

Getting Started

Prerequisites

  • JDK 23 or higher
  • Kotlin 2.2.x
  • Gradle 9.0 or higher
  • Ollama 3.2 or higher (for model API)
  • MCP Inspector (for model context protocol)

Installation

  1. Clone this repository
  2. Build the project using Gradle:
./gradlew build
  1. Start Ollama server:
ollama run llama3.2:latest
  1. Start the MCP Inspector:
export DANGEROUSLY_OMIT_AUTH=true; npx @modelcontextprotocol/inspector@0.16.2
  1. You can access the MCP Inspector at http://127.0.0.1:6274/ and configure the Arguments to start the server:

Connect

Use the following arguments:

-jar ~/mcp-github-code-analyzer/build/libs/mcp-github-code-analyzer-0.1.0-SNAPSHOT.jar --stdio
  1. Click Connect to start the MCP Server.

  2. Then you can click the tab Tools to discover the available tools. The Tool analyze-repository should be listed and ready to be used. Click on the analyze-repository tool to see its details and parameters:

Tools Tab

  1. Finally, capture the repoUrl and branch parameters and click Run Tool to start the analysis:

Run Tool

Configuration

The application uses environment variables for configuration:

  • SERVER_PORT: The port for the server (default: 3001)
  • GITHUB_TOKEN: GitHub token for API access (optional)
  • WORKING_DIRECTORY: Directory for cloning repositories (default: system temp + "/mcp-code-analysis")
  • MODEL_API_URL: URL for the model API (default: "http://localhost:11434/api")
  • MODEL_API_KEY: API key for the model service (optional)
  • MODEL_NAME: Name of the model to use (default: "llama3.2")

Running the Application

The server supports multiple modes:

# Default: Run as SSE server with Ktor plugin on port 3001
./gradlew run

# Run with standard input/output
./gradlew run --args="--stdio"

# Run as SSE server with Ktor plugin on custom port
./gradlew run --args="--sse-server-ktor 3002"

# Run as SSE server with plain configuration
./gradlew run --args="--sse-server 3002"

# With custom environment variables:
SERVER_PORT=3002 MODEL_NAME=mistral ./gradlew run

Model Context Protocol

This server implements the Model Context Protocol (MCP) and provides the following tool:

  • analyze-repository Analyzes GitHub repositories to provide code insights and structure summary.

Required parameters:

Optional parameters:

  • branch: Branch to analyze (default: main)

Project Structure

  • Main.kt: Application entry point
  • config/: Configuration classes
    • AppConfig.kt: Immutable configuration data class
  • server/: MCP server implementation
    • Mcp.kt: Functional MCP server with multiple run modes
  • processor/: MCP server implementation
    • CodeAnalyzer.kt: Analyzes code structure
    • CodeContentProcessor.kt: Processes code files
  • service/: Core services for repository analysis
    • GitService.kt: Handles repository cloning
    • ModelContextService.kt: Generates insights using AI models
    • RepositoryAnalysisService.kt: Coordinates the analysis process

All services are implemented as functional data classes with explicit dependency injection.

License

This project is open source and available under the MIT License.