This project provides an intelligent agent for the 6G-Sandbox platform, enabling users to interactively configure, validate, and deploy Trial Networks (TNs) using natural language. The agent leverages LangChain, LangGraph, Azure OpenAI, and GitHub integration to automate the process of selecting components, resolving dependencies, and generating YAML descriptors for deploymentm using TNLCM (Trial Network Lifecycle Manager) as validation and deployment tool.
The project is composed by:
This folder contains the main API and orchestration login for the 6G-Sandbox Agent. Its structure is as follows:
__init__.py: Marks the folder as a Python package.main.py: Entry point for the FastAPI application, exposing REST endpoints for interacting with the agent. The main endpoints are:GET /– Basic welcome endpoint to verify the API is reachable.GET /health– Liveness/health check for monitoring and deployments.POST /api/login– Simple credential check used by the UI.POST /process_query– Core endpoint that processes a natural-language query and generates/modifies a Trial Network (TN) YAML descriptor.POST /stream– Streaming (SSE) version ofPOST /process_query, sending incremental updates.POST /get_all_components_from_session– Returns all components discovered for a given session.POST /get_selected_components_from_session– Returns the subset of components selected in the session.POST /get_state_session– Returns the full internal graph state associated with a session.POST /get_messages_session– Returns all messages exchanged between internal agents for a session.GET /get_session_ids– Lists all stored session IDs from the LangGraph checkpointer.DELETE /delete_session_id– Deletes the state associated with a single session ID.DELETE /delete_all_session_ids– Deletes the state for all sessions.
models.py: Orchestrates the LangGraph workflow by constructing a state graph with multiple nodes representing different stages of the agent's pipeline. This module:- Initializes the database connection and memory checkpointer for maintaining conversation state
- Defines the complete graph structure with nodes for initial configuration, orchestration, YAML generation, validation, consulting, and TNLCM integration
- Configures conditional edges that route execution based on user intent (consult, modify, create, validate)
- Compiles the graph with checkpointing capabilities to enable stateful multi-turn conversations
- Serves as the central configuration point that brings together all workflow components from
sandbox_llm.py
This folder implements the backend logic that powers the agent’s API and workflow orchestration.
This folder contains configuration files and templastes used by the 6G-Sandbox Agent.
.env: Environment variables for credentials and configuration (such asAzure OpenAI,GitHubandAnsibletokens). This file is not included in the repo since it contains private information.sample_tnlcm_descriptor.yaml: Example YAML descriptor for a Trial Network, used for testing the Agent.template.yaml: Base YAML template for generating new Trial Network descriptors.
This folder centralizes all configuration and template files required for the agent’s operation and deployment.
This folder contains the auxiliar documentation (diagrams, images, documentation, ...) to understand 6G-Sandbox AI Agent.
This folder contains the core logic and utilities for the 6G-Sandbox Agent.
__init__.py: Marks the folder as a Python package.sandbox_llm.py: MainPythonmodule implementing the agent’s orchestration and reasoning logic. It:- Integrates with
LangChain,LangGraph,Azure OpenAI,GitHub, Ansible Vault, and TNLCM tools - Defines the LangGraph node functions (e.g.,
check_initial_configuration,orchestrator_node,consulting_node,check_yaml_example_node,parse_components_node,check_dependencies_node,create_yaml_template_node,modify_yaml_template_node,validate_yaml_tnlcm_node,tnlcm_agent_node,tnlcm_tools_node) - Drives the full lifecycle of a Trial Network descriptor: component discovery, dependency resolution, YAML example retrieval, template creation, modification, and validation/deployment via TNLCM
- Provides database helpers (
create_database,cleanup_database) used to back LangGraph’s SQLite checkpointer
- Integrates with
This folder is the main location for the agent’s business logic, workflow orchestration, and interactive experimentation.
This folder contains the different tools that the agent can use to interact with external systems like GitHub and TNLCM.
git_tools.py: Provides tools for interacting with the6G-Libraryand6G-Sandbox-SitesGitHub repositories.get_libraries(): Fetches the list of available component libraries from the repository.get_components_branch(repo, branch): Retrieves a list of component directories from a specified repository and branch.get_components_from_site(repo, site): Fetches the available components for a specific site, decrypting the configuration using Ansible Vault.yaml_retrieval(repo, branch, components): Retrieves and parses thepublic.yamlfile for a given list of components.yaml_example_retrieval(repo, branch, component): Fetches a sampletnlcm_descriptor.yamlfor a component, using an LLM to select the best match if multiple examples exist.
other_tools.py: Contains other miscellaneous tools.tnlcm_tools.py: Provides tools for interacting with the Trial Network Lifecycle Manager (TNLCM) API.tnlcm_validate_yaml(): Validates a given TNLCM descriptor YAML against the TNLCM API.activate_tnlcm_trial_network(): Activates a trial network using its ID.delete_tnlcm_trial_network(): Deletes a trial network.purge_tnlcm_trial_network(): Permanently purges a trial network.get_tnlcm_all_trial_networks(): Lists all existing trial networks.get_tnlcm_trial_network(): Retrieves detailed information for a specific trial network.
ℹ️ Note The current deployed UI (http://10.95.82.65:8081/) is using the repository frontend-chatbot-sandbox
This folder contains the web-based user interface for the 6G-Sandbox Agent which connects through Telefónica's Model Gateway.
Dockerfile: This build a Docker iamge for serving the web-based user interface of the 6G-Sandbox Agent using NGINX, using a custom NGINX configuration.docker-compose.yml: It has the same functionality as theDockerfile, but makes it easy to build and run the UI container, mapping it to a local port for development or deployment.nginx/: Used to serve the static web content.default.conf: Configuration of the NGINX.
www/: It conatins the main web assets and frontend codeindex.html: Entry point for the web application.main.js: Main JavaScript logic for the UI.assets/: CSS files and other static assets for styling the interface.img/: Logos and images used in the UI.skin/: Custom CSS themes for light and dark modes.
This folder provides all the resources needed to run and customize the graphical interface for interacting with the 6G-Sandbox Agent.
This is the file that contains the neccesary libraries which need to be installed in the environment to be able to execute the Sandbox AI Agent.
The following section presents the internal architecture of the AI Agent developed for the 6G Sandbox project, as shown in Figure 1. This agent is designed to assist users in creating, modifying, validating, and consulting YAML-based network descriptors used for trial deployments across federated testbed sites.
In its first iteration, the AI Agent establishes an interaction loop between the user and the system where the user selects:
- A target branch of the 6G-Library repository.
- A specific site where the deployment will be executed.
Based on the user's intent and context, the orchestrator component analyzes the prompt and autonomously chooses one of five high-level actions:
- Lazy Creation – Simplified YAML generation using prefilled component templates.
- Zero-Touch Creation – Fully automated descriptor generation from high-level user goals.
- Modify – Update or extend an existing YAML file.
- Consult – Retrieve information about network components or deployment constraints within 6G-Library.
- Validate – Check the structure and dependencies of an existing or generated YAML descriptor using TNLCM tools.
graph TD;
__start__([<p>__start__</p>]):::first
initial_configuration(initial_configuration)
orchestrator(orchestrator)
check_yaml_example(check_yaml_example)
parse_components(parse_components)
check_dependencies(check_dependencies)
create_yaml_template(create_yaml_template)
consulting_node(consulting_node)
validate_yaml_tnlcm(validate_yaml_tnlcm)
modify_yaml_template(modify_yaml_template)
tnlcm_agent_node(tnlcm_agent_node)
tnlcm_tools_node(tnlcm_tools_node)
__end__([<p>__end__</p>]):::last
__start__ --> initial_configuration;
check_dependencies --> create_yaml_template;
check_yaml_example -. end .-> __end__;
check_yaml_example -.-> parse_components;
create_yaml_template --> validate_yaml_tnlcm;
initial_configuration --> orchestrator;
modify_yaml_template --> validate_yaml_tnlcm;
orchestrator -. end .-> __end__;
orchestrator -.-> check_yaml_example;
orchestrator -. consult .-> consulting_node;
orchestrator -. modify .-> modify_yaml_template;
orchestrator -.-> parse_components;
orchestrator -. tnlcm_agent .-> tnlcm_agent_node;
parse_components --> check_dependencies;
tnlcm_agent_node -.-> __end__;
tnlcm_agent_node -. tools .-> tnlcm_tools_node;
tnlcm_tools_node --> tnlcm_agent_node;
validate_yaml_tnlcm -. end .-> __end__;
validate_yaml_tnlcm -. modify .-> modify_yaml_template;
consulting_node --> __end__;
classDef default fill:#f2f0ff,line-height:1.2
classDef first fill-opacity:0
classDef last fill:#bfb6fc
Figure 1: Architecture of the Sandbox Agent — this diagram illustrates the core components and data flow.
- Clone the repository:
git clone https://github.com/Telefonica/fnl-sandbox-agent.git
cd fnl-sandbox-agent- Create an environment and access to it:
python3 -m venv env
source env/bin/activate # On Linux/macOS
# Or on Windows: env\Scripts\activate- Install dependencies:
pip install -r requirements.txtCreate a .env file in the project root with the following structure:
# Azure OpenAI Configuration
AZURE_OPENAI_API_KEY=your_api_key_here
AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com/
AZURE_OPENAI_DEPLOYMENT_NAME=your_deployment_name
AZURE_OPENAI_API_VERSION=2024-02-15-preview
# GitHub Configuration
GITHUB_TOKEN=your_github_personal_access_token
# Ansible Configuration
ANSIBLE_VAULT_PASSWORD=your_vault_password
DEFAULT_SITE=UMA
# TNLCM Configuration
TNLCM_API_URL=http://your-tnlcm-instance
TNLCM_API_TOKEN=your_tnlcm_token.env file to version control. It's already included in .gitignore.
This project includes an API powered by FastAPI and a separate UI component. Below are the steps to deploy each part of the system, both in development mode and using Docker.
To start the API server from the root of the project:
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reloadTo run the server in the background and log output to uvicorn.log:
nohup uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload > uvicorn.log 2>&1 &To deploy the API using Docker in daemon mode:
docker-compose -f docker-compose.yml up -d --buildView logs:
docker-compose logs -fStop the API:
docker-compose downThe UI has its own docker-compose.yml located in the ui/ directory.
Start the UI:
docker-compose -f ui/docker-compose.yml up -d --buildView logs:
docker-compose -f ui/docker-compose.yml logs -fStop the UI:
docker-compose -f ui/docker-compose.yml downStart the UI:
docker-compose up -d --buildView logs:
docker-compose logs -fStop the UI:
docker-compose downAt this moment, deployments requiring descriptor decryption are limited to the UMA site. The UMA site is available because the corresponding Ansible decryption key is provided via the .env configuration. Other sites are not yet enabled for decryption, so they are currently unavailable for automated deployments through the agent and for consulting. To change the site, it needs to update manually the environment Ansible key.
The Model Gateway is proprietary software developed by Telefónica. Its interface and connection endpoints are private at this time and cannot be exposed publicly. Any integration in this project assumes access to Telefónica’s internal Model Gateway and valid credentials configured in the .env file.
That said, the agent exposes a REST API, so the Model Gateway can be replaced by other frontend interfaces (e.g., Streamlit, custom web apps, or CLI clients) that consume the agent’s endpoints. You can build your own UI to "attack" the API directly without relying on the Model Gateway.
This project is proprietary software developed by Telefónica Innovación Digital (TID). All rights reserved ©.