Skip to content

Add backend generation feature#22

Open
GYFX35 wants to merge 2 commits intomainfrom
add-backend-generation-feature
Open

Add backend generation feature#22
GYFX35 wants to merge 2 commits intomainfrom
add-backend-generation-feature

Conversation

@GYFX35
Copy link
Copy Markdown
Owner

@GYFX35 GYFX35 commented Aug 20, 2025

Summary by Sourcery

Add backend and ML model code generation capabilities, a task orchestrator, corresponding API endpoints, and update the frontend to showcase new services.

New Features:

  • Generate Flask-based backend code from user prompts
  • Generate boilerplate ML model code with customizable parameters
  • Orchestrate multiple generation tasks based on prompt keywords
  • Expose POST endpoints for task orchestration, backend development, and ML model generation
  • Add frontend service cards for AI Orchestrator, backend generation, and ML model generation

This commit introduces two new generative features to the AI agent:

1.  **AI-Powered Backend Generation:**
    - A `generate_backend` function in `app.py` creates a simple Python Flask backend from a prompt.
    - A new API endpoint, `/api/v1/develop/backend`, exposes this functionality.

2.  **AI/ML Model Generation:**
    - A `generate_ml_model` function in `app.py` creates boilerplate `scikit-learn` model code.
    - A new API endpoint, `/api/v1/ml/model`, exposes this functionality.

The frontend (`index.html`) has been updated to include service cards for both new features, making them visible to users.
This commit introduces three major new features to the AI agent, significantly expanding its capabilities:

1.  **AI Orchestrator:**
    - Implements an `orchestrate_task` function that can parse high-level user goals and call other generative tools to achieve them.
    - Exposed via a new `/api/v1/orchestrate` endpoint.
    - This feature addresses the user's request for "AGI capacities" by providing a more general, multi-step problem-solving capability.

2.  **AI-Powered Backend Generation:**
    - Adds a `generate_backend` function to create simple Python Flask backends from a prompt.
    - Exposed via a new `/api/v1/develop/backend` endpoint.

3.  **AI/ML Model Generation:**
    - Adds a `generate_ml_model` function to create boilerplate `scikit-learn` model code for classification or regression tasks.
    - Exposed via a new `/api/v1/ml/model` endpoint.

All new features are advertised on the frontend with corresponding service cards in `index.html`.
@sourcery-ai
Copy link
Copy Markdown

sourcery-ai bot commented Aug 20, 2025

Reviewer's Guide

This PR introduces backend and ML model generation functions, an orchestration mechanism, corresponding API endpoints, and frontend updates to showcase new AI services.

Sequence diagram for orchestrate API endpoint

sequenceDiagram
    participant User as actor User
    participant Frontend
    participant Backend
    participant AI as AI Logic
    User->>Frontend: Sends request to /api/v1/orchestrate with prompt
    Frontend->>Backend: POST /api/v1/orchestrate
    Backend->>AI: orchestrate_task(prompt)
    AI->>AI: Calls appropriate generation functions (e.g., generate_backend)
    AI-->>Backend: Returns generated content
    Backend-->>Frontend: Responds with generated message
    Frontend-->>User: Displays result
Loading

Sequence diagram for backend generation API endpoint

sequenceDiagram
    participant User as actor User
    participant Frontend
    participant Backend
    participant AI as AI Logic
    User->>Frontend: Sends request to /api/v1/develop/backend with prompt
    Frontend->>Backend: POST /api/v1/develop/backend
    Backend->>AI: generate_backend(prompt)
    AI-->>Backend: Returns backend code
    Backend-->>Frontend: Responds with generated code
    Frontend-->>User: Displays backend code
Loading

Sequence diagram for ML model generation API endpoint

sequenceDiagram
    participant User as actor User
    participant Frontend
    participant Backend
    participant AI as AI Logic
    User->>Frontend: Sends request to /api/v1/ml/model with prompt
    Frontend->>Backend: POST /api/v1/ml/model
    Backend->>AI: generate_ml_model(prompt)
    AI-->>Backend: Returns ML model code
    Backend-->>Frontend: Responds with generated code
    Frontend-->>User: Displays ML model code
Loading

Class diagram for new backend and ML model generation functions

classDiagram
    class App
    class AI_Logic {
        +generate_backend(prompt)
        +generate_ml_model(prompt)
        +orchestrate_task(prompt)
    }
    class FlaskApp {
        +orchestrate_endpoint()
        +develop_backend_endpoint()
        +ml_model_endpoint()
    }
    App <|-- FlaskApp
    FlaskApp o-- AI_Logic
Loading

File-Level Changes

Change Details Files
Added generate_backend function to produce Flask backend code from prompts
  • define generate_backend with route and message parsing from prompt
  • construct Flask app template based on parsed values
  • format response as a code snippet for backend.py
app.py
Added generate_ml_model function to produce ML model boilerplate
  • parse model type, library, features, and target from prompt
  • assemble sklearn data pipeline and training code based on settings
  • format response as a code snippet for ml_model.py
app.py
Introduced orchestrate_task to coordinate multiple generators
  • detect keywords (website, game, app, backend) to invoke generators
  • aggregate individual responses into a single message
app.py
Exposed new API endpoints for orchestration, backend, and ML services
  • add /api/v1/orchestrate endpoint invoking orchestrate_task
  • add /api/v1/develop/backend endpoint for backend generation
  • add /api/v1/ml/model endpoint for ML model generation
app.py
Updated frontend to advertise new AI services
  • add AI Orchestrator card in services section
  • add AI-Powered Backend card
  • add AI/ML Model Generation card
frontend/templates/index.html

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes and found some issues that need to be addressed.

  • Refactor the repeated prompt parsing loops in generate_backend and generate_ml_model into a shared utility to avoid duplicated code.
  • Since you’re embedding prompt‐derived strings directly into generated code templates, add input sanitization or escaping to avoid unexpected syntax errors or injection issues.
  • The simple substring checks in orchestrate_task may misclassify user intents—consider a more robust command or intent parsing approach rather than basic keyword matching.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Refactor the repeated prompt parsing loops in generate_backend and generate_ml_model into a shared utility to avoid duplicated code.
- Since you’re embedding prompt‐derived strings directly into generated code templates, add input sanitization or escaping to avoid unexpected syntax errors or injection issues.
- The simple substring checks in orchestrate_task may misclassify user intents—consider a more robust command or intent parsing approach rather than basic keyword matching.

## Individual Comments

### Comment 1
<location> `app.py:422` </location>
<code_context>
+def generate_ml_model(prompt):
</code_context>

<issue_to_address>
ML model code generation assumes classification or regression only and does not validate model type or library.

Add checks to ensure only supported model types and libraries are accepted, and handle unsupported cases gracefully.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment thread app.py
@@ -375,6 +375,38 @@ def generate_website(prompt):
"""
return response_message.strip()
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (bug_risk): ML model code generation assumes classification or regression only and does not validate model type or library.

Add checks to ensure only supported model types and libraries are accepted, and handle unsupported cases gracefully.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant