A local ChatGPT-style web app for Babson College students, built with Python, Flask, and the OpenAI API.
I am taking your Web Tech class next semester, so in anticipation for that I wanted to do something more focused on the front end side of development. I still had some left over OpenAI API credits of my own from my last project, so I thought it could be fun to use those in something front end related. Upon thinking about what I wanted to do, asking ChatGPT for suggestions, and thoughts about some other ideas I had, I thought it could be fun to recreate the ChatGPT website (albeit much simpler) using the Babson branding (I had to find the exact hexadecimal green they use from their official Canva templates) and call it something fun like "BabsonGPT". With my knowledge from the course and completing my previous projects, alongside quite a bit of help from ChatGPT learning some HTML/CSS and even a little JavaScript, I managed to get my vision up and running. Thus,
The goal of this project is to recreate the experience of ChatGPT as a local web app that runs using Flask, HTML/CSS, Python, and the OpenAI API
BabsonGPT has:
- A left sidebar with recent chats
- A main chat panel that looks and feels like ChatGPT
- Babson College branding (core green
#006644) - Support for multiple chats that are saved locally in the browser
The main purpose is to:
- Practice working with external APIs (OpenAI)
- Build a small, realistic web application with a client–server architecture
- Explore how to maintain multi-turn conversation history in a simple and safe way
A Babson student opens BabsonGPT and:
- Starts a new chat called "OIM HW 3".
- Asks: "Explain arrays in simple terms and give me a small example."
- Gets a step-by-step explanation from BabsonGPT in the main panel.
- Later clicks back on that chat in the sidebar to revisit the explanation or ask follow-up questions.
All of this runs locally at http://127.0.0.1:5000
- Python 3.9+
- A terminal (Command Prompt, PowerShell, or similar)
- An OpenAI API key with access to the
gpt-5-nanomodel
Download from my github (https://github.com/simonjeast/OIM3640-Final-Project/tree/main)
Open the code in VSCode (or another IDE) through github
Make sure all necessary packages are installed (Flask, OpenAI, Python, etc)
Create a file named .env in the project root:
OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx
You should see output similar to:
- Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
Open a browser and go to:
You will see the BabsonGPT interface.
- Type a message in the input box ("Message BabsonGPT…").
- Press Send or hit Enter (Shift+Enter for a newline).
- BabsonGPT replies in the main chat window.
- Use + New chat in the sidebar to start a fresh conversation.
- Click on items under Recent to switch between saved chats.
Chats are saved locally in your browser using localStorage. They persist across page reloads on the same browser/computer, rather then uploading to a website. (Maybe I will buy the 'BabsonGPT.com' domain in the future!)
This section describes the system at an architectural level
BabsonGPT is a classic small web app:
-
Frontend (browser)
- HTML/CSS/JavaScript in
templates/index.htmlandstatic/style.css - Renders the chat UI and sidebar
- Manages chat state in JavaScript and
localStorage - Sends requests to the Flask backend for model responses
- HTML/CSS/JavaScript in
-
Backend (Flask)
-
app.pyuses:GET /– servesindex.htmlPOST /chat– receives the conversation, calls OpenAI API, returns the replyGET /health– additional simple health check to make sure API is working
-
Loads
OPENAI_API_KEYfrom.envusingpython-dotenv -
Uses the OpenAI Python client to call
gpt-5-nano
-
Very rough diagram: Courtesy of ChatGPT!
Browser (HTML/CSS/JS) <--HTTP--> Flask (app.py) <--API--> OpenAI
| |
localStorage .env / OPENAI_API_KEY
Key pieces:
-
Loads environment variables via
load_dotenv()and readsOPENAI_API_KEY. -
Initializes
OpenAI()client once at the top. -
/chatendpoint:-
Expects JSON
{ "messages": [...] }. -
Calls
client.chat.completions.create(...)with:model="gpt-5-nano"- A list of
{"role": ..., "content": ...}messages (system + user + assistant) - A modest
temperaturefor slightly creative but not wild responses
-
Returns
{"reply": "...text from model..."}or{"error": "..."}
-
-
Error handling:
- Catches exceptions and sends back an error JSON
Layout
-
Left sidebar (solid Babson green
#006644):- Logo and "BabsonGPT" title
- "+ New chat" button
- "Recent" list, showing saved conversations
- Account label ("Student · Babson College")
-
Main area:
- Top green banner header with the title and subtitle.
- Large white chat window in the center.
- Bottom input box and "Send" button.
Chat State and Logic (JavaScript in <script> tag)
-
Uses a
SYSTEM_MESSAGEto make it appear more Babson related:const SYSTEM_MESSAGE = { role: "system", content: "You are BabsonGPT, a helpful assistant for a Babson College student.", };
-
Maintains an array of chats:
chats = [ { id: "timestamp", title: "First question about ...", messages: [SYSTEM_MESSAGE, {role: "assistant", ...}, {role: "user", ...}, ...], createdAt: 1700000000000 }, ... ];
-
Also tracks
activeChatIdso it knows which chat is open. -
Uses
localStorage:localStorage.setItem("babsongpt_chats_v2", JSON.stringify({ chats, activeChatId }));
-
On page load:
- Tries to load chats from
localStorage. - If found, renders the sidebar list and the active chat.
- If not found, creates an initial "Welcome chat" with a greeting.
- Tries to load chats from
-
When the user sends a message:
- Adds the message to the current chat’s
messagesarray. - If it’s the first user message, uses it to auto-generate a short chat title.
- Sends
{ messages }to/chatviafetch("/chat", {...}). - Displays the assistant’s response and saves back to
localStorage.
- Adds the message to the current chat’s
-
When the user clicks + New chat:
- Creates a new chat with a default greeting from BabsonGPT.
- Sets it as the active chat and updates the sidebar and main view.
-
When the user clicks a chat in the Recent list:
- Switches
activeChatIdto that chat. - Re-renders the main chat log with its messages.
- Switches
Some explicit choices:
-
Flask vs. other frameworks Chose Flask for simplicity and because it's what I'm used to from class.
-
LocalStorage instead of a database The project is meant to run on a student’s laptop, and this is definitely not production ready. Storing the information locally, however, is a nice touch to my project in my opinion. Storing locally is enough to persist chats between page reloads and doesn’t require any extra infrastructure.
-
gpt-5-nano model This model is inexpensive and good enough for class-level chatbot tasks. It keeps the cost very low even if many prompts are sent during testing.
- Hold multi-turn conversations with context, similar to ChatGPT.
- Save multiple chats and let you switch between them in the Recent sidebar.
- Maintain a visually clean, ChatGPT-like interface using Babson’s green as the primary color.
- Run entirely on localhost, with only the model calls going over the network to OpenAI.
A few example prompts:
-
Course help
- User: "Explain the difference between gross margin and net margin in simple terms."
- BabsonGPT: Provides definitions, formulas, and a small example.
-
Brainstorming
- User: "Give me 5 startup ideas related to sustainability on a college campus."
- BabsonGPT: Produces a list of ideas (e.g., reusable container system, energy dashboard, etc.).
-
Programming help
- User: "Help me understand what a Flask route is and show a tiny example."
- BabsonGPT: Explains the concept and gives a
@app.routesnippet.
These conversations remain accessible in the sidebar until the user clears browser storage.
- BabsonGPT depends on internet access and a valid API key.
- It is unrelated to Babson College, they may be mad at me for using their branding.
- As with any large language model (it’s just a thin interface on top of the OpenAI model) it can still hallucinate or give incorrect answers.
The project evolved in several stages:
-
Basic Flask + API call
- Started with a minimal Flask app returning "Hello, World."
- Added a
/chatroute that accepted a fixed prompt and returned a model response.
-
Simple chat page
- Created
index.htmlwith a basic text area and "Send" button. - Connected frontend to backend using
fetchand JSON. - Implemented a basic message history array in JavaScript to maintain context.
- Created
-
Styling and Babson branding
- Added
style.cssto move toward a ChatGPT-like layout. - Introduced a left sidebar and main chat area.
- Switched the color palette to match Babson’s green (
#006644) and made the chat window white for readability.
- Added
-
Multiple chats + saving
-
Initially, only one conversation existed and resetting the app cleared it.
-
Upgraded to a multi-chat model:
chats = [...]arrayactiveChatIdto track which chat is open- Sidebar "Recent" list populated from
chats
-
Added
localStoragepersistence so chats stay around across page reloads.
-
-
Polish and error handling
- Introduced a simple loading state (
Send→Thinking...). - Displayed error messages inside the chat if an API call failed.
- Cleaned up titles and made the first user message set the chat name.
- Introduced a simple loading state (
Overall, the project moved from "a single API call demo" to a small but realistic product.
This project builds on several tools, docs, and resources:
-
OpenAI Python Client & API Docs For understanding how to call
gpt-5-nanoand structuremessages. -
Flask Documentation For setting up routes, templates, and basic server behavior.
-
ChatGPT For extensive help throughout the project, especially in the front-end development of the project, using JavaScript features (
fetch, event listeners) andlocalStorage, etc. -
CSS Reference / layout patterns For inspiration on building a two-column layout with a header and content area.
No external icon sets or custom fonts were imported; the app uses system fonts and simple text-based UI.
AI was very useful in making this project come to life, so it deserves credit as well.
-
I used ChatGPT as a coding assistant to:
- Brainstorm the project requirements for a local ChatGPT clone.
- Help draft early versions of the Flask app, HTML structure, and CSS styling.
- How best to implement multi-chat support.
-
I reviewed, edited, and integrated all AI-generated code:
- I made sure to frequently ask questions along the way on how certain features were working, and most importantly, working together.
- I debugged errors (e.g., import issues, missing keys, API errors) on my own.
- I made decisions about architecture (Flask vs. other frameworks, using
localStorage, model choice).
-
I understand how the AI-assisted code works:
- I can explain how
/chatprocesses requests and calls the OpenAI API. - I can explain the structure of
messages, the role of the system prompt, and how context is built. - I learned how to implement local storage in order to store and restore chats
- I can explain how
Ideas for further improvement:
- Streaming responses – Show the model’s answer as it generates text, rather than all at once.
- Settings panel – Allow the user to change the model, temperature, and maximum token settings from the UI.
- Export / import chats – Let the user download a chat transcript as a
.txtor.mdfile. - Authentication & multi-user mode – If this were deployed, add user accounts and server-side storage (e.g., a small database) instead of localStorage.
- Content filters / safety – Add additional checks to encourage safe and appropriate use.
Overall, BabsonGPT is a compact, local tool that demonstrates how to tie together Python, web development, and the OpenAI API in a way that covers the scope of our course.