Skip to content

Shourav-Deb/Dev-Flux

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Dev Flux


ℹ What is Dev Flux?

Dev Flux is a centralized, software-defined platform designed to simulate, monitor, and record realistic edge device behavior under both normal and anomalous conditions.

Instead of relying on expensive physical hardware such as routers, IoT devices, or authentication servers, Dev Flux creates virtual sensors that behave like real systems using:

  • Mathematical models
  • Time-series equations
  • Controlled randomness
  • Scenario-driven state transitions

The platform enables reproducible experimentation, machine-learning-ready dataset generation, and multi-device anomaly research, all through an intuitive web-based interface.

🛠️ Why Dev Flux Exists

Modern cybersecurity and edge computing research depends heavily on almost realistic testbeds. However, most researchers and students face serious limitations:

  • Physical devices are expensive
  • Testbed setup is complex and time-consuming
  • Existing simulators generate unrealistic random data
  • Multi-device and coordinated attack scenarios are hard to model
  • Reproducible datasets are difficult to produce

Dev Flux solves this problem by providing a fully software-based alternative that produces almost realistic, time-correlated, and labeled data, suitable for research, teaching, and experimentation.

🌟 What This Project Enables

Dev Flux allows users to:

  • Simulate realistic device-level behavior
  • Inject controlled anomalies
  • Monitor behavior live through a centralized dashboard
  • Replay historical runs for inspection and analysis
  • Automatically generate ML-ready datasets with proper labeling
  • Study multi-device and distributed attack patterns
  • Experiment without any physical hardware

🔑 Key Benefits

  • Hardware-Free Testbed
    No routers, IoT boards, or servers required.

  • Equation-Driven Sensor Models
    Data evolves over time instead of appearing as random noise.

  • Scenario-Based Experimentation
    Attacks can be injected gradually, suddenly, or in coordinated waves.

  • Reproducibility
    Same scenario → same behavior → consistent datasets.

  • ML Readiness
    Clean labels, timestamps, and structured outputs for anomaly detection research.

  • Educational Value
    Ideal for demonstrations, labs, and learning environments.


🗂 Project Evolution and Versioning

Dev Flux is a single evolving project, developed iteratively to improve realism, usability, and research capability. Each version represents a clear milestone, not a separate idea. The repository reflects this journey transparently and intentionally. Below is a structured overview of each version.

Some sites may go to sleep when not in use. Just access the site and give it a minute to wake up. It will be online again shortly.

🔹 Version 0 – Core System Foundation


Dev Flux Version

⏩ Overview

Dev Flux V0 establishes the functional foundation of the project. All core ideas, sensor models, dataset generation logic, and system behavior are present in this version.

Access Site: Dev Flux (Version 0).

⏩ What Was Implemented

  • Virtual sensors for:
    • Network (normal + anomalous)
    • Authentication (normal + anomalous)
  • Equation-driven, time-correlated data generation
  • Threaded sensor execution
  • Live monitoring through a web interface
  • Historical run replay
  • Dataset generation and download
  • Clean labeling for ML use

⏩ User Manual

  1. Launch the application web.
  2. Open the Sensors tab.
  3. Toggle individual sensors:
    • Network
    • Network Anomaly
    • Authentication
    • Authentication Anomaly
  4. Observe real-time data in live tables.
  5. Ensure all sensors are OFF.
  6. Go to Dataset Generator, enter name and turn dataset mode ON.
  7. The system automatically:
    • Starts all sensors
    • Synchronizes logs
    • Applies ground-truth labels
  8. Stop dataset mode to:
    • Package logs
    • Generate metadata
    • Create a ZIP dataset. [Download datasets directly from the UI]

Best for: ML pipelines, academic experiments, reproducible research.

⏩ Technical Characteristics

  • Flask backend
  • JSONL-based logging
  • Mathematical and probabilistic sensor models
  • Bootstrap-based frontend
  • Premium UI with animated interactions

⏩ Purpose of V0

  • Prove feasibility of software-defined edge sensors
  • Demonstrate realistic data generation
  • Establish the complete end-to-end pipeline

🔹 Version 1 – UI Refinement & Code Stabilization


Dev Flux Version

⏩ Overview

Dev Flux V1 is functionally equivalent to V0. The system behavior, data models, and outputs remain the same.

Access Site: Dev Flux (Version 1).

⏩ What Changed

  • Minor UI interaction refinements:
    • Smoother animations
    • Enhanced visual feedback
  • Internal code cleanup and structure improvements
  • Improved readability and maintainability
  • More consistent naming and organization

⏩ User Manual

  1. Launch the application web.
  2. Open the Sensors tab.
  3. Toggle individual sensors:
    • Network
    • Network Anomaly
    • Authentication
    • Authentication Anomaly
  4. Observe real-time data in live tables.
  5. Ensure all sensors are OFF.
  6. Go to Dataset Generator, enter name and turn dataset mode ON.
  7. The system automatically:
    • Starts all sensors
    • Synchronizes logs
    • Applies ground-truth labels
  8. Stop dataset mode to:
    • Package logs
    • Generate metadata
    • Create a ZIP dataset. [Download datasets directly from the UI]

Best for: ML pipelines, academic experiments, reproducible research.

⏩ What Remained the Same

  • Same sensor logic
  • Same equation-based models
  • Same dataset generation pipeline
  • Same capabilities and outputs

⏩ Purpose of V1

  • Polish the system without altering behavior
  • Improve maintainability and clarity
  • Prepare the project for further architectural expansion

🔹 Version 2 – Scenario-Driven Testbed Introduction


Dev Flux Version

⏩ Overview

Dev Flux V2 marks the first real architectural shift. The project transitions from manual sensor control to a scenario-driven testbed model.

Access Site: Dev Flux (Version 2).

⏩ What Was Implemented

  • Scenario engine with time-based phases
  • Automated switching between normal and attack states
  • Coordinated behavior across multiple sensors
  • Central testbed controller
  • Testbed execution UI

⏩ User Manual

  1. Select & Upload a scenario JSON file.
  2. Start the testbed.
  3. The system automatically:
    • Switches between normal and attack phases
    • Coordinates multiple sensors
  4. Monitor:
    • Live sensor data
    • Detected anomalies
  5. Stop the testbed to save the session.
  6. Load past sessions for analysis.

Best for: IDS testing, attack pattern evaluation, security research.

⏩ UI Scope

  • Testbed fully implemented and operational
  • Other system sections shown as informational placeholders

⏩ Technical Focus

  • Scenario JSON definitions
  • Timeline-driven state management
  • Centralized logging
  • Testbed-oriented UI layout
  • TCP-based internal sensor transport
  • Uploaded-scenario execution and past-session replay

⏩ Repository & Source Note

The repository includes scenario files and the materials needed to understand or demonstrate the Version 2 flow from the live site.

If you need the full local source package, deployment-ready files, or the complete runnable Version 2 bundle, please contact me directly.

⏩ Purpose of V2

  • Introduce automation and repeatability
  • Enable structured experiments
  • Shift from manual control to orchestration

🔹 Version 3 – Testbed-Centric Research Platform


Dev Flux Version

⏩ Overview

Dev Flux V3 refines V2 by narrowing focus entirely to the testbed. All non-essential UI sections are removed to create a clean, research-oriented interface.

Access Site: Dev Flux (Version 3).

⏩ What Changed

  • Testbed becomes the sole active UI module
  • Simplified interface for live monitoring and replay
  • Clear separation between:
    • Monitoring
    • Anomaly detection
    • Historical analysis
  • Same scenario-driven experiment flow with a cleaner research presentation

⏩ User Manual

  1. Select & Upload a scenario JSON file.
  2. Start the testbed.
  3. The system automatically:
    • Switches between normal and attack phases
    • Coordinates multiple sensors
  4. Monitor:
    • Live sensor data
    • Detected anomalies
  5. Stop the testbed to save the session.

Best for: IDS testing, attack pattern evaluation, security research.

⏩ What Remained the Same

  • Same scenario engine
  • Same testbed controller
  • Same sensor logic and data generation
  • Same local testbed-style experiment workflow introduced in V2

⏩ Repository & Source Note

The repository and live site make it possible to understand the Version 3 interface and scenario-based workflow.

If you need the full local source package, full local run files, or the original runnable Version 3 build, please contact me directly.

⏩ Purpose of V3

  • Reduce UI noise
  • Emphasize experimentation and observation
  • Support academic and research demonstrations

🔹 Version 4 – Three-Sensor Clean UDP Testbed


Dev Flux Version 4

⏩ Overview

Dev Flux V4 marks a major runtime expansion. This version introduces a three-sensor clean testbed built around a more complete STGen-style execution flow, covering Network, Authentication, and Host sensors.

Unlike the earlier web-first TCP scenario testbed, Version 4 shifts into a UDP-based local runtime model with separate server, sensors, client, and live browser showcase components. It also introduces a full training-and-detection workflow using generated server-side logs and an XGBoost multiclass model.

⏩ Access & Preview

Version 4 is handled differently from the earlier web-hosted versions.

  • The full Version 4 runtime is intended to be run locally on your PC.
  • Because this version depends on local UDP runtime processes and local log files, the full testbed is not exposed as a normal public access site like Versions 0–3.
  • If you need the full source code and local testbed package, please contact me directly.

Anomaly Detection Preview: Dev Flux Anomaly Detection

You can also open the anomaly detector locally using anomoly/index.html if that local copy is available in your package.

In this repository, check the Version 4 folder. There you can find:

  • the trained model file,
  • the label file,
  • and sample data.

You can upload those assets to the anomaly detection preview to test how the detector works, even without running the full local testbed.

⏩ What Was Implemented

  • Three-sensor runtime for:
    • Network
    • Authentication
    • Host
  • UDP-based sensor-to-server communication
  • Separate server, sensors, and client runtime roles
  • JSON-controlled sensor behavior and client actions
  • Local live browser showcase for current runtime visualization
  • Server-side log generation for dataset creation
  • XGBoost-based multiclass training workflow
  • Local and hosted anomaly detection workflow using trained model artifacts

⏩ How V4 Differs From Earlier Versions

  • Adds a third sensor family (Host) beyond the earlier Network and Authentication focus
  • Moves from the earlier TCP web-testbed style to a UDP-based STGen runtime
  • Splits the system into dedicated runtime components instead of a single hosted testbed page
  • Introduces a stronger data generation → training → retraining → detection loop
  • Supports a cleaner, more controlled baseline dataset before realism is increased further in V5

⏩ User Manual

  1. Run the Version 4 runtime locally.
  2. The system launches:
    • STGen Server
    • STGen Sensors
    • STGen Client
    • STGen Live Showcase
  3. Let the scenario complete and generate logs.
  4. Collect logs from launcher/server_sensor_log/.
  5. Train the XGBoost model using the generated logs.
  6. Save:
    • stgen_model_xgboost.json
    • stgen_label_classes.json
  7. Generate fresh logs again for evaluation.
  8. Optionally use remove_event.py to create unlabeled testing logs.
  9. Test the detector either:
    • locally through anomoly/index.html, or
    • through the hosted anomaly detection preview
  10. Upload the model, label file, and sample/generated logs to verify predictions.

Best for: clean dataset generation, baseline anomaly detection training, reproducible multi-sensor demonstrations.

⏩ Training Pipeline

Version 4 uses a dedicated XGBoost multiclass training pipeline that:

  • parses structured server-side logs from the three sensors
  • removes timestamp and seq_no from model input
  • one-hot encodes sensor_type
  • trains one shared model across all three sensor types
  • exports the model, label file, prediction rows, confusion matrix, metrics, and classification report

This creates a repeatable clean-data baseline that users can retrain later with new data in Kaggle, local environments, or Google Colab.

⏩ Repository & Source Note

The public repository is intended to provide the assets needed to test the detection workflow for Version 4.

For the full local runtime, source code bundle, and original local execution files, please contact me directly.

⏩ Purpose of V4

  • Extend Dev Flux into a more complete three-sensor runtime
  • Support full anomaly-detection experimentation beyond the earlier hosted testbed focus
  • Establish a clean baseline before moving toward more realistic messy-data generation in V5

🔹 Version 5 – Realism-Focused Messy Data Testbed


Dev Flux Version 5

⏩ Overview

Dev Flux V5 builds directly on V4, but shifts the focus from a clean baseline into more realistic synthetic behavior generation.

The core three-sensor structure remains the same, but Version 5 introduces configurable realism styles that allow the data to become clean, semi-messy, messy, or noisy depending on scenario configuration. The result is a version that better reflects the irregularity and instability often found in real systems while still remaining controllable for research and ML experiments.

⏩ Access & Preview

Version 5 also relies on the local UDP runtime model and is meant to be run locally for full functionality.

  • The full Version 5 runtime is intended to be run locally on your PC.
  • Because this version depends on local UDP runtime processes, local scenario files, and local log outputs, it is not exposed as a normal public hosted testbed site like Versions 0–3.
  • If you need the full source code and local Version 5 package, please contact me directly.

Anomaly Detection Preview: Dev Flux Anomaly Detection

You can also run the detector locally with anomoly/index.html if that local copy is available in your package.

In this repository, check the Version 5 folder. There you can find:

  • the trained model file,
  • the label file,
  • and sample data.

You can upload those assets to the anomaly detection preview to test how the Version 5 detection flow works without needing to run the full local runtime first.

⏩ What Was Implemented

  • Retains the three-sensor runtime for:
    • Network
    • Authentication
    • Host
  • Keeps the UDP-based STGen-style execution model introduced in V4
  • Adds configurable realism styles:
    • clean
    • semi-messy
    • messy
    • noisy
  • Adds time-varying drift, controlled instability, and more natural transitions between states
  • Adds transition blending between previous and current phases
  • Adds jittered sample intervals instead of perfectly fixed timing
  • Adds configurable spikes, bursts, and recovery bias
  • Supports per-phase and per-sensor behavior control through scenario configuration
  • Upgrades the training workflow with more advanced split, evaluation, and export logic

⏩ What Makes the Data More Realistic

Version 5 can simulate behavior that is less perfectly separated than the clean baseline. Instead of only switching modes, it can also represent:

  • drift over time
  • blended transitions between phases
  • irregular sampling intervals
  • transient spikes
  • burst behavior
  • recovery effects after abnormal periods
  • different realism levels depending on the configured style

This makes Version 5 more suitable for robustness testing, harder anomaly detection experiments, and more realistic dataset generation.

⏩ How V5 Differs From V4

  • V4 is the clean baseline; V5 is the realism-focused extension
  • V4 emphasizes structured, clearly separated data; V5 introduces controlled irregularity and realism presets
  • V5 supports semi-messy, messy, and noisy behavior profiles on top of the clean mode
  • V5 uses a more advanced training workflow with stronger split handling, evaluation exports, and model-tracking outputs
  • V5 is better suited for testing model robustness on synthetic data that is closer to real operational messiness

⏩ User Manual

  1. Run the Version 5 runtime locally.
  2. Configure the desired realism level through the sensor scenario settings.
  3. Launch the runtime and generate messy or realism-enhanced server-side logs.
  4. Collect logs from launcher/server_sensor_log/.
  5. Train or retrain the Version 5 model using the messy-data training pipeline.
  6. Save the exported model and labels.
  7. Use the anomaly detection preview to upload:
    • the trained model,
    • the label file,
    • and sample/generated logs
  8. Compare how the detector behaves on cleaner data versus more realism-heavy data.

Best for: realism-aware anomaly detection, robustness testing, messy synthetic data experiments, advanced multi-sensor evaluation.

⏩ Advanced Training Pipeline

Version 5 uses a more advanced XGBoost-based training workflow than Version 4. The pipeline:

  • sorts rows by sensor stream, timestamp, sequence number, and source line
  • assigns stable row identifiers for tracking
  • supports chronological per-sensor splitting (earliest 70% train, next 15% validation, last 15% test)
  • can automatically fall back to label-stratified random splitting if a strict time split fails to preserve all classes
  • uses early stopping during XGBoost training
  • exports richer evaluation artifacts such as:
    • feature names
    • feature importance
    • per-sensor metrics
    • top confusions
    • split assignments
    • confusion matrix
    • classification report

In the provided Version 5 training results, the detector reaches near-perfect performance, with only a very small confusion between DDoS and UDP_Flood in the reported top confusions. This highlights both the strength of the structured pipeline and the fact that the messy data still remains coherent enough for meaningful training and testing.

⏩ Repository & Source Note

The public repository is intended to provide the assets needed to test the Version 5 detection flow, including model artifacts and sample data.

For the full local runtime, source code bundle, and original local execution package, please contact me directly.

⏩ Purpose of V5

  • Move Dev Flux closer to realistic synthetic data behavior
  • Support stronger robustness testing for anomaly detection
  • Preserve reproducibility while introducing more operational messiness
  • Provide a better bridge between clean testbeds and real-world edge behavior

🔹 Version S – Streamlit Demo & Concept Showcase


Dev Flux Version

⏩ Overview

Dev Flux VS is not a separate implementation, but a demonstration layer built on the same conceptual model.

Access Site: Dev Flux (Streamlit Version).

⏩ What It Provides

  • Embedded, predefined scenarios
  • Timeline-driven attack visualization
  • Simplified controls for non-technical audiences
  • No setup or backend required

⏩ User Manual

  1. Open the Streamlit app.
  2. Select scenario duration (1, 2, or 3 minutes).
  3. Click Run.
  4. Watch:
    • Timeline progression
    • Active attack indicators
    • Attack summary table
  5. Pause, resume, or reset the simulation.

Best for: Quick demos, non-technical audiences.

⏩ Technical Characteristics

  • Streamlit-based interface
  • Auto-refreshing simulation timeline
  • Visual attack indicators
  • Stateless demo execution

⏩ Purpose of VS

  • Public demonstration
  • Quick understanding of the system concept
  • Basic presentation and showcase use

📩 Source Code, Local Runtime, and Collaboration

The public repository is designed to showcase the evolution of Dev Flux and provide selected materials for testing or demonstration.

What is publicly testable

  • Versions 0–3: the hosted sites allow direct interaction with the live web builds.
  • Version S: the Streamlit demo is publicly accessible for simplified concept demonstration.
  • Versions 4–5: the anomaly detection preview can be tested using the provided model, label, and sample data assets from the corresponding repository folders.

What is available on request

For users who need any of the following:

  • full source code packages,
  • local execution files,
  • original runtime bundles,
  • deployment-ready copies,
  • or research/demo collaboration support,

please contact me directly.

Contact

💙 Happy Coding 💙


About

⚡ Dev Flux creates reality-based IoT data, simulates cyberattacks, and builds ML-ready datasets to strengthen IoT edge security, all without physical hardware.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors