Dev Flux is a centralized, software-defined platform designed to simulate, monitor, and record realistic edge device behavior under both normal and anomalous conditions.
Instead of relying on expensive physical hardware such as routers, IoT devices, or authentication servers, Dev Flux creates virtual sensors that behave like real systems using:
- Mathematical models
- Time-series equations
- Controlled randomness
- Scenario-driven state transitions
The platform enables reproducible experimentation, machine-learning-ready dataset generation, and multi-device anomaly research, all through an intuitive web-based interface.
Modern cybersecurity and edge computing research depends heavily on almost realistic testbeds. However, most researchers and students face serious limitations:
- Physical devices are expensive
- Testbed setup is complex and time-consuming
- Existing simulators generate unrealistic random data
- Multi-device and coordinated attack scenarios are hard to model
- Reproducible datasets are difficult to produce
Dev Flux solves this problem by providing a fully software-based alternative that produces almost realistic, time-correlated, and labeled data, suitable for research, teaching, and experimentation.
Dev Flux allows users to:
- Simulate realistic device-level behavior
- Inject controlled anomalies
- Monitor behavior live through a centralized dashboard
- Replay historical runs for inspection and analysis
- Automatically generate ML-ready datasets with proper labeling
- Study multi-device and distributed attack patterns
- Experiment without any physical hardware
-
Hardware-Free Testbed
No routers, IoT boards, or servers required. -
Equation-Driven Sensor Models
Data evolves over time instead of appearing as random noise. -
Scenario-Based Experimentation
Attacks can be injected gradually, suddenly, or in coordinated waves. -
Reproducibility
Same scenario → same behavior → consistent datasets. -
ML Readiness
Clean labels, timestamps, and structured outputs for anomaly detection research. -
Educational Value
Ideal for demonstrations, labs, and learning environments.
Dev Flux is a single evolving project, developed iteratively to improve realism, usability, and research capability. Each version represents a clear milestone, not a separate idea. The repository reflects this journey transparently and intentionally. Below is a structured overview of each version.
Some sites may go to sleep when not in use. Just access the site and give it a minute to wake up. It will be online again shortly.
Dev Flux V0 establishes the functional foundation of the project. All core ideas, sensor models, dataset generation logic, and system behavior are present in this version.
Access Site: Dev Flux (Version 0).
- Virtual sensors for:
- Network (normal + anomalous)
- Authentication (normal + anomalous)
- Equation-driven, time-correlated data generation
- Threaded sensor execution
- Live monitoring through a web interface
- Historical run replay
- Dataset generation and download
- Clean labeling for ML use
- Launch the application web.
- Open the Sensors tab.
- Toggle individual sensors:
- Network
- Network Anomaly
- Authentication
- Authentication Anomaly
- Observe real-time data in live tables.
- Ensure all sensors are OFF.
- Go to Dataset Generator, enter name and turn dataset mode ON.
- The system automatically:
- Starts all sensors
- Synchronizes logs
- Applies ground-truth labels
- Stop dataset mode to:
- Package logs
- Generate metadata
- Create a ZIP dataset. [Download datasets directly from the UI]
Best for: ML pipelines, academic experiments, reproducible research.
- Flask backend
- JSONL-based logging
- Mathematical and probabilistic sensor models
- Bootstrap-based frontend
- Premium UI with animated interactions
- Prove feasibility of software-defined edge sensors
- Demonstrate realistic data generation
- Establish the complete end-to-end pipeline
Dev Flux V1 is functionally equivalent to V0. The system behavior, data models, and outputs remain the same.
Access Site: Dev Flux (Version 1).
- Minor UI interaction refinements:
- Smoother animations
- Enhanced visual feedback
- Internal code cleanup and structure improvements
- Improved readability and maintainability
- More consistent naming and organization
- Launch the application web.
- Open the Sensors tab.
- Toggle individual sensors:
- Network
- Network Anomaly
- Authentication
- Authentication Anomaly
- Observe real-time data in live tables.
- Ensure all sensors are OFF.
- Go to Dataset Generator, enter name and turn dataset mode ON.
- The system automatically:
- Starts all sensors
- Synchronizes logs
- Applies ground-truth labels
- Stop dataset mode to:
- Package logs
- Generate metadata
- Create a ZIP dataset. [Download datasets directly from the UI]
Best for: ML pipelines, academic experiments, reproducible research.
- Same sensor logic
- Same equation-based models
- Same dataset generation pipeline
- Same capabilities and outputs
- Polish the system without altering behavior
- Improve maintainability and clarity
- Prepare the project for further architectural expansion
Dev Flux V2 marks the first real architectural shift. The project transitions from manual sensor control to a scenario-driven testbed model.
Access Site: Dev Flux (Version 2).
- Scenario engine with time-based phases
- Automated switching between normal and attack states
- Coordinated behavior across multiple sensors
- Central testbed controller
- Testbed execution UI
- Select & Upload a scenario JSON file.
- Start the testbed.
- The system automatically:
- Switches between normal and attack phases
- Coordinates multiple sensors
- Monitor:
- Live sensor data
- Detected anomalies
- Stop the testbed to save the session.
- Load past sessions for analysis.
Best for: IDS testing, attack pattern evaluation, security research.
- Testbed fully implemented and operational
- Other system sections shown as informational placeholders
- Scenario JSON definitions
- Timeline-driven state management
- Centralized logging
- Testbed-oriented UI layout
- TCP-based internal sensor transport
- Uploaded-scenario execution and past-session replay
The repository includes scenario files and the materials needed to understand or demonstrate the Version 2 flow from the live site.
If you need the full local source package, deployment-ready files, or the complete runnable Version 2 bundle, please contact me directly.
- Introduce automation and repeatability
- Enable structured experiments
- Shift from manual control to orchestration
Dev Flux V3 refines V2 by narrowing focus entirely to the testbed. All non-essential UI sections are removed to create a clean, research-oriented interface.
Access Site: Dev Flux (Version 3).
- Testbed becomes the sole active UI module
- Simplified interface for live monitoring and replay
- Clear separation between:
- Monitoring
- Anomaly detection
- Historical analysis
- Same scenario-driven experiment flow with a cleaner research presentation
- Select & Upload a scenario JSON file.
- Start the testbed.
- The system automatically:
- Switches between normal and attack phases
- Coordinates multiple sensors
- Monitor:
- Live sensor data
- Detected anomalies
- Stop the testbed to save the session.
Best for: IDS testing, attack pattern evaluation, security research.
- Same scenario engine
- Same testbed controller
- Same sensor logic and data generation
- Same local testbed-style experiment workflow introduced in V2
The repository and live site make it possible to understand the Version 3 interface and scenario-based workflow.
If you need the full local source package, full local run files, or the original runnable Version 3 build, please contact me directly.
- Reduce UI noise
- Emphasize experimentation and observation
- Support academic and research demonstrations
Dev Flux V4 marks a major runtime expansion. This version introduces a three-sensor clean testbed built around a more complete STGen-style execution flow, covering Network, Authentication, and Host sensors.
Unlike the earlier web-first TCP scenario testbed, Version 4 shifts into a UDP-based local runtime model with separate server, sensors, client, and live browser showcase components. It also introduces a full training-and-detection workflow using generated server-side logs and an XGBoost multiclass model.
Version 4 is handled differently from the earlier web-hosted versions.
- The full Version 4 runtime is intended to be run locally on your PC.
- Because this version depends on local UDP runtime processes and local log files, the full testbed is not exposed as a normal public access site like Versions 0–3.
- If you need the full source code and local testbed package, please contact me directly.
Anomaly Detection Preview: Dev Flux Anomaly Detection
You can also open the anomaly detector locally using anomoly/index.html if that local copy is available in your package.
In this repository, check the Version 4 folder. There you can find:
- the trained model file,
- the label file,
- and sample data.
You can upload those assets to the anomaly detection preview to test how the detector works, even without running the full local testbed.
- Three-sensor runtime for:
- Network
- Authentication
- Host
- UDP-based sensor-to-server communication
- Separate server, sensors, and client runtime roles
- JSON-controlled sensor behavior and client actions
- Local live browser showcase for current runtime visualization
- Server-side log generation for dataset creation
- XGBoost-based multiclass training workflow
- Local and hosted anomaly detection workflow using trained model artifacts
- Adds a third sensor family (Host) beyond the earlier Network and Authentication focus
- Moves from the earlier TCP web-testbed style to a UDP-based STGen runtime
- Splits the system into dedicated runtime components instead of a single hosted testbed page
- Introduces a stronger data generation → training → retraining → detection loop
- Supports a cleaner, more controlled baseline dataset before realism is increased further in V5
- Run the Version 4 runtime locally.
- The system launches:
- STGen Server
- STGen Sensors
- STGen Client
- STGen Live Showcase
- Let the scenario complete and generate logs.
- Collect logs from
launcher/server_sensor_log/. - Train the XGBoost model using the generated logs.
- Save:
stgen_model_xgboost.jsonstgen_label_classes.json
- Generate fresh logs again for evaluation.
- Optionally use
remove_event.pyto create unlabeled testing logs. - Test the detector either:
- locally through
anomoly/index.html, or - through the hosted anomaly detection preview
- locally through
- Upload the model, label file, and sample/generated logs to verify predictions.
Best for: clean dataset generation, baseline anomaly detection training, reproducible multi-sensor demonstrations.
Version 4 uses a dedicated XGBoost multiclass training pipeline that:
- parses structured server-side logs from the three sensors
- removes
timestampandseq_nofrom model input - one-hot encodes
sensor_type - trains one shared model across all three sensor types
- exports the model, label file, prediction rows, confusion matrix, metrics, and classification report
This creates a repeatable clean-data baseline that users can retrain later with new data in Kaggle, local environments, or Google Colab.
The public repository is intended to provide the assets needed to test the detection workflow for Version 4.
For the full local runtime, source code bundle, and original local execution files, please contact me directly.
- Extend Dev Flux into a more complete three-sensor runtime
- Support full anomaly-detection experimentation beyond the earlier hosted testbed focus
- Establish a clean baseline before moving toward more realistic messy-data generation in V5
Dev Flux V5 builds directly on V4, but shifts the focus from a clean baseline into more realistic synthetic behavior generation.
The core three-sensor structure remains the same, but Version 5 introduces configurable realism styles that allow the data to become clean, semi-messy, messy, or noisy depending on scenario configuration. The result is a version that better reflects the irregularity and instability often found in real systems while still remaining controllable for research and ML experiments.
Version 5 also relies on the local UDP runtime model and is meant to be run locally for full functionality.
- The full Version 5 runtime is intended to be run locally on your PC.
- Because this version depends on local UDP runtime processes, local scenario files, and local log outputs, it is not exposed as a normal public hosted testbed site like Versions 0–3.
- If you need the full source code and local Version 5 package, please contact me directly.
Anomaly Detection Preview: Dev Flux Anomaly Detection
You can also run the detector locally with anomoly/index.html if that local copy is available in your package.
In this repository, check the Version 5 folder. There you can find:
- the trained model file,
- the label file,
- and sample data.
You can upload those assets to the anomaly detection preview to test how the Version 5 detection flow works without needing to run the full local runtime first.
- Retains the three-sensor runtime for:
- Network
- Authentication
- Host
- Keeps the UDP-based STGen-style execution model introduced in V4
- Adds configurable realism styles:
- clean
- semi-messy
- messy
- noisy
- Adds time-varying drift, controlled instability, and more natural transitions between states
- Adds transition blending between previous and current phases
- Adds jittered sample intervals instead of perfectly fixed timing
- Adds configurable spikes, bursts, and recovery bias
- Supports per-phase and per-sensor behavior control through scenario configuration
- Upgrades the training workflow with more advanced split, evaluation, and export logic
Version 5 can simulate behavior that is less perfectly separated than the clean baseline. Instead of only switching modes, it can also represent:
- drift over time
- blended transitions between phases
- irregular sampling intervals
- transient spikes
- burst behavior
- recovery effects after abnormal periods
- different realism levels depending on the configured style
This makes Version 5 more suitable for robustness testing, harder anomaly detection experiments, and more realistic dataset generation.
- V4 is the clean baseline; V5 is the realism-focused extension
- V4 emphasizes structured, clearly separated data; V5 introduces controlled irregularity and realism presets
- V5 supports semi-messy, messy, and noisy behavior profiles on top of the clean mode
- V5 uses a more advanced training workflow with stronger split handling, evaluation exports, and model-tracking outputs
- V5 is better suited for testing model robustness on synthetic data that is closer to real operational messiness
- Run the Version 5 runtime locally.
- Configure the desired realism level through the sensor scenario settings.
- Launch the runtime and generate messy or realism-enhanced server-side logs.
- Collect logs from
launcher/server_sensor_log/. - Train or retrain the Version 5 model using the messy-data training pipeline.
- Save the exported model and labels.
- Use the anomaly detection preview to upload:
- the trained model,
- the label file,
- and sample/generated logs
- Compare how the detector behaves on cleaner data versus more realism-heavy data.
Best for: realism-aware anomaly detection, robustness testing, messy synthetic data experiments, advanced multi-sensor evaluation.
Version 5 uses a more advanced XGBoost-based training workflow than Version 4. The pipeline:
- sorts rows by sensor stream, timestamp, sequence number, and source line
- assigns stable row identifiers for tracking
- supports chronological per-sensor splitting (earliest 70% train, next 15% validation, last 15% test)
- can automatically fall back to label-stratified random splitting if a strict time split fails to preserve all classes
- uses early stopping during XGBoost training
- exports richer evaluation artifacts such as:
- feature names
- feature importance
- per-sensor metrics
- top confusions
- split assignments
- confusion matrix
- classification report
In the provided Version 5 training results, the detector reaches near-perfect performance, with only a very small confusion between DDoS and UDP_Flood in the reported top confusions. This highlights both the strength of the structured pipeline and the fact that the messy data still remains coherent enough for meaningful training and testing.
The public repository is intended to provide the assets needed to test the Version 5 detection flow, including model artifacts and sample data.
For the full local runtime, source code bundle, and original local execution package, please contact me directly.
- Move Dev Flux closer to realistic synthetic data behavior
- Support stronger robustness testing for anomaly detection
- Preserve reproducibility while introducing more operational messiness
- Provide a better bridge between clean testbeds and real-world edge behavior
Dev Flux VS is not a separate implementation, but a demonstration layer built on the same conceptual model.
Access Site: Dev Flux (Streamlit Version).
- Embedded, predefined scenarios
- Timeline-driven attack visualization
- Simplified controls for non-technical audiences
- No setup or backend required
- Open the Streamlit app.
- Select scenario duration (1, 2, or 3 minutes).
- Click Run.
- Watch:
- Timeline progression
- Active attack indicators
- Attack summary table
- Pause, resume, or reset the simulation.
Best for: Quick demos, non-technical audiences.
- Streamlit-based interface
- Auto-refreshing simulation timeline
- Visual attack indicators
- Stateless demo execution
- Public demonstration
- Quick understanding of the system concept
- Basic presentation and showcase use
The public repository is designed to showcase the evolution of Dev Flux and provide selected materials for testing or demonstration.
- Versions 0–3: the hosted sites allow direct interaction with the live web builds.
- Version S: the Streamlit demo is publicly accessible for simplified concept demonstration.
- Versions 4–5: the anomaly detection preview can be tested using the provided model, label, and sample data assets from the corresponding repository folders.
For users who need any of the following:
- full source code packages,
- local execution files,
- original runtime bundles,
- deployment-ready copies,
- or research/demo collaboration support,
please contact me directly.
- GitHub: Shourav Deb
- LinkedIn: Shourav Deb
- Email: heyneeddev@gmail.com
💙 Happy Coding 💙







