Skip to content

404-GeniusNotFound/Quantageddon_1.0

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 

Repository files navigation

πŸ“ˆ Quantageddon 1.0

A Time-Series Feature Engineering and Ridge Regression Project


🎯 Objective

Quantageddon 1.0 is a machine learning project designed to explore feature engineering and regression modeling for time-series financial data. The focus is on predicting the target variable "Open" (possibly stock prices) by generating meaningful features and applying a Ridge regression model with hyperparameter tuning.


πŸ§ͺ Pipeline Overview

πŸ“Š Phase 1: Feature Engineering

  • Generated time-based features from datetime columns (e.g., day of week, day of year).
  • Created interaction features by combining original variables.
  • Computed Pearson correlations between features and target (Open) to evaluate usefulness.
  • Removed features with low or negative correlation to reduce noise and overfitting.

πŸ› οΈ Tools Used:

  • pandas, numpy for data handling
  • seaborn, matplotlib for visualization
  • sklearn for modeling and evaluation

πŸ€– Phase 2: Modeling with Ridge Regression

  • Used Ridge Regression (L2-regularized linear model) from sklearn.linear_model
  • Implemented GridSearchCV for hyperparameter tuning of the alpha parameter
  • Trained the model on engineered features
  • Evaluated performance using standard regression metrics
  • Final predictions were exported as a .csv file for submission or further use

πŸ“€ Output

  • submission.csv: contains predicted Open values on the test set
  • Feature importance & correlation plots for interpretability

πŸ“ Project Structure

  • quantageddon-1-0.ipynb: Jupyter notebook with full code and markdown explanations
  • submission.csv: Final model predictions
  • Plots and metrics embedded in notebook

🧠 Key Takeaways

  • Feature engineering significantly boosted model performance
  • Time-series context was captured using periodic features
  • Ridge Regression provided a stable and interpretable baseline
  • GridSearchCV helped fine-tune model parameters efficiently

πŸš€ Future Scope

  • Explore other regularized models (Lasso, ElasticNet)
  • Add lag-based features or rolling window statistics
  • Use tree-based models (e.g., XGBoost, LightGBM) for comparison
  • Extend to multi-target prediction (e.g., Open, Close, High, Low)

About

It was a stock market prediction challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors