Switch to side-by-side view

--- a
+++ b/03-Experiments/Experiments.md
@@ -0,0 +1,59 @@
+## Experiment Set 1 - Dummy Classifier
+**Description:** In this set, a dummy classifier is utilized, where a decision tree is constructed based on level 1 preprocessed features. Level 1 preprocessing refers to minimal preprocessing required for the algorithm to function.
+
+## Experiment Set 2 - XGBoost Without FE
+**Description:** This set employs the XGBoost algorithm without any feature engineering (FE) techniques.
+
+## Experiment Set 3 - XGBoost With FE
+**Description:** XGBoost is utilized in this set with feature engineering techniques applied to enhance model performance.
+
+## Experiment Set 4 - XGBoost With FE & Optuna
+**Description:** XGBoost is employed with feature engineering techniques, and Optuna is used for hyperparameter optimization.
+
+## Experiment Set 5 - LightGBM Without FE
+**Description:** LightGBM is utilized in this set without any feature engineering.
+
+## Experiment Set 6 - LightGBM With FE
+**Description:** LightGBM is employed with feature engineering techniques to improve model performance.
+
+## Experiment Set 7 - LightGBM With FE & Optuna
+**Description:** LightGBM is used with feature engineering techniques, and Optuna is employed for hyperparameter tuning.
+
+## Experiment Set 8 - LightGBM With custom loss function (focal loss)
+**Description:** This set utilizes LightGBM with a custom loss function, specifically the focal loss function, to address class imbalance or prioritize hard-to-classify samples.
+
+## Experiment Set 9 - Catboost With FE
+**Description:** CatBoost algorithm is employed with feature engineering techniques.
+
+## Experiment Set 10 - Catboost With FE & Optuna
+**Description:** CatBoost algorithm is utilized with feature engineering techniques, and Optuna is employed for hyperparameter optimization.
+
+## Experiment Set 11 - Pycaret
+**Description:** This set utilizes the Pycaret library, which automates the machine learning workflow, including data preprocessing, model selection, and evaluation.
+
+## Experiment Set 12 - Autogluon
+**Description:** Autogluon is utilized in this set, which is an automated machine learning library designed for easy-to-use and efficient model selection and hyperparameter tuning.
+
+## Experiment Set 13 - AutoXGB
+**Description:** AutoXGB employs automated methods, likely through techniques such as AutoML, for training and optimizing an XGBoost model.
+
+## Experiment Set 14 - AutoLGBM
+**Description:** AutoLGBM utilizes automated techniques, possibly through AutoML, for training and optimizing a LightGBM model.
+
+## Experiment Set 15 - Advanced LightGBM
+**Description:** This set employs advanced techniques or configurations with LightGBM to enhance model performance further.
+
+## Experiment Set 16 - XGB Stacked
+**Description:** In this set, an ensemble approach is used where XGBoost models are stacked to improve overall predictive performance.
+
+## Experiment Set 17 - Neural Network
+**Description:** This set utilizes a neural network-based approach for modeling, which can handle complex relationships in the data.
+
+## Experiment Set 18 
+**Description:** *(Fill in the description of Experiment Set 18)*
+
+## Experiment Set 19 
+**Description:** *(Fill in the description of Experiment Set 19)*
+
+## Experiment Set 20 - XGB with NN probabilities
+**Description:** This set combines the predictions of an XGBoost model with probabilities obtained from a neural network model, likely for ensemble or boosting purposes.