|
a |
|
b/03-Experiments/Experiments.md |
|
|
1 |
## Experiment Set 1 - Dummy Classifier |
|
|
2 |
**Description:** In this set, a dummy classifier is utilized, where a decision tree is constructed based on level 1 preprocessed features. Level 1 preprocessing refers to minimal preprocessing required for the algorithm to function. |
|
|
3 |
|
|
|
4 |
## Experiment Set 2 - XGBoost Without FE |
|
|
5 |
**Description:** This set employs the XGBoost algorithm without any feature engineering (FE) techniques. |
|
|
6 |
|
|
|
7 |
## Experiment Set 3 - XGBoost With FE |
|
|
8 |
**Description:** XGBoost is utilized in this set with feature engineering techniques applied to enhance model performance. |
|
|
9 |
|
|
|
10 |
## Experiment Set 4 - XGBoost With FE & Optuna |
|
|
11 |
**Description:** XGBoost is employed with feature engineering techniques, and Optuna is used for hyperparameter optimization. |
|
|
12 |
|
|
|
13 |
## Experiment Set 5 - LightGBM Without FE |
|
|
14 |
**Description:** LightGBM is utilized in this set without any feature engineering. |
|
|
15 |
|
|
|
16 |
## Experiment Set 6 - LightGBM With FE |
|
|
17 |
**Description:** LightGBM is employed with feature engineering techniques to improve model performance. |
|
|
18 |
|
|
|
19 |
## Experiment Set 7 - LightGBM With FE & Optuna |
|
|
20 |
**Description:** LightGBM is used with feature engineering techniques, and Optuna is employed for hyperparameter tuning. |
|
|
21 |
|
|
|
22 |
## Experiment Set 8 - LightGBM With custom loss function (focal loss) |
|
|
23 |
**Description:** This set utilizes LightGBM with a custom loss function, specifically the focal loss function, to address class imbalance or prioritize hard-to-classify samples. |
|
|
24 |
|
|
|
25 |
## Experiment Set 9 - Catboost With FE |
|
|
26 |
**Description:** CatBoost algorithm is employed with feature engineering techniques. |
|
|
27 |
|
|
|
28 |
## Experiment Set 10 - Catboost With FE & Optuna |
|
|
29 |
**Description:** CatBoost algorithm is utilized with feature engineering techniques, and Optuna is employed for hyperparameter optimization. |
|
|
30 |
|
|
|
31 |
## Experiment Set 11 - Pycaret |
|
|
32 |
**Description:** This set utilizes the Pycaret library, which automates the machine learning workflow, including data preprocessing, model selection, and evaluation. |
|
|
33 |
|
|
|
34 |
## Experiment Set 12 - Autogluon |
|
|
35 |
**Description:** Autogluon is utilized in this set, which is an automated machine learning library designed for easy-to-use and efficient model selection and hyperparameter tuning. |
|
|
36 |
|
|
|
37 |
## Experiment Set 13 - AutoXGB |
|
|
38 |
**Description:** AutoXGB employs automated methods, likely through techniques such as AutoML, for training and optimizing an XGBoost model. |
|
|
39 |
|
|
|
40 |
## Experiment Set 14 - AutoLGBM |
|
|
41 |
**Description:** AutoLGBM utilizes automated techniques, possibly through AutoML, for training and optimizing a LightGBM model. |
|
|
42 |
|
|
|
43 |
## Experiment Set 15 - Advanced LightGBM |
|
|
44 |
**Description:** This set employs advanced techniques or configurations with LightGBM to enhance model performance further. |
|
|
45 |
|
|
|
46 |
## Experiment Set 16 - XGB Stacked |
|
|
47 |
**Description:** In this set, an ensemble approach is used where XGBoost models are stacked to improve overall predictive performance. |
|
|
48 |
|
|
|
49 |
## Experiment Set 17 - Neural Network |
|
|
50 |
**Description:** This set utilizes a neural network-based approach for modeling, which can handle complex relationships in the data. |
|
|
51 |
|
|
|
52 |
## Experiment Set 18 |
|
|
53 |
**Description:** *(Fill in the description of Experiment Set 18)* |
|
|
54 |
|
|
|
55 |
## Experiment Set 19 |
|
|
56 |
**Description:** *(Fill in the description of Experiment Set 19)* |
|
|
57 |
|
|
|
58 |
## Experiment Set 20 - XGB with NN probabilities |
|
|
59 |
**Description:** This set combines the predictions of an XGBoost model with probabilities obtained from a neural network model, likely for ensemble or boosting purposes. |