# AI-GO
## 單一model
### lightgbm
rmse: 1.2757990684513854
==mape: 9.728077113409674==
R2 Score: 0.9231276653366925

### xgb
rmse: 1.2779197732038254
==mape: 9.778535044965956==
R2 Score: 0.9220757353304995

### gbr
rmse: 1.2826926709967383
==mape: 9.719456877777773==
R2 Score: 0.9196885834358467

### RF
rmse: 1.2879899598410829
==mape: 9.848582647868867==
R2 Score: 0.917007621015441

### svr
rmse: 1.5675110747490548
mape: 18.329026672495026
R2 Score: 0.7382111829983058

### ridge
rmse: 1.7233624089238477
mape: 25.319515863373482
R2 Score: 0.6161573412665121

## Blend
### Manual
def blended_predictions(X):
return ((0.3 * lightgbm_model.predict(X)) +
(0.2 * xgb_model.predict(X)) +
(0.1 * rf_model.predict(X)) +
(0.4 * gbr_model.predict(X)))
rmse: 1.267029696563372
==**mape:** 9.322803235045656==
R2 Score: 0.9274196833275169

### VotingRegressor
`from sklearn.ensemble import VotingRegressor`
**vote6**
**LIGHTGBM, XGB, GBR, svr, ridge, RF**
rmse: 1.3130068694154173
**mape:** 11.499622070440328
R2 Score: 0.9039115104720675
**vote5**
**LIGHTGBM, XGB, GBR, svr, RF**
rmse: 1.2818552827758867
**mape:** 10.037882106547935
R2 Score: 0.9201093595923333
**vote4**
**LIGHTGBM, XGB, GBR, RF**
rmse: 1.2669795869271623
==**mape:** 9.321660127793326==
R2 Score: 0.9274439384013801
**vote3**
**LIGHTGBM, XGB, GBR**
rmse: 1.2665374528730549
**mape:** 9.33450084630452
R2 Score: 0.9276578148688028
## 各model參數
```python=
# Light Gradient Boosting Regressor
from lightgbm import LGBMRegressor
from sklearn.ensemble import GradientBoostingRegressor, RandomForestRegressor
from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import RobustScaler
from sklearn.linear_model import Ridge, RidgeCV
from sklearn.svm import SVR
# Light Gradient Boosting
LIGHTGBM = LGBMRegressor(objective='regression',
learning_rate=0.08,
n_estimators=9500,
num_leaves=8,
min_data_in_leaf = 15,
feature_fraction=0.5,
feature_fraction_seed=12,
bagging_fraction=1,
bagging_freq = 4,
max_bin=300,
bagging_seed= 8,
min_sum_hessian_in_leaf = 17,
verbose=-1,
random_state=42)
# XGBoost Regressor
XGB = xgb.XGBRegressor(learning_rate=0.1,
n_estimators=100,
max_depth=8,
subsample=0.8,
gamma=0.01,
seed=47,
reg_alpha=0.00007,
random_state=42)
# Gradient Boosting Regressor
GBR = GradientBoostingRegressor(n_estimators=7000,
learning_rate=0.1,
max_depth=30,
max_features='sqrt',
min_samples_leaf=30,
min_samples_split=30,
loss='huber',
random_state=50)
# Random Forest Regressor
RF = RandomForestRegressor(n_estimators=1000,
max_depth=30,
min_samples_split=5,
min_samples_leaf=5,
max_features=None,
oob_score=True,
random_state=42,
n_jobs=-1)
# Ridge Regressor
ridge_alphas =np.logspace(-15, 15, 31)
ridge = make_pipeline(RobustScaler(), RidgeCV(alphas=ridge_alphas, cv=kf))
# SVM
svr = make_pipeline(RobustScaler(), SVR(C=10000, epsilon= 0.1, gamma=0.001))
```