Bunch of Notebooks filled with information about DS libraries, methods and implementations studied from DeepLearningSchool and other sources
Basic implementation in LinearRegression.ipynb
.
Also i implemented Ridge and Lasso regularization, random Batch without division on epochs
Basic implementation in LogisticRegression.ipynb
Basic implementation in bagging.py
Basic implementation in boosting.py
Not the best, but at least existing solution to Titanic Kaggle problem. Solution based on Logistic Regression with ElasticNet
. Coefficients were find through self-implemented values_grid, so the result may be improved with change of model, on that ML based, or with more detail work with data. Maybe 'Rich' feature isn't that good, as it was predicted. Or maybe, i shouldn't thrown away all NaN data ;)))
Implementation in kaggle\titanic\main.ipynb
, solution in submit.csv
Algorithm based on comparison of Random Forest, CatBoost and LogisticRegression
Performance may be increased with rebalancing the train.csv
Implementation in kaggle\PredictionOfTheChurnOfClients\main.ipynb
, solution in submit.csv