You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -162,6 +164,9 @@ Below is a list of the methods currently implemented in this module.
162
164
1. Easy Ensemble classifier [13]_
163
165
2. Balanced Random Forest [16]_
164
166
3. Balanced Bagging
167
+
4. RUSBoost [18]_
168
+
169
+
* Mini-batch resampling for Keras and Tensorflow
165
170
166
171
The different algorithms are presented in the sphinx-gallery_.
167
172
@@ -202,3 +207,7 @@ References:
202
207
.. [15] : H. He, Y. Bai, E. A. Garcia, S. Li, “ADASYN: Adaptive synthetic sampling approach for imbalanced learning,” In Proceedings of the 5th IEEE International Joint Conference on Neural Networks, pp. 1322-1328, 2008.
203
208
204
209
.. [16] : C. Chao, A. Liaw, and L. Breiman. "Using random forest to learn imbalanced data." University of California, Berkeley 110 (2004): 1-12.
210
+
211
+
.. [17] : Felix Last, Georgios Douzas, Fernando Bacao, "Oversampling for Imbalanced Learning Based on K-Means and SMOTE"
212
+
213
+
.. [18] : Seiffert, C., Khoshgoftaar, T. M., Van Hulse, J., & Napolitano, A. "RUSBoost: A hybrid approach to alleviating class imbalance." IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans 40.1 (2010): 185-197.
0 commit comments