You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you have any idea on how to incorporate different weights for each datapoints, in the sense that datapoints with low weights should be less important for the model to fit than those with high weights?
For instance, by specifying the loss as WeightedLogitLogLoss(weights) etc.'
LossFunction.jl has some support for this, e.g.
value(LogitLogLoss(), Y, Y, AggMode.WeightedSum(ones(length(Y))))
The text was updated successfully, but these errors were encountered:
Thanks for the question: I had a think about this and went through the code. Due to how the XGBoost algorithm works, weights are not inputable using the approach you mentioned. Rather, it needs a weight parameter to the jlboost function.
See discussion for increasing the weight for the gradient and hessian. And based on my understanding, this would be the approach too. dmlc/xgboost#144 (comment)
I will support weights soon once I am done with my other OS responsibilities.
For now, unfortunately, it's not possible unless you hack the code. In particular by adding weights to the g and h functions and by passing weights to where g and h are called
Hey and thanks for this package!
Do you have any idea on how to incorporate different weights for each datapoints, in the sense that datapoints with low weights should be less important for the model to fit than those with high weights?
For instance, by specifying the loss as
WeightedLogitLogLoss(weights)
etc.'LossFunction.jl has some support for this, e.g.
The text was updated successfully, but these errors were encountered: