You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
and I achieved it for this approach taking a small proportion in CPU runtime, I don't want to burden the CPU runtime without the need to detect the optimal model after training runtime.
I don't think the underlined function is a quick response code because it selects the optimal model at least O(n) runtime from all trained candidate ensembles only by loss evaluation. It's an implementation based on the tf API, but it doesn't take into account how users can get the ideal combination based on the strategy they are using in as few trials as possible.
The text was updated successfully, but these errors were encountered:
linjing-lab
changed the title
Design suggestions of _IterationBuilder._best_loss from adanet.core.iteration.py
Design suggestions of _IterationBuilder._best_loss from adanet.core.iteration
Dec 3, 2023
I have designed some operators for model loss monitoring, like the following continuous fragments:
https://github.com/linjing-lab/easy-pytorch/blob/9651774dcc4581104f914980baf2ebc05f96fd85/released_box/perming/_utils.py#L269-L281
and I achieved it for this approach taking a small proportion in CPU runtime, I don't want to burden the CPU runtime without the need to detect the optimal model after training runtime.
adanet/adanet/core/iteration.py
Lines 743 to 747 in 0364cc4
I don't think the underlined function is a quick response code because it selects the optimal model at least O(n) runtime from all trained candidate ensembles only by loss evaluation. It's an implementation based on the tf API, but it doesn't take into account how users can get the ideal combination based on the strategy they are using in as few trials as possible.
adanet/adanet/core/iteration.py
Lines 1089 to 1109 in 0364cc4
The text was updated successfully, but these errors were encountered: