-
Hi, When using 2022-06-23 16:24:11,299 WARNING function_runner.py:603 -- Function checkpointing is disabled. This may result in unexpected behavior when using checkpointing features or certain schedulers. To enable, set the train function arguments to be `func(config, checkpoint_dir=None)`. I was wondering whether "checkpointing" functionality is available when using Thank you very much for your help and all the great work! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Currently, the way to do warm start is via the starting_points argument. Are you thinking of normal termination of an AutoML run followed by another AutoML run or a forced termination of an AutoML run followed by another AutoML run that recovers from the failure? For the former, warm-start + logging can work. For the latter, it also works when not using ray. But when using ray, the log is not written when the termination is forced and that needs an improvement. |
Beta Was this translation helpful? Give feedback.
Currently, the way to do warm start is via the starting_points argument. Are you thinking of normal termination of an AutoML run followed by another AutoML run or a forced termination of an AutoML run followed by another AutoML run that recovers from the failure? For the former, warm-start + logging can work. For the latter, it also works when not using ray. But when using ray, the log is not written when the termination is forced and that needs an improvement.