mirror of https://github.com/commaai/tinygrad.git
9f66dcf718
* working PolynomialDecayWithWarmup + tests....... add lars_util.py, oops * keep lars_util.py as intact as possible, simplify our interface * whitespace * clean up * clean up * asserts * test polylr for full resnet training run * add comment * rename * fix do_optim * don't cast lr * info * calculate from train_files * skip it |
||
---|---|---|
.. | ||
lars_optimizer.py | ||
lars_util.py |