tinygrad/examples/mlperf
chenyu e1c5aa9cce
estimated resnet training time for BENCHMARK (#3769)
2024-03-15 22:36:58 -04:00
..
README start on mlperf models 2023-05-10 16:30:49 -07:00
dataloader.py MLPerf Resnet (cleaned up) (#3573) 2024-03-14 00:53:41 -04:00
helpers.py MLPerf Resnet (cleaned up) (#3573) 2024-03-14 00:53:41 -04:00
initializers.py MLPerf Resnet (cleaned up) (#3573) 2024-03-14 00:53:41 -04:00
lr_schedulers.py PolynomialDecayWithWarmup + tests (#3649) 2024-03-07 18:53:36 -05:00
metrics.py Add MLPerf UNet3D model (#775) 2023-05-28 20:38:19 -07:00
model_eval.py MLPerf Resnet (cleaned up) (#3573) 2024-03-14 00:53:41 -04:00
model_spec.py move globalcounters to ops (#2960) 2024-01-01 14:21:02 -08:00
model_train.py estimated resnet training time for BENCHMARK (#3769) 2024-03-15 22:36:58 -04:00
optimizers.py lars optimizer + tests (#3631) 2024-03-06 18:11:01 -05:00

README

Each model should be a clean single file.
They are imported from the top level `models` directory

It should be capable of loading weights from the reference imp.

We will focus on these 5 models:

# Resnet50-v1.5 (classic) -- 8.2 GOPS/input
# Retinanet
# 3D UNET (upconvs)
# RNNT
# BERT-large (transformer)

They are used in both the training and inference benchmark:
https://mlcommons.org/en/training-normal-21/
https://mlcommons.org/en/inference-edge-30/
And we will submit to both.

NOTE: we are Edge since we don't have ECC RAM