2023-06-04 23:51:20 +08:00
< div align = "center" >
2020-10-27 01:12:49 +08:00
2023-07-12 14:14:19 +08:00
[![logo ](https://raw.githubusercontent.com/tinygrad/tinygrad/master/docs/logo.png )](https://tinygrad.org)
2020-10-19 02:27:37 +08:00
2023-06-04 23:51:20 +08:00
tinygrad: For something between [PyTorch ](https://github.com/pytorch/pytorch ) and [karpathy/micrograd ](https://github.com/karpathy/micrograd ). Maintained by [tiny corp ](https://tinygrad.org ).
2020-10-19 04:41:51 +08:00
2023-06-04 23:51:20 +08:00
< h3 >
2023-01-29 03:36:15 +08:00
2023-07-12 14:14:19 +08:00
[Homepage ](https://github.com/tinygrad/tinygrad ) | [Documentation ](/docs ) | [Examples ](/examples ) | [Showcase ](/docs/showcase.md ) | [Discord ](https://discord.gg/ZjZadyC7PK )
2020-10-18 13:57:01 +08:00
2023-06-04 23:51:20 +08:00
< / h3 >
2022-11-09 11:13:11 +08:00
2023-07-12 14:14:19 +08:00
[![GitHub Repo stars ](https://img.shields.io/github/stars/tinygrad/tinygrad )](https://github.com/tinygrad/tinygrad/stargazers)
[![Unit Tests ](https://github.com/tinygrad/tinygrad/actions/workflows/test.yml/badge.svg )](https://github.com/tinygrad/tinygrad/actions/workflows/test.yml)
2023-06-04 23:51:20 +08:00
[![Discord ](https://img.shields.io/discord/1068976834382925865 )](https://discord.gg/ZjZadyC7PK)
2023-07-12 14:14:19 +08:00
[![Lines of code ](https://img.shields.io/tokei/lines/github/tinygrad/tinygrad )](https://github.com/tinygrad/tinygrad)
2022-11-09 11:13:11 +08:00
2023-06-04 23:51:20 +08:00
< / div >
2020-10-19 03:48:17 +08:00
2023-06-04 23:51:20 +08:00
---
2020-10-19 03:48:17 +08:00
2023-06-04 23:51:20 +08:00
This may not be the best deep learning framework, but it is a deep learning framework.
2020-10-19 03:48:17 +08:00
2023-06-06 03:20:14 +08:00
Due to its extreme simplicity, it aims to be the easiest framework to add new accelerators to, with support for both inference and training. If XLA is CISC, tinygrad is RISC.
2020-10-19 03:48:17 +08:00
2023-06-06 03:20:14 +08:00
tinygrad is still alpha software, but we [raised some money ](https://geohot.github.io/blog/jekyll/update/2023/05/24/the-tiny-corp-raised-5M.html ) to make it good. Someday, we will tape out chips.
2020-10-19 03:48:17 +08:00
2023-06-04 23:51:20 +08:00
## Features
2020-10-19 03:48:17 +08:00
2023-06-04 23:51:20 +08:00
### LLaMA and Stable Diffusion
2020-10-19 03:48:17 +08:00
2023-06-04 23:51:20 +08:00
tinygrad can run [LLaMA ](/docs/showcase.md#llama ) and [Stable Diffusion ](/docs/showcase.md#stable-diffusion )!
2020-10-18 13:57:01 +08:00
2023-06-04 23:51:20 +08:00
### Laziness
2023-03-07 00:25:13 +08:00
2023-03-07 00:30:31 +08:00
Try a matmul. See how, despite the style, it is fused into one kernel with the power of laziness.
2023-03-07 00:25:13 +08:00
2023-06-04 23:51:20 +08:00
```sh
2023-06-22 02:50:43 +08:00
DEBUG=3 python3 -c "from tinygrad.tensor import Tensor;
2023-06-04 23:52:13 +08:00
N = 1024; a, b = Tensor.rand(N, N), Tensor.rand(N, N);
2023-03-07 00:25:13 +08:00
c = (a.reshape(N, 1, N) * b.permute(1,0).reshape(1, N, N)).sum(axis=2);
print((c.numpy() - (a.numpy() @ b.numpy())).mean())"
```
2023-06-04 23:51:20 +08:00
And we can change `DEBUG` to `4` to see the generated code.
2023-03-07 01:13:23 +08:00
2023-06-04 23:51:20 +08:00
### Neural networks
2020-10-19 07:40:42 +08:00
2023-06-04 23:51:20 +08:00
As it turns out, 90% of what you need for neural networks are a decent autograd/tensor library.
Throw in an optimizer, a data loader, and some compute, and you have all you need.
2020-10-19 07:40:42 +08:00
2023-06-04 23:51:20 +08:00
#### Neural network example (from test/models/test_mnist.py)
2020-10-19 05:32:45 +08:00
2023-06-04 23:51:20 +08:00
```py
2020-10-19 05:32:45 +08:00
from tinygrad.tensor import Tensor
2022-08-18 22:41:00 +08:00
import tinygrad.nn.optim as optim
2020-10-19 05:32:45 +08:00
class TinyBobNet:
def __init__ (self):
2020-12-07 05:44:31 +08:00
self.l1 = Tensor.uniform(784, 128)
self.l2 = Tensor.uniform(128, 10)
2020-10-19 05:32:45 +08:00
def forward(self, x):
2023-02-25 02:11:24 +08:00
return x.dot(self.l1).relu().dot(self.l2).log_softmax()
2020-10-19 05:32:45 +08:00
model = TinyBobNet()
2020-10-23 20:46:45 +08:00
optim = optim.SGD([model.l1, model.l2], lr=0.001)
2020-10-19 05:32:45 +08:00
2023-06-04 23:51:20 +08:00
# ... complete data loader here
2020-10-19 05:38:20 +08:00
out = model.forward(x)
loss = out.mul(y).mean()
2020-12-08 15:10:43 +08:00
optim.zero_grad()
2020-10-19 05:38:20 +08:00
loss.backward()
optim.step()
2020-10-19 05:32:45 +08:00
```
2020-10-19 04:08:14 +08:00
2023-06-04 23:51:20 +08:00
## Accelerators
2020-11-03 00:33:48 +08:00
2023-06-04 23:51:20 +08:00
tinygrad already supports numerous accelerators, including:
2020-11-03 00:33:48 +08:00
2023-06-04 23:51:20 +08:00
- [x] CPU
- [x] GPU (OpenCL)
- [x] C Code (Clang)
- [x] LLVM
- [x] METAL
- [x] CUDA
- [x] Triton
- [x] PyTorch
2020-11-03 00:33:48 +08:00
2023-06-05 10:21:20 +08:00
And it is easy to add more! Your accelerator of choice only needs to support a total of 26 (optionally 27) low level ops.
2023-06-04 23:51:20 +08:00
More information can be found in the [documentation for adding new accelerators ](/docs/adding_new_accelerators.md ).
2022-06-09 02:46:09 +08:00
2023-06-04 23:51:20 +08:00
## Installation
2022-06-09 02:46:09 +08:00
2023-06-04 23:51:20 +08:00
The current recommended way to install tinygrad is from source.
2020-12-14 13:32:20 +08:00
2023-06-04 23:51:20 +08:00
### From source
2020-12-14 13:32:20 +08:00
2023-06-04 23:51:20 +08:00
```sh
2023-07-12 14:14:19 +08:00
git clone https://github.com/tinygrad/tinygrad.git
2023-06-04 23:51:20 +08:00
cd tinygrad
2023-06-23 15:00:34 +08:00
python3 -m pip install -e .
2020-12-14 13:32:20 +08:00
```
2023-06-04 23:51:20 +08:00
Don't forget the `.` at the end!
2020-12-14 13:32:20 +08:00
2023-06-04 23:51:20 +08:00
## Documentation
2022-06-09 02:41:19 +08:00
2023-06-04 23:51:20 +08:00
Documentation along with a quick start guide can be found in the [docs/ ](/docs ) directory.
2022-06-09 02:41:19 +08:00
2023-06-04 23:51:20 +08:00
### Quick example comparing to PyTorch
2022-06-09 02:41:19 +08:00
2023-06-04 23:51:20 +08:00
```py
from tinygrad.tensor import Tensor
2020-11-03 00:30:43 +08:00
2023-06-04 23:51:20 +08:00
x = Tensor.eye(3, requires_grad=True)
y = Tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()
2020-11-08 04:26:57 +08:00
2023-06-04 23:51:20 +08:00
print(x.grad.numpy()) # dz/dx
print(y.grad.numpy()) # dz/dy
2020-11-08 04:26:57 +08:00
```
2023-06-04 23:51:20 +08:00
The same thing but in PyTorch:
```py
import torch
2023-05-26 14:10:41 +08:00
2023-06-04 23:51:20 +08:00
x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()
2023-05-26 14:10:41 +08:00
2023-06-04 23:51:20 +08:00
print(x.grad.numpy()) # dz/dx
print(y.grad.numpy()) # dz/dy
2023-05-26 14:10:41 +08:00
```
2023-06-04 23:51:20 +08:00
## Contributing
2020-12-14 12:23:12 +08:00
2023-06-04 23:51:20 +08:00
There has been a lot of interest in tinygrad lately. Here are some basic guidelines for contributing:
2021-10-31 10:47:34 +08:00
2023-07-12 14:14:19 +08:00
- Bug fixes are the best and always welcome! Like [this one ](https://github.com/tinygrad/tinygrad/pull/421/files ).
2023-06-04 23:51:20 +08:00
- If you don't understand the code you are changing, don't change it!
2023-07-12 14:14:19 +08:00
- All code golf PRs will be closed, but [conceptual cleanups ](https://github.com/tinygrad/tinygrad/pull/372/files ) are great.
2023-06-04 23:51:20 +08:00
- Features are welcome. Though if you are adding a feature, you need to include tests.
- Improving test coverage is great, with reliable non-brittle tests.
2021-10-31 10:47:34 +08:00
2023-06-04 23:51:20 +08:00
Additional guidelines can be found in [CONTRIBUTING.md ](/CONTRIBUTING.md ).
2022-06-06 03:13:05 +08:00
2020-10-27 23:10:51 +08:00
### Running tests
2023-03-05 07:14:17 +08:00
For more examples on how to run the full test suite please refer to the [CI workflow ](.github/workflows/test.yml ).
2023-06-04 23:51:20 +08:00
Some examples:
```sh
2023-03-05 07:14:17 +08:00
python3 -m pip install -e '.[testing]'
2020-11-28 22:20:02 +08:00
python3 -m pytest
2023-03-05 07:14:17 +08:00
python3 -m pytest -v -k TestTrain
python3 ./test/models/test_train.py TestTrain.test_efficientnet
2020-10-27 23:10:51 +08:00
```