You like pytorch? You like micrograd? You love tinygrad! ❤️
Go to file
William Gibson 91a352a8e2 Fix tests ((again)) 2020-10-18 21:31:55 +01:00
.github/workflows Add CI for tests 2020-10-18 21:26:01 +01:00
test refactor into a few files 2020-10-18 13:30:25 -07:00
tinygrad Fix tests ((again)) 2020-10-18 21:31:55 +01:00
.gitignore start tinygrad 2020-10-17 22:57:01 -07:00
LICENSE readme 2020-10-18 11:27:37 -07:00
README.md refactor tinygrad to be more tiny 2020-10-18 13:19:19 -07:00

README.md

tinygrad

For something in between a grad and a karpathy/micrograd

This may not be the best deep learning framework, but it is a deep learning framework.

The Tensor class is a wrapper around a numpy array, except it does Tensor things.

Example

import numpy as np
from tinygrad.tensor import Tensor

x = Tensor(np.eye(3))
y = Tensor(np.array([[2.0,0,-2.0]]))
z = y.dot(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

Same example in torch

import torch

x = torch.eye(3, requires_grad=True)
y = torch.tensor([[2.0,0,-2.0]], requires_grad=True)
z = y.matmul(x).sum()
z.backward()

print(x.grad)  # dz/dx
print(y.grad)  # dz/dy

TODO (to make real neural network library)

  • Implement gradcheck (numeric)
  • Implement convolutions
  • Implement Adam optimizer