George Hotz
121d5a17ee
use tinynn for Conv2d
2021-10-30 19:40:44 -07:00
George Hotz
114f6ca3fd
more readme cleanup
2021-10-30 16:51:25 -07:00
George Hotz
effd0dc833
update readme
2021-10-30 16:34:00 -07:00
George Hotz
2e71ae33f6
max op works
2021-06-17 17:01:21 -07:00
George Hotz
e8eb7d1b7e
max op
2021-06-17 16:20:56 -07:00
George Hotz
c1d469d440
sum op
2021-06-17 16:19:35 -07:00
George Hotz
ff3fdc58e5
risk -> cherry
2021-06-16 09:59:48 -07:00
George Hotz
1e62e45d67
better todo
2021-06-15 10:30:16 -07:00
George Hotz
9ca4388695
debug
2021-06-15 10:24:21 -07:00
George Hotz
3d44aab52c
more
2021-06-15 10:23:57 -07:00
George Hotz
4850d6eb43
update todo
2021-06-15 10:22:39 -07:00
George Hotz
508ced114c
readme
2021-06-13 17:17:44 -07:00
George Hotz
77ba198b57
Revert "Update README.md ( #259 )" ( #260 )
...
This reverts commit 5a69c5db6d
.
2021-06-04 14:41:41 -07:00
Gabriel Rojas
5a69c5db6d
Update README.md ( #259 )
2021-06-04 14:41:07 -07:00
George Hotz
0702e0c763
nah, no sign, it's not what you want. use relu
2021-01-03 09:30:33 -08:00
George Hotz
c2eeb6950b
add support for sign. technically relu can be second class now
2021-01-03 08:29:57 -08:00
George Hotz
92abe43683
reduce before binary because of unbroadcasting
2020-12-31 09:49:52 -05:00
George Hotz
de7fe085de
no read out of bounds
2020-12-31 09:41:36 -05:00
George Hotz
30f8132646
reorder ops in ops cpu
2020-12-30 11:00:01 -05:00
George Hotz
e5b2803b5d
ops in readme
2020-12-30 10:48:55 -05:00
George Hotz
2d44bf7f1a
Dot -> Matmul
2020-12-30 10:41:51 -05:00
George Hotz
fcfe3dae01
write slice for CPU
2020-12-30 10:32:53 -05:00
George Hotz
1f5c9618ef
refactor in readme and issue #225
2020-12-29 17:30:04 -05:00
George Hotz
4bbad11afe
link to papers
2020-12-29 14:15:46 -05:00
George Hotz
3f8e137b6f
extra/transformer
2020-12-29 14:14:00 -05:00
George Hotz
8f9232d59b
readmee
2020-12-29 13:40:34 -05:00
George Hotz
837aaacfbf
Unpad2D on GPU:
2020-12-29 13:16:14 -05:00
George Hotz
02655c07d5
break maxpool2d on GPU
2020-12-29 13:05:57 -05:00
George Hotz
061e37de39
touchups
2020-12-29 12:41:21 -05:00
George Hotz
a2e6562330
fix max op, less lines
2020-12-29 10:47:04 -05:00
George Hotz
628d21f899
doc touchup
2020-12-28 10:45:26 -05:00
George Hotz
fafece9db7
avgpool2d is a second class op
2020-12-28 10:41:59 -05:00
George Hotz
593233b668
log and exp are first class ops
2020-12-28 10:00:30 -05:00
Liam
bcf1518309
All devices are equal! ( #196 )
...
* Update all devices to be tested
ANE, CPU and OCL all now support all tests.
However tests are not currently passing on GPU and I cannot test on CPU.
Failing GPU test are not an issue caused by this update. Tests have not
been passing due to a missing "six" required installation.
OpenCL Tests have not been run since commit: 1a1c63a08b
devices have 3 types and are handle by a new DeviceTypes enum. (The goal
is to revert to Tensor.<type>, but this current setup allows for keyword
argument defaults: `device=DeviceType.CPU`)
All references to Tensor.GPU/CPU/ANE as been converted to the
corresponding `DeviceTypes` enum.
Refactor of the conversion code to allow for any device to any device
conversion.
* Add six dependency in requirements.txt
* Resolve failure to run tests
Move six into gpu required installs. Remove six from standard
installation.
* Remove repeated data conversion
* Refactor method names
Also reduce code with .to and .to_
* Dynamic device handlers
* Refactor DeviceTypes -> Device
* Add mem copy profiling back
* test_backward_pass_diamond_model passing
* Resolve Sum issue on GPU
* Revert batchnorm2d tests
* Update README with upadated API
* ANE testing with
* Last minute line gains
2020-12-15 23:44:08 -08:00
George Hotz
b86bbd2e72
readmes
2020-12-13 21:32:20 -08:00
George Hotz
4d8235d5f7
readme update
2020-12-13 20:24:33 -08:00
NeuralLink
1a1c63a08b
Gan is real...Look what tiny just generated! ( #192 )
...
* mode collapse solved
* info add
* delete unnecessary imports
* readme
2020-12-13 20:23:12 -08:00
George Hotz
f95e79dab7
update readme
2020-12-12 17:14:10 -08:00
George Hotz
a5aced8d47
30 MEGAReLUs. we need to lose 12 lines
2020-12-12 17:07:34 -08:00
WillemKauf
49da969d25
Fixed a typo. ( #189 )
2020-12-12 16:25:33 -08:00
George Hotz
bc5df477de
readme and .ane()
2020-12-12 16:15:38 -08:00
George Hotz
c63f950348
need zero grad now
2020-12-07 23:10:43 -08:00
George Hotz
102e6356e9
replace layer_init_uniform with .uniform
2020-12-06 13:44:31 -08:00
George Hotz
888689b57b
proprotip
2020-12-04 09:24:46 -08:00
George Hotz
2862b42bac
install from github
2020-12-04 09:06:25 -08:00
George Hotz
1290e01e2c
all ops supported on GPU now
2020-12-03 10:43:11 -08:00
George Hotz
621a93b777
ane in readme
2020-12-03 10:40:31 -08:00
baplou
c83cebccda
Made the readme more consistent ( #136 )
2020-11-28 08:20:02 -06:00
Marcel Bischoff
541330c42a
Update README.md ( #133 )
...
should we put `ipython3` otherwise the path doesn't work or we have to add the env, not sure what is nicer
2020-11-25 07:53:54 -08:00
George Hotz
2d4a5d5950
readme
2020-11-10 01:27:04 -08:00
George Hotz
53157fb876
add back scale
2020-11-09 10:20:56 -08:00
George Hotz
75d69e956f
readme more
2020-11-07 21:58:20 -08:00
Dimitar Vagalinski
35a5c82a2a
done as he said ( #71 )
2020-11-07 18:28:39 -08:00
George Hotz
ce6c408d78
readmee
2020-11-07 12:26:57 -08:00
George Hotz
5486135f2d
readme
2020-11-07 11:41:27 -08:00
George Hotz
bc7758cc5b
getting convs to work on gpu
2020-11-07 09:17:57 -08:00
Rakib Fiha
f40dbd791c
Use --upgrade since its in active dev ( #63 )
2020-11-07 07:15:05 -08:00
George Hotz
3efb4f4df4
chicken.jpg
2020-11-04 11:20:22 -08:00
George Hotz
940e14c6ca
more readme
2020-11-02 08:33:48 -08:00
George Hotz
1e6bbdf4f8
readme updates
2020-11-02 08:30:43 -08:00
George Hotz
2e7f16bf3f
the power of cheating
2020-11-02 07:42:11 -08:00
George Hotz
1f544d6ece
test mnist on GPU
2020-11-01 07:46:17 -08:00
George Hotz
7c0dc8f48b
more whitespace
2020-10-31 11:05:11 -07:00
George Hotz
e01e35e545
14 ops to write for GPU
2020-10-31 10:59:30 -07:00
George Hotz
8d75c9e4c4
we have matmul too
2020-10-27 08:57:17 -07:00
George Hotz
6b5982b6b3
push pypi
2020-10-27 08:13:15 -07:00
George Hotz
716f86a572
sections
2020-10-27 08:10:51 -07:00
=
6b44a7f729
adds beautiful and meaningful logo
2020-10-26 18:12:49 +01:00
George Hotz
c5db2cf517
readme more
2020-10-26 09:21:48 -07:00
George Hotz
43591a1e71
make the example simpler
2020-10-26 09:19:20 -07:00
George Hotz
1f0514e5df
pip
2020-10-26 09:15:31 -07:00
George Hotz
c74764bac3
oops, set to None
2020-10-25 08:28:18 -07:00
George Hotz
49ae15a450
i like that comma
2020-10-23 06:12:04 -07:00
f0ti
0b87aaca1e
update rsmprop
2020-10-23 14:46:45 +02:00
f0ti
6a38ccb6b0
update rmsprop and readme
2020-10-23 11:49:43 +02:00
George Hotz
ecdf2239fc
todo
2020-10-21 09:14:37 -07:00
Oren Amsalem
a3839f0fef
easier to find what micrograd is...
...
Added pytorch link just for fun
2020-10-20 08:29:27 +03:00
George Hotz
4cace5f798
pytorch
2020-10-18 18:15:47 -07:00
George Hotz
dba362e65e
update readme
2020-10-18 16:40:42 -07:00
George Hotz
4019c38942
more readme
2020-10-18 14:38:20 -07:00
George Hotz
cc9054e3ec
refactor into utils
2020-10-18 14:36:29 -07:00
George Hotz
28100c741f
update readme
2020-10-18 14:32:45 -07:00
George Hotz
28c9d31e49
Merge pull request #3 from dewpey/master
...
Test badge in readme
2020-10-18 14:02:33 -07:00
George Hotz
26ce2d93c3
add support for adam
2020-10-18 13:50:23 -07:00
Drew Patel
caf03cf5d6
Added a little test badge
2020-10-18 15:41:51 -05:00
George Hotz
f4e0cb5945
refactor tinygrad to be more tiny
2020-10-18 13:19:19 -07:00
George Hotz
54eafe6c12
update readme
2020-10-18 13:08:14 -07:00
George Hotz
83417d4b4c
readme and dirs
2020-10-18 12:48:17 -07:00
George Hotz
19b3b85b23
readme
2020-10-18 11:27:37 -07:00