Normal Heading
This is a paragrah.This is a paragrah.This is a paragrah.This is a paragrah.
Discrete convolutions in 1D
(f∗g)(x)≜i=−∞∑∞f(i)⋅g(x−i)(1)
Fundamental Theorem of Calculus
∫abf′(x)dx=f(b)−f(a)dxd∫axf(t)dt=f(x)(2)
a array
f=[10,9,10,9,10,1,10,9,10,8](2)
a function
f=(0,10),(1,9),(2,10),(3,9),(4,10),(5,1),(6,10),(7,9),(8,10),(9,8)(3)
Below is a graph block:
Here is a plot of the signal:
Below is how to do multiple equstion numbering:
P({xi,xj}⊆X)=∣∣∣∣∣KiiKjiKijKjj∣∣∣∣∣=KiiKjj−KijKji=P({xi}⊆X)P({xj}⊆X)−Kij2(4)
from tinygrad.tensor import Tensor
import tinygrad.nn.optim as optim
class TinyBobNet:
def __init__(self):
self.l1 = Tensor.uniform(784, 128)
self.l2 = Tensor.uniform(128, 10)
def forward(self, x):
return x.dot(self.l1).relu().dot(self.l2).log_softmax()
model = TinyBobNet()
optim = optim.SGD([model.l1, model.l2], lr=0.001)
# ... complete data loader here
out = model.forward(x)
loss = out.mul(y).mean()
optim.zero_grad()
loss.backward()
optim.step()
f=⎣⎢⎢⎢⎡1611162712173813184914195101520⎦⎥⎥⎥⎤g=⎣⎢⎡adgbehcfi⎦⎥⎤