Mini-Model From Scratch: Two-Layer MLP
Here is a two-layer MLP for binary classification implementation on toy data, using no external libs. We’ll mimic TensorFlow logic: forward pass + backward gradients + updates. (This helps you understand what TF automates.)
TensorFlow would: (1) define layers, (2) run forward, (3) use a GradientTape
to get grads, (4) call optimizer.apply_gradients
. Everything else (placement, kernels, shapes) comes “for free.”
xxxxxxxxxx
67
train()
import random, math
def sigmoid(x): return 1/(1+math.exp(-x))
def dsigmoid(y): return y*(1-y)
def relu(x): return x if x>0 else 0.0
def drelu(x): return 1.0 if x>0 else 0.0
def dot(w, x): return sum(wi*xi for wi,xi in zip(w,x))
def make_blobs(n=200, seed=0):
random.seed(seed)
X, Y = [], []
for _ in range(n//2):
X.append([random.gauss(-1,0.5), random.gauss(0,0.5)]); Y.append(0.0)
for _ in range(n//2):
X.append([random.gauss( 1,0.5), random.gauss(0,0.5)]); Y.append(1.0)
return X, Y
def train(hidden=8, epochs=800, lr=0.05):
X, Y = make_blobs()
in_dim = 2
W1 = [[random.uniform(-0.5,0.5) for _ in range(in_dim)] for _ in range(hidden)]
b1 = [0.0]*hidden
W2 = [random.uniform(-0.5,0.5) for _ in range(hidden)]
b2 = 0.0
for ep in range(epochs):
dW1 = [[0.0]*in_dim for _ in range(hidden)]
db1 = [0.0]*hidden
OUTPUT
:001 > Cmd/Ctrl-Enter to run, Cmd/Ctrl-/ to comment