Hello, world

This is the first post on the site. Mostly a placeholder so the page renders end-to-end before I write anything real.

The plan is to use this space for short notes on what I’m learning — machine learning, data, and the messy parts in between.

Why a separate blog

Projects belong on GitHub. A resume belongs in a PDF. But the in-between thinking — what worked, what broke, what I’d do differently — needs somewhere quieter to live. That’s what this is.

Math

Inline: f(x)=ax2+bx+cf(x) = ax^2 + bx + c.

Block:

θJ(θ)=1mi=1m(hθ(x(i))y(i))x(i)\nabla_\theta J(\theta) = \frac{1}{m} \sum_{i=1}^{m} \left( h_\theta(x^{(i)}) - y^{(i)} \right) x^{(i)}

Callouts

Code

def softmax(x):
    e = np.exp(x - x.max())
    return e / e.sum()

Embeds

Play

More soon.

← all posts