hi there! this is 20+ part blog series where i explain in detail, everything i’ve learnt in deep learning so far.
we will start from basics and build things up step by step.
its not going to be a lame surface-level theoretical blog, but instead a in-depth explanation with the math to back it all up..
i have referenced mitesh khapra’s deep learning lectures and ian goodfellow’s deep learning book and some other random sources
Posts
-
ensemble learning (the more the merrier)
-
implemnting all ML algos from scratch (without sklean)
-
part III: errors and error surfaces
-
Part II: brains to bytes - journey from neurons to perceptrons
subscribe via RSS