βReliable Help, Wherever You Areβ
Deep Learning from scratch, a Journey of Discovery
I dive into the numpy foundation of neural networks with python.
Erik Recinos Jr.
1/22/20252 min read
Today, I started a garden, a garden foray unlike any other, into the roots of deep learning. My tools were that of curiosity and determination, and I used them to excavate the mechanics of chaining functions, dissecting Numpy arrays, and refining my python development skills. This was not simply a technical exercise-it was an opportunity to refine my understanding of how neural networks work from the roots and up! I will share, in the upcoming sections, key lessons and insights I gained that can inspire anyone delving into deep learning and python programming π€
ππππππππππππππππ
Blast off! Here we go! To the realm of deep learning!
One of the most profound (and coolest) concepts I practiced today was the idea of chaining functions-executing multiple operations on a dataset sequentially. This directly mirrors how data flows through layers of a neural network, with each function using a composite (or transforming) the input before it passes to the next function.
Here is the optimized version of a function chaining pipeline I crafted:
from typing import Callable, List
import numpy as np
from numpy import ndarray
#Define a type alias for a function that takes and returns an ndarray
Array_function = Callable[[ndarray], ndarray]
#function to chain
def chain_any_length(chain: List[Array_function], a: ndarray) -> ndarray:
'''
evaluates all functions in a chain, sequentially
'''
for func in chain:
a = func(a)
return a
The true power of this function lies in its flexibility. Using this flexibility, I then created a chain of trigonometric transformations (sin, cos, and tan) and then applied it to a numpy array:
a = np.array([1, 2, 3])
result = chain_any_length([np.sin, np.cos, np.tan], a)
print(result)
#the output in my notebook is: [0.78635739 0.70533911 1.52387302]
I then used the data wrangler in vs code, an awesome feature for debugging arrays in tabular form!
The journey waas not just about Python, but about undeerstanding data flow through a network. Each function in the chain mirrors a corresponding layer in a neural network:
np.sin as the activation function,
np.cos as a normalization layer,
np.tan as an additional transformation
By chaining these operations, I could see how intermediate transformations shape the final output. This lays the foundation for back-propogation and other concepts.
Conclusion:
Today was a reminder that true mastery comes from love of the subject, curiosity, experimentation, and most of all, determination. There are days when I donβt want to do this, but after I start feeling my fingers on the keyboard, typing away, I feel an intense surge of curiosity and love engulf me. No matter what, never give up, and you will find your Sadhana, your calling. This is mine.
Recinos Jr. Solutions Group
Erik Recinos Jr.
erikrecinosjr@recinosjrsolutionsgroup.com
(407) 900-8198
Β© 2024. All rights reserved.
CEO of Recinos Jr. Solutions Group
Shoot Us an Email!
Contact