Hey, everyone! In my journey to become the master of all things NLP, I’d like to begin with the beginning. This underlying foundation is referred to as Neural Networks. In the first installment of this series, I’ll dive into a bit of introduction on the concept as a whole.

*What are Neural Networks?*

*What are Neural Networks?*

Neural networks (sometimes referred to as NN, Artificial Neural Networks or ANN’s) are a type of machine learning associated with deep machine learning. Deep neural networks are defined as networks that have an input layer, an output layer and at least one hidden layer in between.

These layers perform specific types of sorting and ordering tasks on our data to give us our desired results. The concept of a neural network in programming is inspired by the neural network in our brains. It’s the concepts of activations and interconnectivity that drive the comparison. But that is pretty much where it ends.

Each deep layer in the network is connected with its preceding and following layers because of the relationship between each neuron. The input and an output for each neuron is a link called a “weight”. This “weight” is a trainable factor in how much input to use. As input enters the neuron, it is multiplied by the weight value and the resulting output of this multiplication is either observed and logged, or a bias is applied and it is passed to the next layer in the neural network.

The purpose of a bias, which is another trainable parameter, is to offset the output. Because data is rarely perfect, the bias makes the results better suited for a real-world analysis. It levels out the playing field, so to speak.

*A math visualization for weights and bias:*

*A math visualization for weights and bias:*

Weights and Bias are both trainable parameters. In a neural network, the model’s optimizer trains thousands, sometimes millions of these parameters. Each affects the neuron’s output in different ways. For instance weights are multiplied so they will change the magnitude of the output or sometimes flip the sign from positive to negative or vice versa.

It can be helpful to visualize the equation of Output = Weight*Input+Bias as the equation for a line, such as the equation y = mx + b. If we graph a neuron that has a single output, a weight of 1 and bias of 0, we can see the following.

If the weight is increased to, let’s say 2.00, the slope of the function changes. The slope will get steeper as the weight increases.

Conversely, if the weight is decreased the slope will decrease as well. Negating the weight will turn the entire slope negative.

The bias, however, will offset the overall slope. Increasing this bias will shift the slope upward. If we have a weight of 1 and a bias of 2, we can see the following visualized.

As we can see, weights and biases influence the output in different but very important ways.

*A simplistic example of a neural network in action:*

*A simplistic example of a neural network in action:*

The reality is that the full functions of neural networks can be extremely large. Our basic example will have a simple architecture of an input layer, an output layer and two hidden layers with 4 neurons each.

The input layer represents the input data. For this example, the input data could be the pixels of a photo. Let’s say we want to classify if a photo is of a red rose or a white lotus.

The output layer will be the prediction of the class of these input images. Output layers can have as many neurons as the training dataset has specified classes. However, we can use the binary classification for this example. One neuron in the output layer will represent a “red rose” and the other will represent a “white lotus”.

Every image that we pass through our network will have a final calculated value for both the “red rose” neuron and the “white lotus” neuron. When analyzing the neurons, the one with the highest score becomes the class prediction for the input image. Here is a visual representation of what it would look like

Neural networks can of course be used for much more than classification. They perform tasks like clustering, regression and more. In this article, we had a brief look at the overall concept of neural networks. Stay tuned for the next installment where we will implement some Python code and take a look at some of the libraries built for these tasks.

## Learn More

To learn more, feel free to reach out to me @MyricksZenzele on Twitter, connect with me on LinkedIn, and join our Discord. Remember to follow the blog to stay updated with cool Python projects and ways to level up your Software and Python skills! If you liked this article, please Tweet it, share it on LinkedIn, or tell your friends!

I run this site to help you and others like you find cool projects and practice software skills. If this is helpful for you and you enjoy your ad free site, please help fund this site by donating below! If you can’t donate right now, please think of us next time.

#### Make a one-time donation

#### Make a monthly donation

#### Make a yearly donation

Choose an amount

Or enter a custom amount

Your contribution is appreciated.

Your contribution is appreciated.

Your contribution is appreciated.

DonateDonate monthlyDonate yearly