Introduction to Neural Network: Feedforward

You can find this article and source code at my GitHub repo

A Deadly Simple Neural Network

Have you ever heard of or look at some materials about the neural network, if so, a model looks like the one in below figure should not be a stranger to you.

An artificial neural network model
An artificial neural network model

Well, that's a little bit complicated, so what about this one?

Diagram of a simple neural network
Diagram of a simple neural network

Let's have a brief explanation for each component in the figure. Each circle represents a unit (or a neuron). And each square represents a calculation. The left most three units form the input layer. The neuron with an h inside is the only neuron the output layer of this neural network has.

The input to the output unit
The input to the output unit

Recall that for a biological neuron, there exists a threshold for such a neuron to be activated. In our neural network, the neuron will calculate the input with an activation function and send the result as the output. One of the biggest advantages of the activation function is that the function could be any function. It means you can use step, polynomial or sigmoid functions of your choice to build your model. The output unit returns the result of f(h), where h is the input to the output unit, and y is the output of the neural network.

if you let f(h) = h as your activation function, then the output of the network will be this, note that here y = f(h).

The output
The output

You are correct if you think this is just the linear regression model. Once you start using activation functions that are continuous and differentiable, it's possible to train the network using gradient descent. We gonna need the first derivative of the activation function. Before we dive into the training process, let's code the dead simple neural network in Python. For the activation function, we will use the sigmoid function. Don't worry if you think this network can only make a prediction by feedforward but learn (get trained) from backpropagation.

The sigmoid function
The sigmoid function
import numpy as np

def sigmoid(x):
    # sigmoid function
    return 1/(1 + np.exp(-x))

inputs = np.array([0.7, -0.3])
weights = np.array([0.1, 0.8])
bias = -0.1

# calculate the output
output = sigmoid(np.dot(weights, inputs) + bias)

print('Output:')
print(output)

You can find the code here.

Your First 2-Layer NN

Now, you would have a basic idea of how a neural network makes predictions. In a real-world problem, such a simple neural network may not be very helpful for your problem. A new concept needs to introduced here, the hidden layer.

A network with 3 input units, 2 hidden units and 1 output unit
A network with 3 input units, 2 hidden units and 1 output unit

In the previous simple network, our weight is a vector, but for a more common case, our weight should be a matrix looks like below (and this is the weight matrix represented in the figure above).

Weights matrix for 3 input units and 2 hidden units
Weights matrix for 3 input units and 2 hidden units

You may get the idea of calculating h1 from the 2-layer neural network structure. Let's name it a mathematical formula.

The formula of calculating hidden layer inputs
The formula of calculating hidden layer inputs

And for our case,

The matrix multiplication for calculating hidden layer inputs of our network
The matrix multiplication for calculating hidden layer inputs of our network

Note: The weight indices have changed in the above image and no longer match up with the labels used in the earlier diagrams. That's because, in matrix notation, the row index always precedes the column index, so it would be misleading to label them the way we did in the neural net diagram.

Weight matrix shown with labels matching earlier diagrams.
Weight matrix shown with labels matching earlier diagrams.

Remember, the above is not a correct view of the indices, but it uses the labels from the earlier neural net diagrams to show you where each weight ends up in the matrix.

Combine with the formula we learned from the first section, we can implement the 2-layer neural network! The activation function used here is the sigmoid function.

Things to do:

  • Calculate the input to the hidden layer.
  • Calculate the hidden layer output.
  • Calculate the input to the output layer.
  • Calculate the output of the network.
import numpy as np

def sigmoid(x):
    # sigmoid function
    return 1/(1+np.exp(-x))

# Network size
N_input = 3
N_hidden = 2
N_output = 1

np.random.seed(42)
# Make some fake data
X = np.random.randn(4)

weights_in_hidden = np.random.normal(0, scale=0.1, size=(N_input, N_hidden))
weights_hidden_out = np.random.normal(0, scale=0.1, size=(N_hidden, N_output))

hidden_layer_in = np.dot(X, weights_in_hidden)
hidden_layer_out = sigmoid(hidden_layer_in)

print('Hidden-layer Output:')
print(hidden_layer_out)

output_layer_in = np.dot(hidden_layer_out, weights_hidden_out)
output_layer_out = sigmoid(output_layer_in)

print('Output-layer Output:')
print(output_layer_out)

You can find the code here.

Reference

Thanks for reading. If you find any mistake/typo in this blog, please don't hesitate to let me know, you can reach me by email: jyang7[at]ualberta.ca

最后編輯于
?著作權歸作者所有,轉載或內(nèi)容合作請聯(lián)系作者
  • 序言:七十年代末尿这,一起剝皮案震驚了整個濱河市烤镐,隨后出現(xiàn)的幾起案子纳胧,更是在濱河造成了極大的恐慌苇倡,老刑警劉巖扎拣,帶你破解...
    沈念sama閱讀 206,013評論 6 481
  • 序言:濱河連續(xù)發(fā)生了三起死亡事件,死亡現(xiàn)場離奇詭異亏钩,居然都是意外死亡促脉,警方通過查閱死者的電腦和手機,發(fā)現(xiàn)死者居然都...
    沈念sama閱讀 88,205評論 2 382
  • 文/潘曉璐 我一進店門代芜,熙熙樓的掌柜王于貴愁眉苦臉地迎上來埠褪,“玉大人,你說我怎么就攤上這事〕伲” “怎么了贷掖?”我有些...
    開封第一講書人閱讀 152,370評論 0 342
  • 文/不壞的土叔 我叫張陵,是天一觀的道長渴语。 經(jīng)常有香客問我苹威,道長,這世上最難降的妖魔是什么驾凶? 我笑而不...
    開封第一講書人閱讀 55,168評論 1 278
  • 正文 為了忘掉前任牙甫,我火速辦了婚禮,結果婚禮上调违,老公的妹妹穿的比我還像新娘窟哺。我一直安慰自己,他們只是感情好技肩,可當我...
    茶點故事閱讀 64,153評論 5 371
  • 文/花漫 我一把揭開白布且轨。 她就那樣靜靜地躺著,像睡著了一般虚婿。 火紅的嫁衣襯著肌膚如雪旋奢。 梳的紋絲不亂的頭發(fā)上,一...
    開封第一講書人閱讀 48,954評論 1 283
  • 那天然痊,我揣著相機與錄音至朗,去河邊找鬼。 笑死剧浸,一個胖子當著我的面吹牛锹引,可吹牛的內(nèi)容都是我干的。 我是一名探鬼主播辛蚊,決...
    沈念sama閱讀 38,271評論 3 399
  • 文/蒼蘭香墨 我猛地睜開眼粤蝎,長吁一口氣:“原來是場噩夢啊……” “哼真仲!你這毒婦竟也來了袋马?” 一聲冷哼從身側響起,我...
    開封第一講書人閱讀 36,916評論 0 259
  • 序言:老撾萬榮一對情侶失蹤秸应,失蹤者是張志新(化名)和其女友劉穎虑凛,沒想到半個月后,有當?shù)厝嗽跇淞掷锇l(fā)現(xiàn)了一具尸體软啼,經(jīng)...
    沈念sama閱讀 43,382評論 1 300
  • 正文 獨居荒郊野嶺守林人離奇死亡桑谍,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內(nèi)容為張勛視角 年9月15日...
    茶點故事閱讀 35,877評論 2 323
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發(fā)現(xiàn)自己被綠了祸挪。 大學時的朋友給我發(fā)了我未婚夫和他白月光在一起吃飯的照片锣披。...
    茶點故事閱讀 37,989評論 1 333
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內(nèi)的尸體忽然破棺而出雹仿,到底是詐尸還是另有隱情增热,我是刑警寧澤,帶...
    沈念sama閱讀 33,624評論 4 322
  • 正文 年R本政府宣布胧辽,位于F島的核電站峻仇,受9級特大地震影響,放射性物質(zhì)發(fā)生泄漏邑商。R本人自食惡果不足惜摄咆,卻給世界環(huán)境...
    茶點故事閱讀 39,209評論 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望人断。 院中可真熱鬧吭从,春花似錦、人聲如沸含鳞。這莊子的主人今日做“春日...
    開封第一講書人閱讀 30,199評論 0 19
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽蝉绷。三九已至鸭廷,卻和暖如春,著一層夾襖步出監(jiān)牢的瞬間熔吗,已是汗流浹背辆床。 一陣腳步聲響...
    開封第一講書人閱讀 31,418評論 1 260
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留桅狠,地道東北人讼载。 一個月前我還...
    沈念sama閱讀 45,401評論 2 352
  • 正文 我出身青樓,卻偏偏與公主長得像中跌,于是被迫代替她去往敵國和親咨堤。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 42,700評論 2 345

推薦閱讀更多精彩內(nèi)容