Multi Layer Perceptron - Neural Network
Multi Layer Perceptron - Neural Network
For the data shown in the following table, show the first iteration in trying to compute the
membership values for the input variables x1, x2, x3, and x4 in the regions R1, and R2. Use a
4×3×2 neural network with a random set of weights.
x1 x2 x3 x4 R1 R2
4 2 -4 1 0 1
SOLUTION NO 6
x1 x2 x3 x4 R1 R2
4 2 -4 1 0 1
𝑾111
X1
𝑾113 𝑾112 2
𝑾11
2
𝑾12 R1
𝑾121 𝑾221
X2 𝑾122
𝑾123
𝑾131
X3 𝑾132
𝑾222
𝑾133 𝑾231 R2
𝑾141 𝑾142
𝑾232
X4
𝑾143
Suppose threshold value t=0, so the output of the 2nd and 3rd layer can be determined below
Output of the 2nd layer
1
𝑂12 = 1) 1) 1) 1)
(1 + 𝐸𝑋𝑃[(𝑋1 ∗ 𝑊11 + (𝑋2 ∗ 𝑊21 + (𝑋3 ∗ 𝑊31 + (𝑋4 ∗ 𝑊41 − 𝑡])
1
= = 0.475021
(1 + 𝐸𝑋𝑃[(4 ∗ 0.2) + (2 ∗ 0.4) + (−4 ∗ 0.6) + (1 ∗ 0.7))
1
𝑂22 = 1) 1) 1) 1)
(1 + 𝐸𝑋𝑃[(𝑋1 ∗ 𝑊12 + (𝑋2 ∗ 𝑊22 + (𝑋3 ∗ 𝑊32 + (𝑋4 ∗ 𝑊42 − 𝑡])
1
= = 0.401312
(1 + 𝐸𝑋𝑃[(4 ∗ 0.5) + (2 ∗ 0.3) + (−4 ∗ 0.8) + (1 ∗ 0.2))
1
𝑂32 = 1) 1) 1) 1)
(1 + 𝐸𝑋𝑃[(𝑋1 ∗ 𝑊13 + (𝑋2 ∗ 𝑊23 + (𝑋3 ∗ 𝑊33 + (𝑋4 ∗ 𝑊43 − 𝑡])
1
= = 0.845535
(1 + 𝐸𝑋𝑃[(4 ∗ 0.4) + (2 ∗ 0.3) + (−4 ∗ 0.2) + (1 ∗ 0.3))
1
𝑂13 =
(1 + 𝐸𝑋𝑃[(𝑂12 ∗ 2)
𝑊11 + (𝑂22 ∗ 𝑊21
2)
+ (𝑂32 ∗ 𝑊31
2)
− 𝑡])
1
= = 0.643901
(1 + 𝐸𝑋𝑃[(0.475021 ∗ 0.9) + (0.401312 ∗ 0.2) + (0.845535 ∗ 0.1)))
1
𝑂23 =
(1 + 𝐸𝑋𝑃[(𝑂12 ∗ 𝑊12
2)
+ (𝑂22 ∗ 𝑊22
2)
+ (𝑂32 ∗ 𝑊32
2)
− 𝑡])
1
= = 0.654339
(1 + 𝐸𝑋𝑃[(0.475021 ∗ 0.2) + (0.401312 ∗ 0.3) + (0.845535 ∗ 0.5)))
Detail for this answer, please open the excel file “Check_For_Number_6.xlsx”