Artificial Neural Networks (CS5652): Homework 3
Due date: April. 13, 2001
- (15%)
Use the data set x = [0; 1; 2; 3; 4; 5; 6; 7; 8; 9]
and y = [0; 1.5; 3.5; 6.5; 7.5; 10.5; 11.5; 14.5; 15.5; 18.5]
to do polynomial fitting, with the orders being 6, 7, 8, and 9.
Submit hardcopy of your MATLAB script and the following
plot generated by the script:
.
(Hint: try spring.m.)
- (15%)
The following figure shows an MLP involving a single hidden
neuron and jumping connections from the inputs to the output
directly.
(a) Construct a truth table for all variables x1, x2, x3, and
x4. Show that the network solves the XOR problem.
(b) Plot the decision boundary of x3 in x1-x2 plane;
(c) Plot the decision boundary of x4 in x1-x2 plane and
explain how you derive it.
(Note that your decision boundary
should not be limited to the unit square only.)
.
Note that the number on top of each neuron is the threshold
and it is subtracted from the net input.
For instance, the equation for x3 is x3 = step(x1+x2-1.5).
- (30%)
An 2-2-1 MLP has the following configuration:
Suppose that the above MLP has the following decision boundary
for XOR problem:
.
Find the missing weights of the MLP.
- (20%)
Finish designing the perceptron with step-function threshold
units in the following figure; by determining
threshold values
(i=1,2,3)
and six connection weights between the input layer and the
hidden layer that enable your perceptron to recognize correctly
whether an arbitrary point (x, y)
is in the shaded triangular area T or not.
The perceptron is supposed to produce the output o:
.
Figure:
(a) A classification problem, and (b) the perceptron designed
to solve the problem.
You should put your answer into a compact matrix:
where wxi and wyi
is the connection weights from
inputs x and y, respectively, to the hidden node i.
(Note that there are six sets of solutions. You only need to find one
of them. If you find all of them, you get extra credits.)
Back to CS5652 Home