Tenorlfow Complete Neural Network Sample from Zero and Simple Realization of Data Set Visualization

Training neural networks on simulated data sets, solving simple binary classification problems, and simply realizing visualization of data sets

scatter() in pyplot of matplotlib can be used to visualize data sets.

This is the data set for testing. The sample will be used to solve the binary classification problem.
In this case, there are two inputs x1 and x2, and x1 + x2 < 1 in the data set is considered as positive sample, whereas negative sample.
After weighting one hidden layer and weighting sum of one output layer, the y value is nonlinearized (sigmoid). After repeatedly running back propagation optimization algorithm to update the values of parameters w1 and w2, the value of cross-entropy decreases gradually, and the cross-entropy shows that the prediction results are more fitting to the real results.

The role of batch
At the beginning of each iteration, a small amount of training data, called batch, is first selected.
The batch example passes through the forward propagation algorithm

import tensorflow as tf
import numpy
import matplotlib.pyplot as plt

w1 = tf.Variable(tf.random_normal([2, 3], stddev=1, seed=1))
w2 = tf.Variable(tf.random_normal([3, 1], stddev=1, seed=1))

x = tf.placeholder(tf.float32, shape=(None, 2), name="x-input")
y_ = tf.placeholder(tf.float32, shape=(None, 1), name='y-input')

a = tf.matmul(x, w1)
y = tf.matmul(a, w2)



y = tf.sigmoid(y)

cross_entropy = -tf.reduce_mean(
    y_ * tf.log(tf.clip_by_value(y, 1e-10, 1.0))
    + (1 - y_)* tf.log(tf.clip_by_value(1-y, 1e-10, 1-0)))

learning_rate = 0.001

train_step = tf.train.AdamOptimizer(learning_rate).minimize(cross_entropy)

# Generating a simulated data set from random numbers
rdm = numpy.random.RandomState(1)
dataset_size = 128
X = rdm.rand(dataset_size, 2)

# X_1 + x_2 < 1 is considered a positive sample
Y = [[int(x1+x2) < 1] for (x1, x2) in X]

# Drawing
plot_x1 = []
plot_y1 = []

plot_x2 = []
plot_y2 = []

fig = plt.figure()
for i in range(len(Y)):
    if Y[i][0]:
        plot_x1.append(X[i, 0])
        plot_y1.append(X[i, 1])
    else:
        plot_x2.append(X[i, 0])
        plot_y2.append(X[i, 1])

plt.scatter(plot_x1, plot_y1, marker='o', color='green', s=40, label='Ok')
plt.scatter(plot_x2, plot_y2, marker='x', color='red', s=40, label='No')
plt.legend(loc='best')    # Set the location of the legend use the recommended location
plt.show()

# Create a session
with tf.Session() as sess:
    init_op = tf.global_variables_initializer()
    sess.run(init_op)

    print(sess.run(w1))
    print(sess.run(w2))

    # Setting the number of training rounds
    STEP = 5000
    # Set the batch size
    batch_size = 8
    for i in range(STEP):
        start = (i * batch_size) % dataset_size
        end = min(start + batch_size, dataset_size)

        # Select sample training network and update parameters
        sess.run(train_step,
                 feed_dict={x: X[start:end], y_: Y[start:end]})

        # Output cross-entropy at intervals
        if i % 1000 == 0:
            total_cross_entropy = sess.run(
                cross_entropy, feed_dict={x: X, y_: Y})
            print("After %d traning step(s), cross entropy on all data is %g" %(i, total_cross_entropy))
    # PARAMETERS IN PRINTING NEURAL NETWORK
    print(sess.run(w1))
    print(sess.run(w2))

Output results

[[-0.8113182   1.4845988   0.06532937]
 [-2.4427042   0.0992484   0.5912243 ]]
[[-0.8113182 ]
 [ 1.4845988 ]
 [ 0.06532937]]
After 0 traning step(s), cross entropy on all data is 1.89805
After 1000 traning step(s), cross entropy on all data is 0.655075
After 2000 traning step(s), cross entropy on all data is 0.626172
After 3000 traning step(s), cross entropy on all data is 0.615096
After 4000 traning step(s), cross entropy on all data is 0.610309
[[ 0.02476983  0.56948674  1.6921941 ]
 [-2.1977348  -0.23668918  1.1143897 ]]
[[-0.45544705]
 [ 0.49110925]
 [-0.98110336]]

It can be found that the cross-entropy decreases gradually after 5000 iterations, and w1 and w2 are updated to new values.

Tags: Session network

Posted on Wed, 09 Oct 2019 08:25:12 -0700 by gszauer