Teach Neural Networks to identify sequences of values

Off We Go!

The compile() method then readies the learning model for processing, determines mean_squared_error (deviations from optimal learning success are measured according to the mean square method) as the learning parameter, and sets rmsprop, a common method for neural networks, as the algorithm optimizer.

Line 45 then calls the model's fit() method, passes the learning data to it, and stipulates the number of learning iterations as epoch = 500. In my lab, a lower number affected the results negatively, but larger values of epoch did not achieve any greater learning success, since the system reached a steady state afterward, and the learning process visibly stagnated at a constant value for the loss function.

The section starting in line 48 of Listing 1 then monitors the predictions with the model in the current phase in the training, both for the training data and for previously unseen sequences that the system needs to guess without precedence in the training data.

Figure 7 shows that the network does a good job, even for cyclic data (1,2,3,1,2,3), regardless of the length of the signal period, which is a major advantage over traditional neural networks, for which you have to state the periodicity in advance for reliable predictions.

Figure 7: In a cyclic sequence, the algorithm predicts the next values more or less correctly.

Fast as a Snail

You need to install the following libraries from the Python treasure trove to be able to install the keras library:

pip3 install --user keras pandas tensorflow sklearn numpy
sudo apt-get install python-tk

The TensorFlow back end used by keras is not exactly speedy; it took a good 10 seconds for the program to get going and start the training on my five-year-old PC.

At the end of the day, the network did not perform particularly well in the intelligence test (Figure 8; see also the "Online Example" box). As you will have guessed, you need to alternately add 3 and 2 to the existing numbers in the 2,5,7,10,12 series. Since the jump from 10 to 12 was two units in length, a count of three thus needs to be added to predict the next number: 15 is the correct result. The network tended toward 14 most of the time, so it cannot compete with human intelligence (as of yet).

Figure 8: The LSTM network did not do well in the intelligence test, because it failed to identify the different growth rates.

Online Example

Mike Schilli demonstrates the example in a screencast (in German) at: http://www.linux-magazin.de/Ausgaben/2017/10/plus


  1. Number sequence test: https://www.fibonicci.com/numerical-reasoning/number-sequences-test/easy/
  2. Listings for this article: ftp://ftp.linux-magazine.com/pub/listings/linux-magazine.com/206/
  3. Brownlee, Jason. Long Short-Term Memory Networks with Python: Machine Learning Mastery. 2017, https://machinelearningmastery.com/lstms-with-python/

The Author

Mike Schilli works as a software engineer in the San Francisco Bay area, California. Each month in his column, which has been running since 1997, he researches practical applications of various programming languages. If you go to mailto:mschilli@perlmeister.com he will gladly answer any questions.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Neural networks learn from mistakes and remember successes

    The well-known Monty Hall game show problem can be a rewarding maiden voyage for prospective statisticians. But is it possible to teach a neural network to choose between goats and cars with a few practice sessions?

  • Unsupervised Learning

    The most tedious part of supervised machine learning is providing sufficient supervision. However, if the samples come from a restricted sample space, unsupervised learning might be fine for the task.

  • Programming Snapshot – Mileage AI

    On the basis of training data in the form of daily car mileage, Mike Schilli's AI program tries to identify patterns in driving behavior and make forecasts.

  • Neural Networks

    3, 4, 8, 11… ? A neural network can complete this series without knowledge of the underlying algorithm – by a kind of virtual gut feeling. We’ll show you how neural networks solve problems by simulating the behavior of a human brain.

  • Machine Learning

    "I won't make this mistake again," you promise yourself. In other words, you'll learn from experience. If you translate experience into data, computers can do that, too. We'll introduce you to the fundamental forms of machine learning.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More