Skip to main content

A Time Series Forecasting Method Using DBN and Adam Optimization

  • Conference paper
  • First Online:
Artificial Intelligence for Communications and Networks (AICON 2022)

Abstract

Deep Belief Net (DBN) was applied to the field of time series forecasting in our early works. In this paper, we propose to adopt Adaptive Moment Estimation (Adam) optimization method to the fine-tuning process of DBN instead of the conventional Error Back-Propagation (BP) method. Meta parameters, such as the number of layers of Restricted Boltzmann Machine (RBM), the number of units in each layer, the learning rate, are optimized by Random Search (RS) or Particle Swarm Optimization (PSO). Comparison experiments showed the priority of the proposed method in both cases of a benchmark dataset CATS which is an artificial time series data used in competitions for long-term forecasting, and Lorenz chaos for short-term forecasting in the sense not only prediction precision but also learning performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Engle, R.F.: Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation. Econometrica 50(4), 987–1007 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  2. Kuremoto, T., Obayashi, M., Kobayashi, K.: Neural forecasting systems. In: Weber, C., Elshaw, M., Mayer, N.M. (eds.) Reinforcement Learning, Theory and Applications, Chapter 1, pp. 1–20, INTECH (2008)

    Google Scholar 

  3. Kuremoto, T., Kimura, S., Kobayashi, K., Obayashi, M.: Time series forecasting using restricted Boltzmann machine. In: Huang, D.-S., Gupta, P., Zhang, X., Premaratne, P. (eds.) ICIC 2012. CCIS, vol. 304, pp. 17–22. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31837-5_3

    Chapter  Google Scholar 

  4. Kuremoto, T., Kimura, S., Kobayashi, K., Obayashi, M.: Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137(5), 47–56 (2014)

    Article  Google Scholar 

  5. Kuremoto, T., Obayashi, M., Kobayashi, K., Hirata, T., Mabu, S.: Forecast chaotic time series data by DBNs. In: Proceedings of the 7th International Congress on Image and Signal Processing (CISP 2014), pp. 1304–1309 (2014)

    Google Scholar 

  6. Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: Forecasting real time series data using deep belief net and reinforcement learning. J. Robotics Netw. Artif. Life 4(4), 260–264 (2018)

    Article  Google Scholar 

  7. Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  8. Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: Time series prediction using DBN and ARIMA. In: International Conference on Computer Application Technologies (CCATS 2015), pp. 24–29. Matsue, Japan (2015)

    Google Scholar 

  9. Hirata, T., Kuremoto, T., Obayashi, M., Mabu, S., Kobayashi, K.: A novel approach to time series forecasting using deep learning and linear model. IEEJ Trans. Electron. Inf. Syst. 136(3), 348–356 (2016)

    Google Scholar 

  10. Zhang, G.P.: Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 50, 159–175 (2003)

    Article  MATH  Google Scholar 

  11. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)

    Article  MATH  Google Scholar 

  12. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  13. Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time series prediction competition: the CATS benchmark. In: Proceedings of International Joint Conference on Neural Networks (IJCNN 2004), pp. 1615–1620 (2004)

    Google Scholar 

  14. Lendasse, A., Oja, E., Simula, O., Verleysen, M.: Time series prediction competition: the CATS benchmark. Neurocomputing 70(13–15), 2325–2329 (2007)

    Article  Google Scholar 

  15. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13(2), 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  16. Kuremoto, T., Hirata, T., Obayashi, M., Kobayashi, K., Mabu, S.: Search heuristics for the optimization of DBN for time series forecasting. In: Iba, H., Noman, N. (eds.) Deep Neural Evolution. NCS, pp. 131–152. Springer, Singapore (2020). https://doi.org/10.1007/978-981-15-3685-4_5

    Chapter  Google Scholar 

Download references

Acknowledgement

This work was supported by JSPS KAKENHI Grant No. 22H03709, and No. 22K12152.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Takashi Kuremoto .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kuremoto, T., Furuya, M., Mabu, S., Kobayashi, K. (2023). A Time Series Forecasting Method Using DBN and Adam Optimization. In: Kambayashi, Y., Nguyen, N.T., Chen, SH., Dini, P., Takimoto, M. (eds) Artificial Intelligence for Communications and Networks. AICON 2022. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 477. Springer, Cham. https://doi.org/10.1007/978-3-031-29126-5_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-29126-5_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-29125-8

  • Online ISBN: 978-3-031-29126-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics