Skip to main content

Optimally Weighted Ensembles for Efficient Multi-objective Optimization

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2021)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 13163))

  • 1714 Accesses

Abstract

The process of industrial design engineering is often involved with the simultaneous optimization of multiple expensive objectives. The surrogate assisted multi-objective S-Metric Selection – Efficient Global Optimization (SMS-EGO) algorithm is one of the most popular algorithms to solve these kind of problems. We propose an extension of the SMS-EGO algorithm with optimally weighted, linearly combined ensembles of regression models to improve its objective modelling capabilities. Multiple (different) surrogates are combined into one optimally weighted ensemble per objective using a model agnostic uncertainty quantification method to balance between exploration and exploitation. The performance of the proposed algorithm is evaluated on a diverse set of benchmark problems with a small initial sample and an additional budget of 25 evaluations of the real objective functions. The results show that the proposed Ensemble-based – S-Metric Selection – Efficient Global Optimization (E-SMS-EGO) algorithm outperforms the state-of-the-art algorithms in terms of efficiency, robustness and spread across the objective space.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/Gitdeon/E-SMS-EGO.

  2. 2.

    https://github.com/Gitdeon/E-SMS-EGO.

References

  1. Acar, E.: Various approaches for constructing an ensemble of metamodels using local measures. Struct. Multi. Optim. 42, 879–896 (2010). https://doi.org/10.1007/s00158-010-0520-z

  2. Acar, E., Rais-Rohani, M.: Ensemble of metamodels with optimized weight factors. Struct. Multi. Optim. 37, 279–294 (2008). https://doi.org/10.1007/s00158-008-0230-y

  3. Allmendinger, R., Emmerich, M., Hakanen, J., Jin, Y., Rigoni, E.: Surrogate-assisted multicriteria optimization: complexities, prospective solutions, and business case. J. Multi-Criteria Decis. Anal. 24, 5–24 (2016). https://doi.org/10.1002/mcda.1605

  4. Azzouz, N., Bechikh, S., Ben Said, L.: Steady state ibea assisted by MLP neural networks for expensive multi-objective optimization problems. In: Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, GECCO 2014, pp. 581–588. Association for Computing Machinery, New York (2014). https://doi.org/10.1145/2576768.2598271

  5. Bandaru, S., Ng, A.H.C., Deb, K.: On the performance of classification algorithms for learning pareto-dominance relations. In: 2014 IEEE Congress on Evolutionary Computation (CEC), pp. 1139–1146 (2014). https://doi.org/10.1109/CEC.2014.6900641

  6. Blank, J., Deb, K.: Pymoo: multi-objective optimization in python. IEEE Access 8, 89497–89509 (2020)

    Article  Google Scholar 

  7. Coello, C.A.C., Lamont, G.B., Veldhuizen, D.A.V.: Evolutionary Algorithms for Solving Multi-Objective Problems. Genetic and Evolutionary Computation, Springer, Heidelberg (2006). https://doi.org/10.1007/978-0-387-36797-2

    Book  MATH  Google Scholar 

  8. Cox, D.D., John, S.: Sdo: a statistical method for global optimization. In: Multidisciplinary Design Optimization: State-of-the-Art, pp. 315–329 (1997)

    Google Scholar 

  9. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)

    Article  Google Scholar 

  10. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  11. Deb, K.: Multiobjective Optimization Using Evolutionary Algorithms. Wiley, New York, January 2001

    Google Scholar 

  12. Deb, K., Pratap, A., Meyarivan, T.: Constrained test problems for multi-objective evolutionary optimization. vol. 1993, pp. 284–298, March 2001. https://doi.org/10.1007/3-540-44719-9_20

  13. Ehrgott, M.: Vilfredo Pareto and multi-objective optimization, pp. 447–453. Documenta mathematica (2012)

    Google Scholar 

  14. Fletcher, C.A.J.: Computational Fluid Dynamics: An Introduction. Springer, Heidelberg (1988). https://doi.org/10.1007/978-3-642-97035-1_1

  15. Friese, M., Bartz-Beielstein, T., Emmerich, M.: Building ensembles of surrogates by optimal convex combination. In: Bioinspired Optimization Methods and their Applications, pp. 131–143, May 2016

    Google Scholar 

  16. Goel, T., Haftka, R., Shyy, W., Queipo, N.: Ensemble of surrogates. Struct. Multi. Optim. 33, 199–216 (2007). https://doi.org/10.1007/s00158-006-0051-9

  17. Gong, W., Cai, Z., Zhu, L.: An efficient multiobjective differential evolution algorithm for engineering design. Struct. Multi. Optim. 38, 137–157 (2009). https://doi.org/10.1007/s00158-008-0269-9

  18. Horn, D., Wagner, T., Biermann, D., Weihs, C., Bischl, B.: Model-based multi-objective optimization: taxonomy, multi-point proposal, toolbox and benchmark. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C.C. (eds.) EMO 2015. LNCS, vol. 9018, pp. 64–78. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15934-8_5

    Chapter  Google Scholar 

  19. Jiang, P., Zhou, Q., Shao, X.: Surrogate-model-based design and optimization, January 2020. https://doi.org/10.1007/978-981-15-0731-1_7

  20. Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13, 455–492 (1998). https://doi.org/10.1023/A:1008306431147

  21. Knowles, J.: Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  22. Krige, D.: A statistical approach to some basic mine valuation problems on the witwatersrand. J. South. Afr. Inst. Min. Metall. 52(9), 201–203 (1952). https://doi.org/10.10520/AJA0038223X_4858

  23. Kursawe, F.: A variant of evolution strategies for vector optimization. In: Schwefel, H.-P., Männer, R. (eds.) PPSN 1990. LNCS, vol. 496, pp. 193–197. Springer, Heidelberg (1991). https://doi.org/10.1007/BFb0029752

    Chapter  Google Scholar 

  24. Li, K., Chen, R., Fu, G., Yao, X.: Two-archive evolutionary algorithm for constrained multiobjective optimization. IEEE Trans. Evol. Comput. 23(2), 303–315 (2019)

    Article  Google Scholar 

  25. Loshchilov, I., Schoenauer, M., Sebag, M.: Comparison-based optimizers need comparison-based surrogates. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 364–373. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_37

    Chapter  Google Scholar 

  26. Mckay, M., Beckman, R., Conover, W.: A comparison of three methods for selecting vales of input variables in the analysis of output from a computer code. Technometrics 21, 239–245 (1979). https://doi.org/10.1080/00401706.1979.10489755

  27. Miettinen, K., Mäkelä, M.: On scalarizing functions in multiobjective optimization. OR Spectrum 24, 193–213 (2002). https://doi.org/10.1007/s00291-001-0092-9

  28. Pedregosa, F., et al.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  29. Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted \(\cal{S}\)-metric selection. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 784–794. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_78

    Chapter  Google Scholar 

  30. Shi, R., Liu, L., Long, T., Liu, J.: An efficient ensemble of radial basis functions method based on quadratic programming. Eng. Optim. 48(7), 1202–1225 (2016). https://doi.org/10.1080/0305215X.2015.1100470

    Article  MathSciNet  Google Scholar 

  31. van Stein, B., Wang, H., Kowalczyk, W., Bäck, T.: A novel uncertainty quantification method for efficient global optimization. In: Medina, J., Ojeda-Aciego, M., Verdegay, J.L., Perfilieva, I., Bouchon-Meunier, B., Yager, R.R. (eds.) IPMU 2018. CCIS, vol. 855, pp. 480–491. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-91479-4_40

    Chapter  Google Scholar 

  32. Szabó, B., Babuška, I.: Introduction to Finite Element Analysis: Formulation, Verification and Validation. Wiley Series in Computational Mechanics, Wiley (2011). https://books.google.nl/books?id=bbi7cQAACAAJ

  33. Ulmer, H., Streichert, F., Zell, A.: Evolution strategies assisted by gaussian processes with improved preselection criterion. In: The 2003 Congress on Evolutionary Computation, 2003, CEC 2003. vol. 1, pp. 692–699(2003). https://doi.org/10.1109/CEC.2003.1299643

  34. Viana, F., Haftka, R., Steffen, Jr, V.: Multiple surrogates: how cross-validation errors can help us to obtain the best predictor. Struct. Multi. Optim. 39, 439–457 (2009). https://doi.org/10.1007/s00158-008-0338-0

  35. Wagner, T.: Planning and Multi-Objective Optimization of Manufacturing Processes by Means of Empirical Surrogate Models. Vulkan Verlag, Essen (2013)

    Google Scholar 

  36. Ye, P.: A review on surrogate-based global optimization methods for computationally expensive functions. Softw. Eng. 7, 68–84 (2019)

    Google Scholar 

  37. Ye, P., Pan, G.: Global optimization method using ensemble of metamodels based on fuzzy clustering for design space reduction. Eng. Comput. 33(3), 573–585 (2016). https://doi.org/10.1007/s00366-016-0490-x

    Article  Google Scholar 

  38. Zhang, J., Chowdhury, S., Messac, A.: An adaptive hybrid surrogate model. Struct. Multidiscip. Optim. 46(2), 223–238 (2012). https://doi.org/10.1007/s00158-012-0764-x

    Article  Google Scholar 

  39. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by moea/d with gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gideon Hanse .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hanse, G., de Winter, R., van Stein, B., Bäck, T. (2022). Optimally Weighted Ensembles for Efficient Multi-objective Optimization. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2021. Lecture Notes in Computer Science(), vol 13163. Springer, Cham. https://doi.org/10.1007/978-3-030-95467-3_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95467-3_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95466-6

  • Online ISBN: 978-3-030-95467-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics