PlumX Metrics
Embed PlumX Metrics

Sobolev trained neural network surrogate models for optimization

Computers & Chemical Engineering, ISSN: 0098-1354, Vol: 153, Page: 107419
2021
  • 12
    Citations
  • 0
    Usage
  • 24
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    12
    • Citation Indexes
      11
    • Patent Family Citations
      1
      • Patent Families
        1
  • Captures
    24

Article Description

Neural network surrogate models are often used to replace complex mathematical models in black-box and grey-box optimization. This strategy essentially uses samples generated from a complex model to fit a data-driven, reduced-order model more amenable for optimization. Neural network models can be trained in Sobolev spaces, i.e., models are trained to match the complex function not only in terms of output values, but also the values of their derivatives to arbitrary degree. This paper examines the direct impacts of Sobolev training on neural network surrogate models embedded in optimization problems, and proposes a systematic strategy for scaling Sobolev-space targets during NN training. In particular, it is shown that Sobolev training results in surrogate models with more accurate derivatives (in addition to more accurately predicting outputs), with direct benefits in gradient-based optimization. Three case studies demonstrate the approach: black-box optimization of the Himmelblau function, and grey-box optimizations of a two-phase flash separator and two flashes in series. The results show that the advantages of Sobolev training are especially significant in cases of low data volume and/or optimal points near the boundary of the training dataset—areas where NN models traditionally struggle.

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know