2010-05-01
Comparison of universal approximators incorporating partial monotonicity by structure
Publication
Publication
Neural Networks , Volume 23 - Issue 4 p. 471- 475
Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required; on the other hand, the smoothness and the monotonicity of selected input–output relations have to be guaranteed. Otherwise, the stability of most of the control laws is lost. In this article we compare two neural network-based approaches incorporating partial monotonicity by structure, namely the Monotonic Multi-Layer Perceptron (MONMLP) network and the Monotonic MIN–MAX (MONMM) network. We show the universal approximation capabilities of both types of network for partially monotone functions. On a number of datasets, we investigate the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.
Additional Metadata | |
---|---|
, , , , | |
doi.org/10.1016/j.neunet.2009.09.002, hdl.handle.net/1765/19402 | |
ERIM Article Series (EAS) | |
Neural Networks | |
Organisation | Erasmus Research Institute of Management |
Minin, A., van Bruggen, G., & Daniels, H. (2010). Comparison of universal approximators incorporating partial monotonicity by structure. Neural Networks, 23(4), 471–475. doi:10.1016/j.neunet.2009.09.002 |