Please use this identifier to cite or link to this item:
http://hdl.handle.net/10773/29044
Title: | Automatic adjoint differentiation for gradient descent and model calibration |
Author: | Goloubentsev, Dmitri Lakshtanov, Evgeny |
Keywords: | Automatic adjoint differentiation Automatic vectorization Single instruction multiple data AAD-compiler |
Issue Date: | 14-Jul-2020 |
Publisher: | World Scientific Publishing |
Abstract: | In this work, we discuss the Automatic Adjoint Differentiation (AAD) for functions of the form G=12∑m1(Eyi−Ci)2, which often appear in the calibration of stochastic models. We demonstrate that it allows a perfect SIMDa parallelization and provides its relative computational cost. In addition, we demonstrate that this theoretical result is in concordance with numerical experiments. a Single Input Multiple Data. |
Peer review: | yes |
URI: | http://hdl.handle.net/10773/29044 |
DOI: | 10.1142/S0219691320400044 |
ISSN: | 0219-6913 |
Publisher Version: | https://www.worldscientific.com/doi/abs/10.1142/S0219691320400044 |
Appears in Collections: | CIDMA - Artigos AGG - Artigos DMat - Artigos |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
1901.04200.pdf | 105.15 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.