Drawing from Barron, Hornik, and Telgarsky, it proves neural networks yield superior efficiency in higher‑dimensional pricing tasks.Drawing from Barron, Hornik, and Telgarsky, it proves neural networks yield superior efficiency in higher‑dimensional pricing tasks.

Mathematics of Differential Machine Learning in Derivative Pricing and Hedging: Choice of Basis

Table of Links

Abstract

  1. Keywords and 2. Introduction

  2. Set up

  3. From Classical Results into Differential Machine Learning

    4.1 Risk Neutral Valuation Approach

    4.2 Differential Machine learning: building the loss function

  4. Example: Digital Options

  5. Choice of Basis

    6.1 Limitations of the Fixed-basis

    6.2 Parametric Basis: Neural Networks

  6. Simulation-European Call Option

    7.1 Black-Scholes

    7.2 Hedging Experiment

    7.3 Least Squares Monte Carlo Algorithm

    7.4 Differential Machine Learning Algorithm

  7. Numerical Results

  8. Conclusion

  9. Conflict of Interests Statement and References

Notes

6 Choice of Basis

A parametric basis can be thought of as a set of functions made up of linear combinations of relatively few basis functions with a simple structure and depending non-linearly on a set of “inner” parameters e.g., feed-forward neural networks with one hidden layer and linear output activation units. In contrast, classical approximation schemes do not use inner parameters but employ fixed basis functions, and the corresponding approximators exhibit only a linear dependence on the external parameters.

\ However, experience has shown that optimization of functionals over a variable basis such as feed-forward neural networks often provides surprisingly good suboptimal solutions.

\ A well-known functional-analytical fact is the employing the Stone-Weierstrass theorem, it is possible to construct several examples of fixed basis, such as the monomial basis, a set that is dense in the space of continuous function whose completion is L2. The limitations of the fixed basis are well studied and can be summarized as the following.

\

6.1 Limitations of the Fixed-basis

\ The variance-bias trade-off can be translated into two major problems:

\

  1. Underfitting happens due to the fact that high bias can cause an algorithm to miss the relevant relations between features and target outputs. This happens with a small number of parameters. In the previous terminology, that corresponds to a low d value (see Equation 4).

    \

  2. The variance is an error of sensitivity to small fluctuations in the training set. It is a measure of spread or variations in our predictions. High variance can cause an algorithm to model the random noise in the training data, rather than the intended outputs, which is denominated as overfitting. This, in turn, happens with a high number of parameters. In the previous terminology, that corresponds to a high d value (see Equation 4).

\ The following result resumes the problem discussed. I will state it as in Barron, 1993, and the proof can be found in Barron, 1993 and Gnecco et al., 2012

\

\

\ So, there is a need to study the class of basis, that can adjust to the data. That is the case with the parametric basis.

\

6.2 Parametric Basis: Neural Networks

\ From Hornik et al., 1989, we find the following relevant results:

\ The flexibility and approximation power of neural networks makes them an excellent choice as the parametric basis.

\ 6.2.1 Depth

In practical applications, it has been noted that a multi-layer neural network, outperforms a single-layer neural network. This is still a question under investigation, once the top-of-the-art mathematical theories cannot account for the multi-layer comparative success. However, it is possible to create some counter-examples, where the single-layer neural network would not approach the target function as in the following proposition:

\ Therefore it is beneficial or at least risk-averse to select a multi-layer feed-forward neural network, instead of a single-layer feed-forward neural network

\ 6.2.2 Width

This section draws inspiration from the works Barron, 1993 andTelgarsky, 2020. Its primary objective is to investigate the approximating capabilities of a neural network based on the number of nodes or neurons. I provide some elaboration on this result, once it is not so well known and it does not require any assumption regarding the activation function unlike in Barron, 1994.

\

\ This sampling procedure correctly represents the mean as:

\

\ As the number of nodes, d, increases, the approximation capability improves. This result, contrary to Proposition 5.1, establishes an upper bound that is independent of the dimension of the target function. By comparing both theorems, it can be argued that there is a clear advantage for feed-forward neural networks when d > 2 for d ∈ N.

\

:::info Author:

(1) Pedro Duarte Gomes, Department of Mathematics, University of Copenhagen.

:::


:::info This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.

:::

\

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Fed rate decision September 2025

Fed rate decision September 2025

The post Fed rate decision September 2025 appeared on BitcoinEthereumNews.com. WASHINGTON – The Federal Reserve on Wednesday approved a widely anticipated rate cut and signaled that two more are on the way before the end of the year as concerns intensified over the U.S. labor market. In an 11-to-1 vote signaling less dissent than Wall Street had anticipated, the Federal Open Market Committee lowered its benchmark overnight lending rate by a quarter percentage point. The decision puts the overnight funds rate in a range between 4.00%-4.25%. Newly-installed Governor Stephen Miran was the only policymaker voting against the quarter-point move, instead advocating for a half-point cut. Governors Michelle Bowman and Christopher Waller, looked at for possible additional dissents, both voted for the 25-basis point reduction. All were appointed by President Donald Trump, who has badgered the Fed all summer to cut not merely in its traditional quarter-point moves but to lower the fed funds rate quickly and aggressively. In the post-meeting statement, the committee again characterized economic activity as having “moderated” but added language saying that “job gains have slowed” and noted that inflation “has moved up and remains somewhat elevated.” Lower job growth and higher inflation are in conflict with the Fed’s twin goals of stable prices and full employment.  “Uncertainty about the economic outlook remains elevated” the Fed statement said. “The Committee is attentive to the risks to both sides of its dual mandate and judges that downside risks to employment have risen.” Markets showed mixed reaction to the developments, with the Dow Jones Industrial Average up more than 300 points but the S&P 500 and Nasdaq Composite posting losses. Treasury yields were modestly lower. At his post-meeting news conference, Fed Chair Jerome Powell echoed the concerns about the labor market. “The marked slowing in both the supply of and demand for workers is unusual in this less dynamic…
Share
BitcoinEthereumNews2025/09/18 02:44
[Tambay] Tres niños na bagitos

[Tambay] Tres niños na bagitos

Mga bagong lublób sa malupit na mundo ng Philippine politics ang mga newbies na sina Leviste, Barzaga, at San Fernando, kaya madalas nakakangilo ang kanilang ikinikilos
Share
Rappler2026/01/18 10:00
Massive Whale Buying Spree Could Trigger XRP Supply Shock as Exchange Balances Drop to Lowest Since 2023 ⋆ ZyCrypto

Massive Whale Buying Spree Could Trigger XRP Supply Shock as Exchange Balances Drop to Lowest Since 2023 ⋆ ZyCrypto

The post Massive Whale Buying Spree Could Trigger XRP Supply Shock as Exchange Balances Drop to Lowest Since 2023 ⋆ ZyCrypto appeared on BitcoinEthereumNews.com
Share
BitcoinEthereumNews2026/01/18 10:41