Jump to content

Talk:Deep backward stochastic differential equation method

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Good job!

[edit]

It is a very good job. The article serves as a good introductory resource on the deep BSDE method, which provides a clear and concise overview of this advanced numerical method, which integrates deep learning with BSDEs to solve high-dimensional problems commonly encountered in financial derivatives pricing and risk management. JohnWYu (talk) 13:11, 12 July 2024 (UTC)[reply]

Quite nice!

[edit]

Clear Definition and Background: The article begins with a clear definition of Deep Backward Stochastic Differential Equations (Deep BSDEs) and provides a background on their development. This sets a solid foundation for readers to understand the core concepts and applications.

Wide Range of Applications: It details the diverse applications of the Deep BSDE method in fields such as financial engineering, quantum mechanics, and control theory, demonstrating its broad applicability and significance.

Detailed Mathematical Principles: The article delves into the mathematical foundations of the Deep BSDE method, including its relationship with traditional BSDEs, algorithm derivation, and theoretical proofs. This is particularly valuable for readers with a mathematical background, aiding in a deeper understanding of the method's workings.

Algorithm and Implementation: It provides specific algorithmic steps of the Deep BSDE method and discusses implementation details in practical computations. This information is crucial for researchers and engineers looking to apply the method in real-world projects. Daath3 (talk) 13:16, 12 July 2024 (UTC)[reply]

Nice article!

[edit]

The subject is fascinating and the exposition is clear! I really like to see all these math formulas! In general, I feel that many parts overlap with other Wikipedia articles. Of course, some parts can be repeated to make the explanation clearer. But for example, there is no need to repeat the history of Deep Learning (DL), I would concentrate only on the aspects of DL that fit more with its integration with the stochastic differential equation method. I would also remove the 40-minute video on the introduction of deep learning; it is not strictly related to the exposition of the main topic. This holds also for the Neural network architecture and its algorithms. Just referencing and highlighting the aspects at the intersection of the stochastic differential equation method and deep learning would make it leaner. Take it as just a suggestion; other than that, great work, and thank you! HeyGio (talk) 14:03, 12 July 2024 (UTC)[reply]

I'm very grateful for your reading and comments. But I believe it is necessary to provide a proper introduction to the history and theory of deep learning for those with a mathematical background in backward stochastic differential equations but without a deep learning background. Most existing methods for solving stochastic differential equations and further HJB equations are based on previous numerical methods for PDEs, and the use of deep learning methods is still quite rare. If we only introduce the overlapping parts, it may hinder their understanding of how to approach the problem from both directions and gain a detailed understanding of this method and its comparison with others.
For most practitioners in numerical solutions of differential equations, there may be limited understanding of how deep learning is applied to stochastic differential equations, or even a lack of understanding of deep learning itself. Regarding your suggestion to remove content related to deep learning based on your familiarity with it, I do not fully agree. It’s similar to how you did not suggest removing concepts related to backward stochastic differential equations (perhaps due to unfamiliarity with that area). For those who are not familiar with deep learning, a simple introduction is indeed necessary.
Thank you once again for your suggestions! I really appreciate it. AzzurroLan (talk) 16:32, 12 July 2024 (UTC)[reply]

Observations and suggestions for improvements

[edit]

The following observations and suggestions for improvements were collected, following an expert review of the article within the Science, Technology, Society and Wikipedia course at the Politecnico di Milano, in July 2024.

I would remove the algorithms for the training of the network (e.g. ADAM) and link the already existing page: https://wiki.riteme.site/wiki/Stochastic_gradient_descent.

--Aandurro (talk) 15:45, 29 August 2024 (UTC)[reply]