arsalandywriter.com

Understanding Stochastic Integration: A Comprehensive Guide

Written on

Stochastic Integration: An Overview

This article serves as a gentle introduction to stochastic integration with Python, building on the concepts discussed in my previous piece about stochastic differentiation (read it [here](https://oscarnieves100.medium.com/stochastic-differentiation-5480d33ac8b8)). We will explore key aspects of stochastic integrals and their applications, particularly those involving the Wiener process, denoted as W(t).

Properties of Stochastic Integrals

To begin, let's consider a general stochastic integral relative to a Wiener process:

Stochastic Integral Representation

In this equation, X(s) represents a stochastic process, and integration occurs from time 0 to time t. While integrals can also be defined for other processes, such as Poisson processes, this discussion will primarily focus on Wiener process integrals for simplicity.

It's important to remember that the Wiener process W(t) behaves as the square of time t, which means that dW(s) scales with the square of the infinitesimal time increment ds. This characteristic implies that traditional integration methods cannot be applied here, as these increments do not comply with standard calculus rules. Nonetheless, stochastic integrals maintain a linearity property similar to that of conventional integrals, leading to the following key properties:

Properties of Stochastic Integrals

One notable feature is the Itô isometry property, which allows the expected value of the square of a stochastic integral to be calculated by integrating the expected value of the squared process X²(t):

Itô Isometry Property

Similarly, for the product of two statistically independent integrals:

Product of Independent Integrals

Another significant quantity is the RMS integral:

RMS Integral

When the integrand is deterministic, the expected value of the integral is zero:

Expected Value of Deterministic Integrand

This occurs because each dW(s) increment has a mean of zero, resulting in the integral being an infinite sum of increments with zero mean.

In terms of differentials, the following properties apply:

Differential Properties of Stochastic Integrals Additional Differential Properties

Numerical Evaluation of Stochastic Integrals

Stochastic integrals can be approximated as the limit of a Riemann sum, similar to ordinary integrals. Consequently, we can estimate the stochastic integral using discrete time steps of size ?t via the Euler-Mayurama scheme:

Euler-Mayurama Scheme

Here,

Normal Random Variable

Higher order schemes exist, but their convergence behaviors differ from those of deterministic integrals due to the randomness inherent in the integral, particularly when calculating statistical moments. For this article, we will concentrate on this straightforward approach.

Example 1:

To illustrate, consider the integral:

Example Integral

As the integrand is deterministic, the expected value is zero:

Expected Value of Example Integral

Using the isometry property, we find:

Isometry Result

Thus, the variance is:

Variance Result

Next, we will simulate this in Python by setting an arbitrary time t = 3:

import numpy as np

# Input parameters N = 101 t = 3 MC = 10000 tvector = np.linspace(0, t, N) dt = tvector[1] - tvector[0]

# Analytic results E = 0 Var = 9/5 * t**5

# Numeric results N_array = np.random.normal(loc=0, scale=1, size=[MC, N]) Ones = np.ones([MC, 1]) integrand = 3 * tvector**2 func = np.kron(integrand, Ones) I = np.sqrt(dt) * np.sum(func * N_array, axis=1) E_num = np.mean(I, axis=0) Var_num = np.mean(I**2, axis=0) - E_num**2

# Compare results print("Analytic values:") print("E[I] =", E) print("Var[I] =", Var) print("--------------") print("Numeric values:") print("E[I] =", E_num) print("Var[I] =", Var_num)

Running this code multiple times yields outputs similar to the following:

Analytic values: E[I] = 0 Var[I] = 437.40000000000003 ----------------------------------------------- Numeric values: E[I] = 0.14522140709630404 Var[I] = 442.5864059756218

As expected, the numeric values of E[I] tend to fluctuate around zero, while the values of Var[I] oscillate around the precise value of 437. The degree of these fluctuations is influenced by the number of Monte Carlo simulations (MC) and the number of time steps (N) taken, illustrating that the convergence of stochastic integrals is less favorable than that of ordinary integrals.

Example 2:

Next, we analyze the integral:

Non-Deterministic Integral

In this instance, the integrand is not deterministic, making it impossible to compute the mean value analytically. However, we can apply Itô's isometry to determine the expected value of I² and subsequently compute the integral's RMS value as follows:

Itô Isometry Applied

Using the property:

Another Property

We obtain:

Resulting Calculation

Ultimately, this leads to:

Final Result

We will validate this in Python using t = 3:

import numpy as np

# Input parameters N = 101 t = 3 MC = 10000 tvector = np.linspace(0, t, N) dt = tvector[1] - tvector[0]

# Analytic results RMS = 1/2 * t**2

# Numeric results N_array = np.random.normal(loc=0, scale=1, size=[MC, N]) Ones = np.ones([MC, 1]) W = np.cumsum(np.sqrt(dt) * np.random.normal(loc=0, scale=1, size=[MC, N]), axis=1) integrand = np.kron(tvector, Ones) * W I = np.sqrt(dt) * np.sum(integrand * N_array, axis=1) RMS_num = np.sqrt(np.mean(I**2, axis=0))

# Compare results print("RMS analytic =", RMS) print("RMS numeric =", RMS_num)

When executing this program multiple times, we receive results like:

RMS analytic = 4.5 RMS numeric = 4.684039937790411

While the RMS numeric values display some variation, they consistently hover around 4.5.

Example 3:

Now, we will determine the RMS value of:

Another Integral Example

Utilizing Itô's isometry, we obtain:

Itô Isometry for Example

We recognize that the integrand corresponds to the moment-generating function of W(s), for which we already established the answer in my prior article [here](https://oscarnieves100.medium.com/the-building-blocks-of-stochastic-calculus-part-i-d06c87916070):

Moment-Generating Function

Thus, we arrive at:

Conclusion of Calculation

The Python code for this example is as follows:

import numpy as np

# Input parameters N = 101 beta = 0.55 t = 3 MC = 10000 tvector = np.linspace(0, t, N) dt = tvector[1] - tvector[0]

# Analytic results RMS = 1/beta/np.sqrt(2) * np.sqrt(np.exp(2 * beta**2 * t) - 1)

# Numeric results N_array = np.random.normal(loc=0, scale=1, size=[MC, N]) Ones = np.ones([MC, 1]) W = np.cumsum(np.sqrt(dt) * np.random.normal(loc=0, scale=1, size=[MC, N]), axis=1) integrand = np.exp(beta * W) I = np.sqrt(dt) * np.sum(integrand * N_array, axis=1) RMS_num = np.sqrt(np.mean(I**2, axis=0))

# Compare results print("RMS analytic =", RMS) print("RMS numeric =", RMS_num)

The outputs from this code will resemble:

RMS analytic = 2.9150723103019516 RMS numeric = 2.8871054741989113

Example 4:

Next, we will find the RMS value of:

Yet Another Integral Example

Using Itô's isometry, we have:

Itô Isometry Applied Again

The squared cosine can be simplified through a trigonometric identity:

Trigonometric Simplification

This leads us to:

Result of Simplification

Next, we can apply Euler’s formula to express the cosine function as a sum of exponentials and utilize the moment-generating function of W(s) to arrive at:

Euler's Formula Application

Ultimately, this leads to:

Final Result of Example

The corresponding Python code is as follows:

import numpy as np

# Input parameters N = 101 t = 5 MC = 10000 tvector = np.linspace(0, t, N) dt = tvector[1] - tvector[0]

# Analytic results RMS = np.sqrt(0.5 * t + 0.25 * (1 - np.exp(-2 * t)))

# Numeric results N_array = np.random.normal(loc=0, scale=1, size=[MC, N]) Ones = np.ones([MC, 1]) W = np.cumsum(np.sqrt(dt) * np.random.normal(loc=0, scale=1, size=[MC, N]), axis=1) integrand = np.cos(W) I = np.sqrt(dt) * np.sum(integrand * N_array, axis=1) RMS_num = np.sqrt(np.mean(I**2, axis=0))

# Compare results print("RMS analytic =", RMS) print("RMS numeric =", RMS_num)

The output from this code will appear as follows:

RMS analytic = 1.6583089730257023 RMS numeric = 1.6678332264282443

Application: The Ornstein-Uhlenbeck Process

The Ornstein-Uhlenbeck process (OU process) is a widely utilized stochastic process in both finance and physics, characterizing systems that approach a steady state over time. It is defined by the stochastic differential equation:

Ornstein-Uhlenbeck Process Equation

Here, ? is a positive parameter, ? represents the drift parameter, and ? indicates the volatility parameter. This SDE can be solved to yield:

Solution to the OU Process

We can confirm this solution by differentiating X(t) according to Itô's formula. By leveraging the properties of stochastic integrals we have just discussed, we can easily derive the expected value and variance of the process:

Expected Value of OU Process Variance of OU Process Further Properties of OU Process Distribution of OU Process

Consequently, it turns out that X(t) is normally distributed.

Share the page:

Twitter Facebook Reddit LinkIn

-----------------------

Recent Post:

Navigating Housing Challenges: Insights from Futurama's

Explore the housing crisis through Futurama's

The Future of Liver Transplants: 3D-Printed Mini-Livers

Brazilian scientists have developed a 3D-printed mini-liver that functions like a normal liver, paving the way for future transplants.

Embracing a Deliberate Life: Writing and Living with Intention

Discover the joys of living intentionally and writing at a relaxed pace while nurturing meaningful connections.

The Definitive Guide to Earning Money Online Through Digital Products

Discover how to create and sell digital products for passive income through strategic sacrifices and skill development.

Aiming for $100 on Medium: My Journey and Insights

Join me as I share my experiences and strategies on Medium, inching closer to my first $100 earnings.

The Future of Tesla Destination Chargers: A Comprehensive Guide

Discover the benefits and future of Tesla's Destination Chargers, a vital part of electric vehicle infrastructure.

Navigating the AI Landscape: Insights from Sam Altman

Sam Altman discusses AI's future, ethics, and technology’s impact in a recent podcast, highlighting GPT-4 and ChatGPT.

Disruptive Innovation and Bacterial Parthenogenesis Explored

Exploring the impact of bacterial parthenogenesis and its implications for innovation in science and society.