Generalized Functions & Differential Equations: Exploring the Infinite & the Unexpected
- Get link
- X
- Other Apps
Differential Equations for Generalized Functions: When Calculus Meets the Infinite and the Weird
In the previous blog, we understood the Understanding Delta Function Approximations: Lorentzian, Gaussian, and Sinc Compared. Let's take another one step and explore the Differential Equations for Generalized Functions: What happen When Calculus Meets the Infinite and the Weird
What happens when you mix the familiar world of differential equations with the strange universe of generalized functions—those magical creatures that extend what we usually call a function?
It turns out, you get a whole new playground where classical rules meet curious exceptions, and where solutions might look familiar—or downright surprising.
Building Differential Expressions in the Generalized Functions
In the land of generalized functions (also known as distributions), we can still do calculus—just a bit differently. Differentiation works, multiplication by smooth functions is allowed, and addition behaves as expected.
So we can still write differential expressions like:\[ a_0(x)y^{(n)}(x) + a_1(x)y^{(n-1)}(x) + \dots + a_n(x)y(x) = b(x) \] here:
- The \( a_i(x) \) are smooth functions (infinitely differentiable),
- ๐ฆ(x) and ๐(๐ฅ) are generalized functions.
Our goal? Solve for ๐ฆ—but now, in the generalized function world.
Let's Start Simple: \( \frac{dy}{dx}=0\)
What’s the solution to the most basic differential equation?
You might say: “The solution is a constant.”
And you’d be right—even in the generalized setting.
Why?
We rewrite the equation using distributional derivatives: \[ (y', \varphi) = (y, -\varphi') = 0, \quad \forall \varphi \in K \] This means that the generalized function ๐ฆ vanishes on every test function that is the derivative of another test function.
Integrals Reveal Hidden Structure
A test function \( \varphi_0 \) is a derivative of another test function \(\varphi_1 \) if and only if: \[ \int_{-\infty}^{\infty} \varphi_0(x) \,dx = 0 \] Why? If \( \varphi_0 = \varphi_1'\), then integration and vanishing at infinity give: \[ \int_{-\infty}^{\infty} \varphi_0(x) \,dx = \varphi_1(\infty) - \varphi_1(-\infty) = 0 \] Conversely, if a test function integrates to zero, you can construct a suitable \( \varphi_1 \) via: \[ \varphi_1(x) = \int_{-\infty}^{x} \varphi_0(\xi) \,d\xi \] So \( \varphi_0 = \varphi_1' \in K\), meaning every function with zero integral is a derivative of another test function.
Decomposing Every Test Function
Now Let \(\varphi_1(x)\) be a fixed function in \(K\) such that \[ \int_{-\infty}^{\infty} \varphi_1(x) \,dx = 1 \] Let \(\varphi(x)\in K\) be any test function. Define: \[ \varphi(x)=\varphi_1(x)\int_{-\infty}^{\infty} \varphi(x) \,dx + \varphi_0(x) \] where \(\varphi_0(x)\) Then clearly: \[ \int_{-\infty}^{\infty} \varphi_0(x) \,dx = 0 \] So we’ve written:\[ (y,\varphi)=(y,\varphi_1)\int_{-\infty}^{\infty} \varphi(x) \,dx\] Let \((y,\varphi_1)=C_1\) be any constant. then \[ (y,\varphi)=\int_{-\infty}^{\infty} C_1\varphi(x) \,dx\] this means y is constant generalized function \(C_1\). No surprises here: generalized or not, the only solution is a constant.
Translation Invariance ⇒ Constant
Here’s a quirky perspective: suppose a generalized function ๐ is translation invariant:\[f(x + \Delta x) - f(x) = 0\] Then its derivative vanishes: \[ \quad f'(x) = \lim_{\Delta x \to 0} \frac{f(x + \Delta x) - f(x)}{\Delta x} = 0\] Again, we conclude:๐ must be a constant.
Homogeneous Systems Behave Classically
Systems like:\[\frac{d y_1}{dx} = a_{11} y_1 + \dots + a_{1m} y_m\] \[\vdots \]\[\frac{d y_m}{dx} = a_{m1} y_1 + \dots + a_{mm} y_m\] with smooth coefficients \(a_{ij}\) also don’t gain new generalized solutions.
Why?
Write the system in vector form: \[\frac{d y}{dx} = A y \] Let \( U \) be a fundamental matrix of classical solutions (invertible). Define the transformation: \(y = U z\) Then,\[\frac{d}{dx} (U z) = A U z \Rightarrow U \frac{d z}{dx} = 0 \Rightarrow \frac{d z}{dx} = 0 \Rightarrow z = \text{constant} \Rightarrow y = U z\] No new generalized behavior—the solution remains a classical combination.
But Singular Coefficients? Here’s Where the Weird Happens
If coefficients become singular (say, not defined at a point), all bets are off.
Example 1:
\[x \frac{d y}{dx} = 0\] Classically: ๐ฆ is constant on \(๐ฅ<0\) and \(๐ฅ>0\), possibly with a jump.
Generalized solutions include:
- \(y_1 = 1\)
- \(y_2 = \theta(x) \)Heaviside function
Example 2:
\[-2x^3 \frac{dy}{dx} = y \] Classical solution: \[\quad y = Cx^{-2}\], which blows up at 0. In the generalized setting: no solution exists (you can’t regularize \(x^{-2}\).
The Indefinite Integral: Always Possible in the Generalized World
Can every generalized function ๐ be the derivative of some generalized function ๐?
Yes!
For the equation:\[\frac{d g}{dx} = f\] We define
๐ as acting on test functions via:\[ (g, -\varphi') = (f, \varphi)\] Then extend
๐ to all of ๐พ using the same decomposition trick: \[\varphi = \varphi_1 \int \varphi + \varphi_0\] Thus \[(g, \varphi) = (g, \varphi_0) + C \int \varphi
\] Hence: \[g = g_0 + C\] where \(g_0\) acts on zero-integral test functions and
๐ถ is a constant generalized function.
Inhomogeneous Systems and Higher-Order Equations: Reduce to Integration
Any inhomogeneous system: \[\frac{d y_i}{dx} + \sum_j a_{ij} y_j = f_i
\] can be transformed using the same trick:
Let \(y = U z\) again, and solve: \[U \frac{dz}{dx} = f \Rightarrow \frac{dz}{dx} = U^{-1} f
\] This reduces the problem to finding primitives of generalized functions.
Same idea works for higher-order equations:\[y^{(m)} + a_1 y^{(m-1)} + \dots + a_m y = f
\]
Summary: When Classical Meets Generalized
- We can differentiate, add, and multiply by smooth functions in the generalized function setting.
- For smooth coefficients, generalized solutions match classical ones.
- For singular coefficients, generalized solutions can be richer—or nonexistent!
- Inhomogeneous systems and higher-order equations ultimately reduce to finding indefinite integrals of generalized functions.
Generalized functions don’t "break" calculus—they complete it.
They allow us to solve equations that classical functions can’t handle, give meaning to impulses and jumps, and bridge the finite with the infinite.
Heaviside Function ๐(๐ฅ) and Its Derivative (Dirac Delta)
import numpy as np
import matplotlib.pyplot as plt
# Define Heaviside function using Sage's piecewise
theta = piecewise([((-5,0), 0), ((0,5), 1)])
# Plot Heaviside function
p1 = plot(theta, (x,-5,5), legend_label="Heaviside ฮธ(x)", color='blue')
# Approximate Dirac delta by a narrow Gaussian
delta_approx = lambda t: 1/(np.sqrt(np.pi)*0.1)*np.exp(-(t/0.1)**2)
# Plot the approximation of delta near zero
plt.figure(figsize=(8,4))
x_vals = np.linspace(-1,1,400)
plt.plot(x_vals, delta_approx(x_vals), label=r'Approximate $\delta(x)$', color='red')
plt.title('Approximate Dirac Delta')
plt.legend()
plt.grid(True)
plt.show()
# Show Heaviside plot
show(p1)
First-Order ODE with Singular Coefficient — New Generalized Solutions
# Classical solution is constant function y = C
C1 = 3 # example constant
p1 = plot(C1, (x, -5, 5), color='blue', legend_label='y = 3 (classical)')
# Define Heaviside function theta(x)
theta = piecewise([((-5,0), 0), ((0,5), 1)])
# Plot Heaviside function (generalized solution)
p2 = plot(theta, (x, -5, 5), color='red', legend_label='ฮธ(x) (generalized)')
# Show both plots
show(p1 + p2, figsize=4)
๐ก Try It Yourself! Now You can copy and paste directly into here Run SageMath Code Here
- Get link
- X
- Other Apps
Comments
Post a Comment
If you have any queries, do not hesitate to reach out.
Unsure about something? Ask away—I’m here for you!