Heuristic Computation and the Discovery of Mersenne Primes

Heuristic Computation and the Discovery of Mersenne Primes Heuristic Computation and the Discovery of Mersenne Primes “Where Strategy Meets Infinity: The Quest for Mersenne Primes” Introduction: The Dance of Numbers and Heuristics Mersenne primes are not just numbers—they are milestones in the vast landscape of mathematics. Defined by the formula: \[ M_p = 2^p - 1 \] where \( p \) is itself prime, these giants challenge our computational limits and inspire new methods of discovery. But why are these primes so elusive? As \( p \) grows, the numbers become astronomically large, making brute-force testing impossible. This is where heuristic computation steps in—guiding us with smart, experience-driven strategies. “In the infinite sea of numbers, heuristics are our compass.” Let’s explore how heuristics and algorithms intertwine to unveil these mathematical treasures. 1. Mersenne Primes — Giants of Number Theory Definition: Numbers of the form \( M_p = 2^p - 1 \...

Understanding Delta Function Approximations: Sinc-Based Approximation (Fourier Kernel)

Understanding Delta Function Approximations: Sinc-Based Approximation (Fourier Kernel) Matrix Space Toolkit in SageMath

Delta-Convergent Sequences — Refined Blog with SageMath Symbolics, Physics Insights, and Cleaner Code

In the previous blog, we understood the Lorentzian Delta Sequence (Cauchy Kernel) and Gaussian Approximation (Heat Kernel) . Let's take another one step and explore the Sinc-Based Approximation (Fourier Kernel).

Why Study These Approximations?

Delta functions are central in many fields:

  • Signal Processing: Ideal impulse, filter response
  • Physics: Point charges/masses, Green's functions
  • Spectral Theory: Lorentzian profiles in resonance
  • Diffusion Models: Gaussians arise from the heat equation
  • Numerics: Regularizing singular integrals

Each kernel has a story to tell.

Sinc-Based Approximation (Fourier Kernel)

Formula (with x = 0 defined): \[ f_{\nu}(x) = \begin{cases} \frac{\sin(\nu x)}{\pi x}, & x \neq 0 \\ \frac{\nu}{\pi}, & x = 0 \end{cases} \]

  • SOscillatory, from Fourier analysis
  • Not always positive
  • Still integrates to 1

#Define the Function

var('x nu')
f_sinc(x, nu) = (1/pi) * (sin(nu * x) / x)
f_sinc(x, nu)

#Symbolic Integration Check
var('xi')
assume(nu > 0)  # Ensure nu is positive
integral(f_sinc(xi, nu), xi, -oo, oo).simplify_full()

#Limit at ( x = 0 )
limit(f_sinc(x, nu), x=0)

#Alternative Approach: Numerical Evaluation
x_vals = [0.1, 0.01, 0.001, 0.0001]
[f_sinc(x, 30).n() for x in x_vals]

#Integral Test (Distributional Behavior)
var('a b')
assume(a < 0, b > 0)  # Ensure a < 0 < b to match delta behavior
integral(f_sinc(xi, nu), xi, a, b).simplify_full()

#Numerical Verification

import numpy as np
import matplotlib.pyplot as plt
import sage.all as sage

def sinc_integral(nu, a=-1, b=1):
    from scipy.integrate import quad
    return quad(lambda x: np.sin(nu*x) / (np.pi*x), a, b)[0]

# Test for different ฮฝ values
nu_values = np.linspace(10, 100, 50)
integral_values = [sinc_integral(nu) for nu in nu_values]

# Plotting
plt.figure(figsize=(8, 5))
plt.plot(nu_values, integral_values, marker='o', linestyle='-', color='blue')
plt.axhline(y=1, color='r', linestyle='--', label="Expected Limit (1)")
plt.xlabel(r"$\nu$")
plt.ylabel(r"Integral Value")
plt.title("Numerical Verification: Sinc Integral Convergence")
plt.legend()
plt.grid(True)
plt.show()

#Plot the Sinc Function

p1 = plot(f_sinc(x, 10), (x, -5, 5), color='red', legend_label="ฮฝ=10") + \
     plot(f_sinc(x, 30), (x, -5, 5), color='blue', legend_label="ฮฝ=30") + \
     plot(f_sinc(x, 100), (x, -5, 5), color='green', legend_label="ฮฝ=100")

p1.show(title="Sinc Approximation to ฮด(x)", ymin=-1, ymax=3)

#First & Second Derivative Computation

f_sinc_prime(x, nu) = diff(f_sinc(x, nu), x)
f_sinc_double_prime(x, nu) = diff(f_sinc_prime(x, nu), x)

f_sinc_prime(x, nu), f_sinc_double_prime(x, nu)

#Plot the Derivatives
p1 = plot(f_sinc_prime(x, 10), (x, -5, 5), color='red', legend_label="ฮฝ=10") + \
     plot(f_sinc_prime(x, 30), (x, -5, 5), color='blue', legend_label="ฮฝ=30") + \
     plot(f_sinc_prime(x, 100), (x, -5, 5), color='green', legend_label="ฮฝ=100")

p1.show(title="First Derivative of Sinc Approximation")

p2 = plot(f_sinc_double_prime(x, 10), (x, -5, 5), color='red', legend_label="ฮฝ=10") + \
     plot(f_sinc_double_prime(x, 30), (x, -5, 5), color='blue', legend_label="ฮฝ=30") + \
     plot(f_sinc_double_prime(x, 100), (x, -5, 5), color='green', legend_label="ฮฝ=100")

p2.show(title="Second Derivative of Sinc Approximation")

p2.show(title="Second Derivative of Gaussian Delta Approximation")

#Integration of the Sinc Sequence
# Compute symbolic integral over a finite range (-a to b)
var('a b')
assume(a < 0, b > 0)
integral(f_sinc(x, nu), x, a, b).simplify_full()

#Plotting the Integrated Sequences
p1 = plot(integral(f_sinc(x, 30), x, -5, 5), (x, -5, 5), color='red', legend_label="Sinc")
p1.show(title="Integrated Delta Approximations")

๐Ÿ’ก Try It Yourself! Now You can copy and paste directly into here Run SageMath Code Here

Physics Note
From Fourier theory and sampling, basis of Shannon’s sampling theorem.

Comments

Popular posts from this blog

๐ŸŒŸ Illuminating Light: Waves, Mathematics, and the Secrets of the Universe

Spirals in Nature: The Beautiful Geometry of Life