Financial Signal Processing in Python XIII - Signal Spikes
Part XIII of Decomposing Time Series and Understanding their Components
This article will be the part XIII of a series of articles that present the signal processing field in an easy and straightforward manner. Financial Signal Processing (FSP) is the application of signal processing techniques to financial time series—like stock prices, returns, volatility, or economic indicators.
It treats price movements like signals—similar to audio—and applies filters, transformations, and models to extract hidden patterns, reduce noise, or make forecasts.
There are many concepts in FSP that are worth discussing, and throughout these special articles, I will try to present each one with functioning code that shows how to use and interpret it:
Decomposition
Empirical Mode Decomposition (EMD)Principal Component Analysis (PCA)
Filters
Moving AveragesKalman FilterWiener Filter
Spectral Analysis
Fourier TransformWavelet Transform
Denoising
Wavelet DenoisingSavitzky–Golay Filter
Anomaly Detection
Recurrence PlotsEntropySignal Spikes 👈🏻
Introduction to Signal Spikes
A spike is a short-lived, high-amplitude excursion that stands out from the background dynamics. Intuitively, it’s the kind of event you’d notice if you skim a plot: a needle in a sea of slower drift and noise. Spikes are common across domains—neural action potentials, dropouts in sensor feeds, network traffic bursts, order-book glitches, seismic picks, or price jumps.
Let xt be a discrete-time signal. A useful model decomposes it into:
where:
bt: slowly varying baseline (trend/seasonality),
ηt: background noise (often heavy-tailed or heteroskedastic),
κ(⋅): spike shape (an impulse δ\deltaδ, a narrow Gaussian, or an instrument response),
ak: spike amplitude and time.
Detecting spikes means estimating the event set {τk} (and optionally ak) despite drift bt and noise ηt.
Random walks, seasonality, and volatility swings change the local mean and variance. If you standardize with a global mean and variance, you will either miss spikes during high-volatility periods or flag too many during quiet periods. Robust detection uses local, outlier-resistant estimates of level and scale.
For a window W around time t, compute the rolling median:
and the Median Absolute Deviation (MAD)
Convert MAD to a Gaussian-equivalent scale via σt = 1.4826 MADt. Robust z-score:
Positive spikes: rt ≥ τ
Negative spikes: rt ≤− τ
Keep only local extrema and enforce a minimum separation (refractory period) to avoid double-counting broad peaks.
This type of algorithms is mostly used with event detection logs, outlier cleaning before modeling, triggered sampling, volatility monitoring, neural spike sorting (first pass), seismology picks, anomaly alerts.
Do you want to master Deep Learning techniques tailored for time series, trading, and market analysis?
My book breaks it all down from basic Machine Learning to complex multi-period LSTM forecasting while going through concepts such as Fractional Differentiation and Forecasting Thresholds. Get your copy here 📖!
Application of the Machine Learning Model
The snippet below:
Builds a random walk,
Injects a few synthetic spikes (for illustration),
Detects spikes using the rolling median + MAD method,
Plots the signal, robust z-scores, and local scale.
import numpy as np
import matplotlib.pyplot as plt
np.random.seed(7)
# --- 1) Random walk (integrated white noise) ---
N = 2500
noise = np.random.normal(0, 1.0, size=N)
x = np.cumsum(noise)
# Inject a few ground-truth spikes (both polarities) for visualization
spike_idx = np.array([300, 550, 900, 1325, 1700, 1975])
spike_amp = np.array([12, -10, 15, -8, 9, -11])
x_with_spikes = x.copy()
x_with_spikes[spike_idx] += spike_amp
def robust_rolling_median_MAD(y, window):
"""
Rolling median and MAD with reflect padding.
Simple O(n*window) implementation for clarity.
"""
if window % 2 == 0:
window += 1
pad = window // 2
y_pad = np.pad(y, pad_width=pad, mode='reflect')
med = np.zeros_like(y, dtype=float)
mad = np.zeros_like(y, dtype=float)
for i in range(len(y)):
w = y_pad[i:i+window]
m = np.median(w)
med[i] = m
mad[i] = np.median(np.abs(w - m))
return med, mad
def detect_spikes(y, window=81, threshold=3.0, min_separation=20, both_polarities=True):
"""
Robust local z-score spike detector.
Returns (indices, r, med, sigma).
"""
if window % 2 == 0:
window += 1
med, mad = robust_rolling_median_MAD(y, window)
sigma = 1.4826 * mad + 1e-12
r = (y - med) / sigma
# Candidates crossing threshold
over = np.where(np.abs(r) >= threshold)[0] if both_polarities else np.where(r >= threshold)[0]
# Keep only local extrema and enforce minimum separation
keep = []
last = -np.inf
for i in over:
if i == 0 or i == len(y)-1:
continue
is_peak = (r[i] > 0 and r[i] >= r[i-1] and r[i] >= r[i+1]) or \
(r[i] < 0 and r[i] <= r[i-1] and r[i] <= r[i+1] and both_polarities)
if not is_peak:
continue
if i - last < min_separation:
# Keep the stronger one if too close
if abs(r[i]) > abs(r[int(last)]):
keep[-1] = i
last = i
continue
keep.append(i)
last = i
return np.array(keep, dtype=int), r, med, sigma
# --- 2) Detect ---
idxs, r, med, sigma = detect_spikes(
x_with_spikes, window=81, threshold=3.0, min_separation=20, both_polarities=True
)
# --- 3) Plot ---
fig = plt.figure(figsize=(12, 8))
gs = fig.add_gridspec(3, 1, height_ratios=[2,1,1], hspace=0.35)
ax0 = fig.add_subplot(gs[0])
ax0.plot(x_with_spikes, color='#1f77b4', lw=1.2, label='Signal (random walk + spikes)')
ax0.plot(med, color='orange', lw=1.0, alpha=0.9, label='Rolling median')
ax0.scatter(idxs, x_with_spikes[idxs], color='crimson', s=50, zorder=3, label='Detected spikes')
ax0.set_title('Spike Detection on a Random Walk (Rolling Median + MAD)')
ax0.set_xlabel('Time index'); ax0.set_ylabel('Amplitude')
ax0.legend(loc='upper left', ncol=3, fontsize=9); ax0.grid(alpha=0.3)
ax1 = fig.add_subplot(gs[1])
ax1.plot(r, color='purple', lw=1.0, label='Robust z-score $r_t$')
thr = 3.0
ax1.axhline(thr, color='red', ls='--', lw=1.0, label='± threshold')
ax1.axhline(-thr, color='red', ls='--', lw=1.0)
ax1.scatter(idxs, r[idxs], color='black', s=20, zorder=3)
ax1.set_ylabel('$r_t$'); ax1.legend(loc='upper left', ncol=2, fontsize=9); ax1.grid(alpha=0.3)
ax2 = fig.add_subplot(gs[2])
ax2.plot(sigma, color='teal', lw=1.0, label=r'Local scale $\hat{\sigma}_t = 1.4826\cdot MAD$')
ax2.set_xlabel('Time index'); ax2.set_ylabel(r'$\hat{\sigma}_t$')
ax2.legend(loc='upper left', fontsize=9); ax2.grid(alpha=0.3)
fig.suptitle('Signal spikes: robust detection with local median and MAD', y=0.98, fontsize=14)
plt.show()
print("Detected indices:", idxs.tolist())The following chart is the output of the code.
✨ The Weekly Market Sentiment Report is evolving into The Signal Beyond 🚀.
This isn’t just a sentiment check anymore. It’s becoming a full market intelligence package with expanded technical scorecards, refined sentiment models, and machine learning forecasts. From classic tools that have stood the test of time to fresh innovations like multi-market RSI heatmaps, volatility regime dashboards, and pairs trading recommendation system, the new report is designed to give you a sharper edge in navigating the markets.
Free trial available.








Rolling z score . Momentum 81 . Tradingview