Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular.

Here we provide such a framework. To this end, we define the scalogram as the
continuous-wavelet-transform equivalent of the extended Lomb–Scargle
periodogram defined in Part 1 of this study

The continuous wavelet transform (CWT) is widely used for the time–frequency
analysis of geophysical time series, mainly through its scalogram, which is
the squared modulus of the CWT. The CWT relies on a probing function, called
the mother wavelet. A common choice for the mother wavelet is the Morlet
wavelet

Studies in climate and weather: analysis of the El Niño Southern Oscillation in

Studies in palaeoclimate: analysis of the astronomical forcing in

A solution to this problem was addressed by Foster in a series of articles
which share a common thread with our two papers, in the sense that it first
generalises the Lomb–Scargle periodogram, based on orthogonal projection
methods, in

An application of Foster's formulas on palaeoclimate
data is found in

In this article, we extend the analysis tools that we derived in the first
part of this study

Most of the mathematical concepts and notations are introduced in the first
part of this study

The mathematical background to Fourier analysis is given in
Appendix

The wavelet power spectrum (WPS) of a continuous-time stochastic process

In this article, we choose the mother wavelet

The
admissibility criteria is required for

We have

The length of the support of the Gaussian may be approximated by 6 times its standard deviation.

There are two common choices for

The parameter

The Morlet wavelet is often used to detect the periodicities in a signal, and
it is therefore suitable to convert scales

Reconstruction of a signal can be performed with the CWT along the

Another type of ridge is the

This can be extended to signals with slowly varying amplitude and phase

By construction, ridge filtering is well-adapted for filtering a
multi-periodic signal, even if it is plunged in a noisy environment

Finally, we mention that the scalogram can be written under the formalism of
orthonormal projections. Indeed, defining

We consider the same model as in Part 1:

When applying the CWT to finite discrete time series, a choice for the
discretisation must be made. In the influential paper of

The convolution theorem for continuous-time functions is
given in Appendix

When the time series is regularly sampled, the scalogram, given by Eq. (

It is computed from the R package
PALINSOL (

Equation (

where

Analogously to Sect. 4.3 of Part 1, we extend the scalogram to take into
account the presence of a polynomial trend of degree

The scalogram suffers from the same inconsistency issue as the periodogram,
in the sense that it remains very noisy regardless of the number of data
points we have at our disposal

The scalogram often looks

Smoothing is traditionally performed by averaging the scalogram over
neighbouring points in the time–scale plane, either by averaging over times
followed by averaging over scales

We adopt here the smoothing procedure of

We want to estimate the amplitude

Formula (I,118) is applied with the left-hand-side term changed to encompass
wavelet formalism.

Formula (I,129) is applied with the left-hand-side term changed to encompass
wavelet formalism:

Like in Part 1, estimating the amplitude is more robust against noise when a
smoothing procedure is performed. We apply to the squared amplitude,
Eq. (

The weighted smoothed scalogram is the analogue of the

provide an estimation of the squared amplitude of a signal, locally in the time–frequency plane, by weighting the scalogram like in
Eq. (

conserve the advantage of the formalism of orthogonal projections in order to avoid the matrix inversions required for the computation of the
amplitude scalogram (see, for example, Eq.

When the wave packets

This has another implication: the maximal scale available by the analysis is

When probing the irregularly sampled time series with the wavelet packet, it
may happen that the period of the oscillation inside the packet,

Weighted (unsmoothed) scalogram of the time series presented in Fig.

We now justify Formula (

The SNEZ is applied to all the analysis tools defined above. When smoothing
is to be applied, it is performed on the areas outside of the SNEZ, since the
scalogram is not computed in the SNEZ. In the neighbourhood of the SNEZ,
adjustments of the smoothing procedure are therefore necessary, as explained
in Appendix

Scalogram of the time series

Scale-to-period conversion is performed in the continuous limit, with Eq. (

As illustrated in Fig.

in which the unknown is

With regularly sampled data, the discretised variable

The integrals in Eqs. (

We test for the presence of periodic components, locally in the
time–frequency plane. Significance testing is mathematically expressed as a
hypothesis testing. Taking our model, Eq. (

To perform significance testing, we thus need

to estimate the parameters of the process under the null hypothesis (this is studied in Sect. 5.2 of Part 1);

to estimate the distribution of the scalogram under the null hypothesis (this is studied in Sect.

Analytical confidence levels in function of the number of conserved
moments, at six particular couples

The results obtained for the periodogram in Sect. 5.3 of Part 1 are valid for
the scalogram, with minor changes that we detail below.

Monte Carlo approach: the same procedure as in Part 1 is applied to the (weighted) smoothed scalogram, Eq. (

Analytical approach (with a unique set of CARMA parameters):

Theorem 1 of Part 1 can be applied to the (weighted) smoothed scalogram, as follows.

The (weighted) smoothed scalogram, defined in Eq. (

The symbol

The

For a Gaussian white noise background with variance

where

The variance of the distribution of the (weighted) smoothed scalogram at

We approximate the linear combination of the independent chi-square distributions, appearing in Eq. (

We observe, however, that the convergence of the percentiles (as the number
of conserved moments grows) strongly depends on the smoothing coefficient

A comparison between the computing times of the Monte Carlo approach and the analytical approach is presented in Appendix

From Sect.

Consider a signal

Analogously to the

Weighted smoothed scalogram (left) and its global scalogram (right)
with

Weighted smoothed scalogram (left) and its global scalogram (right)
with

The time series we use to illustrate the theoretical results is the benthic
foraminiferal

The weighted smoothed scalogram
(Sect.

The unsmoothed estimated amplitude (which is the square root of the
amplitude scalogram, Eq.

The parameters are

Filtered signal in the band [35, 45] kyr.

As explained in Sect.

WAVEPAL is a package, written in Python 2.X, that performs frequency and
time–frequency analyses of irregularly sampled time series, significance
testing against a stationary Gaussian CARMA(

We defined the scalogram as an extension of the generalised Lomb–Scargle
periodogram developed in Part 1. This analysis tool is well suited for
irregularly sampled time series which can be modelled as a locally periodic
component in the time–frequency plane, plus a polynomial trend, plus a
Gaussian CARMA stochastic process. In the particular case of trendless
regularly sampled times series, we showed that the unsmoothed scalogram gives the same results as with the
traditional algorithms such as in

The Python code generating the figures of this article is available in the Supplement.

Strictly speaking, the Fourier transform and the convolution product cannot
be defined on

Parseval–Plancherel identities:

Convolution theorem:

Translation–modulation: the Fourier transform of

Dilation: the Fourier transform of

The Fourier uncertainty principle states that the temporal variance and the
frequency variance of a function

In that book,
the Fourier uncertainty principle is called

We saw in Appendix

In that book,
the author calls them

Scale discretisation is naturally based on the geometry of the boxes. We can,
for example, require that the frequency component of the centre of mass of
the box corresponding to scale

Uncertainty boxes for the Morlet wavelet, with

Example of rule for the discretisation of scales taking into account
the geometry of the uncertainty boxes.

In the formulas of the smoothed (amplitude) scalogram, Eqs. (

Keep the length of integration equal to

Shorten the interval of integration in order to not exclude from the analysis any extra region of the time–frequency plane.

We review the only two rigorous studies that we have found in the literature about the estimation of the scalogram for irregularly sampled time series. Like our theory, they are based on the Lomb–Scargle periodogram, and define a kind of scalogram of the CWT for the Morlet mother wavelet. However, these theories are too restrictive to have an interest in geophysical applications.

In this section, we derive and comment on the formulas published in

Let us start with the approximation made in

In

Now, we derive Foster's scalogram from our theory. The diagonal elements of
the weight matrix

Note that Eq. (5-10) of

Below, we comment on the WWT and make a comparison with our formulas.

The WWT is built on the assumption that the time series holds a Gaussian-shaped trend centred at the probed translation
time

The WWT under the null hypothesis is only

The estimation of the variance of the white noise,

To our point of view, working with weighted inner products, approximations like in Eq. (

The

Under the null hypothesis that the data

Time step of the

Scalogram of the time series presented in
Fig.

This appendix compares the scalograms and their confidence levels in the case
of interpolated and non-interpolated time series. The time series we consider
is the

Finally, we observe that, in this example, the power of the scalogram of the data is weakly affected by the interpolation.

Computing times for generating the scalogram with analytical (blue)
and MCMC (green) confidence levels, in function of the number of data points
(disposed on a regular time grid). Log–log scale.

A comparison between the computing times, for generating the scalogram, with
the analytical and with the MCMC confidence levels, based on the hypothesis
of a red noise background, is presented in Fig.

CPU type: SandyBridge 2.3 GHz. RAM: 64 GB.

.With this parametrisation, and within this interval of the number of data
points, we see that the analytical approach is faster than the MCMC approach.
The analytical approach delivers computing times of the same order of
magnitude regardless of the percentile (the two blue curves in Fig.

The authors declare that they have no conflict of interest.

The authors are very grateful to Jean-Pierre Antoine, Reik Donner, Laurent Jacques, Lilian Vanderveken, and Samuel Nicolay, for their comments on a preliminary version of the paper. This work is supported by the Belgian Federal Science Policy Office under contract BR/12/A2/STOCHCLIM. Guillaume Lenoir is currently supported by the FSR-FNRS grant PDR T.1056.15 (HOPES). Edited by: Jinqiao Duan Reviewed by: two anonymous referees