Table 1
Number of non-trivial floating-point operations required by different 1D FFT algorithms to compute an N-length data vector: Radix-2 (Rad2), Radix-4 (Rad4), Rader–Brenner (RB) [2, 3], Split-Radix (SR) [4, 5], and Quick Fourier Transform (QFT) [6, 7].
| N | RAD2 | RAD4 | RB | SR | QFT |
|---|---|---|---|---|---|
| 24 | 176 | 168 | 168 | 168 | 77 |
| 25 | 496 | – | 492 | 456 | – |
| 26 | 1296 | 1184 | 1300 | 1160 | 587 |
| 27 | 3216 | – | 3236 | 2824 | – |
| 28 | 7696 | 6880 | 7748 | 6664 | 3491 |
| 29 | 17926 | – | 17972 | 15368 | – |
| 210 | 40976 | 36232 | 41016 | 34824 | 18293 |

Figure 1
Scheme of the system architecture of eFFT-C++. The flow begins with asynchronous events, which are converted into stimulus abstractions and processed by the core eFFT<N> library based on a Radix-2 quadtree structure. The architecture enables the operation on both single stimulus and stimuli batches. Intermediate coefficients are handled through the Eigen3 backend, while the current spectrum is exposed via getFFT(). Validation and benchmarking modules (Google Test and Google Benchmark) operate independently from the core, and the build system orchestrated by CMake provides header-only distribution and Python bindings for seamless integration.
Table 2
Concise public API of eFFT-C++. Types: cfloat = std::complex<float> and cfloatmat = Eigen::Matrix<cfloat, Eigen::Dynamic, Eigen::Dynamic>.
| Class | Method | Input | Output | Summary |
|---|---|---|---|---|
| eFFT<N> | eFFT | – | – | Builds lookup twiddles and allocates quadtree buffers. |
| eFFT<N> | ∼eFFT | – | – | Releases FFTW plans when enabled (no-op otherwise). |
| eFFT<N> | framesize | – | unsigned int | Compile-time frame size N as a runtime integer. |
| eFFT<N> | initialize | – | void | Initialize internal state from a zero image. |
| eFFT<N> | initialize | cfloatmat& | void | Initialize from an N×N complex image (Eigen matrix). |
| eFFT<N> | update | Stimulus& | bool | Apply one stimulus. Returns true if spectrum changed. |
| eFFT<N> | update | Stimuli& | bool | Apply a batch of stimuli; prunes redundancies. |
| eFFT<N> | getFFT | – | cfloatmat& | Current Fourier spectrum. |
| eFFT<N> | initializeGT | cfloatmat& | void | Prepare FFTW plan and set input image. |
| eFFT<N> | updateGT | Stimulus& | bool | Apply one stimulus. Returns true if spectrum changed. |
| eFFT<N> | updateGT | Stimuli& | bool | Apply a batch of stimuli; prunes redundancies. |
| eFFT<N> | getGTFFT | – | cfloatmat | Ground-truth FFT with FFTW (if enabled). |
| eFFT<N> | check | – | double | Norm of difference: ∥getFFT() - getGTFFT()∥. |
| Stimulus | on | – | Stimulus& | Set state to true. |
| Stimulus | off | – | Stimulus& | Set state to false. |
| Stimulus | set | bool | Stimulus& | Explicitly set state. |
| Stimulus | toggle | – | Stimulus& | Flip state (on/off). |
| Stimuli | on | – | void | Set all contained stimuli to true. |
| Stimuli | off | – | void | Set all contained stimuli to false. |
| Stimuli | set | bool | void | Apply same state to all contained stimuli. |
| Stimuli | toggle | – | void | Flip state of all contained stimuli. |






Figure 2
Event-by-event benchmark (Benchmark 1) results. Google Benchmark time per iteration (wall-clock) versus frame size . Each iteration processes Ne = 250 events, including random-event generation, updates, and spectrum retrieval after every event.

Figure 3
Packet-based benchmark (Benchmark 2) results. Google Benchmark time per iteration (wall-clock) versus packet size for framesize 128 (top) and 256 (bottom). Each iteration integrates a fixed total of events. Timing includes random-event generation, packet updates, and spectrum retrieval after each packet.

Figure 4
Examples of applications of eFFT. From top to bottom: denoising, pattern analysis, and registration. (1) Denoising: A low-pass filter in the frequency domain is used to suppress high-frequency noise. Artificial noise (5000 random noise events; see top-left) was added to the original events. (2) Pattern analysis: A directional edge filter in the frequency domain is applied to enhance edges within a specific angular range (∼90°). Green lines (middle-right) have been thickened for a better visualization. (3) Registration: Two event slices from different time instants are aligned via phase cross-correlation. The sequence used is urban from the Event Camera Dataset [23].
