This article proposes an efficient and accurate embedded motor imagery-based brain–computer interface (MI-BCI) that meets the requirements for wearable and real-time applications. To achieve a suitable accuracy considering hardware constraints, we explore BCI transducer algorithms, among which Infinite impulse response (IIR) filter, common spatial pattern, and support vector machine are used to preprocess, extract features, and classify data, respectively. With our hardware implementation of these tasks, we have achieved an accuracy of 77%. Our system is designed at register transfer level (RTL) targeting an ASIC implementation, which significantly decreases power consumption, latency, and area compared to the state-of-the-art (SoA) architectures for embedded BCI systems. To this end, we fold IIR filters using time-shared and RAM-based techniques and use hardware-friendly algorithms for the implementation of other tasks. The RTL design is realized on 45 nm CMOS technology consuming 4mWpower and 0.25mm2 area, which outperforms the SoA platforms for embedded BCI systems. To further illustrate the outperformance of our design, the proposed architecture is implemented on Virtex-7 field program gate array as a prototyping platform consuming 6 μJ energy with 1.52% area utilization.
Loading....