Matches in SemOpenAlex for { <https://semopenalex.org/work/W4283325968> ?p ?o ?g. }
- W4283325968 endingPage "625" @default.
- W4283325968 startingPage "609" @default.
- W4283325968 abstract "During the past two decades, epileptic seizure detection and prediction algorithms have evolved rapidly. However, despite significant performance improvements, their hardware implementation using conventional technologies, such as Complementary Metal–Oxide–Semiconductor (CMOS), in power and area-constrained settings remains a challenging task; especially when many recording channels are used. In this paper, we propose a novel low-latency parallel Convolutional Neural Network (CNN) architecture that has between 2-2,800x fewer network parameters compared to State-Of-The-Art (SOTA) CNN architectures and achieves 5-fold cross validation accuracy of 99.84% for epileptic seizure detection, and 99.01% and 97.54% for epileptic seizure prediction, when evaluated using the University of Bonn Electroencephalogram (EEG), CHB-MIT and SWEC-ETHZ seizure datasets, respectively. We subsequently implement our network onto analog crossbar arrays comprising Resistive Random-Access Memory (RRAM) devices, and provide a comprehensive benchmark by simulating, laying out, and determining hardware requirements of the CNN component of our system. We parallelize the execution of convolution layer kernels on separate analog crossbars to enable 2 orders of magnitude reduction in latency compared to SOTA hybrid Memristive-CMOS Deep Learning (DL) accelerators. Furthermore, we investigate the effects of non-idealities on our system and investigate Quantization Aware Training (QAT) to mitigate the performance degradation due to low Analog-to-Digital Converter (ADC)/Digital-to-Analog Converter (DAC) resolution. Finally, we propose a stuck weight offsetting methodology to mitigate performance degradation due to stuck <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$R_{text{ON}}/R_{text{OFF}}$</tex-math></inline-formula> memristor weights, recovering up to 32% accuracy, without requiring retraining. The CNN component of our platform is estimated to consume approximately 2.791 W of power while occupying an area of 31.255 mm <sup xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>2</sup> in a 22 nm FDSOI CMOS process." @default.
- W4283325968 created "2022-06-24" @default.
- W4283325968 creator A5009413337 @default.
- W4283325968 creator A5012864861 @default.
- W4283325968 creator A5021668898 @default.
- W4283325968 creator A5051308921 @default.
- W4283325968 creator A5054032094 @default.
- W4283325968 creator A5067564596 @default.
- W4283325968 date "2022-08-01" @default.
- W4283325968 modified "2023-10-15" @default.
- W4283325968 title "Seizure Detection and Prediction by Parallel Memristive Convolutional Neural Networks" @default.
- W4283325968 cites W1005811612 @default.
- W4283325968 cites W1906184128 @default.
- W4283325968 cites W1981141987 @default.
- W4283325968 cites W1989915444 @default.
- W4283325968 cites W2004718447 @default.
- W4283325968 cites W2010855873 @default.
- W4283325968 cites W2018642663 @default.
- W4283325968 cites W2019448024 @default.
- W4283325968 cites W2019664149 @default.
- W4283325968 cites W2024952598 @default.
- W4283325968 cites W2025668297 @default.
- W4283325968 cites W2025990419 @default.
- W4283325968 cites W2030250405 @default.
- W4283325968 cites W2035987281 @default.
- W4283325968 cites W2038606869 @default.
- W4283325968 cites W2042682017 @default.
- W4283325968 cites W2043748792 @default.
- W4283325968 cites W2053744708 @default.
- W4283325968 cites W2055724818 @default.
- W4283325968 cites W2063047541 @default.
- W4283325968 cites W2069671647 @default.
- W4283325968 cites W2073223058 @default.
- W4283325968 cites W2077746856 @default.
- W4283325968 cites W2086659790 @default.
- W4283325968 cites W2099169745 @default.
- W4283325968 cites W2099986910 @default.
- W4283325968 cites W2100557841 @default.
- W4283325968 cites W2112796928 @default.
- W4283325968 cites W2122982148 @default.
- W4283325968 cites W2127261457 @default.
- W4283325968 cites W2138160374 @default.
- W4283325968 cites W2150313118 @default.
- W4283325968 cites W2152282628 @default.
- W4283325968 cites W2152703446 @default.
- W4283325968 cites W2172124081 @default.
- W4283325968 cites W2272508502 @default.
- W4283325968 cites W2342906508 @default.
- W4283325968 cites W2518281301 @default.
- W4283325968 cites W2549139847 @default.
- W4283325968 cites W2549630556 @default.
- W4283325968 cites W2573857124 @default.
- W4283325968 cites W2625840880 @default.
- W4283325968 cites W2725159389 @default.
- W4283325968 cites W2736148182 @default.
- W4283325968 cites W2768104155 @default.
- W4283325968 cites W2768359949 @default.
- W4283325968 cites W2775771159 @default.
- W4283325968 cites W2777670961 @default.
- W4283325968 cites W2799610518 @default.
- W4283325968 cites W2799714341 @default.
- W4283325968 cites W2801752756 @default.
- W4283325968 cites W2810916204 @default.
- W4283325968 cites W2820981935 @default.
- W4283325968 cites W2830509727 @default.
- W4283325968 cites W2887092592 @default.
- W4283325968 cites W2899212146 @default.
- W4283325968 cites W2909800597 @default.
- W4283325968 cites W2910334812 @default.
- W4283325968 cites W2911226018 @default.
- W4283325968 cites W2914588126 @default.
- W4283325968 cites W2915149867 @default.
- W4283325968 cites W2916001803 @default.
- W4283325968 cites W2940235710 @default.
- W4283325968 cites W2941297993 @default.
- W4283325968 cites W2943354300 @default.
- W4283325968 cites W2943369412 @default.
- W4283325968 cites W2946641467 @default.
- W4283325968 cites W2962222774 @default.
- W4283325968 cites W2962984603 @default.
- W4283325968 cites W2964137095 @default.
- W4283325968 cites W2964267916 @default.
- W4283325968 cites W2973010960 @default.
- W4283325968 cites W2979411460 @default.
- W4283325968 cites W3013080934 @default.
- W4283325968 cites W3015660683 @default.
- W4283325968 cites W3015929423 @default.
- W4283325968 cites W3017900531 @default.
- W4283325968 cites W3030870789 @default.
- W4283325968 cites W3035485822 @default.
- W4283325968 cites W3040302020 @default.
- W4283325968 cites W3047394346 @default.
- W4283325968 cites W3080915835 @default.
- W4283325968 cites W3081302630 @default.
- W4283325968 cites W3083860807 @default.
- W4283325968 cites W3097843109 @default.
- W4283325968 cites W3097855118 @default.
- W4283325968 cites W3103969290 @default.