{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,26]],"date-time":"2026-02-26T05:27:01Z","timestamp":1772083621735,"version":"3.50.1"},"reference-count":0,"publisher":"European Alliance for Innovation n.o.","issue":"4","license":[{"start":{"date-parts":[[2023,9,6]],"date-time":"2023-09-06T00:00:00Z","timestamp":1693958400000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/linproxy.fan.workers.dev:443\/https\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["EAI Endorsed Trans e-Learn"],"abstract":"<jats:p>A multi-view self-attention module is proposed and paired with a multi-scale convolutional model to builda multi-view self-attention convolutional network for multi-channel EEG emotion recognition. First, timeand frequency domain characteristics are extracted from multi-channel EEG signals, and a three-dimensionalfeature matrix is built using spatial mapping connections. Then, a multi-scale convolutional network extractsthe high-level abstract features from the feature matrix, and a multi-view self-attention network strengthensthe features. Finally, use the multilayer perceptron for sentiment classification. The experimental results revealthat the multi-view self-attention convolutional network can effectively integrate the time domain, frequencydomain, and spatial domain elements of EEG signals using the DEAP public emotion dataset. The multi-viewself-attention module can eliminate superfluous data, apply attention weight to the network to hasten networkconvergence, and enhance model recognition precision.<\/jats:p>","DOI":"10.4108\/eetel.3722","type":"journal-article","created":{"date-parts":[[2023,9,6]],"date-time":"2023-09-06T12:31:41Z","timestamp":1694003501000},"page":"e4","source":"Crossref","is-referenced-by-count":2,"title":["EEG Emotion Recognition based on Multi scale Self Attention Convolutional Networks"],"prefix":"10.4108","volume":"8","author":[{"given":"Hao","family":"Chao","sequence":"first","affiliation":[]},{"given":"Fang","family":"Yuan","sequence":"additional","affiliation":[]}],"member":"2587","published-online":{"date-parts":[[2023,9,6]]},"container-title":["EAI Endorsed Transactions on e-Learning"],"original-title":[],"link":[{"URL":"https:\/\/linproxy.fan.workers.dev:443\/https\/publications.eai.eu\/index.php\/el\/article\/download\/3722\/2496","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/linproxy.fan.workers.dev:443\/https\/publications.eai.eu\/index.php\/el\/article\/download\/3722\/2496","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,9,21]],"date-time":"2024-09-21T17:07:11Z","timestamp":1726938431000},"score":1,"resource":{"primary":{"URL":"https:\/\/linproxy.fan.workers.dev:443\/https\/publications.eai.eu\/index.php\/el\/article\/view\/3722"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,9,6]]},"references-count":0,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2023,8,15]]}},"URL":"https:\/\/linproxy.fan.workers.dev:443\/https\/doi.org\/10.4108\/eetel.3722","relation":{},"ISSN":["2032-9253"],"issn-type":[{"value":"2032-9253","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,9,6]]}}}