Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313452837> ?p ?o ?g. }
Showing items 1 to 52 of
52
with 100 items per page.
- W4313452837 abstract "Sound is an important modality to perceive and understand the spatial environment. With the development of digital technology, massive amounts of smart devices in use around the world can collect sound data. Auditory spatial scenes, a spatial environment to understand and distinguish sound, are important to be detected by analyzing sounds collected via those devices. Given limited annotated auditory spatial samples, the current best-performing model can predict an auditory scene with an accuracy of 73%. We propose a novel yet simple Sliding Window based Convolutional Neural Network, SlideCNN, without manually designing features. SlideCNN leverages windowing operation to increase samples for limited annotation problems and improves the prediction accuracy by over 12% compared to the current best-performing models. It can detect real-life indoor and outdoor scenes with a 85% accuracy. The results will enhance practical applications of ML to analyze auditory scenes with limited annotated samples. It will further improve the recognition of environments that may potentially influence the safety of people, especially people with hearing aids and cochlear implant processors." @default.
- W4313452837 created "2023-01-06" @default.
- W4313452837 creator A5054281414 @default.
- W4313452837 creator A5057389363 @default.
- W4313452837 creator A5058915438 @default.
- W4313452837 date "2022-12-01" @default.
- W4313452837 modified "2023-10-06" @default.
- W4313452837 title "Poster: SlideCNN: Deep Learning for Auditory Spatial Scenes with Limited Annotated Data" @default.
- W4313452837 cites W2008142581 @default.
- W4313452837 cites W2021036184 @default.
- W4313452837 cites W2922137896 @default.
- W4313452837 cites W4226200412 @default.
- W4313452837 doi "https://doi.org/10.1109/sec54971.2022.00044" @default.
- W4313452837 hasPublicationYear "2022" @default.
- W4313452837 type Work @default.
- W4313452837 citedByCount "0" @default.
- W4313452837 crossrefType "proceedings-article" @default.
- W4313452837 hasAuthorship W4313452837A5054281414 @default.
- W4313452837 hasAuthorship W4313452837A5057389363 @default.
- W4313452837 hasAuthorship W4313452837A5058915438 @default.
- W4313452837 hasConcept C108583219 @default.
- W4313452837 hasConcept C154945302 @default.
- W4313452837 hasConcept C2776321320 @default.
- W4313452837 hasConcept C2780226545 @default.
- W4313452837 hasConcept C28490314 @default.
- W4313452837 hasConcept C31972630 @default.
- W4313452837 hasConcept C41008148 @default.
- W4313452837 hasConcept C81363708 @default.
- W4313452837 hasConceptScore W4313452837C108583219 @default.
- W4313452837 hasConceptScore W4313452837C154945302 @default.
- W4313452837 hasConceptScore W4313452837C2776321320 @default.
- W4313452837 hasConceptScore W4313452837C2780226545 @default.
- W4313452837 hasConceptScore W4313452837C28490314 @default.
- W4313452837 hasConceptScore W4313452837C31972630 @default.
- W4313452837 hasConceptScore W4313452837C41008148 @default.
- W4313452837 hasConceptScore W4313452837C81363708 @default.
- W4313452837 hasLocation W43134528371 @default.
- W4313452837 hasOpenAccess W4313452837 @default.
- W4313452837 hasPrimaryLocation W43134528371 @default.
- W4313452837 hasRelatedWork W2731899572 @default.
- W4313452837 hasRelatedWork W2999805992 @default.
- W4313452837 hasRelatedWork W3111140902 @default.
- W4313452837 hasRelatedWork W3116150086 @default.
- W4313452837 hasRelatedWork W3126776812 @default.
- W4313452837 hasRelatedWork W3133861977 @default.
- W4313452837 hasRelatedWork W3166467183 @default.
- W4313452837 hasRelatedWork W4200173597 @default.
- W4313452837 hasRelatedWork W4312417841 @default.
- W4313452837 hasRelatedWork W4321369474 @default.
- W4313452837 isParatext "false" @default.
- W4313452837 isRetracted "false" @default.
- W4313452837 workType "article" @default.