Matches in SemOpenAlex for { <https://semopenalex.org/work/W4378175040> ?p ?o ?g. }
Showing items 1 to 96 of
96
with 100 items per page.
- W4378175040 endingPage "e46970" @default.
- W4378175040 startingPage "e46970" @default.
- W4378175040 abstract "Background Even before the onset of the COVID-19 pandemic, children and adolescents were experiencing a mental health crisis, partly due to a lack of quality mental health services. The rate of suicide for Black youth has increased by 80%. By 2025, the health care system will be short of 225,000 therapists, further exacerbating the current crisis. Therefore, it is of utmost importance for providers, schools, youth mental health, and pediatric medical providers to integrate innovation in digital mental health to identify problems proactively and rapidly for effective collaboration with other health care providers. Such approaches can help identify robust, reproducible, and generalizable predictors and digital biomarkers of treatment response in psychiatry. Among the multitude of digital innovations to identify a biomarker for psychiatric diseases currently, as part of the macrolevel digital health transformation, speech stands out as an attractive candidate with features such as affordability, noninvasive, and nonintrusive. Objective The protocol aims to develop speech-emotion recognition algorithms leveraging artificial intelligence/machine learning, which can establish a link between trauma, stress, and voice types, including disrupting speech-based characteristics, and detect clinically relevant emotional distress and functional impairments in children and adolescents. Methods Informed by theoretical foundations (the Theory of Psychological Trauma Biomarkers and Archetypal Voice Categories), we developed our methodology to focus on 5 emotions: anger, happiness, fear, neutral, and sadness. Participants will be recruited from 2 local mental health centers that serve urban youths. Speech samples, along with responses to the Symptom and Functioning Severity Scale, Patient Health Questionnaire 9, and Adverse Childhood Experiences scales, will be collected using an Android mobile app. Our model development pipeline is informed by Gaussian mixture model (GMM), recurrent neural network, and long short-term memory. Results We tested our model with a public data set. The GMM with 128 clusters showed an evenly distributed accuracy across all 5 emotions. Using utterance-level features, GMM achieved an accuracy of 79.15% overall, while frame selection increased accuracy to 85.35%. This demonstrates that GMM is a robust model for emotion classification of all 5 emotions and that emotion frame selection enhances accuracy, which is significant for scientific evaluation. Recruitment and data collection for the study were initiated in August 2021 and are currently underway. The study results are likely to be available and published in 2024. Conclusions This study contributes to the literature as it addresses the need for speech-focused digital health tools to detect clinically relevant emotional distress and functional impairments in children and adolescents. The preliminary results show that our algorithm has the potential to improve outcomes. The findings will contribute to the broader digital health transformation. International Registered Report Identifier (IRRID) DERR1-10.2196/46970" @default.
- W4378175040 created "2023-05-26" @default.
- W4378175040 creator A5011375257 @default.
- W4378175040 creator A5020052749 @default.
- W4378175040 creator A5058249531 @default.
- W4378175040 creator A5068386655 @default.
- W4378175040 creator A5082173308 @default.
- W4378175040 creator A5091262280 @default.
- W4378175040 date "2023-06-23" @default.
- W4378175040 modified "2023-10-01" @default.
- W4378175040 title "Detecting Clinically Relevant Emotional Distress and Functional Impairment in Children and Adolescents: Protocol for an Automated Speech Analysis Algorithm Development Study" @default.
- W4378175040 cites W1049068570 @default.
- W4378175040 cites W1485702413 @default.
- W4378175040 cites W1618306604 @default.
- W4378175040 cites W1940107713 @default.
- W4378175040 cites W1964469912 @default.
- W4378175040 cites W2003502731 @default.
- W4378175040 cites W2045123881 @default.
- W4378175040 cites W2049033841 @default.
- W4378175040 cites W2050390617 @default.
- W4378175040 cites W2077424595 @default.
- W4378175040 cites W2091298722 @default.
- W4378175040 cites W2096066003 @default.
- W4378175040 cites W2122066333 @default.
- W4378175040 cites W2160216316 @default.
- W4378175040 cites W2166496785 @default.
- W4378175040 cites W2167797544 @default.
- W4378175040 cites W2787378487 @default.
- W4378175040 cites W2803193013 @default.
- W4378175040 cites W2901855206 @default.
- W4378175040 cites W2952320437 @default.
- W4378175040 cites W3004047823 @default.
- W4378175040 cites W3022135431 @default.
- W4378175040 cites W3089273598 @default.
- W4378175040 cites W3171083191 @default.
- W4378175040 cites W3190876559 @default.
- W4378175040 cites W3198532692 @default.
- W4378175040 cites W4224284209 @default.
- W4378175040 cites W4286705866 @default.
- W4378175040 cites W4293778617 @default.
- W4378175040 doi "https://doi.org/10.2196/46970" @default.
- W4378175040 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37351936" @default.
- W4378175040 hasPublicationYear "2023" @default.
- W4378175040 type Work @default.
- W4378175040 citedByCount "1" @default.
- W4378175040 countsByYear W43781750402023 @default.
- W4378175040 crossrefType "journal-article" @default.
- W4378175040 hasAuthorship W4378175040A5011375257 @default.
- W4378175040 hasAuthorship W4378175040A5020052749 @default.
- W4378175040 hasAuthorship W4378175040A5058249531 @default.
- W4378175040 hasAuthorship W4378175040A5068386655 @default.
- W4378175040 hasAuthorship W4378175040A5082173308 @default.
- W4378175040 hasAuthorship W4378175040A5091262280 @default.
- W4378175040 hasBestOaLocation W43781750401 @default.
- W4378175040 hasConcept C118552586 @default.
- W4378175040 hasConcept C134362201 @default.
- W4378175040 hasConcept C15744967 @default.
- W4378175040 hasConcept C160735492 @default.
- W4378175040 hasConcept C162324750 @default.
- W4378175040 hasConcept C27415008 @default.
- W4378175040 hasConcept C2779302386 @default.
- W4378175040 hasConcept C2779812673 @default.
- W4378175040 hasConcept C50522688 @default.
- W4378175040 hasConcept C70410870 @default.
- W4378175040 hasConcept C71924100 @default.
- W4378175040 hasConceptScore W4378175040C118552586 @default.
- W4378175040 hasConceptScore W4378175040C134362201 @default.
- W4378175040 hasConceptScore W4378175040C15744967 @default.
- W4378175040 hasConceptScore W4378175040C160735492 @default.
- W4378175040 hasConceptScore W4378175040C162324750 @default.
- W4378175040 hasConceptScore W4378175040C27415008 @default.
- W4378175040 hasConceptScore W4378175040C2779302386 @default.
- W4378175040 hasConceptScore W4378175040C2779812673 @default.
- W4378175040 hasConceptScore W4378175040C50522688 @default.
- W4378175040 hasConceptScore W4378175040C70410870 @default.
- W4378175040 hasConceptScore W4378175040C71924100 @default.
- W4378175040 hasLocation W43781750401 @default.
- W4378175040 hasLocation W43781750402 @default.
- W4378175040 hasOpenAccess W4378175040 @default.
- W4378175040 hasPrimaryLocation W43781750401 @default.
- W4378175040 hasRelatedWork W1976840597 @default.
- W4378175040 hasRelatedWork W1977676411 @default.
- W4378175040 hasRelatedWork W1991155834 @default.
- W4378175040 hasRelatedWork W2040296542 @default.
- W4378175040 hasRelatedWork W2048291412 @default.
- W4378175040 hasRelatedWork W2131788636 @default.
- W4378175040 hasRelatedWork W2317019403 @default.
- W4378175040 hasRelatedWork W2728515003 @default.
- W4378175040 hasRelatedWork W2773343537 @default.
- W4378175040 hasRelatedWork W4236194524 @default.
- W4378175040 hasVolume "12" @default.
- W4378175040 isParatext "false" @default.
- W4378175040 isRetracted "false" @default.
- W4378175040 workType "article" @default.