Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385695358> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W4385695358 abstract "Everyday, many individuals face online trolling and receive hate on different social media platforms like Twitter, Instagram to name a few. Often these comments involving racial abuse, hate based on religion, caste are made by anonymous people over the internet, and it is quite a task to keep these comments under control. So, the objective was to develop a Machine Learning Model to help identify these comments. A Deep Learning Model (a sequential model) was made and it was trained to identify and classify a comment based on whether it is an apt comment or not. LSTM (Long Short-Term Memory) is a type of recurrent neural network (RNN) that is particularly well-suited for modeling sequential data, such as text. LSTMs are capable of modeling long-term dependencies in sequential data. In the case of text classification, this means that LSTMs can take into account the context of a word or phrase within a sentence, paragraph, or even an entire document. LSTMs can learn to selectively forget or remember information from the past, which is useful for filtering out noise or irrelevant information in text. LSTMs are well-established in the field of natural language processing (NLP) and have been shown to be effective for various NLP tasks, including sentiment analysis and text classification. Binary cross-entropy is a commonly used loss function in deep learning models for binary classification problems, such as predicting whether a comment is toxic or not. Binary cross-entropy is designed to optimize the model's predictions based on the binary nature of the classification task. It penalizes the model for assigning a low probability to the correct class and rewards it for assigning a high probability to the correct class. The loss function is differentiable, which allows gradient-based optimization methods to be used during training to minimize the loss and improve the model's performance. Binary cross-entropy is a well-established loss function that has been extensively used in the field of deep learning, and there are many tools and frameworks that support it, making it easy to implement in practice. Binary cross-entropy also has a probabilistic interpretation, which can be useful in some applications. For example, it can be used to estimate the probability that a given comment is toxic. Hence, Binary Cross Entropy has been chosen as the loss function for the Deep Learning model." @default.
- W4385695358 created "2023-08-10" @default.
- W4385695358 creator A5012716210 @default.
- W4385695358 creator A5026992912 @default.
- W4385695358 date "2023-06-09" @default.
- W4385695358 modified "2023-09-25" @default.
- W4385695358 title "Deep Learning Model for Identification and Classification of Web based Toxic Comments" @default.
- W4385695358 cites W2540646130 @default.
- W4385695358 cites W2595653137 @default.
- W4385695358 cites W2901372316 @default.
- W4385695358 cites W2952914260 @default.
- W4385695358 cites W2954992865 @default.
- W4385695358 cites W3096445376 @default.
- W4385695358 cites W3116724594 @default.
- W4385695358 doi "https://doi.org/10.1109/apsit58554.2023.10201794" @default.
- W4385695358 hasPublicationYear "2023" @default.
- W4385695358 type Work @default.
- W4385695358 citedByCount "0" @default.
- W4385695358 crossrefType "proceedings-article" @default.
- W4385695358 hasAuthorship W4385695358A5012716210 @default.
- W4385695358 hasAuthorship W4385695358A5026992912 @default.
- W4385695358 hasConcept C108583219 @default.
- W4385695358 hasConcept C119857082 @default.
- W4385695358 hasConcept C12267149 @default.
- W4385695358 hasConcept C136764020 @default.
- W4385695358 hasConcept C137293760 @default.
- W4385695358 hasConcept C147168706 @default.
- W4385695358 hasConcept C151730666 @default.
- W4385695358 hasConcept C154945302 @default.
- W4385695358 hasConcept C167981619 @default.
- W4385695358 hasConcept C204321447 @default.
- W4385695358 hasConcept C2776224158 @default.
- W4385695358 hasConcept C2777206241 @default.
- W4385695358 hasConcept C2777530160 @default.
- W4385695358 hasConcept C2779343474 @default.
- W4385695358 hasConcept C41008148 @default.
- W4385695358 hasConcept C50644808 @default.
- W4385695358 hasConcept C66402592 @default.
- W4385695358 hasConcept C66905080 @default.
- W4385695358 hasConcept C86803240 @default.
- W4385695358 hasConcept C9679016 @default.
- W4385695358 hasConceptScore W4385695358C108583219 @default.
- W4385695358 hasConceptScore W4385695358C119857082 @default.
- W4385695358 hasConceptScore W4385695358C12267149 @default.
- W4385695358 hasConceptScore W4385695358C136764020 @default.
- W4385695358 hasConceptScore W4385695358C137293760 @default.
- W4385695358 hasConceptScore W4385695358C147168706 @default.
- W4385695358 hasConceptScore W4385695358C151730666 @default.
- W4385695358 hasConceptScore W4385695358C154945302 @default.
- W4385695358 hasConceptScore W4385695358C167981619 @default.
- W4385695358 hasConceptScore W4385695358C204321447 @default.
- W4385695358 hasConceptScore W4385695358C2776224158 @default.
- W4385695358 hasConceptScore W4385695358C2777206241 @default.
- W4385695358 hasConceptScore W4385695358C2777530160 @default.
- W4385695358 hasConceptScore W4385695358C2779343474 @default.
- W4385695358 hasConceptScore W4385695358C41008148 @default.
- W4385695358 hasConceptScore W4385695358C50644808 @default.
- W4385695358 hasConceptScore W4385695358C66402592 @default.
- W4385695358 hasConceptScore W4385695358C66905080 @default.
- W4385695358 hasConceptScore W4385695358C86803240 @default.
- W4385695358 hasConceptScore W4385695358C9679016 @default.
- W4385695358 hasLocation W43856953581 @default.
- W4385695358 hasOpenAccess W4385695358 @default.
- W4385695358 hasPrimaryLocation W43856953581 @default.
- W4385695358 hasRelatedWork W1835566166 @default.
- W4385695358 hasRelatedWork W2117774119 @default.
- W4385695358 hasRelatedWork W2364529976 @default.
- W4385695358 hasRelatedWork W2620094948 @default.
- W4385695358 hasRelatedWork W3192794374 @default.
- W4385695358 hasRelatedWork W3209984204 @default.
- W4385695358 hasRelatedWork W4223943233 @default.
- W4385695358 hasRelatedWork W4312200629 @default.
- W4385695358 hasRelatedWork W4360585206 @default.
- W4385695358 hasRelatedWork W4380075502 @default.
- W4385695358 isParatext "false" @default.
- W4385695358 isRetracted "false" @default.
- W4385695358 workType "article" @default.