Matches in SemOpenAlex for { <https://semopenalex.org/work/W3148364978> ?p ?o ?g. }
- W3148364978 endingPage "52943" @default.
- W3148364978 startingPage "52926" @default.
- W3148364978 abstract "The textual data of a document is supplemented by the graphical information in it. To make communication easier, they contain tables, charts and images. However, it excludes a section of our population - the visually impaired. With technological advancements, the blind can access the documents through text to speech software solutions. In this method, even images can be conveyed by reading out the figure captions. However, charts and other statistical comparisons which involve critical information are difficult to be “read” out this way. Aim of this paper is to analyse various methods available to solve this vexatious issue. We survey the state-of-the-art works that do the exact opposite of graphing tools. In this paper, we explore the existing literature in understanding the graphs and extracting the visual encoding from them. We classify these approaches into modality-based approaches, conventional and deep-learning based methods. The survey also contains comparisons and analyses relevant study datasets. As an outcome of this survey, we observe that: (i) All existing works under each category need decoding in a variety of graphs. (ii) Among the approaches, deep learning performs remarkably well in localisation and classification. However, it needs further improvements in reasoning from chart images. (iii) Research works are still in progress to access data from vector images. Recreating data from the raster images has unresolved issues. Based on this study, the various applications of decoding the graphs, challenges and future possibilities are also discussed. This paper explores current works in the extraction of chart data, which seek to enable researchers in Human Computer Interaction to achieve human-level perception of visual data by machines. In this era of visual summarisation of data, the AI approaches can automate the underlying data extraction and hence provide the natural language descriptions to support visually disabled users." @default.
- W3148364978 created "2021-04-13" @default.
- W3148364978 creator A5010786471 @default.
- W3148364978 creator A5077794382 @default.
- W3148364978 date "2021-01-01" @default.
- W3148364978 modified "2023-10-18" @default.
- W3148364978 title "Towards Assisting the Visually Impaired: A Review on Techniques for Decoding the Visual Data From Chart Images" @default.
- W3148364978 cites W1536680647 @default.
- W3148364978 cites W1594084088 @default.
- W3148364978 cites W1933349210 @default.
- W3148364978 cites W1986860025 @default.
- W3148364978 cites W1990809386 @default.
- W3148364978 cites W1993032559 @default.
- W3148364978 cites W2000892322 @default.
- W3148364978 cites W2020813568 @default.
- W3148364978 cites W2027929866 @default.
- W3148364978 cites W2032227036 @default.
- W3148364978 cites W2034685859 @default.
- W3148364978 cites W2034994693 @default.
- W3148364978 cites W2043622810 @default.
- W3148364978 cites W2053604034 @default.
- W3148364978 cites W2068842640 @default.
- W3148364978 cites W2088452021 @default.
- W3148364978 cites W2091169704 @default.
- W3148364978 cites W2091343463 @default.
- W3148364978 cites W2097117768 @default.
- W3148364978 cites W2097817347 @default.
- W3148364978 cites W2108598243 @default.
- W3148364978 cites W2117539524 @default.
- W3148364978 cites W2126667706 @default.
- W3148364978 cites W2127368507 @default.
- W3148364978 cites W2128523760 @default.
- W3148364978 cites W2135231474 @default.
- W3148364978 cites W2139808046 @default.
- W3148364978 cites W2152763748 @default.
- W3148364978 cites W2155953728 @default.
- W3148364978 cites W2156279557 @default.
- W3148364978 cites W2194187530 @default.
- W3148364978 cites W2315491708 @default.
- W3148364978 cites W2342096063 @default.
- W3148364978 cites W2343052201 @default.
- W3148364978 cites W2416987009 @default.
- W3148364978 cites W2471094925 @default.
- W3148364978 cites W2513263058 @default.
- W3148364978 cites W2534375750 @default.
- W3148364978 cites W2561715562 @default.
- W3148364978 cites W2565639579 @default.
- W3148364978 cites W2587292159 @default.
- W3148364978 cites W2597425697 @default.
- W3148364978 cites W2725765016 @default.
- W3148364978 cites W2739300322 @default.
- W3148364978 cites W2769986739 @default.
- W3148364978 cites W2783231089 @default.
- W3148364978 cites W2784053407 @default.
- W3148364978 cites W2795424778 @default.
- W3148364978 cites W2897518192 @default.
- W3148364978 cites W2939836861 @default.
- W3148364978 cites W2959442326 @default.
- W3148364978 cites W2963037989 @default.
- W3148364978 cites W2963150697 @default.
- W3148364978 cites W2963420691 @default.
- W3148364978 cites W2963954913 @default.
- W3148364978 cites W2971712385 @default.
- W3148364978 cites W2977761974 @default.
- W3148364978 cites W2978554907 @default.
- W3148364978 cites W2996238229 @default.
- W3148364978 cites W3008893318 @default.
- W3148364978 cites W3009518609 @default.
- W3148364978 cites W3014820286 @default.
- W3148364978 cites W3028907449 @default.
- W3148364978 cites W3030888285 @default.
- W3148364978 cites W3099484189 @default.
- W3148364978 cites W3104085139 @default.
- W3148364978 cites W4236324440 @default.
- W3148364978 cites W4250891033 @default.
- W3148364978 cites W634106523 @default.
- W3148364978 doi "https://doi.org/10.1109/access.2021.3069205" @default.
- W3148364978 hasPublicationYear "2021" @default.
- W3148364978 type Work @default.
- W3148364978 sameAs 3148364978 @default.
- W3148364978 citedByCount "11" @default.
- W3148364978 countsByYear W31483649782021 @default.
- W3148364978 countsByYear W31483649782022 @default.
- W3148364978 countsByYear W31483649782023 @default.
- W3148364978 crossrefType "journal-article" @default.
- W3148364978 hasAuthorship W3148364978A5010786471 @default.
- W3148364978 hasAuthorship W3148364978A5077794382 @default.
- W3148364978 hasBestOaLocation W31483649781 @default.
- W3148364978 hasConcept C105795698 @default.
- W3148364978 hasConcept C136197465 @default.
- W3148364978 hasConcept C154945302 @default.
- W3148364978 hasConcept C181844469 @default.
- W3148364978 hasConcept C190812933 @default.
- W3148364978 hasConcept C205208641 @default.
- W3148364978 hasConcept C23123220 @default.
- W3148364978 hasConcept C2522767166 @default.
- W3148364978 hasConcept C33923547 @default.
- W3148364978 hasConcept C41008148 @default.