Matches in SemOpenAlex for { <https://semopenalex.org/work/W1554726024> ?p ?o ?g. }
Showing items 1 to 64 of
64
with 100 items per page.
- W1554726024 abstract "Many navigation devices for blind or visually impaired individuals have been developed (There is a lot of researches. For example, Johnson, 2006, Velazquez, 2006). The creation of these devices requires the management of three critical research elements: first, detecting obstacles; second, measuring the present location of the blind or impaired individual for route navigation; third, method of informing the individual of direction and distance. Regarding obstacle detection, we expect this activity to be performed using robot technology for autonomous transfer. As for location measurement, recent studies indicate location can be determined via GPS (Global Positioning Satellite) or RFID (Radio Frequency Identification) to guide the blind or impaired person (Miyanaga, 2008, Jongwhoa, 2006, Ding, 2007). However, GPS cannot be used in underground locations or urban areas surrounded by tall buildings because of poor signal detection. In addition, many ID chips must be buried under the roads to allow the use of RFID so that location measurement using RFID is likely to be limited. To avoid these difficulties, we proposed a measurement method based on foot movement. We will detail this measurement method further in this article. Concerning the method of presenting information to the blind and visually impaired, we believe that the device should provide the individual with necessary information about the direction and distance of obstacles/destinations. Judging from the natural human reaction to sound sources, we hypothesized that humans grasp direction based on the position of the head. Consequently, we proposed a head-centered direction display method. Our previous experimental results show that human beings can comprehend directions simply based on head position (Asonuma, Matsumoto & Wada, 2005). Incidentally, other similar systems currently use speech audio to allow the blind or impaired person to determine distance, yet difficulty still exists for the individual to accurately assess distances from speech audio. For example, this type of system notifies the person that he or she must turn right 3 meters in front of him or her if he or she wants to arrive at an entrance. However, the impaired individual may not be able to imagine a distance of 3 meters due to lack of prior physical reference. Even though there are difficulties in assessing distance using speech audio, it is thought to be a better presentation method than a tactile sensation such as vibration, at least from the standpoint of training necessity. Thus, we are attempting to examine the optimal" @default.
- W1554726024 created "2016-06-24" @default.
- W1554726024 creator A5002120968 @default.
- W1554726024 date "2008-10-01" @default.
- W1554726024 modified "2023-09-25" @default.
- W1554726024 title "Investigation of a Distance Presentation Method using Speech Audio Navigation for the Blind or Visually Impaired" @default.
- W1554726024 cites W1573496470 @default.
- W1554726024 cites W2115458032 @default.
- W1554726024 cites W2588133075 @default.
- W1554726024 cites W1520653268 @default.
- W1554726024 doi "https://doi.org/10.5772/5906" @default.
- W1554726024 hasPublicationYear "2008" @default.
- W1554726024 type Work @default.
- W1554726024 sameAs 1554726024 @default.
- W1554726024 citedByCount "0" @default.
- W1554726024 crossrefType "book-chapter" @default.
- W1554726024 hasAuthorship W1554726024A5002120968 @default.
- W1554726024 hasBestOaLocation W15547260241 @default.
- W1554726024 hasConcept C107457646 @default.
- W1554726024 hasConcept C116834253 @default.
- W1554726024 hasConcept C154945302 @default.
- W1554726024 hasConcept C166957645 @default.
- W1554726024 hasConcept C204222849 @default.
- W1554726024 hasConcept C205649164 @default.
- W1554726024 hasConcept C2776650193 @default.
- W1554726024 hasConcept C31972630 @default.
- W1554726024 hasConcept C38652104 @default.
- W1554726024 hasConcept C41008148 @default.
- W1554726024 hasConcept C59822182 @default.
- W1554726024 hasConcept C60229501 @default.
- W1554726024 hasConcept C76155785 @default.
- W1554726024 hasConcept C86803240 @default.
- W1554726024 hasConceptScore W1554726024C107457646 @default.
- W1554726024 hasConceptScore W1554726024C116834253 @default.
- W1554726024 hasConceptScore W1554726024C154945302 @default.
- W1554726024 hasConceptScore W1554726024C166957645 @default.
- W1554726024 hasConceptScore W1554726024C204222849 @default.
- W1554726024 hasConceptScore W1554726024C205649164 @default.
- W1554726024 hasConceptScore W1554726024C2776650193 @default.
- W1554726024 hasConceptScore W1554726024C31972630 @default.
- W1554726024 hasConceptScore W1554726024C38652104 @default.
- W1554726024 hasConceptScore W1554726024C41008148 @default.
- W1554726024 hasConceptScore W1554726024C59822182 @default.
- W1554726024 hasConceptScore W1554726024C60229501 @default.
- W1554726024 hasConceptScore W1554726024C76155785 @default.
- W1554726024 hasConceptScore W1554726024C86803240 @default.
- W1554726024 hasLocation W15547260241 @default.
- W1554726024 hasLocation W15547260242 @default.
- W1554726024 hasOpenAccess W1554726024 @default.
- W1554726024 hasPrimaryLocation W15547260241 @default.
- W1554726024 hasRelatedWork W1867797766 @default.
- W1554726024 hasRelatedWork W2049139666 @default.
- W1554726024 hasRelatedWork W2076462394 @default.
- W1554726024 hasRelatedWork W2198043339 @default.
- W1554726024 hasRelatedWork W2222788649 @default.
- W1554726024 hasRelatedWork W2299059641 @default.
- W1554726024 hasRelatedWork W2362290929 @default.
- W1554726024 hasRelatedWork W3112945917 @default.
- W1554726024 hasRelatedWork W3183741207 @default.
- W1554726024 hasRelatedWork W373795050 @default.
- W1554726024 isParatext "false" @default.
- W1554726024 isRetracted "false" @default.
- W1554726024 magId "1554726024" @default.
- W1554726024 workType "book-chapter" @default.