Home > Sample essays > Enhancing Navigation for the Visually Impaired Using Haptic Touchscreen Technology

Essay: Enhancing Navigation for the Visually Impaired Using Haptic Touchscreen Technology

Essay details and download:

  • Subject area(s): Sample essays
  • Reading time: 5 minutes
  • Price: Free download
  • Published: 1 April 2019*
  • Last Modified: 23 July 2024
  • File format: Text
  • Words: 1,405 (approx)
  • Number of pages: 6 (approx)

Text preview of this essay:

This page of the essay has 1,405 words.



1.0 Introduction

The client is Tanvas which is a company that works on designing and developing haptic (“involving tactile sense” [1]) touchscreen technology. They developed this technology using electrostatics to generate controlled resistive forces against the user's fingers and tablets surface [2]. This gives users virtual touch experience, such as clothing material or a rough piece of virtual sandpaper [2]. Furthermore, 60% of people in the world will be using smartphones by 2020 [3]. A smartphone is defined as “a mobile phone able to function like a computer that is typically touch screen” [4]. Based on this information it can be assumed that smartphone users heavily rely on their sight to interact with the mobile devices, which is a disability that VI individuals struggle with. Visual impairment is defined as having a 6/60 or worse vision in the better eye, which is not correctable by standard glasses, contact lenses, medicine, or surgery [5]. In this project we will focus on individuals with vision between 6/60 to 3/60. Furthermore,Visual impairment causes a person’s ability to perform everyday activities, challenging.[5] Hence, Tanvas sees their technology being used to ease VI individuals lives.  

2.0 Problem Statement

Tanvas tasked the Engineering Strategies and Practice teams at University of Toronto to utilize their technology to enhance an existing app’s method of communication with VI users. Tanvas has specifically asked us to use their technology on apps relating to, Health and wellness, time, operating system or navigation [6]. Since, navigation is an important part of our life, as a group we chose this topic as the scope of our project.

Current navigation technologies utilized by the VI often update the users with information by auditory communication, however audio cues are not always reliable or effective. Background noises in a loud, urban environment, for example, can prevent one from hearing the device clearly without use of headphones [7].  Headphones, on the other hand, can also be detrimental in a busy and loud environment, as they prevent other sources of sound from reaching the user. Visually impaired people rely heavily on their auditory senses in order to maintain spacial awareness [8].

Therefore, the gap in technology is a navigation app that can use communicate with the users through the sense of touch [9], and doesn’t only rely on visual or auditory feedback.

3.0 Detailed Requirements

The design will focus on improving the Google maps app by converting the navigation route, streetcar and intersection information into a format which can be conveyed through Tanvas’ haptic feedback [6]. Success will be measured through tests and objectives.

3.1 Functions

Through research on how VI users use Google Maps [10],[11],[12],[13], we created a Task Analysis of their actions that can be seen in the Appendix (A). From it, we discovered that the Google Maps app is missing information on the physical environment that may cause danger to the users. This results the chosen functions of the navigation app which shows in table 3.1.

Table 3.1. The primary functions and their respective secondary functions

Primary Functions

Secondary Functions

Indicate current location and destination

Display current location

Calculate current location through satellites

Current location is indicated through haptic touch

Convert directions, intersections and streetcar locations to haptic touch

Generate directions

Locate directions from Google map calculations

Display the directional path

Prevent the user from entering an area inaccessible or blocked

Generate location of street cars and intersections close to the user

Locate street car lines and road crossings in the direction of the user

Display the streetcar lines and the intersections

Directions, streetcar crossings and intersections felt through haptic touch

To indicate areas which the user is not permitted to enter

3.2 Objectives (how well does the app gives info to user)

The general objectives of the navigation app for VI users are the ability to navigate while using the app and better accessibility output compared to Google Maps found through research and consideration of the service environment. The how-why tree in Appendix (B) presents the idea of the generation of sub-objectives such as enough information output for the users from the general objective. The respective metrics, goals and the priority are shown in the Table 3.2.

Table 3.2 The objectives with its respective objective meric, objective goal, and priority.

Objective

Sub-objective

Metric

Objective goal

Priority

Ability to Navigate

Enough information given to the user

Number of prompts of information while walking

At least 1 every time a street is passed

At least 1 to indicate a turn [12]

2

Ability to accomodate errors or specific preferences

Count the number of choices of alternate paths

At least 3 paths [14], [15]

3

Less steps needed during navigation

Count the number of tasks needed while navigating

Less than 2 tasks [12], [16]

4

Accessibility

Able to distinguish street car crossings and intersections

Amount of electrostatic force on the screen [17], [18], [19]

At least the strength indicated by the strength bar at half in the Tanvas demonstration app based on user experience.

1

Multi-lingual

Count the number of language

At least 50 different languages (count from the google maps app)

5

3.3 Constraints

The absolute limits for this navigation app results from Tanvas, the client and laws. Tanvas states the limitation of their tablet in the client statement and during an interview with the CEO of Tanvas, Ed Colgate [6],[20]. The code restricts actions of the app such as disclosure users’ information. The table 3.3 demonstrate the constraint details and their metric

Table 3.3 The constraints with its respective metric and limit.

Section

Constraints

Metric

Tanvas

Must use Tanvas tablet [6]

Must not use digital braille

Using digital braille =0

All haptics must be distinguishable with constant movement of the finger

Frequency of finger movement ≥ 1

Must not use haptics for identifying letters or numbers

Use haptics for identifying letters or numbers = 0

Standards/ codes/ regulation

Must abide the “Freedom of Information and Privacy Act”

[21]

Must follow these sections of the act (shown in Appendix C):

“Personal Privacy s.12(1) FIPPA/s.14(1) MFIPPA”

“Consent s.21(1)(a) FIPPA/ s.14(1)(a) MFIPPA”

Through these acts, the application must not send information of the users to outside organizations without user consent. [22], [23]

4.0 Service environment

The navigation app will operate in the Tanvas tablet when the users travel outdoor. The location where the app will run includes virtual and physical environment. The service environment are presents in Table 3.4 with three sections.

Table 3.4 A description of the service environments where the app will operate in

Section of Service Environment

Environment

Physical

Precipitation

Average precipitation in Toronto daily is 2.19mm. [24],[25],[26],[27]

Noise level

In Toronto, the mean level of noise for day and night was 62.9dBA (A-weighted decibels) in 2016. [28],[29]

Living things

Amount of pedestrians and cars

The app will operate in downtown toronto with average 15,755 vehicles and 1,990 pedestrians crossing one intersections during eight peak hour. [30]

Virtual

GPS Coverage

Google maps satellites covers whole Canada [31]

5.0 Stakeholders

Given the function and service environment of our app, three categories of stakeholders should be taken into account: first, other components on road, mainly sighted pedestrians and vehicle drivers, second, the civil public sector, which includes some government institutions and urban planners,  and third, market competitors in the same field.

Table 3.5 Stakeholders and their interest

Stakeholder

Interest

Sighted Pedestrians

The navigation app would provide users with reliable information about their surroundings, which then reduces the risk of getting involved in navigation-related accidents, in return increase their confidence to travel independently.  [32]

Government Institutions

Government institutions plan and permit events which have the ability to close roads. The application will require information regarding road closures such that the navigation does not route users through an area which is inaccessible to them.[33]

Vehicle Drivers

Wearing earphones or headphones dramatically increases the risk of traffic accidents, as stated by some studies. The replacement of auditory instructions with haptic messages would then positively affect the vehicle drivers.[34]

Other Navigation App Design Teams

Due to market competition, the emergence of the new navigation app would take away partial market shares of those other design teams. Some individuals working as call center assistants might lose their jobs.[35]

Urban Planners

Provided with the navigational information through an online platform, the visually impaired would travel to a larger area. This requires urban planners to ensure that all accessibility infrastructures are in place in those regions. [36]

7.0 Conclusion

Tanvas desires an improvement on an navigation application for VI users with the use of their haptic technology. We are focusing on converting navigation information to haptic touch with Tanvas’ technology. We will hand in the problem statement document by November 2, 2018. During the next stage, idea generation, we will be coming up with as many solutions as possible that fall within the constraints and perform the functions that we have specified in this document.

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Enhancing Navigation for the Visually Impaired Using Haptic Touchscreen Technology. Available from:<https://www.essaysauce.com/sample-essays/2018-10-29-1540779409/> [Accessed 18-04-26].

These Sample essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on EssaySauce.com and/or Essay.uk.com at an earlier date than indicated.