Home > Sample essays > Create a Touch Feeling of Virtual Reality with a Head-Mount Display: Maximizing Immersion in VR w/ Haptic Technology

Essay: Create a Touch Feeling of Virtual Reality with a Head-Mount Display: Maximizing Immersion in VR w/ Haptic Technology

Essay details and download:

  • Subject area(s): Sample essays
  • Reading time: 9 minutes
  • Price: Free download
  • Published: 1 April 2019*
  • Last Modified: 23 July 2024
  • File format: Text
  • Words: 2,746 (approx)
  • Number of pages: 11 (approx)

Text preview of this essay:

This page of the essay has 2,746 words.



ay in her1. Introduction

1.1 Problem Summary

Virtual Reality (VR) equipment is becoming increasingly popular. Here a head-mounted display(HMDs) is provided, which may be worn on user’s head and which has one or two screens in front of the one or two of the user’s eyes to enjoy the experience of virtual reality. This type of display has multiple commercial applications involving simulation of virtual reality including video games, medicines, sports training, entertainment applications, and so forth. In the gaming field, these displays may be used, to render three-dimensional (3D) virtual game worlds. Virtual Reality (VR) simulates physical presence in places in the real world or imagined worlds and sometimes lets the user interact with that virtual environment. Virtual reality artificially creates sensory experiences such as sight and hearing.

Although much advancement has occurred in the head mount device and the field of virtual reality, the technology still needs improvement to bring physical reality to real user interactions with virtual object rendered in virtual environments. This inventive system aims to provide a method for simulating a touch feeling of contact with a virtual object in virtual environment presented in a head mounted display. To create element during real time in virtual reality along with it to safeguard the user from colliding with the objects in the nearby area at the time of being in virtual reality, visual and audible notifications are given to the user to sense the real world objects around them without taking off the virtual reality gears or turning on a camera view for the moment. Here the actual world object is superimposed with the objects in the virtual world even with the provision to determine the direction and position of the detected object concerning the user. The system is designed to perform real time detection of the subject matter in the real world so that even if the position of the object is changed the device can track the present position and to defend the user from colliding. The system is programmed to create elements in the real time which the user seems fit in the virtual reality environment.

1.2 Aims and Objectives of the work

The Principle objective of this system is a peripheral device for interfacing with a virtual reality scene generated by a computer for presentation on a head mounted display. The peripheral device includes a haptic device capable of being placed in contact with a user and a controller for processing instructions for outputting a signal to the haptic device. The haptic device changes to correspond to a user’s virtual interactions with a virtual object in a virtual reality scene as presented on the head mounted display.

Another objective is the capability of the system to impart one or more tactile sensations to the user. The haptic device can simulate the texture of the virtual object, a shape of the virtual object, an amount of pressure virtually exerted on the user by the virtual object, an amount of movements such as vibrations or other movements of the virtual object or proximity of the virtual object to a second virtual object. It is also made capable of detecting motion of the peripheral device and outputting a motion signal to the feedback controller and/or the computer.

The other objective is to use a glove (or gloves) as a haptic device that can apply a dynamically adjustable pressure and/or texture to the user’s hand(s). The glove can squeeze the user’s hand or flex the user’s hand and/or fingers to correspond to the feeling and textures of the virtual object in the three-dimensional (3D) virtual environment presented to the user in a head-mounted display.

The further objective is to create and place elements in virtual world. The system is designed with elements which can be placed into the virtual environment during the real time to enhance the experience of the user and to elevate the imagination. The elements will be designed into the system based on the vision the user is experiencing the virtual reality and based on the requirement the user can utilize or place the object into the virtual environment.

1.3   Usefulness of the project to the industry/user/society

 Medical industry: In this industry it is difficult to practice or perform operations on a dead body so by using our product Students can perform surgery freely.

 Aviation industry: In this industry it is not possible to give every new student to fly a plane so by using our products simulation one can learn easily without causing any damage.

 Defence industry: By creating a war like environment soldiers can train and can gain experience.

 Gaming industry: We all know that Virtual reality is the future so our product focus on VR games so that user can get the best gaming experience.

1.4 Materials and methods used.

 Smartphone App:

 Unity Game Engine

 Leap Motion SDK

 Google VR SDK

 Server:

 MEAN Stack

 3D Blender Objects Hosting

 Mosca mqtt server

 Other software used:

 Open cv

2. Literature Review

The basic idea of this project was taken from one of the fictional works in which the concept of full-dive technology was explained. The full-dive is explained as the immersion of user’s consciousness into virtual reality, letting the user experience the virtual reality as a part of reality. The work depicts the transmission of nerve signals from the headset to simulate the senses. Further, on researching on the same lines it became clear, the proposed model was not feasible with present technology. Thus, the focus was diverted to simulate the senses through various means possible. The senses which are already being simulated includes the vision and hearing. The remaining senses of taste and smell were not tapped and the work on the sense of touch is limited due to the lack of technology and methodology. Thus, our focus was diverted on finding means to simulate the sense of touch for a near real virtual reality experience. On further research, we discovered that the works on touch simulation included a huge bulky hardware setup which was not at all feasible for ease of use.

The research also further opened up the possibilities of distance surgery which can mimic the movements of hands and provide with response that too in real time. The already existing Leonardo-da-Vinci distance surgery setup works on the same lines but fails to provide with the necessary feedback mechanism to simulate the feeling of touch. Also, the same thought process led us to the thought of superimposing the virtual reality with the real world to provide with virtual experience along with reality touch.

Some of the relatable patents are included below with relatable information:

Integrated virtual reality rehabilitation system

US 5429140 A

A rehabilitation system employs a force feedback system, such as a force feedback glove, to simulate virtual deformable objects. Prior to rehabilitation, the patient places his or her hand in a sensing glove which measures the force exert-able by the patient's digits. Information from the sensing glove is received by an interface and transmitted to a computer where the information can be used to diagnose the patient's manual capability. The computer generates rehabilitation control signals for a force feedback glove. The patient places his or her hand in the force feedback glove and attempts to bring the digits together as though grasping the virtual object. The force feedback glove resists the squeezing movement of the digits in a manner that simulates the tactile feel of the virtual object. The force exerted by the fingers of the patient is fed back to the computer control system where it can be recorded and/or used to modify future rehabilitation control signals. The basic concept of rehabilitation in virtual environment with force feedback can also be applied to other appendages of the human body including arms, legs, neck, knees, elbows and other articulated joints

Figure 1)Sensing glove

Tactile feedback mechanism for a data processing system

US 5986643 A

An apparatus for providing a tactile stimulus to a part of the body of a physical operator when a virtual operator, created by movements of the physical operator, encounters a virtual object defined by a computer. A signalling unit communicates with the computer and emits a signal when the virtual operator encounters a virtual object. A stimulus unit responsive to the signalling unit is disposed in close proximity to a part of the body of the physical operator for providing a tactile stimulus when the virtual operator encounters a virtual object. The stimulus unit may comprise a segment of memory metal which undergoes a martensitic transformation to a different form or a solenoid having a member which moves in response to a signal emitted by the signalling unit. A vibrating member, such as a piezoceramic bender may be used instead of or in addition to the solenoid or memory metal.

Figure 2)Tactical stimulas

Tactile feedback man-machine interface device

US 6424333 B1

A man-machine interface which provides tactile feedback to various sensing body parts is disclosed. The device employs one or more vibrotactile units, where each unit comprises a mass and a mass-moving actuator. As the mass is accelerated by the mass-moving actuator, the entire vibrotactile unit vibrates. Thus, the vibrotactile unit transmits a vibratory stimulus to the sensing body part to which it is affixed. The vibrotactile unit may be used in conjunction with a spatial placement sensing device which measures the spatial placement of a measured body part. A computing device uses the spatial placement of the measured body part to determine the desired vibratory stimulus to be provided by the vibrotactile unit. In this manner, the computing device may control the level of vibratory feedback perceived by the corresponding sensing body part in response to the motion of the measured body part. The sensing body part and the measured body part may be separate or the same body part.

Figure 3)mass moving actuator

Mixed reality space image providing apparatus

US 20140085298 A1

A mixed reality space image providing apparatus configured to provide a user with a mixed reality space image in which a virtual object image is superimposed on a real space image is provided includes a selection unit configured to select simulation processing from among a plurality of types of simulation processing based on an instruction from the user, a simulation processing unit configured to perform the simulation processing selected by the selection unit with respect to the virtual object image, and a providing unit configured to generate a mixed reality space image by superimposing the simulation-processed virtual object image on the real space image and to provide the generated mixed reality space image to the user.

Figure 4)mixed reality space

3. Plan of our work

3.1 System Architecture

3.1.1 Connection – Connection establishment between smartphones, server and gloves.

 Smartphone initiates the process and sends the connection request to the server.

 Gloves also connect to the available server and remain open until paired with a smartphone.

 After connection with the server, smartphone sends the request for the gloves from the available ones.

 Pairing is done on the server side and the network is created.

 Finally, all the connections are verified and the network is locked.

3.1.2 Interaction – Virtual object interaction feedback returned to the gloves through server (PHASE 1).

 Smartphone will trigger the server if user enters the buffer area for a virtual object.

 Server will hold the data received from the smartphone.

 Server will transmit the data to the gloves on receiving interaction trigger from the smartphone.

 Server will discard the data otherwise.

 Gloves (NodeMCU) will execute the commands on receiving the trigger from the server and control the motors correspondingly.

3.1.3 Uplink – Transmitting the smartphone video feed to the server

 Sending the video feed (fps – bandwidth) to the server from smartphone camera.

 Scanning of the objects using image processing techniques.

3.1.4 Morphosis – Scan the environment and overlap the real objects w.r.t virtual objects of the environment

 Uplink Module Called.

 Transform the environment to the selected virtual environment.

 Mapping of the real objects to the corresponding virtual objects based on the virtual environment selected.

 Overlap the mapped virtual objects to the transformed video feed on the corresponding real objects.

 Sending the transformed video to the smartphone.

3.1.5 Identification – Fetches the details of the real objects.

 Uplink Module called.

 Fetch the data corresponding to the real objects.

 Send the data to the smartphone as per the requirements

3.2 Sequence Diagram

Figure 5) Sequence Diagram

   

3.3 Flow Diagram

Figure 6) System Flow Diagram

4. Design

4.1 AEIOU Summary Canvas

As “AEIOU” (vowels) are very important part of English language and so is the AEIOU sheets for designing the product. We selected our project spot as Gaming Parlour

ACTIVITIES

we have listed down the general impression of activities which we thought and the actual activities happening there.

ENVIRONMENT

The type of atmosphere we found there was a bit different from the world outside

INTERACTION

Several interactions were taking place. Some of them were useful whereas some of them not (not to mention). The interactions which grabbed our attention towards the project were people rants each other while playing. And people adjusting controls and setting up.

OBJECT

Objects plays an important role in understanding the problem statement as they are also actively involved in the activities and interactions taking place. Some of the objects related to our domain were computer, Headphones, Joysticks, Play-station, TV Screen etc.

USER

User is the one who will be doing activities and having interactions, we are building a solution to ease their work. The user category which we found would be benefited from the product were students, Gamers, Teachers etc.

Figure 7) AEIOU Summary

4.2 Empathy Mapping Canvas

Empathy mapping canvas includes users, stakeholders, which were directly or indirectly interacting at the site place.

In empathy canvas, first of all we selected users to focus on particular problem these users are facing. Moreover, there are activities done by users. In our case various activities are Aviation Simulation, Defense Simulation, Surgery Simulation etc

In the end, there is story boarding which contain two happy stories and two sad stories describing the need to develop this project and the situation

Figure 8) Empathy Mapping

4.3 Ideation Canvas

Ideation canvas consists of many sections like people, activities, situation/context/location and props. People section includes the people who will use our product.

People can be individuals, business organizations, administrators etc.

Next is activities section which contains all activities which can be performed using our product like Aviation Simulation, Defense Simulation, Surgery Simulation etc.

Then comes  situations/context/locations where the product will be useful in Gaming Industry, Defense Industry, Aviation Industry, Education etc.

Lastly props that the product will need like Leap Motion Sensor, NODE MCU , Gloves, VR Headset, Smartphone, Server etc.

Figure 9) Ideation Canvas

4.4 Product Development Canvas

Moving further towards Product Development Canvas, we have analysed , studied and noted various aspects of “Vii Touch” which includes purpose, people, product experience, product functions and product features.

Product development canvas describes the purpose of product, people involved, feature of the product, functions of product and components involved.

Figure 10) Product Development Canvas

5. Implementation of the System

We have created some 3D objects using blender software and Unity which are  used in our application. Below are some snapshots of the objects created :

Figure 11) Cube

Figure 12) Sphere

Figure 13) Tyre

Figure 14) 3D Character

Figure 15)Environment

Figure 16)Environment with object

6. Conclusion

This system consists of a peripheral device for interfacing with a virtual reality scene generated by a computer for presentation on a head mounted display. The peripheral device includes a device capable of being in contact with a user and to receive feedback for outputting a touch signal to the wearable device to sense the object in the virtual environment. The sensing feedback controller receiving the instructions from the hand worn device corresponding to a user’s virtual interactions with a virtual object in the virtual reality scene as presented on the head mounted display. The system is also made capable of enhancing the viewing by providing the provision of placing additional elements into the virtual world scenarios which will be imaginative and also by securing the user from colliding with the object in the physical world when in the virtual world by superimposing virtual object in the same orientation on top of the image of its physical element.

7. References

Scanning Reality into Virtual Reality

https://patents.google.com/patent/US6072466A/en?q=virtual&q=object&q=manipulation

https://patents.google.com/patent/US20040238732A1/en?q=overlap&q=real&q=objects&q=virtual&q=objects&q=virtual+reality

e…

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Create a Touch Feeling of Virtual Reality with a Head-Mount Display: Maximizing Immersion in VR w/ Haptic Technology. Available from:<https://www.essaysauce.com/sample-essays/2018-5-15-1526394994/> [Accessed 13-04-26].

These Sample essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on EssaySauce.com and/or Essay.uk.com at an earlier date than indicated.