1. Tagging of Metadata: This involves creating a text file containing the sequence of events along with their corresponding time stamps. Using this file and a video as input to a program, an XML containing the metadata of the video file is generated.
2. The haptic vest: The vest should be able to simulate the appropriate sensation in the correct location on the user’s body. The vest contains six vibrating motors, five on the front and one at the back. For example , if an action movie is being played and a person is shot on the right side of the chest, the vibrating motor placed on the vest at the right top, will vibrate and give the user the sensation of a gun shot.
3. Couch and lighting control: The subsystem should provide the appropriate lighting and sound environment based on the sequence on the TV. For example, if a soccer game is being played on the TV and if a goal is scored, the lights in the room will flash to indicate celebration happening in the game.
4. Open Sound Control Implementation: To develop a message API based on the Open Sound Control protocol to make the message format more generalized. This will make addition of new nodes an easy task since most of the work is done by the main node with the slaves just following its commands.
Reach goal: We intend to develop an ambient back light display system based on the video sequence currently running to help the eye adapt to the light intensity while switching from dark to bright scenes. The video below illustrates this.
2. The haptic vest: The vest should be able to simulate the appropriate sensation in the correct location on the user’s body. The vest contains six vibrating motors, five on the front and one at the back. For example , if an action movie is being played and a person is shot on the right side of the chest, the vibrating motor placed on the vest at the right top, will vibrate and give the user the sensation of a gun shot.
3. Couch and lighting control: The subsystem should provide the appropriate lighting and sound environment based on the sequence on the TV. For example, if a soccer game is being played on the TV and if a goal is scored, the lights in the room will flash to indicate celebration happening in the game.
4. Open Sound Control Implementation: To develop a message API based on the Open Sound Control protocol to make the message format more generalized. This will make addition of new nodes an easy task since most of the work is done by the main node with the slaves just following its commands.
Reach goal: We intend to develop an ambient back light display system based on the video sequence currently running to help the eye adapt to the light intensity while switching from dark to bright scenes. The video below illustrates this.
No comments:
Post a Comment