My three keywords
Sensory Visualisation, Power Balance and Future Space Robots
Speaking with Ben, our advisor, he gave us direction on a vision of our program and take it to real life.
We conducted a physical run of what we think our game will be on the bare minimum. These are some of the rules that we came up with
- 2 People
- Initially, 1 will be blind and 1 will deaf (there will be no switching of sense for this experiment)
- No outsider can help the two
- have to have light obstacles to cross
- Have an objective, in this case a red shiny button
- only the blind can reach the button/objective
Unique to this run.
- speech is stationary, ( can’t move with the blind)
- information is shared using the spoken English language.
For this run, what we observed and found out
- break down in interpretation – since the English language is diverse, small, little turn, rotate and all the other commands is very subjective, little can mean different actions for different people
- trust issue – our initial guinea pigs knew each other fairly well but was discovered that the blind just have to trust the voice to take the blind to the destination and being in real life walking with blindfold is hard even if you know the room enough, let alone blindfolded walking over obstacles.
- maybe don’t have words, if we are running with robots, beeps can work and still be interpreted differently
- have instructions that cant be misinterpreted – but removes the players unique style.