Players of the virtual reality game Star Trek Bridge Crew will be able to control the Starship Enterprise using voice commands, following a collaboration with IBM’s supercomputer.
IBM Watson works with a program called Conversation to interpret the commands.
The game was released last month, but the new voice command feature will be unlocked on Thursday.
One player, the captain, can play with two other crew members played either by other people or by the computer.
“The idea is you can now talk to your bridge crew, and that part has been powered by Watson,” said Joshua Carr, technical liaison at IBM.
“Originally, there was a set of menus to click to instruct the helm and so on
“It works fairly well, but it is the lowest common denominator – we are using our hands to give out instructions – but this is virtual reality, and this is Star Trek.
“When we think about some of the incredible lines by Patrick Stewart’s Jean Luc Picard or [William Shatner’s] Captain Kirk, it’s all about your voice – how you communicate.”
Piers Harding-Rolls, a research director for IHS Markit, told the BBC News website: “In your average video game experience, you don’t have things like voice control.
“When it comes to virtual reality, you’re looking for something to keep you in the experience.
“Using your voice to engage with characters in the game is a step further, it adds believability to the experience you’re having, unlike if you had to use a controller.”
Trying it out
If you’ve ever wondered what it’s like to utter the immortal words “Engage” and “Warp speed ahead” and then zoom off around outer space, then you are in for a bit of a treat.
Star Trek Bridge Crew has a triple-A rating, which means it had a huge budget, and it shows.
The graphics are impressive and the production values are high.
It retails at about £35 in the UK, but you also need a high-end VR headset to enjoy it – and they cost considerably more.
Crucially, using the new voice control, you don’t have to use set phrases to communicate – you can use your own words. Or, at least, that’s the plan.
Being captain of the USS Enterprise for a short while was great fun, but the demo model I tried wasn’t entirely able to follow my commands.
At one point, it felt more like a game of charades as I struggled to think of as many different ways as possible of telling the crew to beam aboard another crew on a stricken vessel we were supposed to rescue – it failed to understand the essential Trekkie phrase “beam them up”.
This turned out to be a rookie mistake on my part as I’d forgotten to scan the virtual vessel first – but my virtual crew had no response when I asked them why they didn’t understand me – it would have been useful if they had been able to tell me what I’d done wrong.
Joshua Carr says this will be part of the learning process for the Watson-enabled software as it gets to grips with human speech.