How can stephen hawking talk




















In the s, Hawking's condition worsened and he started slurring his words due to his lack of muscle control; he completely lost speech capabilities after a surgery in , according to Biography.

At first, Hawking relied on a hand-held clicker for help choosing his words, which were then synthesized to speech. When he lost the use of his hands, Hawking needed an alternative to a clicker and that's when he switched to the use of a system that detects face movement.

The program run by Intel that allowed Hawking to select characters and words is called ACAT or assistive context-aware toolkit. An infrared switch that was attached to Hawking's glasses would detect movement he made with his cheek that would stop the cursor or mouse from moving on the screen. The program allowed Hawking to move his cheek to stop the cursor that automatically scanned a keypad, he wrote on his website.

More from Quartz About Quartz. Follow Quartz. These are some of our most ambitious editorial projects. From our Obsession. By Dave Gershgorn Artificial intelligence reporter.

Published March 14, This article is more than 2 years old. Sign me up. Before the Intel project, Hawking had tested EEG caps that could read his brainwaves and potentially transmit commands to his computer. Somehow, they couldn't get a strong enough brain signal. They weren't able to get a strong enough signal-to-noise. After returning to Intel Labs and after months of research, Denman prepared a minute video to send to Hawking, delineating which new user-interface prototypes they wanted to implement and soliciting his feedback.

The changes included additions such as a "back button," which Hawking could use not only to delete characters but to navigate a step back in his user interface; a predictive-word algorithm; and next-word navigation, which would let him choose words one after another rather than typing them.

The main change, in Denman's view, was a prototype that tackled the biggest problem that Hawking had with his user interface: missed key-hits. It was unbearably slow and he would get frustrated. He's not somebody who just wants to get the gist of the message across. He's somebody who really wants it to be perfect. To address the missed key-hits, the Intel team added a prototype that would interpret Hawking's intentions, rather than his actual input, using an algorithm similar to that used in word processing and mobile phones.

The problem is that it takes a little time to get used to and you have to release control to let the system do the work. The addition of this feature could increase your speed and let you concentrate on content.

The video concluded: "What's your level of excitement or apprehension? They implemented the new user interface on Hawking's computer. Denman thought they were on the right path. By September, they began to get feedback: Hawking wasn't adapting to the new system.

It was too complicated. Prototypes such as the back button, and the one addressing "missed key-hits," proved confusing and had to be scrapped.

We were trying to teach the world's most famous and smartest year-old grandfather to learn this new way of interacting with technology.

Denman and the rest of the team realized that they had to start thinking differently about the problem. We had to point a laser to study one individual. At the end of , the Intel team set up a system that recorded how Hawking interacted with his computer. They recorded tens of hours of video that encompassed a range of different situations: Stephen typing, Stephen typing when tired, Stephen using the mouse, Stephen trying to get a window at just the right size.

By September , now with the assistance of Jonathan Wood, Hawking's graduate assistant, they implemented another iteration of the user interface in Hawking's computer.

However, by the following month, it became clear that, again, Hawking was having trouble adapting. It was many more months before the Intel team came up with a version that pleased Hawking.

For instance, Hawking now uses an adaptive word predictor from London startup SwiftKey which allows him to select a word after typing a letter, whereas Hawking's previous system required him to navigate to the bottom of his user interface and select a word from a list. In the beginning he was complaining about it, and only later I realized why: He already knew which words his previous systems would predict.

He was used to predicting his own word predictor. Selecting 'black' automatically predicts 'hole'. The new version of Hawking's user interface now called ACAT, after Assistive Contextually Aware Toolkit includes contextual menus that provide Hawking with various shortcuts to speak, search or email; and a new lecture manager, which gives him control over the timing of his delivery during talks.

It also has a mute button, a curious feature that allows Hawking to turn off his speech synthesizer.



0コメント

  • 1000 / 1000