He takes a more "humanistic" view of the future of human-machine interfaces, one that frees us to be more expressive and requires computers to communicate on our level, not the other way around. That means software that can understand our speech, facial expressions, gestures, and handwriting. These technologies already exist, but have a lot of room for improvement.
One example he gives is holding up your hand to pause a video.
The hack exploits the new "Night Mode" in the firmware, which lets you set a maximum volume for specific hours of the day, creating silent (but still-active) music streaming. "Yep, a hack, but it works," writes Lauren. "And it's the closest we've gotten to a real sleep timer on Google Home so far."
Any other Slashdot readers have their own favorite personal assistant tricks?
It's not just the home, either; Amazon announced a deal to make Alexa available in BMW and Mini vehicles from the middle of next year, allowing drivers to use the digital assistant to get directions, play music or control smart home devices while travelling, without having to use a separate app. Travellers will also have access to Alexa skills from third-party developers like Starbucks, allowing them to order their coffee while driving and thus skip the line. Back in January, Amazon and Ford said they were working together to allow voice commands to turn on the engine, lock or unlock the doors as well as play music and use other skills...
It's still early days but I think Alexa has a good shot at becoming one of the standard interfaces, certainly for consumers -- an operating system for the home, if not more, if the automotive tie-ups take off too. All of this will make Amazon a serious force to be reckoned with. Windows has the desktop, and Android and iOS can fight it out for the smartphone, but right now Alexa has a lock on the smart home.
Last week, at the International Conference on Sampling Theory and Applications, researchers from MIT and the Technical University of Munich presented a technique that they call unlimited sampling, which can accurately digitize signals whose voltage peaks are far beyond an ADC's voltage limit. The consequence could be cameras that capture all the gradations of color visible to the human eye, audio that doesn't skip, and medical and environmental sensors that can handle both long periods of low activity and the sudden signal spikes that are often the events of interest.
One of the paper's author's explains that "The idea is very simple. If you have a number that is too big to store in your computer memory, you can take the modulo of the number."
The Model F Keyboards project, now taking preorders for the new line of authentic retro-boards, was started by Joe Strandberg, a Cornell University grad who's taken up keyboard wizardry as a nights-and-weekends hobby. He started as a collector and restorer of genuine Model F keyboards -- originally produced from 1981 to 1994 -- a process that familiarized him with their virtues and their flaws... Working with a factory in China, Strandberg has carefully overseen the reproduction process one step at time, from the springs to the unique powder-coating on the keyboard's zinc case. Despite the expense (Strandberg estimates spending $100,000 to revive the tooling necessary for the production run), it was the only viable option given the kind of abuse your average keyboard takes on a daily basis. "With 3D printing," he says, "the keyboard wouldn't last a year."
The first prototypes have just left the assembly line, and he's already racked up over a quarter of a million dollars in pre-orders. Does anyone else fondly remember IBM's hefty and trusty old keyboards?
In a blog post the company emphasized that "We're still absolutely committed to growing the VR film and creative content ecosystem."
U.C. Santa Cruz also remembers Huskey's work on the Bendix G-15 in 1954, "a 950-pound predecessor to today's laptops" which is sometimes hailed as the first personal computer (since it didn't require a separate technician to run) -- though each one cost over $50,000. The idea of an "electronic brain" was still so new, it led Huskey to an appearance on Groucho Marx's radio show You Bet Your Life, where Groucho warned him that "They're pretty tricky those machines! I wouldn't trust 'em... They'll turn on your like a mad dog, doctor!"
A half-decade later, at Xerox's storied Palo Alto Research Center, Mr. Taylor was instrumental in another technological breakthrough: funding the design of the Alto computer, which is widely viewed as the forerunner of the modern personal computer. Mr. Taylor even had a vital role in the invention of the computer mouse. In 1961, at the dawn of the Space Age, he was about a year into his job as a project manager at NASA in Washington when he learned about the work of a young computer scientist at Stanford Research Institute, later called SRI International... Mr. Taylor decided to pump more money into the work, and the financial infusion led directly to Engelbart's invention of the mouse, a computer control technology that would be instrumental in the design of both Macintosh and Microsoft Windows-based computers.
Taylor had become fascinated with human-computer interactions in the 1950s during his graduate work at the University of Texas at Austin, and was "appalled" that performing data calculations required submitting his punch cards to a technician running the school's mainframe computers. Years later, it was Taylor's group at PARC that Steve Jobs visited in 1979, which inspired the "desktop" metaphor for the Macintosh's graphical user interface. And Charles Simonyi eventually left PARC to join Microsoft, where he developed the Office suite of applications.
Taylor died Thursday at his home in Woodside, California, from complications of Parkinson's disease, at the age of 85.
Peggy Gullick, business process improvement director with AGCO, says the addition of Google Glass has been "a total game changer." Quality checks are now 20 percent faster, she says, and it's also helpful for on-the-job training of new employees... Tiffany Tsai, who writes about technology, says it's one of a growing number of companies -- including General Electric and Boeing -- testing it out... Companies working in the health care, entertainment and energy industries are listed as some of the Google Glass certified partners.
AGCO plans to have 200 workers using Google Glass by the end of this year.
According to neuroscientists, several figures from the tech sector are currently scouring labs across the U.S. for technology that might fuse human and artificial intelligence. In addition to Johnson, Elon Musk has been teasing a project called "neural lace," which he said at a 2016 conference will lead to "symbiosis with machines." And Mark Zuckerberg declared in a 2015 Q&A that people will one day be able to share "full sensory and emotional experiences," not just photos. Facebook has been hiring neuroscientists for an undisclosed project at Building 8, its secretive hardware division.
Elon Musk complains that the current speeds for transferring signals from brains are "ridiculously slow".
His conclusion? "This is probably the first AAA game that actually works on the Vive."
"No proprietary software," explains their campaign's video. "No backdoors. No spyware. No NDAs." They envision a world where users upgrade their computers by simply popping in a new card -- reducing electronic waste -- or print new laptop casings to repair defects or swap in different colors. (And they also hope to eventually see the cards also working with cameras, phones, tablets, and gaming consoles.) Rhombus Tech CTO Luke Leighton did a Slashdot interview in 2012, and contacted Slashdot this weekend to announce: A live-streamed video from Hope2016 explains what it's about, and there is a huge range of discussions and articles online. The real burning question is: if a single Software Libre Engineer can teach themselves PCB design and bring modular computing to people on the budget available from a single company, why are there not already a huge number of companies doing modular upgradeable hardware?
In December 2014 the company had introduced their first product which was a dock which used the MHL standard to output to external monitor. That campaign failed, however their newest creation, the Superbook smashed their Kickstarter goal in just over 20 minutes.
And within their first 38 hours, they'd crowdfunded $500,000. In an intriguing side note, Andromium "says it'll open its SDK so developers can tailor their apps for Andromium, too, though how much support that gets remains to be seen," reports Tech Insider. But more importantly, "Andromium says its prototypes are finished, and that it hopes to ship the Superbook to backers by February 2017."
The Register points out that since Windows RT is "a dead-end operating system" which Microsoft has announced they'll stop developing, "mainstream support for Surface RT tablets runs out in 2017 and Windows RT 8.1 in 2018. This is why a means to bypass its boot mechanisms is highly sought."
The app's popularity has created lagging servers and forced Niantic to delay its international roll-out, meaning "Those who have already downloaded the game in the U.S., Australia and New Zealand can still play it, while those in the U.K., the Netherlands and other countries will have to wait." Meanwhile, Motherboard warns that a malicious sideloaded version of Pokemon Go is being distributed that actually installs a backdoor on Android devices, and also reports that some players are already spoofing their GPS coordinates in order to catch Pokemon without leaving their house.
Last November searches for the term experienced the "spike of all spikes", according to a post on the VR Talk forum, which also identifies the top cities (two in Australia) for the searches -- Helsinki, Melbourne, Sydney, Brisbane, Singapore, Tel Aviv, and Seoul.