AI

The New iPad Pro's LIDAR Sensor Is An AR Hardware Solution In Search of Software (theverge.com) 45

One of the biggest new additions to Apple's new iPad Pro is a new "Light Detection and Ranging" (LIDAR) system on the rear camera, which Apple argued was the missing piece for revolutionary augmented reality applications. "It claims that by combining the depth information from the LIDAR scanner with camera data, motion sensors, and computer vision algorithms, the new iPad Pro will be faster and better at placing AR objects and tracking the location of people," reports The Verge. "But it doesn't change the fact that, right now, there still aren't a lot of compelling reasons to actually use augmented reality apps on a mobile device beyond the cool, tech-demo-y purposes that already exist." From the report: R apps on iOS today are a thing you try out once, marvel at how novel of an idea it is, and move on -- they're not essential parts of how we use our phones. And nearly three years into Apple's push for AR, there's still no killer app that makes the case for why customers -- or developers -- should care. Maybe the LIDAR sensor really is the missing piece of the puzzle. Apple certainly has a few impressive tech demos showing off applications of the LIDAR sensor, like its Apple Arcade Hot Lava game, which can use the data to more quickly and accurately model a living room to generate the gameplay surface. There's a CAD app that can scan and make a 3D model of the room to see how additions will look. Another demo promises accurate determinations of the range of motion of your arm.

The fact that Apple is debuting the iPad for AR doesn't help the case, either. While Apple has been rumored to be working on a proper augmented reality headset or glasses for years -- a kind of product that could make augmented digital overlays a seamless part of your day-to-day life -- the iPad (in 11-inch and 12.9-inch sizes) is effectively the opposite of that idea. It's the same awkwardness of the man who holds up an iPad to film an entire concert; holding a hardcover book-sized display in front of your face for the entire time you're using it just isn't a very natural use case.

It's possible that Apple is just laying the groundwork here, and more portable LIDAR-equipped AR devices (like a new iPhone or even a head-mounted display) are on their way in the future. Maybe the LIDAR sensor is the key to making more immersive, faster, and better augmented apps. Apple might be right, and the next wave of AR apps really will turn the gimmicks into a critical part of day-to-day life. But right now, it's hard not to look at Apple's LIDAR-based AR push as another hardware feature looking for the software to justify it.

Apple

On iPad Getting a Trackpad (learningbyshipping.com) 64

Apple on Wednesday announced the Magic Keyboard, featuring a trackpad, that will work with newly unveiled iPad Pro models and some previous generation iPads. Is this the "convergence" everyone had been waiting for? A "2 in 1" or a tablet or a toaster-refrigerator? Did Apple capitulate? Some context on the evolution of devices, from Steven Sinofsky, former President of the Windows Division at Microsoft. He writes: Hardware evolves just like software but we don't often see it the same way. We're used to talking about the cycle software bundling and unbundling, but hardware does the same thing. Every new generation of hardware begins this cycle anew. Certainly we're used to hardware adding ports or absorbing new technologies over time. Where things get really interesting with hardware is when a new "form" is introduced, often the first step is jettisoning many features from the leader. With the introduction of a form, the debate immediately begins over whether the new form can take over or whether it is a substitute for the old one. Tech dialog is rather divisive over these questions (dodged by marketing). "It can never work" or "It will eventually work." The industry works hard to create these dividing lines. The way it does this is first because usually there are new manufacturers that make the new form.

Second, pundits attach labels to form factors and begin a process of very specific definitions (dimensions, peripherals.) The first one of these transitions I remember is the introduction of portable computers. Out of the gate, these were way less powerful than "PCs." The debate over whether a portable can "replace" a "PC" was in full force. Quickly the form factor of portable evolved and with that came all sorts of labels: luggable, portable, notebook, desktop, sub-notebook, and so on. This continued all the way until the introduction of "ultra-books." If you're a maker these labels are annoying at best. (1987) Quite often these are marketing at work -- manufacturers looking to differentiate an otherwise commodity product create a new name for the old thing done slightly differently. Under the hood, however, the forms are evolving. In fact the way they are evolving is often surprising. The evolution of new forms almost always follows the surprising pattern of *adding back* all those things from the old form factor. So all those portables, added more floppies, hard disks, then expansion through ports/docks, and then ultimately CPUs as powerful as desktop.

Then we wake up one day and look at the "new" form and realize it seems to have morphed into the old form, capabilities and all. All along the way, the new form is editing, innovating, and reimagining how those old things should be expressed in the new one. These innovations can change software or hardware. But this is where hardware devices like USB come from -- the needs of the new form dictate new types of hardware even if it solves the same problem again. The evolution of PCs to become Servers offers an interesting arc. PCs were literally created to be smaller and less complex computers. They eliminated all the complexity of mainframes at every level while making computing accessible and cheap. When first PCs began to do server tasks, they did those in an entirely different way than mainframes that were servers of the day. They used commodity desktop PCs -- literally the same as a desktop running in an office. That was the big advantage -- cheap, ubiquitous, open! Mainframe people balked at this crazy notion. It was an obvious moment of "that is a toy." Then the age of client server computing was before us, starting in the late 1990s. But what followed was a classic case of convergent evolution. PC Servers started to add attributes of mainframes. At first this seemed totally crazy -- redundant power supplies, RAID, multi processors, etc. THAT was crazy stuff for those $1M mainframes. Pretty soon at h/w level telling a PC Server from a MF became a vocabulary exercise. And here we are today where server to stripped the very elements rooted in PC (like monitors and keyboards!) Guess what? That's a mainframe! On Twitter, this would be: "Mainframe, you've invented a mainframe.

Except, the operating system and software platform is entirely different. The evolution was not a copy, but a useful convergence done through an early series of steps copying followed by distinct and innovative approaches that created a new value ... a new form factor. So here we are today with an iPad that has a trackpad. Many are chuckling at the capitulation that the iPad was never a real computer and finally Apple admitted it. Laptop, Apple has invented the laptop. This was always going to happen. From the earliest days there were keyboard cases that turned iPads into "laptops" (w/o trackpads) and these could be thought of as experiments copying the past. It took time (too much?) to invent the expression of the old in the new. The PC server everyone uses in the cloud today is no mainframe. It is vastly cheaper, more accessible, more scaleable, runs different software (yes people will fight me on these in some way, but the pedantic argument isn't the point). Adding a trackpad to iPad was done in a way that reimagined not just the idea of a pointer, but in the entire package -- hardware and software. That's what makes this interesting. To think of it as capitulation would be to do so independent of how computing has evolved over decades.

Entertainment

Netflix Urged To Slow Down Streaming To Stop the Internet From Breaking (cnn.com) 199

The European Union is urging Netflix and other streaming platforms to stop showing video in high definition to prevent the internet from breaking under the strain of unprecedented usage due to the coronavirus pandemic. From a report: With so many countries on forced lockdowns to fight the spread of the virus, hundreds of millions working from home and even more children out of school, EU officials are concerned about the huge strain on internet bandwidth. European Commissioner Thierry Breton, who is responsible for the EU internal market covering more than 450 million people, tweeted Wednesday evening that he had spoken with Netflix CEO Reed Hastings. Breton called on people and companies to "#SwitchtoStandard definition when HD is not necessary" in order to secure internet access for all. "Commissioner Breton is right to highlight the importance of ensuring that the internet continues to run smoothly during this critical time," the Netflix spokesperson said. "We've been focused on network efficiency for many years, including providing our open connect service for free to telecommunications companies."
Education

Code.org: 'Our Team Will Teach Your Children At Home While School Is Closed' 20

theodp writes: In a Medium post, tech-backed Code.org explains how it will be supporting our community during school closures, which includes "a major investment in online education without an in-person instructor" and other offerings. From the signup form for Code Break: "With schools closed and tens of millions of students at home, Code.org is launching Code Break -- a live weekly webcast where our team will teach your children [K-12 computer science] at home while school is closed, and a weekly challenge to engage students of all abilities, even those without computers. [...] Computer science is foundational to all fields of study, but since many schools don't offer it yet, this could be a unique chance to support your child in a fun new learning opportunity."

Interestingly, Code.org will be competing with its own corporate donors for homebound kids' attention. Microsoft is offering limited-time free Minecraft: Education Edition licenses as its way "to help teachers and students stay connected to the classroom" during school closures. And Google has come up with a curated list of distance learning resources for schools affected by COVID-19 (think Google Hangouts and Chromebooks), as has Facebook for Education ("If school is closed, Messenger Kids is a way to continue the social interactions the students might have at school"). Amazon is also pitching CS study for homebound kids: "As classrooms across the U.S. experience educational disruption during the pandemic, Amazon Future Engineer will initially provide free access to our sponsored computer science courses in the United States [thru Aug. 31]. These courses are for independent learners from 6th to 12th grade, or teachers who are teaching remotely to this age group."
Moon

Can Astronauts Use GPS To Navigate On the Moon? (ieee.org) 99

schwit1 shares a report from IEEE Spectrum: Here on Earth, our lives have been transformed by the Global Positioning System, fleets of satellites operated by the United States and other countries that are used in myriad ways to help people navigate. Down here, GPS is capable of pinpointing locations with accuracy measured in centimeters. Could it help astronauts on lunar voyages? Kar-Ming Cheung and Charles Lee of NASA's Jet Propulsion Laboratory in California did the math, and concluded that the answer is yes: Signals from existing global navigation satellites near the Earth could be used to guide astronauts in lunar orbit, 385,000 km away. The researchers presented their newest findings at the IEEE Aerospace Conference in Montana this month.

Cheung and Lee plotted the orbits of navigation satellites from the United States's Global Positioning System and two of its counterparts, Europe's Galileo and Russia's GLONASS system -- 81 satellites in all. Most of them have directional antennas transmitting toward Earth's surface, but their signals also radiate into space. Those signals, say the researchers, are strong enough to be read by spacecraft with fairly compact receivers near the moon. Cheung, Lee and their team calculated that a spacecraft in lunar orbit would be able to "see" between five and 13 satellites' signals at any given time -- enough to accurately determine its position in space to within 200 to 300 meters. In computer simulations, they were able to implement various methods for improving the accuracy substantially from there.

Slashdot Top Deals