Anti-Google Glass attack in San Francisco highlights tension over wearables

San Francisco social media consultant Sarah Slocum says she was attacked this weekend for using Google Glass in a local bar. On her Facebook profile, Slocum said she was “verbally and physically assaulted” by bar patrons who jeered at her for using Glass and, she alleges, snatched the device off of her face. Slocum also said that she was robbed after leaving her purse in the bar as she pursued her alleged attackers.

She has posted a video of part of the incident to YouTube, though it doesn’t show the all the details of the alleged attack:
>

Another video, shared with San Francisco television station KRON4, shows one of the alleged attackers telling Slocum that people like her are “destroying” the city of San Francisco — a comment that ties this incident into a growing backlash against tech companies such as Google, Twitter and others in the city.

San Francisco Police Department Officer Albie Esparza confirmed that a woman reported being involved in an altercation over Google Glass after they believed she was recording them without their consent in a bar Saturday at 1:45 a.m. He said that no suspects are in custody.

Wearable technology is moving people into a brave new world with new rules about how to use evermore ubiquitous gadgets. The path to acceptance, however, is not a smooth one. Google’s head-mounted Glass computer, for example, faces a long road ahead to convince the general public that smart glasses will eventually be as accepted as camera-phones and earbuds. The device has many of the same functions as a smartphone, but lets users read e-mail, see notifications, get directions and take calls via a screen on their faces rather than on their smartphones.

The San Francisco incident follows a handful high-profile reports of negative, sometimes violent reactions that Google Glass users have reported as the new technology causes rifts with restaurant owners, law enforcement officials and movie theater owners.

From a legal standpoint, those fights have largely come out in Google’s favor. A San Diego court dismissed a case against a woman pulled over for using Glass last month. But more cases could be on their way, as a handful of states have also discussed legislation to ban Glass behind the wheel amid worries that having a small screen hovering in users’ peripheral vision will pose a danger on the roads.

Education has been the main tool in Google’s arsenal as it faces questions about the implications Glass has for privacy, distracted driving, piracy and a general unease among some folks who see a person walking around with a head-mounted computer.

The firm, which has already taken the tech for demonstrations on Capitol Hill, has also been touring the country and showing Glass to mayors, state legislators and the general public in order to let people know more about the benefits it sees in the technology.

Google has also placed considerable weight on its beta testers — which it calls Explorers — to act as ambassadors for the technology. Last week, Google released a list of dos and don’ts for Explorers to give them basic etiquette and safety tips on using the device and to remind them to be respectful when facing Glass doubters.

“New technology raises new concerns which is why educating Explorers and those around them is a top priority for the Glass team,” Google said in a statement. “The point of the Explorer program is to get Glass in the hands of people from all walks of life and see how they use it out in the world. Our Explorers provide us with continuous feedback and on the whole, they act as positive ambassadors for Glass on a daily basis. While Glass is currently in the hands of this small group, we find that when people try it for themselves they realize Glass connects people more with the world around them than distracts them from it.”

Joseph White, a Rockville-based consultant and Explorer, said that he’s only faced one negative reaction to his use of Glass since he started using it in December — and it was not nearly as confrontational as Slocum’s alleged encounter.

“The closest experience that I have had to [Slocum’s] is someone coming up to me at an Organizing for America event . . . and asking me ‘What are you recording right now?’ ” he said. White, who is in his 60s, said that while he’s had some conversations with people about the privacy implications of Glass, those same people have also asked to try on the device and have their picture taken with him.

“I have never been asked to take them off,” he said. “And I have been in restaurants, some bars — just out in public at different functions.”

How Wetware Can Help Hardware Makers Beat Moore’s Law, Save Energy

How Wetware Can Help Hardware Makers Beat Moore's Law, Save Energy

Humans may not use all of the brain’s computing capacity, but they do use most of the cranium for computing, according to biophysicist Bruno Michel at IBM’s Zurich research laboratory. Co-located cortices and capillaries keep our neurons powered and cooled with minimal fuss. Yet today’s computers dedicate around 60 percent of their volume to getting electricity in and heat out compared to perhaps 1 percent in a human brain, Michel estimates. Last week in Zurich, he told journalists about IBM’s long-term plan to help computers achieve human-like space- and energy efficiency. The tool: a kind of electronic blood.

Liquid cooling has been on the market since Cray tried it on for size in 1985. The technique hit consumer computers in the 1990s, though it remains a custom, quirky, or high-end option. During the era of Moore’s Law, the easiest way to bring down the price of computing has been to fit more transistors on a chip. But a lesser-known limit lurked in the background: Rent’s Rule, which noted a logarithmic relationship between the number of logic units in a computer and the number of communications gates between them. In other words, packing in more logic requires packing in a lot more communication. More communication means a need for more energy in and more heat out.

Michel describes that as the bottleneck for supercomputing: “In the future we are worried about the energy bill and not about the purchase cost of a computer. So we want to squeeze as much as we can out of a watt or joule and not out of a silicon chip.” he says.

Today’s computer chips get their electric current from tiny copper wires and dissipate heat into surrounding air that bulky and energy-hungry air conditioners must whisk away. The need for continuous air flow means chip designers have stuck to a more or less flat design. Water’s much higher heat-carrying capacity make it possible to shunt heat away from chip components that are much, much closer together. The upshot: they could be stacked on top of one another in a three-dimensional arrangement much like the brain’s. That would give the chip shorter communication distances, too, further reducing the amount of heat generated.

Cool. But not that cool. IBM’s SuperMUC cools its chips with water at 45 degrees Celsius. You wouldn’t want to spill any on your hand, but it’s still cool enough to bring down a computer chip’s temperature. Michel says that the hot water from a water-cooled 10-megawatt data center could heat 700 homes via district heating, which is common in Europe. IBM set up a first working example of a heat-recycling liquid-cooled supercomputer, called Aquasar, at the Swiss Federal Institute of Technology Zurich (ETH). If it works on a larger scale, it would save energy in two places: less energy would be needed to cool the computer in the first place, and some of the energy spent could be reused to heat homes.

Getting electricity in via a liquid is more complicated. Patrick Ruch, a colleague of Michel’s at IBM in Zurich, and others are examining vanadium as a potential electrolyte for a redox flow battery. This is a concept used more often for storing energy from variable power sources such as solar panels or wind power, due to its low cost and scalability. But it has a low energy density compared to some other batteries. Ruch, who began the project this year, has his work cut out for him. He says, “the goal is to be able to provide a substantial amount of the power that the microprocessor needs by this electrochemical solution.”

Rudimentary Computer Built From Nanowires

Rudimentary Computer Built From Nanowires

A rudimentary nanocomputer built from an array of tiny wires could point the way to skirting the limits of Moore’s Law and lead to new kinds of biosensors or microcontrollers for robots. So say scientists and engineers from Harvard and Mitre, a nonprofit R&D company based in Bedford, Mass., and McLean, Va. The team describes its nanowire nanocomputer this week in Proceedings of the National Academy of Sciences.

The device is based on 15-nanometer-wide wires with a germanium core and a silicon outer layer. The wires are laid out in parallel on a surface of silicon coated with silicon dioxide, with layers of dielectric material on top and chromium and gold contacts laid perpendicularly across the nanowires. Each point where the contacts cross the nanowires is a programmable transistor node, which can be switched between an active and an inactive state by applying voltage so that the structure forms a “crossbar array.”

The researchers arranged a total of 180 such transistor nodes in six crossbar arrays and divided them into three separate tiles to make a finite-state machine capable of performing arithmetical operations. In one configuration, the first tile does the math, while the second tile holds one bit in memory and the third tile holds a second bit. Chemist Charles Lieber, head of the Harvard team, says more tiles can be added in a Lego-like fashion. The version the team built was a two-bit adder. Four tiles would make a four-bit adder, and so on. A four-by-four array of tiles “could function as a pretty sophisticated microprocessor,” he says.

Researchers concerned with the eventual end of Moore’s Law have been experimenting with nanowires as an alternative to CMOS technology, at least for some applications. Nanowires can easily be grown from a solution on a substrate, using a variety of techniques. However, they’re so small that placing them exactly where they need to be is challenging, and if they accidentally cross each other, they cause a short circuit

Lieber and his team overcame those problems with a technique they call “deterministic nanocombing.” After they’d already grown nanowires on one substrate, they coat another substrate that will form the base of their array with a thin layer of resist, then use electron-beam lithography to remove narrow slots in the resist layer where they want the wires to go. After using a chemical treatment on the wires that makes them more likely to stick to the exposed silicon oxide surface, the researchers take the substrate on which they grew their nanowires and scrape it across the slotted surface of the other substrate. The wires follow the exposed areas, and when they’re in place, the team washes away the remaining resist. Lieber says the approach has led to an improvement of one order of magnitude both in the alignment of the wires and in the number of short circuits, compared with previous techniques.

Instead of having to figure out where the wires ended up in order to connect to them, the researchers count on the statistical likelihood that enough of them are in the right place. “We never have to find the wires afterward. We just assume they’re going to be at these positions,” Lieber says.

The circuits built through this process are small and have low power requirements, which opens up new possibilities for how they can be used. They won’t require bulky power supplies and cooling equipment, which will allow engineers to shrink sensors and microcontrollers. “We’re really going for ‘invisible’ computing and control,” says James Ellenbogen, chief scientist for nanotechnology at Mitre. “You don’t even know that it’s there or that it’s computing, because the energy requirement is so small.”

The Mitre researchers think the technology could be used in biosensors for monitoring a person’s health; because these nanocomputers use little power, they could take far more frequent and detailed measurements than existing devices, says Shamik Das, head of the company’s nanosystems group. Such nanocomputers could also be used for environmental sensors or as controllers for drones and other robotic systems, adds James Klemic, director of Mitre’s nanotechnology laboratory.

André DeHon, an electrical engineer at the University of Pennsylvania, calls this research a step forward for nanocomputing. He’d still like to see the wires packed closer together to increase the transistor density. And he and others are developing new fabrication techniques for contacts, which he says will be needed to make the manufacturing truly bottom-up. “One question is whether or not this will compete with or exceed silicon lithography, or provide the techniques to continue pushing silicon scaling further,” he says.

Lieber says the nanowire device will likely surpass CMOS chips only in very specific applications that benefit from low power consumption and operate at megahertz rather than gigahertz clock rates. But Ellenbogen hopes the research will provide hints for where CMOS technology can be improved. “We think some of the approaches are more generally applicable,” he says.

Carbon Nanotubes Could Solve Overheating Problem for Next-Generation Computer Chips

Carbon Nanotubes Could Solve Overheating Problem for Next-Generation Computer Chips

Computer chips used in next-generation smartphones and supercomputers can’t get much faster without overheating. That’s why engineers hope carbon nanotubes offer a possible cooling solution that could enable processing speeds to continue accelerating.

The overheating problem has become steadily worse as engineers cram more power-hungry transistors into the same microchip space, because much of the electricity that powers the transistors is wasted as heat. Carbon nanotubes have high thermal conductivity that could carry the excess heat away from the microchips, but only if engineers can figure out how to improve the heat transfer at the point of contact between the nanotubes and microchips.

“The thermal conductivity of carbon nanotubes exceeds that of diamond or any other natural material but because carbon nanotubes are so chemically stable, their chemical interactions with most other materials are relatively weak, which makes for high thermal interface resistance,” said Frank Ogletree, a physicist with the Lawrence Berkeley National Laboratory’s Materials Sciences Division, in a news release.

Ogletree and his colleagues worked with two former Intel researchers to figure out how to make a six-fold improvement in the heat flow between metal and carbon nanotubes. Their work is detailed in the 22 January issue of the journal Nature Communications.

The new study’s success rests upon using organic molecules as a bridge between the carbon nanotubes and metal—a method that greatly reduces the interface resistance that would otherwise prevent heat from flowing more efficiently between the materials. The organic molecules, including aminopropyl-trialkoxy-silane (APS) and cysteamine, create strong covalent bonds between the carbon nanotubes and the metal used in microchips. (The same bonding technique pioneered by the researchers can also work with graphene—a promising material for complementing silicon transistors.)

“With carbon nanotubes, thermal interface resistance adds something like 40 [micrometers] of distance on each side of the actual carbon nanotube layer,” said Sumanjeet Kaur, lead author of the Nature Communications paper and an industrial postdoctoral scientist at Porifera. “With our technique, we’re able to decrease the interface resistance so that the extra distance is around seven microns at each interface.”

This success will help pave the way for carbon nanotubes’ use in this application. But there is still a ways to go before we see them in commercially-available gadgets. One problem is that most nanotubes, grown in vertically-aligned arrays on silicon wafers, don’t make contact with the metal surfaces. But the Berkeley team hopes to improve the density of the contacts between the nanotubes and metal over time.

D-Wave’s Quantum Computing Claim Disputed Again

D-Wave's Quantum Computing Claim Disputed Again

The strongest scientific evidence for D-Wave’s claim to have built commercial quantum computers just got weaker. A new paper finds that classical computing can explain the performance patterns of D-Wave’s machines just as well as quantum computing can—a result that undermines crucial support for D-Wave’s claim from a previous study.

Quantum computing offers the possibility of doing many calculations in parallel by using quantum bits that can exist as both a 1 and 0 at the same time, as opposed to classical computing bits that exist as only a 1 or 0. Such mind-bending quantum physics could allow quantum computers to outperform classical computers in tackling challenging problems that would take today’s hardware practically forever to solve. But D-Wave’s quantum computing claims remain controversial in the scientific community even as the Canadian company has attracted high-profile clients such as Lockheed Martin and Google. (See IEEE Spectrum’s overview of the evidence for and against D-Wave’s machines from the December 2013 issue.)

A growing number of independent researchers have examined the Canadian company’s claim to have built quantum computers at scales well beyond anything seen in academic labs. Rather than build machines based on the traditional logic-gate model of computing, D-Wave built 512-qubit machines that supposedly perform quantum annealing—a method for tackling optimization problems. The best optimization solutions represent the lowest “valley” of a problem landscape resembling peaks and valleys.

One key to making quantum computing practical involves harnessing the quantum physics phenomenon of entanglement—separate qubits sharing the same quantum state—so that quantum computers can scale up effectively to tackle more complex challenges. A previous paper used simulations to suggest D-Wave’s machine performance patterns did show evidence of large-scale quantum entanglement across more than one hundred qubits. But the new paper published on the arXiv preprint server came up with a classical computing model that could explain the same patterns formerly attributed to quantum annealing, according to the Physics arXiv Blog.

The new paper drew a critical response from Geordie Rose, founder and chief technology officer of D-Wave, in which he suggested the new paper’s classical model did not fit other experiments involving the performance of D-Wave machines. But, Scott Aaronson, a theoretical computer scientist at MIT and D-Wave critic, said the classical model’s applicability in this particular case still undermines the biggest piece of evidence for D-Wave machines’ having large-scale entanglement. He spoke recently with Matthias Troyer, a computational physicist at ETH Zurich and a lead author of the previous paper (Boixo et al.) about how the new research (Shin et al.) had changed Troyer’s mind.

"Most tellingly, Matthias Troyer—one of the lead authors of the Boixo et al. paper, and someone I trust as possibly the most evenhanded authority in this entire business—tells me that the Shin et al. paper caused him to change his views: he didn’t know if the correlation patterns could be reproduced in any reasonable model without entanglement, and now he knows that it’s possible."

When IEEE Spectrum spoke with Aaronson last year, he was willing to admit that D-Wave’s machines may do quantum entanglement despite his overall skepticism toward D-Wave’s endeavors. Even now, he noted that evidence still exists for D-Wave’s machines performing small-scale entanglement. But he added that the evidence from the past year shows that D-Wave still has not demonstrated a speedup over classical computing. That is, D-Wave hasn’t proved its machines can outperform classical computers on increasingly challenging problems.

The lack of evidence for such a speedup has not stopped Google from enthusiastically testing out its recently purchased D-Wave machine, but it does mean D-Wave still has a ways to go before it can show that quantum computing has truly arrived. In a previous IEEE Spectrum interview, Troyer pointed out how independent researchers have only gained access to D-Wave machines in the past few years—a change that has allowed them to begin testing the machines’ scientific merits. The results of such scientific investigations so far have not supported most of D-Wave’s claims.