Transportation weekly: Nuro dreams of autonomous lattes, what is a metamaterial, Volvo takes the wheel

Welcome back to Transportation Weekly; I’m your host Kirsten Korosec, senior transportation reporter at TechCrunch. We love the reader feedback. Keep it coming.

Never heard of TechCrunch’s Transportation Weekly? Read the first edition hereAs I’ve written before, consider this a soft launch. Follow me on Twitter @kirstenkorosec to ensure you see it each week. An email subscription is coming!

This week, we’re shoving as much transportation news, tidbits and insights in here as possible in hopes that it will satiate you through the end of the month. That’s right, TechCrunch’s mobility team is on vacation next week.

You can expect to learn about metamaterials, how traffic is creating genetic peril, the rise of scooter docks in a dockless world, new details on autonomous delivery startup Nuro and a look back at the first self-driving car fatality.


ONM …

There are OEMs in the automotive world. And here, (wait for it) there are ONMs — original news manufacturers. (Cymbal clash!) This is where investigative reporting, enterprise pieces and analysis on transportation lives.

nuro-scout-coffee

Mark Harris is here again with an insider look into autonomous vehicle delivery bot startup Nuro. The 3-year-old company recently announced that it raised $940 million in financing from the SoftBank Vision Fund.

Harris, during his typical gumshoeing, uncovers what Nuro might do with all that capital. It’s more than just “scaling up” and “hiring talent” — the go-to declarations from startups flush with venture funding. No, Nuro’s founders have some grand ideas from automated kitchens and autonomous latte delivery to smaller robots that can cross lawns or climb stairs to drop off packages. Nuro recently told the National Highway Traffic Safety Administration that it wants introduce up to 5,000 upgraded vehicles called the R2X, over the next two years.

The company’s origin story and how it’s tied to autonomous trucking startup Ike is just as notable as its “big ideas.”

Come for the autonomous lattes; stay for the story … How Nuro plans to spend Softbank’s money


Dig In

What do metamaterials and Volvo have in common? Absolutely nothing. Except they’re both worth higlighting this week.

First up, is an article by TechCrunch’s Devin Coldewey on a company called Lumotive that has backing from Bill Gates and Intellectual Ventures. The names Bill Gates and Intellectual Ventures aren’t the most interesting components of the story. Nope, it’s metamaterials.

Let us explain. Most autonomous vehicles, robots and drones use lidar (or light detection and ranging radar) to sense their surroundings. Lidar basically works by bouncing light off the environment and measuring how and when it returns; in short, lidar helps create a 3D map of the world. (Here’s a complete primer on WTF is Lidar).

However, there are limitations to lidar sensors, which rely on mechanical platforms to move the laser emitter or mirror. That’s where metamaterials come in. In simple terms, metamaterials are specially engineered surfaces that have embedded microscopic structures and work as a single device. Metamaterials remove the mechanical piece of the problem, and allow lidar to scan when and where it wants within its field of view.

Metamaterials delivers the whole package: they’re durable and compact, solve problems with existing lidar systems, and are not prohibitively expensive.

If they’re so great why isn’t everyone using them? For one, it’s a new and emerging technology. Lumotive’s product is just a prototype. And Intellectual Ventures (IV) holds the patents for known techniques, Coldewey recently explained to me. IV is granting Lumotive an exclusive license to the tech — something it has done with other metamaterial-based startups it has spun out.

Shifting gears to Volvo

Automakers are rolling out increasingly robust advanced driver assistance systems in production cars. These new levels of automation are creating a conflict of sorts. One on hand, features like adaptive cruise control and lane steering can make commutes less stressful and arguably safer. And yet, they can also cause overconfidence in the system and complacency among drivers. (Even Tesla CEO Elon Musk has noted that complacency is a problem among owners using its advanced ADAS feature called Autopilot). (And yes, I wrote advanced ADAS; it sounds repetitive, but it’s meant to express higher levels of automation and a term I recently encountered from two respected sources)

Some argue that automakers shouldn’t deploy these kinds of automated features unless vehicles are equipped with driver-monitoring systems (DMS are essentially an in-car camera and accompanying software) that can ensure drivers are paying attention. Volvo is taking that a step further.

Driver Monitoring Camera in a Volvo research vehicle

The company announced this week that it will integrate DMS into its next-gen, SPA2-based vehicles beginning in the early 2020s and more importantly, enable its system to take action if the driver is distracted or intoxicated. The camera and other sensors will monitor the driver and will intervene if a clearly intoxicated or distracted driver does not respond to warning signals and is risking an accident involving serious injury or death. Under this scenario, Volvo could limit the car’s speed, call the Volvo on Call service on behalf of the driver or cause the vehicle to slow down and park itself on the roadside.

Volvo’s plans raise all kinds of questions, including privacy concerns and liability. The intent is to add a layer of safety. But it also adds complexity, which could compromise Volvo’s mission. The Autonocast, a podcast I co-host with Alex Roy and Ed Niedermeyer, talk about Volvo’s plans in our latest episode. Check it out.


A little bird …

We hear a lot. But we’re not selfish. Let’s share.

blinky-cat-bird

Remember two weeks ago when we dug into Waymo’s laser bears and wondered whether we had reached “peak” LiDAR? (Last year, there were 28 VC deals in LiDAR technology valued at $650 million. The number of deals was slightly lower than in 2017, but the values jumped by nearly 34 percent.)

It doesn’t look like we have. We’re hearing about several funding deals in the works or recently closed, a revelation that shows investors still see opportunity in startups trying to bring the next generation of light ranging and detection sensors to market.

Spotted …. Former Zoox CEO and co-founder Tim Kentley Klay was spotted at the Self-Racing Car event at Thunderhill Raceway near Willows, Calif., this weekend.

Got a tip or overheard something in the world of transportation? Email me or send a direct message to @kirstenkorosec.


Deal of the week

Lyft set the terms for its highly-anticipated initial public offering and announced it will kick off the roadshow for its IPO. That means the initial public offering will likely occur in the next two weeks. Here’s the S-1 that Lyft filed in early March. This latest announcement also revealed new details, including that its ticker symbol will be  “LYFT” — as one might expect — and that the IPO range is set for between $62 and $68 per share to sell 30,770,000 shares of Class A common stock. Lyft could raise up to $2.1 billion at the higher end of that range, or $1.9 billion at the lower end.

The Lyft news was big — and it’s a story we’ll be following for awhile. However, we wanted to highlight another one of Ingrid Lunden’s articles because it underscores a point I’ve been pushing for awhile: not every important move in the world of autonomous vehicles occurs in the big three of Detroit, Pittsburgh and Silicon Valley.

This week, Yandex, the Russian search giant that has been working on self-driving car technology, inked a partnership with Hyundai to develop software and hardware for autonomous car systems. This is Yandex’s first partnership with an OEM. But it’s not Hyundai’s first collaboration with an autonomous vehicle startup. (Hyundai has a partnership with Aurora too)

Yandex will work with Hyundai Mobis, the car giant’s OEM parts and service division, “to create a self-driving platform that can be used by any car manufacturer or taxi fleet” that will cover both a prototype as well as parts for other car-makers.

Other deals:


Snapshot

uber-bike-crash

One year ago, I parked on a small rise overlooking Mill Avenue in Tempe, Arizona. The mostly dirt knoll, dotted with some trees and a handful of structures known out here as ramadas, was hardly remarkable. Just one other car sat in the disintegrating asphalt parking lot, the result of so many sun-baked days. A group of homeless people had set up at the picnic tables under a few of the structures, their dogs lolling nearby.

And yet, it was here, or specifically on the gleaming road below, that something extraordinary had indeed happened. Just days before, Elaine Herzberg was crossing Mill Avenue south of Curry Road when an Uber self-driving vehicle struck and killed her. The vehicle was in autonomous mode at the time of the collision, with a human test driver behind the wheel.

I had been in the Phoenix area, a hub for testing autonomous vehicle technology, to moderate a panel on that very subject. But the panel had been hastily canceled by organizers worried about the optics of such a discussion. And so I picked up Starsky Robotics CEO Stefan Seltz Axmacher who was also in town for this now-canceled panel, and we drove to site where Herzberg had died.

I wrote at the time, that “March 18 changed everything—and nothing—in the frenzied and nascent world of autonomous vehicles.” One year later, those words are still correct. The incident dumped a bucket of ice water over the figurative heads of autonomous vehicle developers. Everyone it seemed, had sobered up. Testing was paused, dozens of companies assessed their own safety protocols. Earnest blogs were written. Lawsuits were filed.

And yet, the cogs on the AV machine haven’t stopped turning. That’s not necessarily a bad thing. Innovation can sometimes “make the world a better place.” But it’s rarely delivered in a neat little package, no strings attached.

I’m hardly the first to reflect or write about this one-year anniversary. There are many takes, some of them hot, others not so much. And there are a few insightful ones; Autonocast co-host Niedermeyer has one entitled 10 Lessons from Uber’s Fatal Self-Driving Car Crash that’s worth reading.

Right now, I’m more interested in those lessons that haven’t been learned yet. It’s partly what prompted us to launch this newsletter, a weekly post that aims to be more than a historical record or a medium to evangelize AV technology.


Tiny but mighty micromobility

It’s been said before, but we’ll say it again. Data is queen. This past week, mobility management startup Passport partnered with Charlotte, N.C., Detroit, Mich. and Omaha, Neb. and Lime to create a framework to apply parking principles, data analysis and more to the plethora of shared micromobility services.

And, in case you missed it, Bird had to let some people go late last week. We’ve learned a few more details since the news broke. That came out to about 40 people out of the ~900-person company. The layoffs were part of Bird’s annual performance review process and only affected U.S.-based employees, TechCrunch learned. Those laid off are eligible for severance, including health and medical benefits. Despite the layoffs, Bird is actively looking to hire for more than 100 positions throughout the company.

Meanwhile, Ford-owned Spin partnered with mobility startup Zagster to deploy scooters in 100+ new cities and campuses by the end of this year.

Megan Rose Dickey


Notable reads

Traffic affects more than people. Take a look at the map pictured above. See the red line? That’s Interstate 15 in southern California. To the east, are inland communities and eventually the San Bernardino National Forest and San Jacinto Mountains.

To the west, are the Santa Ana Mountains and an increasingly isolated family of 20 cougars, the Los Angeles Times reports this week. The 15 and the heavy traffic on it is putting pressure on the gene pool. In the past 15 years, at least seven cougars have crossed the 15. Just one sired 11 kittens. This lack of genetic diversity — the lower documented for the species outside of the endangered Florida panther — could have devastating effects on mountain lions here. A study published in the journal Ecological Applications predicts extinction probabilities of 16 percent to 28 percent over the next 50 years for these lions.

In this specific case, the last natural wildlife corridor in the area — and perhaps the difference between survival and extinction —  is little Temecula Creek.

This phenomenon is happening in other areas as well, causing communities to toy with possible solutions. One option: shuttling the lions over the other side, a move that could cause all sorts of problems. In other places, such as an area near the Santa Monica Mountains, a wildlife overpass has been proposed.

Transit pain points

Meanwhile, digital and mobile ticketing and payment company CellPoint Mobile released a report this week that examines the rising cost of acquiring new riders, mobile technology limitations and outdated procurement processes. The titillatingly named report — Challenges Facing Municipal, Regional and National Transit Agencies in the United States — surveyed 103 ground and mass transit operators in the United States.

Some takeaways and key findings:

  • 30 percent of mass transit providers collect fares through a mobile app; only 39 percent have an app at all
  • 26 percent of transit operators say costs are their biggest challenges. Among metro mass transit agencies, that concern jumps to 40 percent
  • Nearly a quarter (23 percent) of national operators and 24 percent of large transit agencies (1,000  to 10,000 employees) say that implementing mobile technology is their single biggest challenge.
  • Customer acquisition is the second-most common challenge in US transportation, cited by 23 percent national, 33 percent regional, and 17 percent of private operators.

Other items of note:


Testing and deployments

Lyft Scooters docks

Dockless scooters have been all the rage; now it seems that cities and scooters startups are considering whether free-floating micromobility might need to be reined in a skosh.

Lyft, which has scooters in 13 cities, recently experimented with parking racks. These parking racks or docks are designed specifically for scooters. The company set up these docking stations in Austin during SXSW and released a handy Guide to Good Scootiquette to encourage better and safer rider behavior.

Meanwhile, an industry around scooter management is emerging. Swiftmile, a startup that developed light electric vehicle charging systems for bike share, has new solar-powered charging platforms for scooters. TechCrunch met Swiftmile CEO Colin Roche in Austin earlier this month and learned that a number of cities are interested in deploying these systems. Swiftmile’s system not only charges the scooters, it also can provide scooter companies with diagnostics and keep the device locked in the dock if it’s malfunctioning. The docks can be programmed to lock the scooters up during certain hours — bar closing time would seem like an optimal time — to keep them from being misused. Systems like these could help scooter companies like Bird and Lime extend the life of their scooters and keep local officials happy.

Autonomous street sweepers

ENWAY and Nanyang Technological University are deploying autonomous street sweepers in the inner city of Singapore as part of a project with National Environmental Agency Singapore. The project began this month and will run into September 2020.

Under the pilot, ENWAY’s autonomous sweeper will clean an area of more than 12 kilometers of roads every day. The sweeper is equipped with numerous sensors, including 2D and 3D lidars, 3D cameras, GNSS. The base vehicle is a retrofitted all-electric compact road sweeper from Swiss manufacturer Bucher Municipal.

The company aims to commercialize autonomous cleaning on public ground in Singapore and abroad.

A demo of the sweeper is in the video below.

Silvercar scales up

On the other end of the transportation spectrum, Silvercar by Audi has rolled out a delivery and pick up service in downtown locations in New York and San Francisco. Silvercar customers can request their rental be dropped off and picked up at home or a location of their choosing for an additional fee. Silvercar also announced plans to bring its premium rental experience to Boston at Logan International Airport on April 15.

If you’ve never heard of Silvercar, you’re forgiven. It’s not exactly widespread. The company aims to remove the headache of traditional car rental. I recently tried it out in Austin during SXSW and found that it is convenient, and works pretty well, but doesn’t remove some of the annoying pinch points of car rentals. Yes, there are no lines. When I got off the plane in Austin, I received a message that my car was ready and to hail my driver who picked me up curbside, drove me to the Silvercar operation, and brought me to my Audi. I used the app to unlock the vehicle.

That’s cool. What would be even better is skipping all those steps and being able to access the vehicle right there in the airport without interacting with anyone. (Granted, not everyone wants that) This new delivery and pickup service in New York and San Francisco gets closer to that sweet spot.

Other stuff:


On our radar

New York Auto Show is coming up and I’ll be in the city right before the show. But then it’s back to the west coast for TC Sessions: Robotics + AI, a one-day event held April 18 at UC Berkeley. I’ll be interviewing Anthony Levandowski on stage and moderating a panel with Aurora co-founder Sterling Anderson and Uber ATG Toronto chief Raquel Urtasun to talk about building the self-driving stack and how AI is used to help vehicles understand and predict what’s happening in the world around them and make the right decisions.

Also, the PAVE Coalition is hosting its first public demonstration event April 5-7 at the Cobo Center in downtown Detroit. The public will have an opportunity to ride in a self-driving car, and interactive displays will help visitors understand the technology behind self-driving cars and their potential benefits.

Finally, one electric vehicle thing we’ve been following. Columbus, Ohio won the U.S. Department of Transportation’s first-ever Smart City Challenge and we’ve been tracking the city’s progress and its efforts to increase electric vehicle adoption.

One of the organizers told TechCrunch that since the beginning of 2017, the cumulative new EV registrations in the Columbus metropolitan area have increased by 121 percent. New EV registrations over this period outpaced the 82 percent expansion in the Midwest region and the 94 percent growth seen across the U.S. over the same time period.

Thanks for reading. There might be content you like or something you hate. Feel free to reach out to me at [email protected] to share those thoughts, opinions or tips. 

Nos vemos en dos semanas.

How Nuro plans to spend Softbank’s $940 million

Autonomous delivery startup Nuro is bursting with ideas since SoftBank invested nearly $1 billion in February, new filings reveal.

A recent patent application details how its R1 self-driving vehicle could carry smaller robots to cross lawns or climb stairs to drop off packages. The company has even taken the step of trademarking the name “Fido” for delivery services.

“We think there’s something neat about that name,” Nuro founder Dave Ferguson told TechCrunch. “It’s friendly, neighborly and embodies the spirit of a helper that brings you things. It wasn’t intended to extend towards literal robot dogs, although some of the legged platforms that others are building could be very interesting for this last 10-foot problem.”

Another section of Nuro’s patent shows the R1 delivering piping hot pizza and beverages, prepared en route in automated kitchens.

“We tried to build a lot of flexibility into the R1’s compartment so we could serve all the applications that people will be able to think of,” Ferguson said. “A coffee machine is actually a pretty good one. If you go to your local barista, those machines are incredibly expensive. Amortizing them over an entire neighborhood makes sense.”

As automated technologies mature, companies are focusing less on simply getting around and more on how services will connect with actual customers. Delivering goods instead of passengers also means fewer regulations to navigate.

That opportunity has prompted a number of companies, including e-commerce and logistics giant Amazon, FedEx, and numerous startups to explore autonomous delivery.  At CES this year, Continental unveiled a prototype dog-shaped robot for last-yard deliveries, while Amazon has unveiled a sidewalk robot called Scout that is already delivering packages to homes.

The first company to scale automated driving and delivery could start building revenue while those aiming for autonomous taxis are stuck in a maze of laws, safety concerns and consumer skepticism.

Origin story

Softbank’s capital allows Nuro’s founders to run with its many ideas. But even in its earliest days, they benefited from an early injection of cash.

Nuro was founded in June 2016 by Ferguson and another former Google engineer, Jiajun Zhu, after they received multi-million dollar payouts from the company’s infamous Chauffeur bonus plan. Chauffeur bonuses were intended to incentivize engineers who stuck with Google’s self-driving car project. However, the plan’s structure meant that anyone who left after the first payout in 2015 would also receive a large lump sum.

Lead engineer Anthony Levandowski appears to have earned over $125 million from the plan. He used some of the money to start Otto, a self-driving truck company that was acquired by Uber and subsequently became the focus of an epic patent and trade secrets theft lawsuit.

Court filings from that case suggest that Ferguson and Zhu received around $40 million each, although Ferguson would not confirm this. (Another Chauffeur alum, Russell Smith, got a smaller payout and quickly joined Nuro as its hardware lead).

Nuro completed its first Series A funding round in China just three months later, in a previously unreported deal that gave NetEase founder Ding Lei (aka William Ding) a seat on Nuro’s board. Ding was China’s first Internet and gaming billionaire, and was reportedly once the wealthiest person in China. However, his business empire, which spans e-commerce, education and pig farming, recently laid off large numbers of staff.

“William has been a board member and a strong supporter from the very start. But he’s not directing company decisions,” says Ferguson.

A second, U.S.-based round in June 2017 raised Nuro’s total Series A funding to $92 million.

A Nuro spinout

Nuro started pilot grocery deliveries last summer with a Kroger supermarket affiliate in the Phoenix suburb of Scottsdale. The pilot initially used modified Toyota Prius sedans and transitioned in December to its R1 vehicle. “We’re super excited about the application area,” says Ferguson. “87 percent of commerce is still local and 43 percent of all personal vehicle trips in the U.S. are for shopping and running errands.”

Meanwhile, Uber’s self-driving truck program, which had begun with the acquisition of Otto, was on its last legs. Although the program was not publicly canned until July 2018, many of its key personnel left in May. The LinkedIn profiles of engineers Jur van den Berg, Nancy Sun and Alden Woodrow show them going straight from Uber to found Ike, another self-driving truck startup, the same month.

When Ike came out of stealth mode in October, Nuro characterized its relationship with the new company as a partnership, where “we gave Ike a copy of our autonomy and infrastructure software and, in exchange, Nuro got an equity stake in Ike.”

In reality, Ike was more of a spinout. California and Delaware business records show that Ike was not incorporated until July, and shared office space with Nuro until at least the beginning of September. Ike’s founding engineers actually worked at Nuro after leaving Uber. Van den Berg can even be seen in a Nuro team photo that was shot in June and reproduced in Nuro’s Safety Report, wearing a Nuro T-shirt.

Ferguson confirmed that all three Ike founders had worked at Nuro before starting Ike.

“We are always looking for opportunities where the tech that we’ve built could help,” Ferguson said. “Trucking was a really good example, but we recognized that as a company, we couldn’t spread ourselves too thin. It made sense for both sides for the Ike co-founders to build their own independent company.”

Ike CEO Woodrow told TechCrunch recently that it’s using Nuro’s hardware designs and autonomous software, as well as data logging, maps and simulation systems. It raised $52 million in its own Series A in February.

Not to be outdone, Nuro quickly followed with an announcement of a $940 million investment by the SoftBank Vision Fund, in exchange for what Ferguson calls a “very, very significant ownership stake.” Nuro had been introduced to SoftBank after talks with Cruise fell through.

Thousands of bots

Apart from robotic dogs, what does the future hold for a newly cash-rich Nuro?

“We’re very excited about the Scottsdale pilot, but it’s basically one grocery store in one ZIP code,” says Ferguson. Shortly after our interview, Nuro announced that it would be expanding its delivery service to four more ZIP codes in Houston, Texas.

“Next year and onwards, we want to start to realize the potential of what we’re building to eventually service millions of people” Ferguson said. We’re aggressively expanding the number of partners we’re working with and we’re working on how we manufacture a vehicle at a large scale.”

Nuro will likely to partner with an established auto OEM to build a fleet of what Ferguson hopes will become tens or hundreds of thousands of driverless vehicles. Last week, it petitioned the National Highway Traffic Safety Administration (NHTSA) for exemptions to safety standards that do not make sense for a driverless vehicle – like having to install a windshield or rearview mirrors.

Nuro told NHTSA that it wants to introduce up to 5,000 upgraded vehicles called the R2X, over the next two years. The electric vehicles would have a top speed of 25 miles per hour and appear very similar to the R1 prototype operating in Arizona and Texas today. The R2X will have 12 high-def cameras, radars, and a top-mounted LiDar sensor. Nuro said it would not sell the vehicle but “own and centrally operate the entire fleet of R2Xs through partnerships with local businesses.”

“Providing services is also very expensive,” Ferguson explained. “Look at Uber or Lyft. As we scale up to the population we’re trying to serve and the number of verticals we’re looking at, it requires capital to operate until we’re profitable, which will not happen this year.”

Gates-backed Lumotive upends lidar conventions using metamaterials

Pretty much every self-driving car on the road, not to mention many a robot and drone, uses lidar to sense its surroundings. But useful as lidar is, it also involves physical compromises that limit its capabilities. Lumotive is a new company with funding from Bill Gates and Intellectual Ventures that uses metamaterials to exceed those limits, perhaps setting a new standard for the industry.

The company is just now coming out of stealth, but it’s been in the works for a long time. I actually met with them back in 2017 when the project was very hush-hush and operating under a different name at IV’s startup incubator. If the terms “metamaterials” and “Intellectual Ventures” tickle something in your brain, it’s because the company has spawned several startups that use intellectual property developed there, building on the work of materials scientist David Smith.

Metamaterials are essentially specially engineered surfaces with microscopic structures — in this case, tunable antennas — embedded in them, working as a single device.

Echodyne is another company that used metamaterials to great effect, shrinking radar arrays to pocket size by engineering a radar transceiver that’s essentially 2D and can have its beam steered electronically rather than mechanically.

The principle works for pretty much any wavelength of electromagnetic radiation — i.e. you could use X-rays instead of radio waves — but until now no one has made it work with visible light. That’s Lumotive’s advance, and the reason it works so well.

Flash, 2D and 1D lidar

Lidar basically works by bouncing light off the environment and measuring how and when it returns; this can be accomplished in several ways.

Flash lidar basically sends out a pulse that illuminates the whole scene with near-infrared light (905 nanometers, most likely) at once. This provides a quick measurement of the whole scene, but limited distance as the power of the light being emitted is limited.

2D or raster scan lidar takes an NIR laser and plays it over the scene incredibly quickly, left to right, down a bit, then does it again, again and again… scores or hundreds of times. Focusing the power into a beam gives these systems excellent range, but similar to a CRT TV with an electron beam tracing out the image, it takes rather a long time to complete the whole scene. Turnaround time is naturally of major importance in driving situations.

1D or line scan lidar strikes a balance between the two, using a vertical line of laser light that only has to go from one side to the other to complete the scene. This sacrifices some range and resolution but significantly improves responsiveness.

Lumotive offered the following diagram, which helps visualize the systems, although obviously “suitability” and “too short” and “too slow” are somewhat subjective:

The main problem with the latter two is that they rely on a mechanical platform to actually move the laser emitter or mirror from place to place. It works fine for the most part, but there are inherent limitations. For instance, it’s difficult to stop, slow or reverse a beam that’s being moved by a high-speed mechanism. If your 2D lidar system sweeps over something that could be worth further inspection, it has to go through the rest of its motions before coming back to it… over and over.

This is the primary advantage offered by a metamaterial system over existing ones: electronic beam steering. In Echodyne’s case the radar could quickly sweep over its whole range like normal, and upon detecting an object could immediately switch over and focus 90 percent of its cycles tracking it in higher spatial and temporal resolution. The same thing is now possible with lidar.

Imagine a deer jumping out around a blind curve. Every millisecond counts because the earlier a self-driving system knows the situation, the more options it has to accommodate it. All other things being equal, an electronically steered lidar system would detect the deer at the same time as the mechanically steered ones, or perhaps a bit sooner; upon noticing this movement, it could not just make more time for evaluating it on the next “pass,” but a microsecond later be backing up the beam and specifically targeting just the deer with the majority of its resolution.

Just for illustration. The beam isn’t some big red thing that comes out.

Targeted illumination would also improve the estimation of direction and speed, further improving the driving system’s knowledge and options — meanwhile, the beam can still dedicate a portion of its cycles to watching the road, requiring no complicated mechanical hijinks to do so. Meanwhile, it has an enormous aperture, allowing high sensitivity.

In terms of specs, it depends on many things, but if the beam is just sweeping normally across its 120×25 degree field of view, the standard unit will have about a 20Hz frame rate, with a 1000×256 resolution. That’s comparable to competitors, but keep in mind that the advantage is in the ability to change that field of view and frame rate on the fly. In the example of the deer, it may maintain a 20Hz refresh for the scene at large but concentrate more beam time on a 5×5 degree area, giving it a much faster rate.

Meta doesn’t mean mega-expensive

Naturally one would assume that such a system would be considerably more expensive than existing ones. Pricing is still a ways out — Lumotive just wanted to show that its tech exists for now — but this is far from exotic tech.

CG render of a lidar metamaterial chip.The team told me in an interview that their engineering process was tricky specifically because they designed it for fabrication using existing methods. It’s silicon-based, meaning it can use cheap and ubiquitous 905nm lasers rather than the rarer 1550nm, and its fabrication isn’t much more complex than making an ordinary display panel.

CTO and co-founder Gleb Akselrod explained: “Essentially it’s a reflective semiconductor chip, and on the surface we fabricate these tiny antennas to manipulate the light. It’s made using a standard semiconductor process, then we add liquid crystal, then the coating. It’s a lot like an LCD.”

An additional bonus of the metamaterial basis is that it works the same regardless of the size or shape of the chip. While an inch-wide rectangular chip is best for automotive purposes, Akselrod said, they could just as easily make one a quarter the size for robots that don’t need the wider field of view, or a larger or custom-shape one for a specialty vehicle or aircraft.

The details, as I said, are still being worked out. Lumotive has been working on this for years and decided it was time to just get the basic information out there. “We spend an inordinate amount of time explaining the technology to investors,” noted CEO and co-founder Bill Colleran. He, it should be noted, is a veteran innovator in this field, having headed Impinj most recently, and before that was at Broadcom, but is perhaps is best known for being CEO of Innovent when it created the first CMOS Bluetooth chip.

Right now the company is seeking investment after running on a 2017 seed round funded by Bill Gates and IV, which (as with other metamaterial-based startups it has spun out) is granting Lumotive an exclusive license to the tech. There are partnerships and other things in the offing, but the company wasn’t ready to talk about them; the product is currently in prototype but very showable form for the inevitable meetings with automotive and tech firms.

Tech regulation in Europe will only get tougher

European governments have been bringing the hammer down on tech in recent months, slapping record fines and stiff regulations on the largest imports out of Silicon Valley. Despite pleas from the world’s leading companies and Europe’s eroding trust in government, European citizens’ staunch support for regulation of new technologies points to an operating environment that is only getting tougher.

According to a roughly 25-page report recently published by a research arm out of Spain’s IE University, European citizens remain skeptical of tech disruption and want to handle their operators with kid gloves, even at a cost to the economy.

The survey was led by the IE’s Center for the Governance of Change — an IE-hosted research institution focused on studying “the political, economic, and societal implications of the current technological revolution and advances solutions to overcome its unwanted effects.” The “European Tech Insights 2019” report surveyed roughly 2,600 adults from various demographics across seven countries (France, Germany, Ireland, Italy, Spain, The Netherlands, and the UK) to gauge ground-level opinions on ongoing tech disruption and how government should deal with it.

The report does its fair share of fear-mongering and some of its major conclusions come across as a bit more “clickbaity” than insightful. However, the survey’s more nuanced data and line of questioning around specific forms of regulation offer detailed insight into how the regulatory backdrop and operating environment for European tech may ultimately evolve.

 

Distractions

Aurora’s Sterling Anderson, Uber ATG’s Raquel Urtasun to discuss self-driving cars and AI at TC Sessions

We’re just weeks away from our TC Sessions: Robotics + AI event at UC Berkeley on April 18.

Some of the best and brightest minds are joining us for the day-long event, including Marc RaibertColin AngleMelonee Wise and Anthony Levandowski . Last week, we added to the list of marquee guests and announced a panel with roboticist Ken Goldberg, who is chief scientist at Ambidextrous Robotics and William S. Floyd Jr. Distinguished Chair in Engineering at UC Berkeley, and Michael I. Jordan, the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at UC Berkeley.

Today we’ve got another exciting panel to unveil.

This is the first time artificial intelligence has joined robotics at this TC Sessions event. And what better way to discuss the intersection of AI and robotics than a panel on autonomous-vehicle technology.

Today we’re revealing two people who are among the top thinkers focused on autonomous-vehicle development: Sterling Anderson, co-founder and chief product officer of Aurora, and Uber ATG chief scientist Raquel Urtasun

The pair will dig into the self-driving stack and how AI is used to help vehicles understand and predict what’s happening in the world around them and make the right decisions.Sterling Anderson

Sterling Anderson

In the brain trust of self-driving car developers, Anderson is highly regarded. Prior to founding Aurora with Chris Urmson and Drew Bagnell, Anderson was director of Tesla’s Autopilot program. He also led the design, development and launch of the Tesla Model X, an all-electric SUV that launched in 2015.

Anderson has a PhD in Robotics from MIT. After finishing his doctorate at MIT, Anderson founded a startup called Gimlet Systems with another self-driving car pioneer, Karl Iagnemma. He also worked at McKinsey & Co.

Raquel Urtasun

Raquel Urtasun

Urtasun, chief scientist and head of Uber ATG Toronto, is also an associate professor in the Department of Computer Science at the University of Toronto.

Urtasun is a co-founder of the Vector Institute for AI. Urtasun is a leading expert in machine perception for self-driving cars. She earned her degree from the computer science department at École Polytechnique Fédéral de Lausanne and post-doc at MIT and UC Berkeley. Her research interests include machine learning, computer vision, robotics and remote sensing.

General Admission ($349) tickets are on sale now. Prices go up at the door, so book today!

Students, grab your discounted $45 tickets here.

Startups, make sure to check out our demo table packages, which include three tickets, for just $1,500.

How Amazon and Walmart are putting robots to work behind the scenes

Extra Crunch offers members the opportunity to tune into conference calls led and moderated by the TechCrunch writers you read every day.

This week Brian Heater, fresh off a trip to Pittsburgh to visit a handful of robotics companies, led a discussion about the current state of robotics and how startups are integrating the machines into our lives. When it comes to our home lives, we really only have the Roomba, that circular disc that moves about our floors on its own sweeping up the dust and dirt. In fact, the jobs being performed behind the scenes are the ones robots are digging into.

Obviously we’ve got some fairly unrealistic expectations about robotics that have been served up to us by sci-fi and things like that. And when we take away the state of consumer robotics and household robotics, the best we can do at the moment right now is the Roomba. Which is obviously quite far away from being Rosie the Robot idea that has been promised to us since the 1960s.The rub of all this, however, is that we tend to not actually see them in action. In automation, there’s a concept of three Ds, which are dull, dirty and dangerous. So they’re the jobs that these robotics are basically designed to adopt.

He also touches upon the fear of robots taking our jobs. What he found is that, no, you don’t have anything to fear — unless you’re an elevator operator, he says, and even that’s not across the board. But there is a political response to that by Rep. Alexandria Ocasio-Cortez, who said at SXSW last week: “We should not be haunted by the specter of being automated out of work. We should be excited by that. But the reason that we’re not excited is that we live in a society where if you don’t have a job, you’ll have to die. And at its core, that’s the problem.”

And it’s not robotics discussion without mentioning Amazon . Heater recently visited an Amazon fulfillment center on Staten Island to give you a peek at how robots help get your packages to you on time.

For access to the full transcription and the call audio, and for the opportunity to participate in future conference calls, become a member of Extra Crunch. Learn more and try it for free. 

Discount student tickets still available for TC Sessions: Robotics + AI 2019

Here’s a shout-out to students fascinated by the world of robots and artificial intelligence. There’s a limited supply of discounted tickets left for TechCrunch Sessions: Robotics + AI, a one-day conference that takes place at UC Berkeley on April 18. Apply now to reserve your $45 student ticket, and spend a full day learning from the best minds, makers and investors in the business.

What can you expect at TC Sessions: Robotics + AI? A jam-packed day of panel discussions, workshops, Q&As, demos and world-class networking. Here’s a sample:

  • A panel discussion on human-robot interaction with Anca Dragan (UC Berkeley’s Interact Lab), Rana el Kaliouby (Affectiva) and Matt Willis (SoftBank Robotics).
  • A workshop with Eric Migicovsky (Y Combinator): How to Launch a Robotics Startup
  • A Q&A with startup founders gives you the opportunity to ask questions of some of the greatest minds in technology
  • A robot demo by Marc Raibert featuring SpotMini, Boston Dynamic’s latest creation

In a classic “but wait, there’s more” moment, we’re holding back some juicy agenda updates, so keep checking back for the latest.

For students looking for an internship, or a job, in robotics or AI, networking is key. TC Sessions: Robotics + AI draws more than 1,000 of the industry’s top technologists, founders and investors for one powerful day. You’ll have ample opportunity to talk, connect and rub elbows with these dream-makers and game-changers.

You might wonder how you’ll connect with the people you’re most interested in meeting. Good news on that front. We’re making CrunchMatch, TC’s free business match-making service, available to all attendees. The CrunchMatch platform (powered by Brella) helps you find and connect with people based on specific mutual criteria, goals and interests. It makes effective networking easier and more efficient.

Just one month to go before TechCrunch Sessions: Robotics + AI takes place on April 18 at UC Berkeley’s Zellerbach Hall. Apply for your $45 student ticket today and, once we verify your student status, we’ll release your ticket. Can’t wait to see you in Berkeley!

Trump’s views about ‘crazy’ self-driving cars are at odds with his DOT

President Donald Trump is an automated-vehicle skeptic, a point of view that lies in stark contrast with agencies within his own administration, including the U.S. Department of Transportation .

According to a recent scoop by Axios, Trump has privately said he thinks the autonomous-vehicle revolution is “crazy.” Trump’s point of view isn’t exactly surprising. His recent tweets about airplanes becoming too complex illustrates his Luddite leanings.

The interesting bit — beyond a recounting of Trump pantomiming self-driving cars veering out of control — is how his personal views compare to the DOT.

Just last week during SXSW in Austin, Secretary of Transportation Elaine Chao announced the creation of the Non-Traditional and Emerging Transportation Technology (NETT) Council, an internal organization designed to resolve jurisdictional and regulatory gaps that may impede the deployment of new technology, such as tunneling, hyperloop, autonomous vehicles and other innovations.

“New technologies increasingly straddle more than one mode of transportation, so I’ve signed an order creating a new internal Department council to better coordinate the review of innovation that have multi-modal applications,” Chao said in a prepared statement at the time.

Meanwhile, other AV-related policies and legislation are in various stages of review.

The DOT’s National Highway Traffic Safety Administration (NHTSA) announced Friday that automated-vehicle petitions from Nuro and General Motors are advancing to the Federal Register for public review and comment.

The parallel viewpoints have yet to collide. There’s no evidence that Trump’s personal views on autonomous-vehicle technology has been inserted into DOT policy. Of course, that doesn’t mean it won’t.

AV companies are hip to this eventuality and are taking steps now to educate the masses — and Trump. Take the Partners for Automated Vehicle Education (PAVE) coalition, as one example. PAVE launched in January with a founding group that included a number of major automakers, technology companies and organizations with a stake in autonomous vehicles, including Audi, Aurora, Cruise, GM, Mobileye, Nvidia, Toyota, Waymo and Zoox to spread the word about advanced vehicle technologies and self-driving vehicles. Their message: This tech can transform transportation and make it safer and more sustainable.

Waymo has also teamed up with AAA on a public education campaign to spread the word about autonomous-vehicle technology and how it could impact safety and help people get around. The partnership, announced recently, is with AAA Northern California, Nevada & Utah (AAA NCNU), a regional organization that oversees operations in seven markets, including well-known hubs of autonomous vehicle development such as Arizona and California.

These are the robots that help you get your Amazon packages on time

Months before the hard-fought battle for its second global headquarters in Queens, Amazon planted a massive, 855,000-square-foot flag in Staten Island. The $100 million JFK8 fulfillment center opened last fall, after an “on the spot” hiring spree, aimed at employing an eventual 2,250 people.

The new factory smell still permeated the air when we visited the space in February. Things were shiny and new, but still humming. It’s a 24-hour process designed to meet the standards of rapid package delivery the company has — for better and worse — set for the entire e-commerce world.

JFK8 stands as a kind of church to the world of early 21st century capitalism, and wherever you happen to land on the politics of Amazon, it’s an impressive sight to behold, as packages zip by on complex highway systems of conveyor belts, en route to next-day deliveries.

The space also offers a compelling glimpse into the factories of the future, where humans and robots work hand in hand, so to speak. The company currently has around 100,000 robotic systems deployed across more than 25 fulfillment centers, a number that it says has helped the company store an additional 40 percent of inventory in its fulfillment centers.

Those on display at the Staten Island facility run the gamut, from the ship sorters that whiz across conveyor belts booting packages into their proper chutes to giant palletizer robotic arms, developed in conjunction with Japanese automation giant, Fanuc.

All are working to the same end, with Amazon’s own in-house robots forming the system’s core. Up a few flights, robots zoom around the floor in a tightly controlled space, like giant-sized Roombas in closely choreographed movements.

The mobile robots were the heart of the company’s $775 million 2012 acquisition of Kiva, a Massachusetts-based startup it would rename Amazon Robotics three years later. The Kiva name still appears in some of the legacy signage, including labels that appear around the perimeter of the robot’s enclosed space, but Amazon was quick to incorporate what was at the time its second-largest acquisition.

“I think by the time Amazon looked at us, they were quite interested in the technology that we had developed and acquired us because they were interested in taking them into the fulfillment centers as we do today,” Scott Dresser, Amazon Robotics’ director of Software, Systems and Solutions tells TechCrunch. “It is the core storage system in a fulfillment center. It houses all of the inventory.”

The army of robots are walled in by fences that bring to mind nothing more than indoor batting cases. Around the edges, Amazon employees work at pick-and-stow stations, working alongside the robots to determine how best to store inventory on the shelving pods and how items will be shipped together.

Dresser quickly rejects the premise that robots will outright replace their human co-workers in the short term, noting that each has a separate, but complementary skill set.

“Associates and people in our builds are really good at finding where to put products in storage shelving,” he says. “Systems are not good at doing that. We’re able to leverage the things that people are good for and leverage the things that systems are good for. We see that pattern playing out in a lot of different applications. We certainly see that being an augmentation to what people are doing and helping them be as efficient as possible with what they’re doing.”

Floor safety is an increasing concern. A recent story about an exploding bear mace canister that sent 24 employees to the hospital at a Robbinsville Township, New Jersey warehouse has once again brought the issue to top of mind. While the incident was initially reported to involve an Amazon robot, the company denies that any were involved.

The fencing around the robots is designed specifically to keep associates out of harms way — an increasingly important concern as large machinery becomes an everyday part of life at these sorts of factories. Human employees are generally not allowed in this enclosed space for reasons of safety and efficiency, but an imperfect system requires the occasional interaction. Things fall out of pods, robots break down.

It’s for this reason the company introduced the Robotic Safety Vest. The bright orange mesh vest includes a belt containing a variety of sensors that add a few pounds to the employee’s get up.

“It complements the technology that’s in the robotic drive system,” says Dresser. “That vest is detectable by the robot. There’s a system that knows the associate is nearby and based on that signal.”

An employee demonstrates the vest, flicking a button, opening the fence and walking onto the floor. Robots at a distance slow down, while ones in the immediate vicinity stop altogether. It’s another added level of safety. If the bear mace incident showed anything, it’s how quickly news of robotics-related accidents spread, whether or not an actual robot was ultimately involved.

As more people buy more products online, these armies of robotics will no doubt play an increasingly central role in meeting that demand.

Last day to save $100 on TC Sessions: Robotics + AI 2019

Like all good things must, early-bird pricing for tickets to TechCrunch Sessions: Robotics + AI comes to an end today. This is your last chance to save $100 on our day-long gathering of the greatest minds, makers and investors in the robotics and AI communities. Procrastination has a price. Buy your $249 ticket today or pay $349 tomorrow. That’s basic math right there.

TC Sessions: Robotics + AI takes place on April 18 at UC Berkeley’s Zellerbach Hall, and we’re stoked about the agenda of incredible speakers, panel discussions, demos and workshops we have planned. We think you’ll love it, too.

Previously on TechCrunch, we shared just a few of the heavy-hitters you’ll hear from, including Marc Raibert, Melonee Wise, Ken Goldberg, Anca Dragan Anthony Levandowski and Laura Major. But wait, there’s more!

We’re beyond thrilled to add Colin Angle, iRobot co-founder and CEO, to our list of industry headliners gracing our stage. If anyone knows mainstream home robots, it’s this guy. He’s one of the creators of Roomba, the first successful consumer robot and the best-selling vacuum in the U.S. Angle will discuss Terra, iRobot’s new robotic lawnmower, and the 10 years of R&D involved in bringing it to market.

What does it take for someone to invest in a robotics startup? That’s a red-hot topic, and you’ll get to hear Peter Barrett (Playground Global), Hidetaka Aoki (Global Brain), Helen Liang (FoundersX Ventures) and Andy Wheeler (GV) discuss what they’ve learned in a panel called, “Investing in Robotics and AI: Lessons from the Industry’s VCs.”

We expect more than 1,000 members of the robotics and AI community to attend, and that makes it a networking event you won’t want to miss. These are the influencers, the people who can help you make your robotic dreams come to life. Or perhaps you’ll invest in the next robot unicorn. It’s a day full of opportunity and potential. Be sure to use CrunchMatch (powered by Brella), our free business match-making service. It’s curated and automated, and it makes connecting with the right people painless and efficient.

TechCrunch Sessions: Robotics + AI takes place at UC Berkeley’s Zellerbach Hall on April 18, 2019. If you’re part of the robotics/AI startup ecosystem, you do not want to miss this event. And you certainly don’t want to pay more than necessary. Today’s the last day to save $100 on admission. Buy your early-bird ticket now, and we’ll see you in April.