Trump’s views about ‘crazy’ self-driving cars are at odds with his DOT

President Donald Trump is an automated-vehicle skeptic, a point of view that lies in stark contrast with agencies within his own administration, including the U.S. Department of Transportation .

According to a recent scoop by Axios, Trump has privately said he thinks the autonomous-vehicle revolution is “crazy.” Trump’s point of view isn’t exactly surprising. His recent tweets about airplanes becoming too complex illustrates his Luddite leanings.

The interesting bit — beyond a recounting of Trump pantomiming self-driving cars veering out of control — is how his personal views compare to the DOT.

Just last week during SXSW in Austin, Secretary of Transportation Elaine Chao announced the creation of the Non-Traditional and Emerging Transportation Technology (NETT) Council, an internal organization designed to resolve jurisdictional and regulatory gaps that may impede the deployment of new technology, such as tunneling, hyperloop, autonomous vehicles and other innovations.

“New technologies increasingly straddle more than one mode of transportation, so I’ve signed an order creating a new internal Department council to better coordinate the review of innovation that have multi-modal applications,” Chao said in a prepared statement at the time.

Meanwhile, other AV-related policies and legislation are in various stages of review.

The DOT’s National Highway Traffic Safety Administration (NHTSA) announced Friday that automated-vehicle petitions from Nuro and General Motors are advancing to the Federal Register for public review and comment.

The parallel viewpoints have yet to collide. There’s no evidence that Trump’s personal views on autonomous-vehicle technology has been inserted into DOT policy. Of course, that doesn’t mean it won’t.

AV companies are hip to this eventuality and are taking steps now to educate the masses — and Trump. Take the Partners for Automated Vehicle Education (PAVE) coalition, as one example. PAVE launched in January with a founding group that included a number of major automakers, technology companies and organizations with a stake in autonomous vehicles, including Audi, Aurora, Cruise, GM, Mobileye, Nvidia, Toyota, Waymo and Zoox to spread the word about advanced vehicle technologies and self-driving vehicles. Their message: This tech can transform transportation and make it safer and more sustainable.

Waymo has also teamed up with AAA on a public education campaign to spread the word about autonomous-vehicle technology and how it could impact safety and help people get around. The partnership, announced recently, is with AAA Northern California, Nevada & Utah (AAA NCNU), a regional organization that oversees operations in seven markets, including well-known hubs of autonomous vehicle development such as Arizona and California.

These are the robots that help you get your Amazon packages on time

Months before the hard-fought battle for its second global headquarters in Queens, Amazon planted a massive, 855,000-square-foot flag in Staten Island. The $100 million JFK8 fulfillment center opened last fall, after an “on the spot” hiring spree, aimed at employing an eventual 2,250 people.

The new factory smell still permeated the air when we visited the space in February. Things were shiny and new, but still humming. It’s a 24-hour process designed to meet the standards of rapid package delivery the company has — for better and worse — set for the entire e-commerce world.

JFK8 stands as a kind of church to the world of early 21st century capitalism, and wherever you happen to land on the politics of Amazon, it’s an impressive sight to behold, as packages zip by on complex highway systems of conveyor belts, en route to next-day deliveries.

The space also offers a compelling glimpse into the factories of the future, where humans and robots work hand in hand, so to speak. The company currently has around 100,000 robotic systems deployed across more than 25 fulfillment centers, a number that it says has helped the company store an additional 40 percent of inventory in its fulfillment centers.

Those on display at the Staten Island facility run the gamut, from the ship sorters that whiz across conveyor belts booting packages into their proper chutes to giant palletizer robotic arms, developed in conjunction with Japanese automation giant, Fanuc.

All are working to the same end, with Amazon’s own in-house robots forming the system’s core. Up a few flights, robots zoom around the floor in a tightly controlled space, like giant-sized Roombas in closely choreographed movements.

The mobile robots were the heart of the company’s $775 million 2012 acquisition of Kiva, a Massachusetts-based startup it would rename Amazon Robotics three years later. The Kiva name still appears in some of the legacy signage, including labels that appear around the perimeter of the robot’s enclosed space, but Amazon was quick to incorporate what was at the time its second-largest acquisition.

“I think by the time Amazon looked at us, they were quite interested in the technology that we had developed and acquired us because they were interested in taking them into the fulfillment centers as we do today,” Scott Dresser, Amazon Robotics’ director of Software, Systems and Solutions tells TechCrunch. “It is the core storage system in a fulfillment center. It houses all of the inventory.”

The army of robots are walled in by fences that bring to mind nothing more than indoor batting cases. Around the edges, Amazon employees work at pick-and-stow stations, working alongside the robots to determine how best to store inventory on the shelving pods and how items will be shipped together.

Dresser quickly rejects the premise that robots will outright replace their human co-workers in the short term, noting that each has a separate, but complementary skill set.

“Associates and people in our builds are really good at finding where to put products in storage shelving,” he says. “Systems are not good at doing that. We’re able to leverage the things that people are good for and leverage the things that systems are good for. We see that pattern playing out in a lot of different applications. We certainly see that being an augmentation to what people are doing and helping them be as efficient as possible with what they’re doing.”

Floor safety is an increasing concern. A recent story about an exploding bear mace canister that sent 24 employees to the hospital at a Robbinsville Township, New Jersey warehouse has once again brought the issue to top of mind. While the incident was initially reported to involve an Amazon robot, the company denies that any were involved.

The fencing around the robots is designed specifically to keep associates out of harms way — an increasingly important concern as large machinery becomes an everyday part of life at these sorts of factories. Human employees are generally not allowed in this enclosed space for reasons of safety and efficiency, but an imperfect system requires the occasional interaction. Things fall out of pods, robots break down.

It’s for this reason the company introduced the Robotic Safety Vest. The bright orange mesh vest includes a belt containing a variety of sensors that add a few pounds to the employee’s get up.

“It complements the technology that’s in the robotic drive system,” says Dresser. “That vest is detectable by the robot. There’s a system that knows the associate is nearby and based on that signal.”

An employee demonstrates the vest, flicking a button, opening the fence and walking onto the floor. Robots at a distance slow down, while ones in the immediate vicinity stop altogether. It’s another added level of safety. If the bear mace incident showed anything, it’s how quickly news of robotics-related accidents spread, whether or not an actual robot was ultimately involved.

As more people buy more products online, these armies of robotics will no doubt play an increasingly central role in meeting that demand.

Last day to save $100 on TC Sessions: Robotics + AI 2019

Like all good things must, early-bird pricing for tickets to TechCrunch Sessions: Robotics + AI comes to an end today. This is your last chance to save $100 on our day-long gathering of the greatest minds, makers and investors in the robotics and AI communities. Procrastination has a price. Buy your $249 ticket today or pay $349 tomorrow. That’s basic math right there.

TC Sessions: Robotics + AI takes place on April 18 at UC Berkeley’s Zellerbach Hall, and we’re stoked about the agenda of incredible speakers, panel discussions, demos and workshops we have planned. We think you’ll love it, too.

Previously on TechCrunch, we shared just a few of the heavy-hitters you’ll hear from, including Marc Raibert, Melonee Wise, Ken Goldberg, Anca Dragan Anthony Levandowski and Laura Major. But wait, there’s more!

We’re beyond thrilled to add Colin Angle, iRobot co-founder and CEO, to our list of industry headliners gracing our stage. If anyone knows mainstream home robots, it’s this guy. He’s one of the creators of Roomba, the first successful consumer robot and the best-selling vacuum in the U.S. Angle will discuss Terra, iRobot’s new robotic lawnmower, and the 10 years of R&D involved in bringing it to market.

What does it take for someone to invest in a robotics startup? That’s a red-hot topic, and you’ll get to hear Peter Barrett (Playground Global), Hidetaka Aoki (Global Brain), Helen Liang (FoundersX Ventures) and Andy Wheeler (GV) discuss what they’ve learned in a panel called, “Investing in Robotics and AI: Lessons from the Industry’s VCs.”

We expect more than 1,000 members of the robotics and AI community to attend, and that makes it a networking event you won’t want to miss. These are the influencers, the people who can help you make your robotic dreams come to life. Or perhaps you’ll invest in the next robot unicorn. It’s a day full of opportunity and potential. Be sure to use CrunchMatch (powered by Brella), our free business match-making service. It’s curated and automated, and it makes connecting with the right people painless and efficient.

TechCrunch Sessions: Robotics + AI takes place at UC Berkeley’s Zellerbach Hall on April 18, 2019. If you’re part of the robotics/AI startup ecosystem, you do not want to miss this event. And you certainly don’t want to pay more than necessary. Today’s the last day to save $100 on admission. Buy your early-bird ticket now, and we’ll see you in April.

Toyota and Panasonic will showcase assistive robotics during the Tokyo Summer Olympics

Next year, the world’s top athletes will compete in the Tokyo Summer Games. Some of Japan’s biggest companies will also happily be using the opportunity to show off their latest technologies — namely a fleet of robotics aimed at helping human spectators navigate around the event.

Officials behind the games showcased some of the technologies that will be on display, during a press conference this week. The event is set to show the world Japan’s close working relationships with robotics designed to help an aging populace.

“Robots should not overwhelm people,” Tokyo Olympics Vice General Director Masaaki Komiya said at the event. “Robots are something that have an amicable relationship with human beings and can work together. That’s the kind of robots we envision.”

Toyota, a long time developer of assistive robotics, will take center stage here. The company is a key sponsor of the events bringing 16 support robots to the games, along with delivery robots. Fellow sponsor Panasonic will also participate by bringing 20 robotic exoskeletons to help transport luggage and lift other heavy objects.

The Olympics provide the perfect spotlight for the technologies, as a curious world tunes into the games. But the technologies on display here also fittingly highlight a less flashy side of robotics than we’re accustom to seeing on international television. Instead, they’ll provide a showcase for the role robotics will play in the short term future of many of our day to day lives: providing assistive technologies for those who need them.

UC Berkeley’s Ken Goldberg and Michael I. Jordan will discuss AI at TC Sessions: Robotics + AI April 18

We’re just over a month out from our TC Sessions: Robotics + AI event at UC Berkeley on April 18. We’ve already announced a number of marquee guests for the event, including Marc Raibert, Colin Angle, Melonee Wise and Anthony Levandowski. Today we’ve got another exciting panel to unveil and as an FYI our early bird sale ends Friday!

This is our third robotics event, but it’s the first time artificial intelligence has shared the spotlight. Today we’re revealing that two of UC Berkeley’s top names in the space will be sharing a stage to discuss the role of AI in society for a panel titled, TC Sessions: Robotics + AI Artificial Intelligence: Minds, Economies and Systems that Learn.

The pair of professors will be discussing how AI grew to become of modern society’s most ubiquitous and wide ranging technologies. The panel will also explore where the tech will go from here.

Ken Goldberg is a Professor, Industrial Engineering and Operations Research at U.C. Berkeley. He has co-authored more than 200 peer-reviewed papers on automation, robotics and social information. He is the Editor-in-Chief of IEEE Transactions on Automation Science and Engineering and cofounder of the the Berkeley Center for New Media.

Michael I. Jordan is the Pehong Chen Distinguished Professor in the Department of Electrical Engineering and Computer Science and the Department of Statistics at U.C. Berkeley. His work touches on a wide range of topics, including computer science, AI and computational biology. He is a member of the National Academy of Engineering, the American Academy of Arts and Sciences and a Fellow of the American Association for the Advancement of Science.

Early bird ticket sales end tomorrow, Friday. Book your tickets today and save $100 before prices increase.

Students, grab your discounted $45 tickets here.

Startups, make sure to check out our demo table packages, which include 3 tickets for just $1500.

MIT’s deflated balloon robot hand can pick up objects 100x its own weight

Soft, biologically inspired robots have become one of the field’s most exciting offshoots, with machines that are capable of squeezing between obstacles and conforming to the world around them. A joint project between MIT CSAIL and Harvard’s Wyss converts those learnings into a simple, soft robotic gripper capable of handling delicate objects and picking up things up to 100x its own weight.

The gripper itself is made of an origami-inspired skeletal structure, covered in either fabric or a deflated balloon. It’s a principle the team recently employed on another project designed to create low-cost artificial muscles. A connector attaches the gripper to the arm and also sports a vacuum tube that sucks air out from the gripper, collapsing it around an object.

Like Soft Robotics’ commercial gripper, the malleable nature of the device means it grab hold of a wide range of different objects with less need for a complex vision system. It also means that it can grab hold of delicate items without damaging them in the process.

“Previous approaches to the packing problem could only handle very limited classes of objects — objects that are very light or objects that conform to shapes such as boxes and cylinders, but with the Magic Ball gripper system we’ve shown that we can do pick-and-place tasks for a large variety of items ranging from wine bottles to broccoli, grapes and eggs,” MIT professor Daniela Rus says in a release tied to the news. “In other words, objects that are heavy and objects that are light. Objects that are delicate, or sturdy, or that have regular or free form shapes.”

Tiny claws let drones perch like birds and bats

Drones are useful in countless ways, but that usefulness is often limited by the time they can stay in the air. Shouldn’t drones be able to take a load off too? With these special claws attached, they can perch or hang with ease, conserving battery power and vastly extending their flight time.

The claws, created by a highly multinational team of researchers I’ll list at the end, are inspired by birds and bats. The team noted that many flying animals have specially adapted feet or claws suited to attaching the creature to its favored surface. Sometimes they sit, sometimes they hang, sometimes they just kind of lean on it and don’t have to flap as hard.

As the researchers write:

In all of these cases, some suitably shaped part of the animal’s foot interacts with a structure in the environment and facilitates that less lift needs to be generated or that power flight can be completely suspended. Our goal is to use the same concept, which is commonly referred to as “perching,” for UAVs [unmanned aerial vehicles].

“Perching,” you say? Go on…

We designed a modularized and actuated landing gear framework for rotary-wing UAVs consisting of an actuated gripper module and a set of contact modules that are mounted on the gripper’s fingers.

This modularization substantially increased the range of possible structures that can be exploited for perching and resting as compared with avian-inspired grippers.

Instead of trying to build one complex mechanism, like a pair of articulating feet, the team gave the drones a set of specially shaped 3D-printed static modules and one big gripper.

The drone surveys its surroundings using lidar or some other depth-aware sensor. This lets it characterize surfaces nearby and match those to a library of examples that it knows it can rest on.

Squared-off edges like those on the top right can be rested on as in A, while a pole can be balanced on as in B.

If the drone sees and needs to rest on a pole, it can grab it from above. If it’s a horizontal bar, it can grip it and hang below, flipping up again when necessary. If it’s a ledge, it can use a little cutout to steady itself against the corner, letting it shut off or all its motors. These modules can easily be swapped out or modified depending on the mission.

I have to say the whole thing actually seems to work remarkably well for a prototype. The hard part appears to be the recognition of useful surfaces and the precise positioning required to land on them properly. But it’s useful enough — in professional and military applications especially, one suspects — that it seems likely to be a common feature in a few years.

The paper describing this system was published in the journal Science Robotics. I don’t want to leave anyone out, so it’s by: Kaiyu Hang, Ximin Lyu, Haoran Song, Johannes A. Stork , Aaron M. Dollar, Danica Kragic and Fu Zhang, from Yale, the Hong Kong University of Science and Technology, the University of Hong Kong, and the KTH Royal Institute of Technology.

Take NVIDIA’s new Deep Learning Robotics Workshop at TC Sessions: Robotics + AI

As part of TechCrunch Sessions: Robotics + AI, we are happy to announce a partnership to deliver a brand new course from NVIDIA’s Deep Learning Institute (DLI). Called the NVIDIA Deep Learning for Robotics Workshop, this brand new, never before offered workshop will provide training in a new 3D accelerated remote desktop environment on April 17 in Berkeley, the day before the main show.

Under this supervision of a NVIDIA DLI instructor, the 60 participants will explore how to create robotics solutions on a Jetson for embedded applications and how to implement and deploy an end-to-end project through hands-on training. This eight-hour workshop will also teach how to:

  • Apply computer vision models to perform detection
  • Prune and optimize the model for embedded application
  • Train a robot to actuate the correct output based on the visual input

Upon completion, participants will know how to deploy high-performance deep learning applications for robotics and can earn an NVIDIA certificate upon completion of a code-based assessment.

What are the prerequisites? All that’s required is a basic familiarity with deep neural networks and basic coding experience in Python or a similar language.

Registration is $299* for workshop tickets. All supplies will be provided except for a laptop. (We’re providing the super-fat bandwidth the course requires.) There are only 60 spots available, so grab tickets while they last.

Check out the website for more information.

*Tickets to the April 17 NVIDIA workshop do not grant access to TC Sessions: Robotics + AI on April 18. Tickets are sold separately.

2 days left to save $100 on TC Sessions: Robotics + AI

For the love of robots, don’t miss your chance to save $100 on admission to TechCrunch Sessions: Robotics + AI. Our annual, day-long event goes deep on all things related to robotics and artificial intelligence, and you’ll hear from the greatest tech and investment minds and makers. Danger, Will Robinson, danger! The early-bird price — and the $100 savings — disappears in only two days, so buy your ticket right now.

Last year, more than 1,000 people attended our robo-fest, which makes it a prime opportunity for networking. This year, we’re adding CrunchMatch — TechCrunch’s free business match-making service — into the mix. This handy tool simplifies the networking process by helping you find and connect with the right people based on specific mutual criteria, goals and interests.

We’ve prepared an outstanding lineup of interviews, panel discussions, demos and workshops, and we still have a few more surprises to add over the next few weeks. Check out the event agenda and keep checking back for updates. In the meantime, here’s a taste to wet your proverbial whistle.

Founders might wish they could read investors’ minds, but we have the next best thing. A TechCrunch editor will moderate an all-star robotics and AI investor panel with Peter Barrett (Playground Global), Helen Liang (FoundersX Ventures), Andy Wheeler (GV) and Hidetaka Aoki (Global Brain). Even better, the panel will take audience questions, too.

Human-robot interaction has come a long way from Issac Asimov’s short story, “Robbie.” HRI incorporates just about every aspect of AI and robotics, and we’re thrilled to host a panel discussion on this challenging topic featuring Anca Dragan (UC Berkeley’s Interact Lab), Rana el Kaliouby(Affectiva) and Matt Willis (SoftBank Robotics). Learn where HRI stands today and where it will lead tomorrow.

Building a tech startup ain’t easy, as you well know. But building a robotics startup, well, let’s just say it’s not for the faint-hearted. We’ve tapped several brave hearts to share their lessons learned. You’ll hear from Melonee Wise, CEO of Fetch Robotics (funds raised: $48 million), Manish Kothari, president of SRI Ventures and Nima Keivan, CTO and co-founder of Canvas Technology (funds raised: $15 million).

TechCrunch Sessions: Robotics + AI takes place April 18, at UC Berkeley’s Zellerbach Hall. If you love saving money as much as you love robots, buy your early-bird ticket now. That bird disappears — along with your $100 — in just two days.

This is the next-gen lionfish vacuuming robot

Roboticists have the strangest pet projects. For Colin Angle, it’s vacuuming up lionfish. The iRobot CEO is also the cofounder of RSE (that’s Robots in Service of the Environment), a volunteer-based organization designed to create what it says in the name.

The organization’s first project, announced back in 2017, is a robot designed to capture the invasive species, which are capable of decimating reef fish.

Following a successful $29,000 Kickstarter campaign, RSE just announced the launch the intensely named Guardian LF1 Mark 3. Fish vacuuming robot works at depths up to 400 feet, where the fish tend to hang out and breed. It can be remotely operated via a laptop or mobile device for up to an hour in a go.

“The Lionfish are destroying the coral reef and decimating fish populations in the Atlantic,” Angle said in a release tied to the news. The latest innovations incorporated into the RSE Guardian LF1, enable the undersea robotic solution to go deeper, fish longer and pull in a larger haul. With each technical milestone we cross we get one step closer to saving our greatest natural resource by empowering fisherman with new tools.”

The device stuns the fish with a zap, before collected up to ten in a single go. It’s currently a functioning prototype, which has been deployed during various testing missions in Florida.