Drones are making a difference in the world and regulatory agencies are helping

About two months ago, in the middle of the night, a small, specially designed unmanned aircraft system – a drone – carried a precious cargo at 300 feet altitude and 22 miles per hour from West Baltimore to the University of Maryland Medical Center downtown, a trip of about 5 minutes. They called it, “One small hop for a drone; one major leap for medicine.”

The cargo was a human kidney, and waiting for that kidney at the hospital was a patient whose life would be changed for the better.

“This whole thing is amazing,” the 44-year-old recipient later told the University of Maryland engineering and medical teams that designed the drone and the smart container. The angel flight followed more than two years of research, development and testing by the Maryland aerospace and medical teams and close coordination with the Federal Aviation Administration (FAA) .

There were many other ways the kidney could have been delivered to the hospital, but proving that it could be done by drone sets the stage for longer and longer flights that will ultimately lower the cost and speed up the time it takes to deliver an organ. And speed is life in this case – the experts say the length of time it takes to move an organ by traditional means is a major issue today.

This is one example of how small drones are already changing the landscape of our economy and society. Our job at the Department of Transportation (DOT), through the FAA, is to safely integrate these vehicles into the National Airspace System.

Time is of the essence. The Department has been registering drones for less than four years and already there are four times as many drones—1.5 million– on the books as manned aircraft. This week in Baltimore, more than 1,000 members of the drone community are coming together to discuss the latest issues in this fast-growing sector  as part of the fourth annual FAA UAS Symposium, which the Department co-hosts with the Association for Unmanned Aircraft Systems.

Along with public outreach, the Department is also involved in demonstration projects, including the Integration Pilot Program, or IPP. Created by this Administration in 2017, the IPP allows the FAA to work with state, local and tribal governments across the U.S. to get the experience needed to develop the regulations, policy and   guidance for safely integrating drones, including tackling tough topics like security and privacy. The experience gained and the data collected will help ensure the United States remains the global leader in safe UAS integration and fully realizes the economic and societal benefits of this technology.

A couple of IPP examples show the ingenuity of the drone community.

In San Diego, the Chula Vista police department and CAPE, a private UAS teleoperations company, are using drones as first responders to potentially save the lives of officers and make the department more efficient. Since October, they have launched drone first responders on more than 400 calls in which 59 arrests were made, and for half of those calls, the drone was first on the scene with an average on-scene response time of 100 seconds. Equally important is the 60 times that having the drone there first eliminated the need to send officers at all.

Recently as the result of an IPP project, the FAA granted the first airline certification to Alphabet Inc.’s Wing Aviation, a commercial drone operator that will deliver packages in rural Blacksburg, Virginia.

What happens next is that the FAA will gradually implement new rules to expand when and how those operators can conduct their business safely and securely. To manage all the expected traffic, the FAA is working with NASA and industry on a highly automated UAS Traffic Management, or UTM, concept.

At the end of the day, drones will help communities like Baltimore — and others throughout the country — save lives and deliver new services. DOT and the FAA will help ensure it’s all done safely, and that public concerns about privacy and security are addressed.

MIT develops a better way for robots to predict human movement

People and robots working together has tremendous potential for factory and construction site settings, but robots are also potentially incredibly dangerous to people, especially when they’re large and powerful, which is typically the case for industrial robots.

There are plenty of efforts to make ‘corobotics’ a reality, including production machines like the YuMi produced by German robotics giant ABB . But a new algorithm created by MIT researchers could help make humans and robots working together even safer.

Researchers working with automaker BMW and observing their current product flow workflow noticed that the robots were overly cautious when it came to watching out for the humans in the plant – they’d lose lots of potentially productive time waiting for people to cross their paths long before there was any chance of the people actually doing that.

They’ve not developed a solution that greatly improves the ability of robots to anticipate the trajectory of humans as they move – allowing robots that typically freeze in the face of anything even vaguely resembling a person walking in their path to continue to operate and move around the flow of human foot traffic.

Researchers managed this by eschewing the usual practice of borrowing from how music and speech processing works for algorithmic prediction, which are much better when it comes to predicting predictable paths of travel, and instead came up with a ‘partial trajectory’ method that references real-time trajectory data with a large library of reference trajectories gathered before.

This is a much better way of anticipating human movement, which is very rarely consistent and involves a lot of stops and starts, even in a factory worker performing the same action repeatedly over thousands of instances.

This could have potential consumer applications too – researchers note that human movement even in the home would be better predicted using this moment, which could have benefits in terms of robotic long-term in-home care for the elderly, for instance.

RealityEngines.AI raises $5.25M seed round to make ML easier for enterprises

RealityEngines.AI, a research startup that wants to help enterprises make better use of AI, even when they only have incomplete data, today announced that it has raised a $5.25 million seed funding round. The round was led by former Google CEO and Chairman Eric Schmidt and Google founding board member Ram Shriram. Khosla Ventures, Paul Buchheit, Deepchand Nishar, Elad Gil, Keval Desai, Don Burnette and others also participated in this round.

The fact that the service was able to raise from this rather prominent group of investors clearly shows that its overall thesis resonates. The company, which doesn’t have a product yet, tells me that it specifically wants to help enterprises make better use of the smaller and noisier datasets they have and provide them with state-of-the-art machine learning and AI systems that they can quickly take into production. It also aims to provide its customers with systems that can explain their predictions and are free of various forms of bias, something that’s hard to do when the system is essentially a black box.

As RealityEngines CEO Bindu Reddy, who was previously the head of products for Google Apps, told me the company plans to use the funding to build out its research and development team. The company, after all, is tackling some of the most fundamental and hardest problems in machine learning right now — and that costs money. Some, like working with smaller datasets, already have some available solutions like generative adversarial networks that can augment existing datasets and that RealityEngines expects to innovate on.

Reddy is also betting on reinforcement learning as one of the core machine learning techniques for the platform.

Once it has its product in place, the plan is to make it available as a pay-as-you-go managed service that will make machine learning more accessible to large enterprise, but also to small and medium businesses, which also increasingly need access to these tools to remain competitive.

On the road to self-driving trucks, Starsky Robotics built a traditional trucking business

More than three years ago, self-driving trucks startup Starsky Robotics was founded to solve a fundamental issue with freight — a solution that CEO Stefan Seltz-Axmacher believes hinges on getting the human driver out from behind the wheel.

But a funny thing happened along the way. Starsky Robotics started a regular ol’ trucking company. Now, nearly half of the employees at this self-driving truck startup help run a business that uses the traditional model of employing human drivers to haul loads for customers, TechCrunch has learned.

Starsky’s trucking business, which has been operating in secret for nearly two years alongside the company’s more public pursuit of developing autonomous vehicle technology, has hauled 2,200 loads for customers. The company has 36 regular trucks that only use human drivers to haul freight. It has three autonomous trucks that are driven and supported by a handful of test drivers. Starsky also employs a number of office people who, as Seltz-Axmacher notes, “know how to run trucks.”

The CEO and co-founder contends that without the human-driven trucking piece, Starsky won’t ever have an operational, or profitable, self-driving truck business. The trucking business has generated revenue, led to key partnerships such as Schneider Logistics, Penske and Transport Enterprise Leasing, and importantly, helped build a company that works in the real world. It has also been a critical tool for recruiting and vetting safety drivers and teleoperators (or remote drivers), according to Seltz-Axmacher.

“The decision to have a trucking business interact with the real trucking world in parallel with developing the robotics piece is a necessary part of building a longstanding business in the space,” said Reilly Brennan, general partner at Trucks VC and the first institutional investor in Starsky.

Starksy, which was co-founded by Seltz-Axmacher and Kartik Tiwari, has raised $21.7 million in equity from investors including Shasta Ventures and Trucks VC.

The evolution over at Starsky illustrates the challenge that awaits the autonomous vehicle industry and the giant companies and startups operating within it. Even after engineers solve the complexity of building an AI-powered driver that’s better than a human, these companies must figure out the equally intricate task of operations. Robotaxis, autonomous delivery robots and self-driving trucks won’t matter if humans don’t use, like or trust the tech.

Figuring out the basics of operations — including the rather pedestrian and obvious ones — will mean the difference between making or losing money. Or, having a business at all.

And the stakes are high. Trucks are the backbone of the U.S. economy and moved more than 70% of all U.S. freight and generated more than $700 billion in 2017, according to the most up-to-date statistics available from the American Trucking Associations (ATA).

Companies pursuing robotaxis and other autonomous vehicle programs are going to eventually wake up — if they haven’t already — to the same realities that Starsky has accepted, Brennan contends.

“The interaction with the market, particularly in logistics, is vital,” Brennan said, adding that companies pursuing robotaxis that haven’t built out and tested a consumer-facing app risk the same problems. “They need to have a business on day one, not on day 720.”

For Starsky, it started with something as basic as having a working vehicle and access to mechanics that could fix it.

Trucks, the hard way

Seltz-Axmacher admits now he underestimated how difficult trucks could be.

“Hey, it’s a truck, how hard can buying one be?,” said Seltz-Axmacher, as he described the company’s first major purchase of a truck for about $50,000. “We quickly realized that having a truck and driving a truck are not easy things to do.”

Starsky engineers retrofitted the truck, named Rosebud, with its autonomous driving system and made plans to test it at the Thunderhill Raceway about 150 miles north of San Francisco. It didn’t make it. The truck’s engine was smoking by the time it crossed the Bay Bridge. And then the truck, along with all those engineers, sat for two weeks while Seltz-Axmacher hunted for a diesel mechanic.

Self-driving truck startup Starsky Robotics began with this first, and problematic truck

The truck, pictured above, continued to break down. The company ran into more snafus, including a problem with insurance and the title of the vehicle. Starsky was going to miss a key milestone and Seltz-Axmacher was going to have to tell investors that it wasn’t because of bottlenecks in engineering, but because they didn’t know how to manage the truck part of this self-driving truck company.

The founders learned that even “average” trucks needed to go to the shop every 60 days, which is operationally complex when vehicles are traveling throughout the United States.

Starsky ended up making a key hire, Paul Schlegel, who is a veteran of trucking operations, to organize the enterprise. Schlegel, who has 32 years in the transportation industry with companies such as Schneider National and Stevens Transport, developed the trucking business that enabled autonomous trucks, but still worked in their absence. The trucking operations team is in Dallas. 

The driver pinchpoint

Seltz-Axmacher has said repeatedly that “unless you’re getting the driver out of the truck, you’re not solving anything.”

The problem in trucking is the supply of drivers. The chronic shortage has, in turn, driven up costs. For instance, the median salary for a truckload driver working a national, irregular route was more than $53,000 — a $7,000 increase from ATA’s last survey, which covered annual pay for 2013, or an increase of 15%. It’s even higher for private fleet drivers, who saw their pay rise to more than $86,000 from $73,000, or a gain of nearly 18%.

Starksy soon found that finding the right drivers was just as hard as finding the right trucks. The Federal Motor Carrier Safety Administration shows the company has reported three crashes of its manually driven trucks.

Seltz-Axmacher said they’ve had a driver make a wrong turn and have a low-hanging branch rip a hole in the side of a trailer. The most serious incident involved a new driver who took an offramp in Florida too fast and rolled the truck onto its side. No one was injured and the driver was terminated.

These drivers are critical to the autonomous program and the best of them end up becoming teleop controllers, a job that involves sitting in an office, not logging days and weeks in a truck.

Starsky is taking a dual approach to its autonomous trucks. It outfits regular trucks with a combination of sensors like radar and cameras along with software that allows long-haul trucks to drive autonomously on the highway. When the truck is about to exit, a trained remote operator, who is sitting in an office, takes over and navigates the truck to its final destination.

The promise of being able to be promoted to teleoperator is a big part of how Starsky is able to hire drivers effectively. The company contends it wouldn’t be possible to find 25 highly skilled safety and remote drivers without having a broader fleet of regular truck drivers to choose from.

Robotrucks or bust

The ultimate goal of Starsky Robotics hasn’t changed, Seltz-Axmacher said. To get there, the company recently hired Ain McKendrick as vice president of engineering, and former Tesla executive Keith Flynn to head up its hardware manufacturing to support Starsky’s fleet build. McKendrick, who co-founded Podtek and Lyve, also has experience at autonomous vehicle company Cyngn, Highfive, Netflix and Dell .

By early 2020, the company aims to have 25 autonomous trucks — a goal that is only possible if it has 100 regular trucks, he added.

The only way Starsky can scale its operations on the autonomous side is to continue to scale its regular trucking operations six months in advance. In other words, the regular trucking business is inextricably linked to the success of deploying autonomous trucks.

The company has already found that the 15-plus brokers that are regularly giving it freight to haul are ready for driverless trucks.

“Many times the brokers who have given us loads have been fairly ambivalent to whether or not we’re hauling that freight with a self-driving truck, Seltz-Axmacher said. “A lot of the concern that people might have is that this is a technology-averse industry and might not be willing to accept self-driving trucks has proven not to be true.”

Zoox co-founder Jesse Levinson is coming to TC Sessions: Mobility

Autonomous vehicle startup Zoox has a history of keeping its progressive and plans to itself. But that’s starting to change.

The venture-backed company that is creating ground-up fully autonomous electric vehicles is ready to share a bit more about its tech, strategy and plans. And who better to talk to than co-founder and CTO Jesse Levinson, the person who oversees the company’s software, artificial intelligence, computing and sensing platforms.

We’re excited to announce that Levinson will join us onstage at TC Sessions: Mobility on July 10 in San Jose. TechCrunch will discuss with Levinson the tech that is driving the company’s autonomous vehicles, recent changes at Zoox, including its new CEO Aicha Evans, challenges facing the company and its deployment plans.

Levinson is among a group of insiders who participated in early government-backed competitions aimed at pushing the development of autonomous vehicles. While completing a computer science Ph.D. and postdoc under Sebastian Thrun at Stanford University, Levinson developed algorithms for the school’s winning entry in the 2007 DARPA Urban Challenge. He went on to lead the self-driving car team’s research efforts before joining Zoox.

Levinson also co-created a popular mobile photography app, Pro HDR, that has been purchased by more than a million people.

Levinson is just one of the many leaders in autonomous vehicles, scooters and electric mobility who will participate in TC Sessions: Mobility.

The agenda is packed with some of the biggest names and most exciting startups in the transportation industry, including Mobileye co-founder and CEO Amnon Shashua, Alisyn Malek with May Mobility, Dmitri Dolgov at Waymo, Karl Iagnemma of Aptiv, Seleta Reynolds of the Los Angeles Department of Transportation and Ford Motor CTO Ken Washington. With early-bird ticket sales ending soon, you’ll want to be sure to grab your tickets. Others include Katie DeWitt of Scoot, Argo AI’s chief safety officer Summer Fowler, Uber’s engineering director for Elevate Mark Moore and Stonly Baptiste, co-founder of early-stage venture capital fund Urban Us.

We have a few surprises too, including demos showcasing some cool tech and startups coming out of stealth.

The event will feature startup founders and industry experts and partake in discussions about the future of transportation, the promise and problems of autonomous vehicles, the potential for bikes and scooters, investing in early-stage startups and more.

Early-bird tickets are now on sale — save $100 on tickets before prices go up after June 14.

Students, you can grab your tickets for just $45.

Apple reportedly exploring acqui-hire of self-driving startup Drive.ai

Apple is potentially seeking to acquire Silicon Valley autonomous driving startup Drive.ai, according to a new report from The Information. The report describes the acquisition as in process, and says it will be an ‘acqui-hire,’ which means it’s primary goal is to bring in the talent of Drive.ai – with presumably special focus on the engineering talent of the self-driving tech company.

Drive.ai got its start in 2016, founded by a crack team of graduates from Stanford’s AI lab. It focused originally on building out not only the functional autonomy of driving systems, but also intelligent communications systems that would help self-driving vehicles better integrate with existing human drivers and pedestrians.

The company later raised more money with a business model shift focused on retrofitting existing fleets of commercial vehicles, and last year began testing its own self-driving pick-up and drop-off service in Frisco, Texas.

The Information reported earlier this year that Drive.ai started seeking potential buyers for the company after finding fewer options in terms of continued funding and independent operation. Apple, for its part, has had a spotty history with its own efforts around autonomous driving, with some high-profile leadership shifts on its so-called ‘Titan’ car project. It’s still actively testing vehicles on roads, but the scope and shape of its approach aren’t entirely clear.

We’ve reached out to both Apple and Drive.ai, who declined to comment to The Information regarding the original report, and will update if we hear back.

A first look at Amazon’s new delivery drone

For the first time, Amazon today showed off its newest fully electric delivery drone at its first re:Mars conference in Las Vegas. Chances are, it neither looks nor flies like what you’d expect from a drone. It’s an ingenious hexagonal hybrid design, though, that has very few moving parts and uses the shroud that protects its blades as its wings when it transitions from vertical, helicopter-like flight at takeoff to its airplane-like mode.

These drones, Amazon says, will start making deliveries in the coming months, though it’s not yet clear where exactly that will happen.

What’s maybe even more important, though, is that the drone is chock-full of sensors and a suite of compute modules that run a variety of machine learning models to keep the drone safe. Today’s announcement marks the first time Amazon is publicly talking about those visual, thermal and ultrasonic sensors, which it designed in-house, and how the drone’s autonomous flight systems maneuver it to its landing spot. The focus here was on building a drone that is as safe as possible and able to be independently safe. Even when it’s not connected to a network and it encounters a new situation, it’ll be able to react appropriately and safely.

When you see it fly in airplane mode, it looks a little bit like a TIE fighter, where the core holds all the sensors and navigation technology, as well as the package. The new drone can fly up to 15 miles and carry packages that weigh up to five pounds.

This new design is quite a departure from earlier models. I got a chance to see it ahead of today’s announcement and I admit that I expected a far more conventional design — more like a refined version of the last, almost sled-like, design.

Amazon’s last generation of drones looked very different.

Besides the cool factor of the drone, though, which is probably a bit larger than you may expect, what Amazon is really emphasizing this week is the sensor suite and safety features it developed for the drone.

Ahead of today’s announcement, I sat down with Gur Kimchi, Amazon’s VP for its Prime Air program, to talk about the progress the company has made in recent years and what makes this new drone special.

“Our sense and avoid technology is what makes the drone independently safe,” he told me. “I say independently safe because that’s in contrast to other approaches where some of the safety features are off the aircraft. In our case, they are on the aircraft.”

Kimchi also stressed that Amazon designed virtually all of the drone’s software and hardware stack in-house. “We control the aircraft technologies from the raw materials to the hardware, to software, to the structures, to the factory to the supply chain and eventually to the delivery,” he said. “And finally the aircraft itself has controls and capabilities to react to the world that are unique.”

(JORDAN STEAD / Amazon)

What’s clear is that the team tried to keep the actual flight surfaces as simple as possible. There are four traditional airplane control surfaces and six rotors. That’s it. The autopilot, which evaluates all of the sensor data and which Amazon also developed in-house, gives the drone six degrees of freedom to maneuver to its destination. The angled box at the center of the drone, which houses most of the drone’s smarts and the package it delivers, doesn’t pivot. It sits rigidly within the aircraft.

It’s unclear how loud the drone will be. Kimchi would only say that it’s well within established safety standards and that the profile of the noise also matters. He likened it to the difference between hearing a dentist’s drill and classical music. Either way, though, the drone is likely loud enough that it’s hard to miss when it approaches your backyard.

To see what’s happening around it, the new drone uses a number of sensors and machine learning models — all running independently — that constantly monitor the drone’s flight envelope (which, thanks to its unique shape and controls, is far more flexible than that of a regular drone) and environment. These include regular camera images and infrared cameras to get a view of its surroundings. There are multiple sensors on all sides of the aircraft so that it can spot things that are far away, like an oncoming aircraft, as well as objects that are close, when the drone is landing, for example.

The drone also uses various machine learning models to, for example, detect other air traffic around it and react accordingly, or to detect people in the landing zone or to see a line over it (which is a really hard problem to solve, given that lines tend to be rather hard to detect). To do this, the team uses photogrammetrical models, segmentation models and neural networks. “We probably have the state of the art algorithms in all of these domains,” Kimchi argued.

Whenever the drone detects an object or a person in the landing zone, it obviously aborts — or at least delays — the delivery attempt.

“The most important thing the aircraft can do is make the correct safe decision when it’s exposed to an event that isn’t in the planning — that it has never been programmed for,” Kimchi said.

The team also uses a technique known as Visual Simultaneous Localization and Mapping (VSLAM), which helps the drone build a map of its current environment, even when it doesn’t have any other previous information about a location or any GPS information.

“That combination of perception and algorithmic diversity is what we think makes our system uniquely safe,” said Kimchi. As the drone makes its way to the delivery location or back to the warehouse, all of the sensors and algorithms always have to be in agreement. When one fails or detects an issue, the drone will abort the mission. “Every part of the system has to agree that it’s okay to proceed,” Kimchi said.

What Kimchi stressed throughout our conversation is that Amazon’s approach goes beyond redundancy, which is a pretty obvious concept in aviation and involves having multiple instances of the same hardware on board. Kimchi argues that having a diversity of sensors that are completely independent of each other is also important. The drone only has one angle of attack sensor, for example, but it also has a number of other ways to measure the same value.

Amazon isn’t quite ready to delve into all the details of what the actual on-board hardware looks like, though. Kimchi did tell me that the system uses more than one operating system and CPU architecture, though.

It’s the integration of all of those sensors, AI smarts and the actual design of the drone that makes the whole unit work. At some point, though, things will go wrong. The drone can easily handle a rotor that stops working, which is pretty standard these days. In some circumstances, it can even handle two failed units. And unlike most other drones, it can glide if necessary, just like any other airplane. But when it needs to find a place to land, its AI smarts kick in and the drone will try to find a safe place to land, away from people and objects — and it has to do so without having any prior knowledge of its surroundings.

Amazon Prime Air drone

To get to this point, the team actually used an AI system to evaluate more than 50,000 different configurations. Just the computational fluid dynamics simulations took up 30 million hours of AWS compute time (it’s good to own a large cloud when you want to build a novel, highly optimized drone, it seems). The team also ran millions of simulations, of course, with all of the sensors, and looked at all of the possible positions and sensor ranges — and even different lenses for the cameras — to find an optimal solution. “The optimization is what is the right, diverse set of sensors and how they are configured on the aircraft,” Kimchi noted. “You always have both redundancy and diversity, both from the physical domain — sonar versus photons — and the algorithmic domain.”

The team also ran thousands of hardware-in-the-loop simulations where all the flight services are actuating and all the sensors are perceiving the simulated environment. Here, too, Kimchi wasn’t quite ready to give away the secret sauce the team uses to make that work.

And the team obviously tested the drones in the real world to validate its models. “The analytical models, the computational models are very rich and are very deep, but they are not calibrated against the real world. The real world is the ultimate random event generator,” he said.

It remains to be seen where the new drone will make its first deliveries. That’s a secret Amazon also isn’t quite ready to reveal yet. That will happen within the next few months, though. Amazon started drone deliveries in England a while back, so that’s an obvious choice, but there’s no reason the company could opt for another country as well. The U.S. seems like an unlikely candidate, given that the regulations there are still in flux, but maybe that’s a problem that will be solved by then, too. Either way, what once looked like a bit of a Black Friday stunt may just land in your backyard sooner than you think.

Aurora’s head of product is coming to TC Sessions: Mobility

Self-driving car startup Aurora might be as buzzy as they come. The company raised more than $530 million just a few months ago in a round led by Sequoia Capital and with “significant investment” from Amazon and T. Rowe Price Associates. We’ve learned Aurora isn’t shy about using that capital with two acquisitions in the books.

But how, when and where will Aurora’s autonomous vehicle technology end up? Lia Theodosiou-Pisanelli, who leads product development and program management for all of Aurora’s partnerships, will join us on stage at TC Sessions: Mobility to give us the scoop.

Theodosiou-Pisanelli has had an up close view of the autonomous vehicle industry. Prior to joining Aurora this spring, she was director of business development for Lyft’s self-driving technology business Level 5.

Her background in global government relations at Square as well as a U.S. Trade Representative at the White House, gives Theodosiou-Pisanelli’s insight into how policy and regulations are shaped and might eventually affect the autonomous vehicle industry.

TC Sessions: Mobility will be held July 10 in San Jose. The agenda is packed with some of the biggest names and most exciting startups in the transportation industry, including Mobilieye co-founder and CEO Amnon Shashua, Alisyn Malek with May Mobility, Dmitri Dolgov at Waymo, Karl Iagnemma of Aptiv, Seleta Reynolds of the Los Angeles Department of Transportation, and Ford Motor CTO Ken Washington. With Early-Bird ticket sales ending soon, you’ll want to be sure to grab your tickets.

Throughout the day, you can expect to hear from and partake in discussions about the future of transportation, the promise and problems of autonomous vehicles, the potential for bikes and scooters, investing in early-stage startups and more.

Early-Bird tickets are now on sale — save $100 on tickets before prices go up after June 14.

Students, you can grab your tickets for just $45.

iOS 13 will let you limit app location access to ‘just once’

Apple will soon let you grant apps access to your iPhone’s location just once.

Until now, there were three options — “always,” “never,” or “while using,” meaning an app could be collecting your real-time location as you’re using it.

Apple said the “just once” location access is a small change — granted — but one that’s likely to appeal to the more privacy-minded folk.

“For the first time, you can share your location to an app — just once — and then require it to ask you again next time at wants,” said Apple software engineering chief Craig Federighi at its annual developer conference on Monday.

That’s going to be helpful for those who download an app that requires your immediate location, but you don’t want to give it persistent or ongoing access to your whereabouts.

On top of that, Apple said that the apps that you do grant location access to will also have that information recorded on your iPhone in a report style, “so you’ll know what they are up to,” said Federighi.

Apps don’t always use your GPS to figure out where you are. All too often, apps use your Wi-Fi network information, IP address, or even Bluetooth beacon data to figure out where you physically are in the world so they can better target you with ads. Federighi said it will be “shutting the door on that abuse” as well.

The new, more granular location-access feature will feature in iOS 13, expected out later this year,.

China says it will ‘soon grant’ 5G licenses for commercial use

There’s a widely accepted method to interpret China’s official announcements: the shorter the news, the heavier it is. Today, in one concise sentence, the Ministry of Industry and Information Technology, China’s telecom regulator, announced that it will “soon grant 5G licenses for commercial use.”

That’s according to the Communist Party newspaper People’s Daily. TechCrunch reported four months ago that China planned to “fast-track” the commercial use of the next-gen networking technology at a time when Huawei, the champion of the country’s 5G development, faces mounting pressure in the west.

It manages to find allies in other parts of the world. Just last week, the Shenzhen-based telecom giant launched a 5G lab in South Korea but decided to keep the event “low-key,” Reuters reported, notably because the Asian country is a security ally of the U.S.

The acceleration of 5G licensing in China will also be a potential boost to the domestic economy, as it will “drum up demand with upgraded technology experiences across devices, automotive and manufacturing leveraging 5G technology,” Neil Shah, research director at Counterpoint Research, told TechCrunch in a previous interview.

5G technologies are expected to generate 6.3 trillion yuan ($947 billion) worth of economic output and 8 million jobs for China by 2030, according to a white paper released by the China Academy of Information and Communications Technology.