Startups Weekly: In a crowded field of unicorns, ClassPass becomes another unicorn

I hope you’ve all had a good week. Normally I’m behind the scenes (where I’m most comfortable), but I’ll be managing the Startups Weekly newsletter until I assign it to someone else. More on that in a few weeks. Want it in your inbox? Sign up here for this and other great newsletters we have to offer, including ones on space and transportation. For now, let’s get on with it, shall we?

A unicorn workout

Working out never did a body so good as it did for ClassPass this week. The popular startup that created a way to help people exercise more easily just became a unicorn with an influx of Series E cash. 

The latest funding, in the amount of $285 million, was led by L Catterton and Apax Digital, with participation from existing investor Temasek. It brings ClassPass’s total known raise to about $550 million.

We reported a couple of weeks ago that ClassPass, then at a $536.4 million valuation, was sniffing around for the round, which would promote it to the unicorn club.

We are motivated by the impact we’ve had on members and partners, including 100 million hours of workouts that have already been booked,” said ClassPass Founder and Chairman Payal Kadakia in a statement about the raise. “This investment is a significant milestone that will further our mission to help people stay active and spend their time meaningfully.

Photo: ClassPass

Funding real estate

A couple of real estate-ish startups got some attention this week. Los Angeles-based Luxury Presence raised $5.4 million to help it help agents round out their digital marketing arsenals.

In other real estate funding this week, Orchard, previously known as Perch, announced that it has raised $36 million. The company solves the problem that so-called “dual-trackers” face: selling their home while trying to buy one. It’s stressful and costs a lot of money.

As Jordan Crook wrote in her story on the raise: “Orchard solves this by making an offer on buyers’ old houses that is guaranteed for 90 days. Orchard co-founder Court Cunningham says that more than 85% of those homes sell at a market price before the 90-day period.”

GettyImages 979844616

Image via Getty Images / Feng Yu

Brian Heater had a chat with Lora DiCarlo CEO Lora Haddock about the sex tech company’s return to CES. But the interview wasn’t conducted at a table in a crowded press room in some random hotel. It was in a truck with a big, glass trailer. It’s Vegas, obviously, so why not?

As Brian put it:

Driving down the Las Vegas Strip in a transparent box is a curious, extremely Vegas experience: puzzled tourists and confused CES attendees gawk from the sidewalks. Four of us are sitting in a makeshift living room with fuzzy white carpet: CEO Lora Haddock, Enzo Ferrari Drift DiCarlo (her fuzzy black-and-white Pomeranian), and a colleague, who holds Enzo in their lap. A four-foot-tall faux sex toy sits in a corner, swaying occasionally.

Last year, you might recall, the consumer tech show awarded Lora DiCarlo with an innovation award, but then took it back. They also banned the company from the show floor, stating it didn’t fit into a product category. Months later, they scored some funding and got an apology from the CES show runners.

Read the interview on Extra Crunch.

48838615618 56fd7e6444 c

SAN FRANCISCO, CALIFORNIA – OCTOBER 03: Lora DiCarlo Founder & CEO Lora Haddock speaks onstage during TechCrunch Disrupt San Francisco 2019 at Moscone Convention Center on October 03, 2019 in San Francisco, California. (Photo by Kimberly White/Getty Images for TechCrunch)

Around the horn

Extra Crunch

Over on Extra Crunch we published a bunch of great stuff this week, including stories about Ring and its evolving stance on security and privacy, how gig economy companies are trying to keep workers classified as independent contractors, and whether online privacy will make a comeback this year.

Here are a few more:

Head here if you aren’t a subscriber yet for a super-discounted first month.


Alex Wilhelm was back on the mic this week with Danny Crichton, TechCrunch’s managing editor. Their docket included news of Lily AI’s $12.5 million Series A, Insight’s $1.1 billion acquisition of Armis Security, a round for a self-driving forklift startup called Vecna and SoftBank’s Vision Fund.

Listen to the episode here, and if you haven’t subscribed yet, you can do that here.

But that’s not the only Equity news I have for you. Alex wants to help you all get started each week with Equity Mondays. In his own words:

The Equity crew will put together a short, zero-bullshit episode designed to get your week started. What news did you miss over the weekend? What recent venture rounds do you need to know about? What’s ahead in the coming week? And what’s on our minds? That’s what Equity Monday will bring you each and every morning in about seven minutes.

The good news is it’ll show up in the Equity feed you already know and love. Have a listen to the first Monday edition here.

TechCrunch Include yearly report

Welcome to the third annual TechCrunch Include Progress Report. Our editorial and events teams work hard throughout the year to ensure that we bring you the most dynamic and diverse group of speakers and judges to our event stages. And finally, at the tail end of 2019, we bring you … 2018 data. (You can see 2017 data here.)

In 2018, TechCrunch produced Disrupts in San Francisco and Berlin, as well as regional Battlefield events in Zug, Switzerland; Lagos, Nigeria; São Paulo, Brazil and Berlin, Germany. We also produced a number of Sessions events, including the increasingly popular Robotics edition, as well as Blockchain and AR/VR.

It is important to us that we foster an environment that reflects the increasingly diverse tech industry. We are pleased to report that we saw an overall increase across the board with regard to inclusion, while still acknowledging that we weren’t yet where we needed to be when it comes to women and people of color across our stages. Happily, 2019 has been even better, and we’ll bring you those numbers soon.

Below we have compiled data from our 2018 events about the makeup of people who appeared as panelists, judges and founders of the Battlefield competitors. 


Our flagship conference attracts speakers, judges and Battlefield contestants from all over the world. It serves as a global arena for startups in all stages of development, as well as investors interested in finding their next big investment.

At Disrupt SF in 2018, of the 153 total speakers and judges, 33% were women and 27% were people of color. On the Battlefield stage, of the 22 teams, 36% had female founders. This is up from 29% the year before.

At Disrupt Berlin, of the 56 speakers and judges, 39% were women and 18% were people of color. Of the 12 teams that competed on the Battlefield stage, half the founders were women.

Regional Battlefield 

Our Battlefield competition isn’t limited to Disrupt. We take it on the road in order to give as many startups an opportunity to compete. In addition, these events include panels designed around region-specific topics. In 2018, we hosted Battlefield competitions in the Middle East and North Africa, Latin America and Africa regions.

Battlefield MENA showcased 15 teams; of those, 53% were founded by women. Of the 28 speakers and judges, 35% were women and 75% were people of color.

Fifteen teams competed in Battlefield LatAm, 20% of which were led by women. Out of the 28 speakers and judges, 32% were women and 68% were people of color.

And finally, in Battlefield Africa, a total of 15 teams competed. Of those, 33% were founded by women. Of the 28 speakers and judges, 14% were women and 75% were people of color.


Our daylong Sessions events are targeted at specific topics. In 2018, we held events about Blockchain, robotics and AR/VR. TechCrunch Sessions events attract to the stage specialists in their industries speaking to rapt audiences.

Of the 28 speakers who appeared onstage in Berkeley for Sessions: Robotics, 25% were women and 21% were people of color. In Zug, Switzerland for Sessions: Blockchain, of the 29 speakers, 17% were women and 21% were people of color. And in Los Angeles at Sessions: AR/VR, 34% of the 29 speakers were women and 24% were people of color.


Tel Aviv

Our event in Tel Aviv leaned heavily toward mobility, and served as a preview of what would become Sessions: Mobility in 2019. Of the 38 speakers in our programming, 21% were women and 63% were people of color.


In 2018, TechCrunch also hosted a hackathon at VivaTech in Paris, as well as presented editorial programming. Of the 20 speakers, 45% were women and 30% were people of color.

The inevitable takedown of the female CEO

Four years ago when I founded Winnie, I set out to build a different kind of startup. Above and beyond any success our business achieved, it was most important to me that we create a culture where people would want to work. As a new mom at the time, I intentionally decided to build a company where employees would not work on nights or weekends, where there was flexibility for employees to manage their lives outside of the office, where motherhood would no longer be a penalty but a bonus and where underrepresented groups would be valued and promoted. If we failed because we did those things, so be it.

Four years later, I’m proud of the culture my co-founder Anne Halsall and I have built. As it turns out, treating employees well, valuing their families and personal time and diversifying our team are not only the right things to do, but also competitive advantages.

Even so, I worry that being a woman and taking on the role of co-founder and CEO places a target on my back.

Aggressive. Blunt. Furious. These are words that have been used to criticize the behavior of female CEOs of prominent companies like Thinx, Cleo, Rent the Runway and ThirdLove, to name a few. Away is the latest female-led company to come under fire, in an article in The Verge on Thursday.

First, let me be clear: A toxic work culture is never acceptable. Regardless of who started a company or what kind of stress the company is under, it’s never okay to mistreat employees. Some of the things that came to light in these pieces are particularly abhorrent: sexual harassment, lying about one’s credentials, creating an unsafe space for underrepresented groups, overworking employees. These are dynamics that need to be called out and eliminated at all companies, whether female or male-led. The Away example is no exception.

But as a female founder and CEO of a growing company, I have to ask: Why does it seem like so many of the toxic companies in the news are founded and led by women? The number of major public corporations led by female CEOs is less than 5%, and of the 134 U.S.-based unicorns, only 14 even have a woman with a co-founder title.

For such a small fraction of female-led companies, the amount of negative press female CEOs receive is glaringly disproportionate. I have a couple of ideas why.

First, while much of what is revealed in these reports is disgusting, what also comes through is the stereotype of women leaders as “bitches.” Articles often highlight when female CEOs curse, yell and show anger or bawdiness, because the shock value is higher than when male CEOs demonstrate these behaviors. We ask women leaders not only to be successful, but also to be ladylike and likable. I have lost count of the number of times I’ve been criticized for not being warm and friendly enough, or saying things that were too blunt.

Second, studies show that when it comes to ethical failures, women are “judged more harshly than men.” The ThirdLove article calls out that “at a by women, for women company” ThirdLove’s practice of discouraging salary negotiation was particularly disappointing. Cleo’s last-minute setup of a mother’s room using hanging curtains and a TaskRabbit was described by employees as one of the “more outrageous” behaviors of the founder. As a breastfeeding mom myself, I hate when mother’s rooms are inadequate, but male-led companies have poor lactation accommodations all the time.

The way we are targeting female founders and CEOs is doing nothing to encourage gender equality. It is only ensuring that the number of female CEOs is dwindling under the pressure of having to live up to stricter standards than men. So what can each of us do to create a more fair and accurate picture?

Reporters should continue to hold companies accountable, but just seek stories of male CEOs in equal proportion to the number of male-led companies out there. Those stories are there and only a few of the very worst examples have been exposed. Let’s have it take much less time to expose the next Travis Kalanick or Adam Neumann.

As readers, it is also worth being aware of our own biases. We can ask ourselves if we’re more outraged at a behavior because it comes from a woman, and if there are men we’re allowing to go unscrutinized. We can ask ourselves if maybe we enjoy seeing successful women taken down a notch (I certainly hope the answer is no).

I will continue to implement a healthy work environment at Winnie, grow a company where my employees can thrive and hold myself to the highest standards of conduct. But as we continue to take down the already few female CEOs one by one, I can only hope that what I do will be enough.

Apple Watch Series 5’s banner feature needs to be turned up to 11

Reviewing Apple Watch Series 5 is not hard. It is so largely similar to last year’s Series 4. It carries with it all the things that made its predecessor great — the large display, haptics-enhanced Digital Crown and fall detection — and marches forward with one defining feature: the always-on display. Back-to-back years of seminal moments for the Watch is an impressive feat.

From an accessibility perspective, everything that was (and remains) great about Series 4 is there in Series 5. It is the best Apple Watch to date, and it is certainly the most accessible smartwatch on the market, period. But there are a few caveats.


The longer I wear Apple Watch Series 5, a 44mm space-gray aluminum review unit from Apple, the more torn I feel about the device’s always-on display.

On one hand, I readily acknowledge the significance of the new display as it relates to the watch as a whole. On the other hand, however, I find the always-on display to be somewhat of a letdown in practice. It isn’t that the always-on display is bad; it’s not. It’s that the current implementation isn’t that conducive to my visual needs.

The issue is brightness. The always-on display right now isn’t bright enough for me to quickly glance down at my wrist to see the time. As someone who requires maximum brightness on all my devices in order to see well, this is problematic. Other reviewers have mentioned how nice it is to just casually look down at the watch to see the time, as you would on a mechanical watch. My peers must have substantially better eyesight than I do, because I literally cannot do that. In my usage, I have found I’m still flicking my wrist like I have any previous Apple Watch to see the time. When you do so, the Apple Watch’s screen fully illuminates (to max brightness, per my display settings), and that’s how I can tell time.

The whole point of buying Series 5 is for the always-on display. I could turn it off, but that defeats the purpose.

It makes no difference whether I’m using an analog or digital watch face. The exception is when using the new Numerals Duo face with the “filled” styling. The digits are so large that I have no trouble seeing the time. This face would be a good solution for my woes if not for the fact it doesn’t support complications. Otherwise, Numerals Duo is a great workaround for the always-on display’s lack of light.

At a technical level, I understand why watchOS dims the display. Nonetheless, it’s unfortunate there is no way to adjust the brightness while in “always on” mode. Perhaps Apple will add such a feature in the future; it would make sense as an accessibility setting. As it stands today, as good as the always-on display is in general, I can’t say it makes much sense for me. I’m effectively using Series 5 the same way I use my Series 4. Because of this, Series 5 loses some of its appeal. The whole point of buying Series 5 is for the always-on display. I could turn it off, but that defeats the purpose, and I may as well stick with last year’s model.

On the flip side, if and when the always-on display improves for me, another benefit is it will save me from having to raise my arm so often. I wear my watch on my right wrist, which is notable because the right side of my body is partially paralyzed due to cerebral palsy. As such, raising my wrist to tell time or check a notification can sometimes be painful and fatiguing. The always-on display mitigates this because, by virtue of its persistence, you don’t necessarily have to contort your arm to look at your watch — thereby alleviating pain and fatigue for me and others.

Apple watch series 5 personalization of watch band 091019 big.jpg.large

Problematic packaging

From the original Apple Watch (colloquially known as “Series 0”) through Series 3, Apple packaged the watch as an “all-in-one” product. Which is to say, the band was fastened to the watch. You could grab it and go — take the watch out of the box and immediately see how it looks on you, even before pairing it with your iPhone.

With last year’s Series 4, Apple changed how they package Apple Watch, whereby the band and watch were separate entities. In order to wear it, you first need to attach the band to the watch. In my review, I called out this change as regressive despite recognizing why it made sense operationally. The revised layout continues in Series 5, which is disappointing.

Everything should be as accessible as possible.

The issues this setup raises are the same ones I expounded upon last year. To wit, it’s easy to see how some people could get flustered with the watch and band being piecemeal; it can be challenging in terms of cognitive load and fine-motor skills. Even as a seasoned product reviewer, I freely admit to again feeling a tad disjointed as I was piecing together my review unit.

Like the always-on display’s dimmed state, I totally get why Apple chose to overhaul how they package Apple Watch. It makes complete sense in context of the new Apple Watch Studio, where you can mix and match finishes and bands. This is a prime example of why reporting on accessibility and assistive technology matters so much: esoteric details like how a product is packaged can really matter to a person with disabilities. Part of the reason Apple products are so revered is precisely because of the elegant simplicity of its packaging. The unboxing is supposed to be one of the best parts of a new Apple Watch or iPhone or iMac — especially for disabled people, the initial experience leaves a lasting impression if you have to fiddle as if it were a jigsaw puzzle. I can manage, but many cannot. And it’s important to bear in mind. Everything should be as accessible as possible.

The bottom line

There is no doubt Apple Watch Series 5 is great. It retains the title of Best, Most Accessible Apple Watch Yet, but with an asterisk. I don’t have a burning desire to upgrade — although admittedly, the titanium’s siren song has been calling me ever since last month’s event. The problem I have with the display can be easily remedied with a software update; if Apple shipped a brightness slider tomorrow, I’d order one pronto. Today, though, always-on isn’t always bright — and that sucks.

In the end, I still heartily recommend Apple Watch Series 5 to everyone. My low vision makes the always-on display difficult to see as-is, and I surely can’t be the only one. But that doesn’t take away from the fact that the watch is still the best, most accessible smartwatch by a country mile. I’m confident the always-on display will be iterated and refined over time. In the meantime, Series 4 and watchOS 6 is a pretty bad-ass combination for me.

Apple puts accessibility features front and center

Although the meat of Apple’s accessibility news from WWDC has been covered, there still are other items announced that have relevancy to accessibility as well. Here, then, are some thoughts on Apple’s less-headlining announcements that I believe are most interesting from a disability point of view.

Accessibility goes above the fold

One of the tidbits I reported during the week was that Apple moved the Accessibility menu (on iOS 13 and iPadOS) to the top level of the Settings hierarchy. Instead of drilling down to Settings > General > Accessibility, the accessibility settings are now a “top level domain,” in the same list view as Notifications, Screen Time, and so on. Apple also told me this move applies to watchOS 6 as well.

Similarly, Apple said they’ve added accessibility to the first-run “setup buddy” experience. When someone sets up a new iPhone or other device for the first time, the system will prompt them to configure any desired accessibility features such as VoiceOver.

Both changes are long overdue and especially important symbolically. While it may not affect the average user much, if at all, the fact Apple is making this move speaks volumes about how much they care for the accessibility community. By moving Accessibility to the front page in Settings, it gives disabled users (and by extension, accessibility) just a bit more awareness.

As a disabled person myself, this is not insignificant. This change reinforces Apple’s position as the leader in the industry when it comes to making accessibility a first-class citizen; by elevating it to the top level, Apple is sending the message that accessibility is a critical aspect of the operating system, and a critical part of the user experience for so many, myself included.

Handoff for HomePod

I enjoy my HomePod for listening to music, podcasts, and controlling our HomeKit devices. Until now, however, one of the biggest annoyances with HomePod has been the inability to pick up where I left off. If I come home from the supermarket listening to music or a podcast and want to keep going, I have to stop and change the output source to my office’s HomePod. It’s not difficult to do, but from an accessibility perspective it’s a lot of extra taps. I definitely feel that bit of friction, and curse the dance every time I have to go through the rigamarole.

With iOS 13, that friction goes away. All I need to do is place my iPhone XR close to the HomePod (as if I were setting it up) and the iPhone will “hand off” whatever audio is playing to the speaker. Again, changing source is not a huge deal in the grand scheme of things, but as a disabled person I’m attuned to even the slightest inconveniences. Likewise with the ability to hear incoming iMessages read aloud to you on AirPods, these little refinements go a long way in not only having a more enjoyable, more seamless experience—it makes the experience more accessible, too. In this sense, this technology is magical in more ways than one.

The victory of Voice Control

The addition of Voice Control is definitely a headliner, but the backstory to it certainly isn’t.

Everyone I’ve spoken to during the week, whether it be fellow reporters, developers or Apple employees, shared the same sentiment: Voice Control is so great. In fact, the segment of John Gruber’s live episode of his podcast, The Talk Show, where he and special guests Craig Federighi and Greg Joswiak discussed the feature is a perfect example. It totally meshes with what I was told. Federighi explained how he had “friggin’ tears in my eyes” after watching an internal demo from somebody on Apple’s accessibility team.

Similarly, it was a hot topic of conversation at the accessibility get-together at the conference. So many of the engineers and other members of Apple’s accessibility group shared with me how proud they are that Voice Control exists. I’ve heard that its development was a considerable undertaking, and for everyone involved to see it released to the world—in beta for now, at least—is thrilling and affirming of the hard road the team took to get here.

At a high level, Voice Control strikes me as emblematic of Apple’s work in accessibility. Just watch the video:

It feels impossible, magical—but it’s entirely real. And the best part is this is a game-changing feature that will enhance the experience of so many, so immensely. Federighi was not wrong to cry; it’s amazing stuff.

Apple’s global accessibility head on the company’s new features for iOS 13 and macOS Catalina

From dark mode in iOS 13 to a redesigned user interface in tvOS to the dismantling of iTunes to the coming of iPadOS, Apple made a slew of announcements at its Worldwide Developers Conference keynote on Monday in San Jose. And accessibility was there in full force.

Accessibility, as it always does, plays a significant role in not only the conference itself — the sessions, labs and get-togethers all are mainstays of the week — but also in the software Apple shows off. Of particular interest this year is Apple’s Voice Control feature, available for macOS Catalina and iOS 13 devices, which allows users to control their Macs and iPhones using only the sound of their voices. In fact, it’s so compelling Apple decided to make it a banner feature worthy of precious slide space during Craig Federighi’s onstage presentation.

After the keynote concluded, I had an opportunity to sit down with Sarah Herrlinger, director of Global Accessibility Policy & Initiatives at Apple, to talk more in-depth about Voice Control, as well as some other notable accessibility features coming to Apple’s platforms in 2019.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,“ Herrlinger said. “There’s something in each operating system and things for a lot of different types of use cases.”

Hello, computer

Although much of the conversation around what Apple announced revolves around iPadOS and Project Catalyst, based on what I’m hearing on podcasts and seeing in my Twitter timeline, Voice Control definitely is a crown jewel too. Nearly everyone has praised not only the engineering that went into developing it, but also the fact that Apple continues to lead the industry at making accessibility a first-class citizen. Myke Hurley said it best on the Upgrade podcast following the event, the weekly show he co-hosts with Jason Snell, when he said Voice Control is something Apple doesn’t have to do. They do it, he said, because it’s the right thing to do for every user. Put another way, Apple works so tirelessly on accessibility not for “the bloody ROI,” to paraphrase Tim Cook.

Sarah Herrlinger, director of Global Accessibility Policy & Initiatives

Herrlinger demoed Voice Control to me, and it works as advertised despite our setting containing a lot of ambient noise. The gist of it is simple enough: You give your MacBook or iMac commands, such as “Open Mail” or “Tell Mary ‘Happy Birthday’ ” in Messages. Beyond the basic syntax, however, there are elements of Voice Control that make dictating to your Mac (or iOS device) easier. For example, Herrlinger explained how you can say “show numbers” in Safari’s Favorites view and little numbers, corresponding to the number of favorites you have, show up beside a website’s favicon. Say TechCrunch is No. 2 in your list of favorites. If the glyph is hard to make out visually, saying “open 2” will prompt Voice Control to launch TechCrunch’s page. Likewise, you can say “show grid” and a grid will appear so you perform actions such as clicking, tapping or pinching-and-zooming.

For many disabled people, the floodgates just opened. It’s a big deal.

Herrlinger told me Voice Control, while conceptually fairly straightforward, is designed in such a way to be deep and customizable. Furthermore, Herrlinger added that Apple has put in a ton of work to improve the speech detection system so that it can more adeptly parse users with different types of speech, such as those who stutter. Over time, Voice Control should improve at this.

Of course, the reason for all the excitement over Voice Control is the way it makes computing more accessible. Which is to say, Apple has reached an inflection point with its assistive technologies where someone who can’t physically interact with their computers now has an outlet. To use only your voice to do this used to be the stuff of science fiction, but now it’s more or less reality. There are other tools, like Apple’s own Switch Control, that are in the ballpark, but Voice Control takes it to a whole other level. Apple is putting a stake in the ground — if you can’t touch your computer, just talk to it. For many disabled people, the floodgates just opened. It’s a big deal.

Hover Text is Dynamic Type reimagined

I’ve made my affection for iOS’s Dynamic Type feature known countless times. By the same token, I’ve made my displeasure of its absence on macOS known just as often. Apple heard me.

Another feature Herrlinger was keen to show me was something Apple is calling Hover Text, on macOS. A subset of the already present Zoom functionality, Hover Text sort of reminds me of tooltips in Windows. The “Hover” name stems from the function: place your mouse pointer over a selection of text and you get a bubble with said text enlarged.

Herrlinger told me the feature works system-wide, even in places like the menu bar. And yes, Hover Text is indeed customizable; users have access to a wide variety of fonts and colors to make Hover Text’s “bubbles” their own. Text size can be enlarged up to 128pt, Herrlinger said. What this means is users can play with different permutations of the feature to find which label(s) work best — say, a yellow background with dark blue text set in Helvetica for the highest contrast. The possibilities are virtually endless, a testament to how rich the feature is despite its simplicity.

At a high level, Hover Text strikes me as very much the spirit animal of my beloved Dynamic Type. They’re clearly different features, with clearly defined purposes, but both strive to achieve the same goal in their own ways. Herrlinger told me Apple strives to create software solutions that make sense for the respective platform and the company’s accessibility group believes Hover Text is a shining example. They could’ve, she told me, ported Dynamic Type to the Mac, but found Hover Text accomplished the same goal (enlarging text) in a manner that felt uniquely suited to the operating system.

iOS gains Pointer Support, sort of

As first spotted by the ever-intrepid, master spelunker Steve Troughton-Smith, iOS 13 includes pointer support — as an accessibility feature.

Mouse support lives in the AssistiveTouch menu, the suite of options designed for users with physical motor delays who can’t easily interact with the touchscreen itself. Apple says it works with both USB and Bluetooth mice, although the company doesn’t yet have an official compatibility list. It’s telling how mouse functionality is purposely included as an accessibility feature — meaning, Apple obviously sees its primary value as a discrete assistive tool. Of course, accessibility features have far greater relevance than simply bespoke tools for disabled people. Just look at Troughton-Smith’s tweet for proof.

Still, in my conversation with Herrlinger, she emphasized the point that Apple built pointer support into AssistiveTouch as a feature designed and developed with accessibility in mind. In other words, support for mice and external pointing devices are intended expressly for accessibility’s sake. As usual with Apple products, Herrlinger told me the foundational parts of pointer support date back “a couple years.” This is something they’ve been working on for some time.

Accessibility features can benefit more than the original community they were designed to support.

To Apple, Herrlinger said, pointer support — which is supported on both iOS 13 and iPadOS — is a feature they felt needed to exist because the accessibility team recognized the need for it. There’s a whole class of users, she told me, who literally cannot access their devices without some other device, like a mouse or joystick. Hence, the team embarked on their mission to accommodate those users. When I asked why build pointer support into a touch-based operating system, Herrlinger was unequivocal in her answer: it serves a need in the accessibility community. “This is not your old desktop cursor as the primary input method,” she said.

The reality is, it’s not your secondary choice, either. The bottom line is that, while Apple loves the idea of accessibility features being adopted by the mainstream, pointer support in iOS 13 and iPadOS really isn’t the conventional PC input mechanism at all. In this case, it’s a niche feature that should suit a niche use case; it’s not supposed to represent the milestone of iPad’s productivity growth that many think it could be. Maybe that changes over time, but for now, it’s the new Mac Pro of software: not for everyone, not even for most people.

That said, a crucial point should be made here: people without disabilities will use this feature, regardless of its actual intended utility, and Apple recognizes that. No one will stop you from plugging a mouse into your iPad Pro. It’s no different from someone using Magnifier to get up close on a finely printed restaurant menu or using Type to Siri in order to quietly give commands in a Messages-like environment.

“Accessibility features can benefit more than the original community they were designed to support,” Herrlinger said. “For example, many people find value in closed captions. Our goal is to engineer for specific use cases so that we continue to bring the power of our devices to more people.”

It’s important, though, to take this feature in context. Users should be cognizant of the fact this implementation of pointer support is not meant to drastically alter the primary user input landscape of iPad in any way. That is the broader point Apple is trying to make here, and it’s a good one.

Equity transcribed: How to avoid an IPO

This week, the Equity duo of Kate Clark and Alex Wilhelm convened to get some quick hits in about Slack’s WORK, Luckin Coffee and Sam Altman’s departure from Y Combinator.

They then dug a bit deeper into the money around food: DoorDash and Sun Basket both raised this week. And what is a discussion about venture in food without mentioning Blue Apron?

And finally, TransferWise illustrates how not to IPO.

In all of this, they considered a world without the word “unicorn” as it relates to billion-dollar valuations — before admitting they are probably responsible for a good amount of its use.

Alex: So I think that the real unicorns now are companies that are growing, and are profitable, while also been worth over a billion dollars. Because we’ve seen very few of these, Zoom famously, was a profitable company. And its S-1, appears TransferWise also is, I can’t name more than two. And that makes them actually as rare as unicorn should be, in my view.

Kate: Yeah, I’m thinking maybe we should just actually stop using the term unicorn unless they’re profitable.

Alex: The problem with that is, it would be a two-person crusade against a wave of usage. I don’t think you and I have that clout. No offense to us.

Kate: I do think you and I are responsible for using that term, at least like 20% of the times that it’s used.

Alex: If that’s true, I’m going to retire. But I hear your point, we should actually get rid of the word unicorn, it’s now effectively meaningless, it means nothing. And profitable growing and worth a billion would be a great constellation of things to actually meet some threshold to be called special, because that is.

This also marks the last time the show was recorded from the only home it’s ever known — because TC SF is moving — so Kate trolling Alex at the tail end is quite fitting. Come back next week for the same great podcast from a different great location.

Want more Extra Crunch? Need to read this entire transcript? Then become a member. You can learn more and try it for free. 

Kate Clark: Hello and welcome back to Equity. I’m TechCrunch’s Kate Clark. This week, I am with Alex Wilhelm of Crunchbase News. How’s it going today, Alex?

Alex Wilhelm: Very, very good. Excited about all this and enjoying the sun out here on the East Coast. And I am missing you in the studio today because it is, I think, the very last one out of 410 Townsend, is that right?

Kate: Yeah, so last week we said it was our last one recording altogether, Chris, Alex and I. But this week is my last recording in the studio as Alex is in Providence. So next week we’ll be with you from a brand new spot. I have not seen the new podcast studio, I have no idea what to expect but hoping it’s nicer than this little cave.

Alex: It is cave-ish. It’s kind of nice. There are chairs and a table.

Kate: There is exposed brick which I really like. So I’ll miss the exposed brick.

Alex: It’s very SOMA rustic, if you will.

Kate: It is, indeed. All right, well, let’s get going, we have a few really good topics to get into today that I’m excited to talk about but first, let’s just do a quick little IPO update. Alex, why don’t you lead the way?

With new Fit technology, Nike calls itself a tech company

In 1927, Charles Brannock, the son of a local shoe company owner in Syracuse, N.Y., invented the Brannock Device. The steel measurement tool with five scales has been the most effective way in the U.S. to find an accurate shoe size.

Industry-wide, 60% of consumers are wearing the wrong-sized shoes. Not only is there a discrepancy among different styles of shoes (high heels to leather boots), sizing can often differ from brand to brand within one type of shoe (like adidas sneakers to Nike sneakers) and even silhouette to silhouette within a singular brand.

For instance, I’ve owned Nike React Epic sneakers with Flyknit technology in a women’s size 10. I have men’s suede Nike Air Max 95s in a 9.5. All of my men’s Air Jordan 1s are comfortably a men’s size 8.5, but I have a women’s pair in an 11, and my Air Jordan 4s are an 8. Meanwhile, my Nike Air Max 720s feel decidedly too small at a men’s 8.5. And this is all within one brand.

During the 92 years since its introduction, the birth of the internet, and some other society-altering technological advances, the Brannock Device has somehow remained uncontested. Until now.

This summer, Nike will introduce Nike Fit, a foot-scanning solution designed to find every person’s best fit. Conceptually, Nike Fit falls somewhere between “why would we reinvent the wheel” and “we don’t even need that wheel.”

Nike Fit uses a proprietary combination of computer vision, data science, machine learning, artificial intelligence and recommendation algorithms to find your right fit. With sub two-millimeter accuracy through dozens of data points, measurements are fed into the machine learning model that accommodates every detail of every Nike silhouette down to the materials that were used, the lacing systems and other critical aspects of fit. This is then paired with AI capabilities to learn a wearer’s personal fit preference and how they relate to the population as a whole.

Users can either find their size with the augmented reality feature in the Nike app or, soon, visit participating stores to use the technology. I recently had the opportunity to do both.

Within the Nike app, I used my phone’s camera to capture an empty space where the floor meets the wall as a point of reference, with the app’s guidance ensuring a level plane. I stood with my heels against the wall I captured as my reference point and pointed the camera down at my feet as if to take a photo. Once my feet were properly aligned with the outline guide within the app, I simply touched the button that looks just like I’m taking a photo.

In seconds, this action scans the feet and collects 13 data points, the best of the 32 points Nike is capable of capturing. Despite all of the data being collected, users will only be offered the length and width measurements, down to the millimeter, of each foot individually.

“Augmented reality is a new type of experience for a lot of consumers and sets a lot of challenges for them,” says Josh Moore, Nike’s vice president of design and user experience. “We’ve been doing a lot of experiments and creating new features in our SNKRS app over the last few years where we really learned a lot about how to use augmented reality successfully. Specifically, we know we have to guide our users through the journey at their own pace so they can comprehend as they go.”

“We’re talking about phones with cameras measuring your feet,” Moore continues. “It’s a new type of experience where you’re using your device, the device’s camera, the 3D space around you, and you’re using your body. There’s no common UX pattern for this.”

The in-store experience differs in a few ways. It wasn’t enough to simply have great technology, it also had to reduce friction within the in-store buying process. The idea is to reduce the amount of time associates spend going back and forth grabbing sizes from the stock room in order to ensure time spent with customers is higher quality and more efficient.

At the Retail Lab on Nike’s campus, I stood on a mat while a Nike sales associate scanned my feet with a handheld iTouch device. With the measurements taken (my right foot is 1 millimeter longer than my left, while my left is 1 millimeter wider than my right), the associate can provide a range of sizes for me, which includes where my best fit could fall in any shoe in Nike’s catalog. Once they look up the shoe I’m interested in, the app will offer the best fit size for my measurements and that shoe. If it’s available, they’ll bring out that size, and if there is any disbelief, they’ll bring out the size you’d like to try, as well.

Trying the Nike Fit experience at the Retail Lab on Nike’s campus

Whether using the app to find the right fit and make a purchase or going into the store, associates and customers can record which size is purchased, as well as other personal preferences around fit.

“Before a shoe arrives onto the market, it will already be trained into the solution. But since the solution encompasses both machine learning and AI, its accuracy out of the gate is astonishing and just gets even better,” says Michael Martin, vice president of Nike direct products, growth and innovation.

With more data, Nike will not only have continual improvements of an individual’s fit preferences, it will also learn the greater population’s preferences around each specific model, offering insight on creating better-fitting shoes. 

In development for just over 12 months, Nike Fit was being tested in three stores — one each in Seattle, Dallas and Pasadena, Calif. — only six months after Nike acquired Israeli startup, Invertex, whose entire mission was to create scans of the human body for better fit customization.

“Fundamentally, at this stage, Nike is a technology company. It’s a technology company that builds upon its historical strengths in footwear design, storytelling and inspiration, and it’s able to use those in combination to solve problems that no one else can solve,” says Martin. “We think this is arguably our biggest solution to date.”

Despite being for footwear right now, the technology created for Nike Fit has the potential to change retail in a lot of ways. One can imagine women being able to use the tech to find the right bra size. It could also make buying denim easier. As individualism and inclusivity have become marketing tools, custom fit seems like a natural next step, but until now, there hasn’t been a clear-cut solution.

Nike Fit will be introduced in select stores in the U.S. and within the Nike app in early July 2019, with Europe to follow later in the summer.

Nike has always had a place in the conversation alongside the likes of Apple when upper echelon branding and storytelling is discussed. With the introduction of Nike Fit, Nike just does it — again.

Equity transcribed: New a16z funds, a $200M round and the latest from WeWork and Slack

Welcome back to this week’s transcribed edition of Equity, TechCrunch’s venture capital-focused podcast that unpacks the numbers behind the headlines.

This week, Crunchbase News’s Alex Wilhelm and Extra Crunch’s Danny Crichton connected from their respective sides of the States to run through a rash of news about Divvy, Cheddar, SoftBank’s Vision Fund and Andreessen Horowitz. Plus, they got into the WeWork IPO:

Alex: We should move on to a business that we’ve never talked about on the show before WeWork.

Danny: To be clear, it’s not WeWork. It is the WeCompany.

Alex: But you have to put in quotes because no one knows what that is.

Danny: Sounds like a rollercoaster manufacturing company. So give us the top line numbers cause I never get tired of hearing them.

Alex: No, no, no. First we have to tell them the news Danny, what is the news then I will do the numbers.

Danny: Okay so the news was, so they originally had filed privately with the SEC to do their S1 in December and they didn’t pull it right? Or are they sort of delayed it but you know these filings don;t just disappear from the SEC but they refiled their S1 earlier or according to, I believe the Wall Street Journal, on Monday and that means that they’re sort of ready to go public and probably updated it with their Q1 figures.

For access to the full transcription, become a member of Extra Crunch. Learn more and try it for free. 

Why women are indefinitely sharing their locations

New York-based DJ and creative consultant Amrit and I are sitting at a women’s empowerment dinner waiting for her manager, Ramya Velury. Another friend of ours asks where Ramya is. “She said she was getting an Uber 15 minutes ago,” Amrit says as she unlocks her phone to check Ramya’s location.

“She’s still at home!” Ramya and Amrit share their locations with each other indefinitely through Apple’s Find My Friends app, which allows you to see a contact’s location at all times. Most of us have our locations shared with a friend.

One can easily wonder why anyone would want to allow someone complete 24-hour access to their location, especially the type who text “On my way!” before they’ve even stepped foot into the shower. However, women are foregoing privacy among their most trusted friends to offer full access to their location (more specifically, the location of their phone) at all times.

Conveniences by way of technological advances are normalizing a culture of being alone with strangers. Uber launched 10 years ago and multiple ridesharing apps followed. Tinder changed the world of online dating (and dating as a whole) with its millennial-friendly, instantly gratifying match-making. You can connect with someone nearby and be on the way to meet them as soon as you can get out the door.

We talk to strangers online, pay them to get into their cars and meet up with them alone. These developments go against every rule about strangers that our parents imbedded in our childhood brains.

Danueal Drayton, known as the “dating app murderer,” confessed to killing seven women, all of whom he met on dating apps. His criminal trial has been put on hold pending further psychiatric treatment and evaluation after a Los Angeles County judge deemed him incompetent for trial. And 24-year-old Sydney Loofe was murdered after a 2018 Tinder date.

“We utilize a network of industry-leading automated and manual moderation and review tools, systems and processes — and spend millions of dollars annually — to prevent, monitor and remove bad actors who have violated our Community Guidelines and Terms of Use from our app,” a Tinder spokesperson tells me, regarding the measures it takes to keep users safe. “These tools include automatic scans of profiles for red-flag language and images, manual reviews of suspicious profiles, activity and user-generated reports, as well as blocking email addresses, phone numbers and other identifiers.”

While these aren’t necessarily common occurrences, they are real-life horror stories nonetheless.

Sexual assault and sexual misconduct has gotten bad enough within Ubers that the company can no longer ignore it. In 2018, the company released a list of 21 types (categories, not 21 incidents) of sexual misconduct reported by drivers and riders, ranging from explicit gestures to rape.

Uber offers an option where you can share with a friend the status of your ride. The company did not respond to a request for comment about what they’re doing to combat the sexual misconduct within Ubers.

But, that’s just for cars that are actually employed by rideshare apps. Los Angeles resident and self-proclaimed introvert Erika Ramirez pointed to a crime of opportunity when a young woman got into a car that wasn’t her Uber.

“Recently, a 21-year-old woman [Samantha Josephson] was kidnapped and murdered by a man who pretended to be her Uber driver. Unfortunately, it feels like not a day goes by that you don’t hear of a case where a man kills a woman.” (That prompted Uber and Lyft to implement safety features in their apps.)

Conveniences by way of technological advances are normalizing a culture of being alone with strangers.

Ramirez is a freelance journalist and runs her indie publication ILY Magazine mainly from her one-bedroom apartment. “My schedule isn’t too set in stone. I wander to run errands, do laundry, grab food, meet with friends and go on dates at random times of the day or at night,” she says. “To be safe, I share my location with a close girlfriend, in case anything ever goes wrong during any of those instances. I let her know when I’m going on a late snack run or when I’m going on a date with someone.”

Naturally, there are concerns about sharing locations. In 2018, The New York Times reported there were 75 companies that track your location and use, sell or store it. They even illustrated how they were able to get the data and align the anonymous traveling dot to the human it belonged to based on a distinct daily routine.

“When my siblings first asked to share my location with them, I thought they were weird. It’s not like I was doing anything sketchy, but why do you need to know where I am all the time?,” Dr. Brittanny Keeler laments. She was living in Buffalo, N.Y., where she completed her residency and lived for six years. “If someone didn’t see me for 24 hours, the police would be notified. I have a bigger social circle there.”

Now she is an OBGYN in Norwalk, Conn., and newborns don’t adhere to a 9-to-5 work week. “If I deliver babies in the middle of the night, I’m getting out of work at all hours. Here, no one would know I was missing unless I didn’t show up for work.”

It wasn’t an incident or a friend or family member that caused her to reconsider sharing her location. It was one of those horror stories. “I listened to this podcast called Up And Vanished. I think it’s from 2016. It’s about a 30-year-old-woman that left a party and was supposedly going home and was never seen again. I thought to myself, I leave places alone all of the time and hopefully get home. That actual podcast is what prompted me to start sharing my location,” Keeler recalls.

Not at all as a result of Ubers, Tinders and other beneficial disruptive tech, socially, there’s a significant shift in traditional gender norms coinciding with and ultimately utilizing all of these advancements. The percentage of unpartnered adults living alone has risen from 56% in 2007 to 61% in 2017, and women are more likely to live alone than men. Sons are also more likely to live with their parents later in life than daughters, and in 2018, the median age for Americans’ first marriages was the oldest it has ever been, at 30 for men and 28 for women.

Dr. Keeler, Ramirez and Ramya are all unmarried and live alone. Amrit’s boyfriend just moved in after she lived on her own for the majority of her seven years in New York. She’s from Perth, Australia, and her family still lives there.

“Because my family is so far, Ramya is probably the closest to my family and would act responsibly in case of an emergency,” Amrit says. While Ramya is Amrit’s manager, she’s also one of her best friends, and Amrit regularly checks on her location, too. “She always stays out later. If it’s the morning, I’ll check where she is and that she’s made it home.”

It’s not just the number of women living alone that has increased, but more are also traveling alone. As recording artist Tommy Genesis’ tour DJ, it’s not unnatural for Amrit to be traveling as many days as she spends at home in New York. “I’m usually home for two to three weeks and gone for about the same,” she says. Ramirez is nearly bi-coastal, traveling to her former home of New York City once a month and sometimes spending weeks at a time there.

The New York Times just released a discouraging story connecting the dots of dangers the increased number of solo women travelers experience. In it, they highlighted a 2018 study conducted by online hostel booking site Hostelworld that showed a 45 percent increase in solo women travelers from 2015 to 2017. The bottom line of their findings: “Most countries do not comprehensively track violence against female travelers.”

This isn’t to say that women believe sharing their locations with each other will prevent violence against them. However, regardless of their awareness that Apple is not utilizing or sharing their data from Find My Friends, women are in favor of someone they trust to be able to track their every move in case something happens.

It actually may have saved Jaila Gladden’s life. After Jaila’s attacker kidnapped her from outside Atlanta and raped her in her own car, he tasked her with finding a gas station for him to rob, as he planned to take her to Michigan. She convinced him to let her use her phone to do so. She sent her location and alerted her boyfriend what was happening while “looking” for a gas station. Ultimately, police were able to find her, the car and her attacker.

While plenty of users are definitely hot and cold on location services, there is undeniable value and security in knowing someone can find you in case of emergency.

Since 2018, Apple iOS 12 securely and automatically shares location with first responders when U.S. users call 911. Now, iPhone 8s and later have the Emergency SOS feature that requires some setup but ultimately allows for an emergency call to trigger a text to a preselected group of contacts and a location alert to emergency services.

Google also has the iPhone and Android-friendly Google Trusted Contacts App, which allows users to trust and request locations from trusted contacts.

“Not only did I think it was weird that my family wanted to know where I am all the time, but our phones tracking everything in general is creepy to me,” says Dr. Keeler. “I don’t know what data collection I’m contributing to, but I do think it’s necessary for someone to have my location now.”

And it’s because of what Ramirez knows to be true: “Women have been killed by ex-boyfriends, men who’ve forced themselves on them on dates, men whose catcalling were ignored or rejected. Women have to be keenly aware of their surroundings, and sadly have a backup plan in case we are placed in harm’s way.”