Apple Watch Series 5’s banner feature needs to be turned up to 11

Reviewing Apple Watch Series 5 is not hard. It is so largely similar to last year’s Series 4. It carries with it all the things that made its predecessor great — the large display, haptics-enhanced Digital Crown and fall detection — and marches forward with one defining feature: the always-on display. Back-to-back years of seminal moments for the Watch is an impressive feat.

From an accessibility perspective, everything that was (and remains) great about Series 4 is there in Series 5. It is the best Apple Watch to date, and it is certainly the most accessible smartwatch on the market, period. But there are a few caveats.

Always-on

The longer I wear Apple Watch Series 5, a 44mm space-gray aluminum review unit from Apple, the more torn I feel about the device’s always-on display.

On one hand, I readily acknowledge the significance of the new display as it relates to the watch as a whole. On the other hand, however, I find the always-on display to be somewhat of a letdown in practice. It isn’t that the always-on display is bad; it’s not. It’s that the current implementation isn’t that conducive to my visual needs.

The issue is brightness. The always-on display right now isn’t bright enough for me to quickly glance down at my wrist to see the time. As someone who requires maximum brightness on all my devices in order to see well, this is problematic. Other reviewers have mentioned how nice it is to just casually look down at the watch to see the time, as you would on a mechanical watch. My peers must have substantially better eyesight than I do, because I literally cannot do that. In my usage, I have found I’m still flicking my wrist like I have any previous Apple Watch to see the time. When you do so, the Apple Watch’s screen fully illuminates (to max brightness, per my display settings), and that’s how I can tell time.

The whole point of buying Series 5 is for the always-on display. I could turn it off, but that defeats the purpose.

It makes no difference whether I’m using an analog or digital watch face. The exception is when using the new Numerals Duo face with the “filled” styling. The digits are so large that I have no trouble seeing the time. This face would be a good solution for my woes if not for the fact it doesn’t support complications. Otherwise, Numerals Duo is a great workaround for the always-on display’s lack of light.

At a technical level, I understand why watchOS dims the display. Nonetheless, it’s unfortunate there is no way to adjust the brightness while in “always on” mode. Perhaps Apple will add such a feature in the future; it would make sense as an accessibility setting. As it stands today, as good as the always-on display is in general, I can’t say it makes much sense for me. I’m effectively using Series 5 the same way I use my Series 4. Because of this, Series 5 loses some of its appeal. The whole point of buying Series 5 is for the always-on display. I could turn it off, but that defeats the purpose, and I may as well stick with last year’s model.

On the flip side, if and when the always-on display improves for me, another benefit is it will save me from having to raise my arm so often. I wear my watch on my right wrist, which is notable because the right side of my body is partially paralyzed due to cerebral palsy. As such, raising my wrist to tell time or check a notification can sometimes be painful and fatiguing. The always-on display mitigates this because, by virtue of its persistence, you don’t necessarily have to contort your arm to look at your watch — thereby alleviating pain and fatigue for me and others.

Apple watch series 5 personalization of watch band 091019 big.jpg.large

Problematic packaging

From the original Apple Watch (colloquially known as “Series 0”) through Series 3, Apple packaged the watch as an “all-in-one” product. Which is to say, the band was fastened to the watch. You could grab it and go — take the watch out of the box and immediately see how it looks on you, even before pairing it with your iPhone.

With last year’s Series 4, Apple changed how they package Apple Watch, whereby the band and watch were separate entities. In order to wear it, you first need to attach the band to the watch. In my review, I called out this change as regressive despite recognizing why it made sense operationally. The revised layout continues in Series 5, which is disappointing.

Everything should be as accessible as possible.

The issues this setup raises are the same ones I expounded upon last year. To wit, it’s easy to see how some people could get flustered with the watch and band being piecemeal; it can be challenging in terms of cognitive load and fine-motor skills. Even as a seasoned product reviewer, I freely admit to again feeling a tad disjointed as I was piecing together my review unit.

Like the always-on display’s dimmed state, I totally get why Apple chose to overhaul how they package Apple Watch. It makes complete sense in context of the new Apple Watch Studio, where you can mix and match finishes and bands. This is a prime example of why reporting on accessibility and assistive technology matters so much: esoteric details like how a product is packaged can really matter to a person with disabilities. Part of the reason Apple products are so revered is precisely because of the elegant simplicity of its packaging. The unboxing is supposed to be one of the best parts of a new Apple Watch or iPhone or iMac — especially for disabled people, the initial experience leaves a lasting impression if you have to fiddle as if it were a jigsaw puzzle. I can manage, but many cannot. And it’s important to bear in mind. Everything should be as accessible as possible.

The bottom line

There is no doubt Apple Watch Series 5 is great. It retains the title of Best, Most Accessible Apple Watch Yet, but with an asterisk. I don’t have a burning desire to upgrade — although admittedly, the titanium’s siren song has been calling me ever since last month’s event. The problem I have with the display can be easily remedied with a software update; if Apple shipped a brightness slider tomorrow, I’d order one pronto. Today, though, always-on isn’t always bright — and that sucks.

In the end, I still heartily recommend Apple Watch Series 5 to everyone. My low vision makes the always-on display difficult to see as-is, and I surely can’t be the only one. But that doesn’t take away from the fact that the watch is still the best, most accessible smartwatch by a country mile. I’m confident the always-on display will be iterated and refined over time. In the meantime, Series 4 and watchOS 6 is a pretty bad-ass combination for me.

Apple puts accessibility features front and center

Although the meat of Apple’s accessibility news from WWDC has been covered, there still are other items announced that have relevancy to accessibility as well. Here, then, are some thoughts on Apple’s less-headlining announcements that I believe are most interesting from a disability point of view.

Accessibility goes above the fold

One of the tidbits I reported during the week was that Apple moved the Accessibility menu (on iOS 13 and iPadOS) to the top level of the Settings hierarchy. Instead of drilling down to Settings > General > Accessibility, the accessibility settings are now a “top level domain,” in the same list view as Notifications, Screen Time, and so on. Apple also told me this move applies to watchOS 6 as well.

Similarly, Apple said they’ve added accessibility to the first-run “setup buddy” experience. When someone sets up a new iPhone or other device for the first time, the system will prompt them to configure any desired accessibility features such as VoiceOver.

Both changes are long overdue and especially important symbolically. While it may not affect the average user much, if at all, the fact Apple is making this move speaks volumes about how much they care for the accessibility community. By moving Accessibility to the front page in Settings, it gives disabled users (and by extension, accessibility) just a bit more awareness.

As a disabled person myself, this is not insignificant. This change reinforces Apple’s position as the leader in the industry when it comes to making accessibility a first-class citizen; by elevating it to the top level, Apple is sending the message that accessibility is a critical aspect of the operating system, and a critical part of the user experience for so many, myself included.

Handoff for HomePod

I enjoy my HomePod for listening to music, podcasts, and controlling our HomeKit devices. Until now, however, one of the biggest annoyances with HomePod has been the inability to pick up where I left off. If I come home from the supermarket listening to music or a podcast and want to keep going, I have to stop and change the output source to my office’s HomePod. It’s not difficult to do, but from an accessibility perspective it’s a lot of extra taps. I definitely feel that bit of friction, and curse the dance every time I have to go through the rigamarole.

With iOS 13, that friction goes away. All I need to do is place my iPhone XR close to the HomePod (as if I were setting it up) and the iPhone will “hand off” whatever audio is playing to the speaker. Again, changing source is not a huge deal in the grand scheme of things, but as a disabled person I’m attuned to even the slightest inconveniences. Likewise with the ability to hear incoming iMessages read aloud to you on AirPods, these little refinements go a long way in not only having a more enjoyable, more seamless experience—it makes the experience more accessible, too. In this sense, this technology is magical in more ways than one.

The victory of Voice Control

The addition of Voice Control is definitely a headliner, but the backstory to it certainly isn’t.

Everyone I’ve spoken to during the week, whether it be fellow reporters, developers or Apple employees, shared the same sentiment: Voice Control is so great. In fact, the segment of John Gruber’s live episode of his podcast, The Talk Show, where he and special guests Craig Federighi and Greg Joswiak discussed the feature is a perfect example. It totally meshes with what I was told. Federighi explained how he had “friggin’ tears in my eyes” after watching an internal demo from somebody on Apple’s accessibility team.

Similarly, it was a hot topic of conversation at the accessibility get-together at the conference. So many of the engineers and other members of Apple’s accessibility group shared with me how proud they are that Voice Control exists. I’ve heard that its development was a considerable undertaking, and for everyone involved to see it released to the world—in beta for now, at least—is thrilling and affirming of the hard road the team took to get here.

At a high level, Voice Control strikes me as emblematic of Apple’s work in accessibility. Just watch the video:

It feels impossible, magical—but it’s entirely real. And the best part is this is a game-changing feature that will enhance the experience of so many, so immensely. Federighi was not wrong to cry; it’s amazing stuff.

Apple’s global accessibility head on the company’s new features for iOS 13 and macOS Catalina

From dark mode in iOS 13 to a redesigned user interface in tvOS to the dismantling of iTunes to the coming of iPadOS, Apple made a slew of announcements at its Worldwide Developers Conference keynote on Monday in San Jose. And accessibility was there in full force.

Accessibility, as it always does, plays a significant role in not only the conference itself — the sessions, labs and get-togethers all are mainstays of the week — but also in the software Apple shows off. Of particular interest this year is Apple’s Voice Control feature, available for macOS Catalina and iOS 13 devices, which allows users to control their Macs and iPhones using only the sound of their voices. In fact, it’s so compelling Apple decided to make it a banner feature worthy of precious slide space during Craig Federighi’s onstage presentation.

After the keynote concluded, I had an opportunity to sit down with Sarah Herrlinger, director of Global Accessibility Policy & Initiatives at Apple, to talk more in-depth about Voice Control, as well as some other notable accessibility features coming to Apple’s platforms in 2019.

“One of the things that’s been really cool this year is the [accessibility] team has been firing on [all] cylinders across the board,“ Herrlinger said. “There’s something in each operating system and things for a lot of different types of use cases.”

Hello, computer

Although much of the conversation around what Apple announced revolves around iPadOS and Project Catalyst, based on what I’m hearing on podcasts and seeing in my Twitter timeline, Voice Control definitely is a crown jewel too. Nearly everyone has praised not only the engineering that went into developing it, but also the fact that Apple continues to lead the industry at making accessibility a first-class citizen. Myke Hurley said it best on the Upgrade podcast following the event, the weekly show he co-hosts with Jason Snell, when he said Voice Control is something Apple doesn’t have to do. They do it, he said, because it’s the right thing to do for every user. Put another way, Apple works so tirelessly on accessibility not for “the bloody ROI,” to paraphrase Tim Cook.

Sarah Herrlinger, director of Global Accessibility Policy & Initiatives

Herrlinger demoed Voice Control to me, and it works as advertised despite our setting containing a lot of ambient noise. The gist of it is simple enough: You give your MacBook or iMac commands, such as “Open Mail” or “Tell Mary ‘Happy Birthday’ ” in Messages. Beyond the basic syntax, however, there are elements of Voice Control that make dictating to your Mac (or iOS device) easier. For example, Herrlinger explained how you can say “show numbers” in Safari’s Favorites view and little numbers, corresponding to the number of favorites you have, show up beside a website’s favicon. Say TechCrunch is No. 2 in your list of favorites. If the glyph is hard to make out visually, saying “open 2” will prompt Voice Control to launch TechCrunch’s page. Likewise, you can say “show grid” and a grid will appear so you perform actions such as clicking, tapping or pinching-and-zooming.

For many disabled people, the floodgates just opened. It’s a big deal.

Herrlinger told me Voice Control, while conceptually fairly straightforward, is designed in such a way to be deep and customizable. Furthermore, Herrlinger added that Apple has put in a ton of work to improve the speech detection system so that it can more adeptly parse users with different types of speech, such as those who stutter. Over time, Voice Control should improve at this.

Of course, the reason for all the excitement over Voice Control is the way it makes computing more accessible. Which is to say, Apple has reached an inflection point with its assistive technologies where someone who can’t physically interact with their computers now has an outlet. To use only your voice to do this used to be the stuff of science fiction, but now it’s more or less reality. There are other tools, like Apple’s own Switch Control, that are in the ballpark, but Voice Control takes it to a whole other level. Apple is putting a stake in the ground — if you can’t touch your computer, just talk to it. For many disabled people, the floodgates just opened. It’s a big deal.

Hover Text is Dynamic Type reimagined

I’ve made my affection for iOS’s Dynamic Type feature known countless times. By the same token, I’ve made my displeasure of its absence on macOS known just as often. Apple heard me.

Another feature Herrlinger was keen to show me was something Apple is calling Hover Text, on macOS. A subset of the already present Zoom functionality, Hover Text sort of reminds me of tooltips in Windows. The “Hover” name stems from the function: place your mouse pointer over a selection of text and you get a bubble with said text enlarged.

Herrlinger told me the feature works system-wide, even in places like the menu bar. And yes, Hover Text is indeed customizable; users have access to a wide variety of fonts and colors to make Hover Text’s “bubbles” their own. Text size can be enlarged up to 128pt, Herrlinger said. What this means is users can play with different permutations of the feature to find which label(s) work best — say, a yellow background with dark blue text set in Helvetica for the highest contrast. The possibilities are virtually endless, a testament to how rich the feature is despite its simplicity.

At a high level, Hover Text strikes me as very much the spirit animal of my beloved Dynamic Type. They’re clearly different features, with clearly defined purposes, but both strive to achieve the same goal in their own ways. Herrlinger told me Apple strives to create software solutions that make sense for the respective platform and the company’s accessibility group believes Hover Text is a shining example. They could’ve, she told me, ported Dynamic Type to the Mac, but found Hover Text accomplished the same goal (enlarging text) in a manner that felt uniquely suited to the operating system.

iOS gains Pointer Support, sort of

As first spotted by the ever-intrepid, master spelunker Steve Troughton-Smith, iOS 13 includes pointer support — as an accessibility feature.

Mouse support lives in the AssistiveTouch menu, the suite of options designed for users with physical motor delays who can’t easily interact with the touchscreen itself. Apple says it works with both USB and Bluetooth mice, although the company doesn’t yet have an official compatibility list. It’s telling how mouse functionality is purposely included as an accessibility feature — meaning, Apple obviously sees its primary value as a discrete assistive tool. Of course, accessibility features have far greater relevance than simply bespoke tools for disabled people. Just look at Troughton-Smith’s tweet for proof.

Still, in my conversation with Herrlinger, she emphasized the point that Apple built pointer support into AssistiveTouch as a feature designed and developed with accessibility in mind. In other words, support for mice and external pointing devices are intended expressly for accessibility’s sake. As usual with Apple products, Herrlinger told me the foundational parts of pointer support date back “a couple years.” This is something they’ve been working on for some time.

Accessibility features can benefit more than the original community they were designed to support.

To Apple, Herrlinger said, pointer support — which is supported on both iOS 13 and iPadOS — is a feature they felt needed to exist because the accessibility team recognized the need for it. There’s a whole class of users, she told me, who literally cannot access their devices without some other device, like a mouse or joystick. Hence, the team embarked on their mission to accommodate those users. When I asked why build pointer support into a touch-based operating system, Herrlinger was unequivocal in her answer: it serves a need in the accessibility community. “This is not your old desktop cursor as the primary input method,” she said.

The reality is, it’s not your secondary choice, either. The bottom line is that, while Apple loves the idea of accessibility features being adopted by the mainstream, pointer support in iOS 13 and iPadOS really isn’t the conventional PC input mechanism at all. In this case, it’s a niche feature that should suit a niche use case; it’s not supposed to represent the milestone of iPad’s productivity growth that many think it could be. Maybe that changes over time, but for now, it’s the new Mac Pro of software: not for everyone, not even for most people.

That said, a crucial point should be made here: people without disabilities will use this feature, regardless of its actual intended utility, and Apple recognizes that. No one will stop you from plugging a mouse into your iPad Pro. It’s no different from someone using Magnifier to get up close on a finely printed restaurant menu or using Type to Siri in order to quietly give commands in a Messages-like environment.

“Accessibility features can benefit more than the original community they were designed to support,” Herrlinger said. “For example, many people find value in closed captions. Our goal is to engineer for specific use cases so that we continue to bring the power of our devices to more people.”

It’s important, though, to take this feature in context. Users should be cognizant of the fact this implementation of pointer support is not meant to drastically alter the primary user input landscape of iPad in any way. That is the broader point Apple is trying to make here, and it’s a good one.

Equity transcribed: How to avoid an IPO

This week, the Equity duo of Kate Clark and Alex Wilhelm convened to get some quick hits in about Slack’s WORK, Luckin Coffee and Sam Altman’s departure from Y Combinator.

They then dug a bit deeper into the money around food: DoorDash and Sun Basket both raised this week. And what is a discussion about venture in food without mentioning Blue Apron?

And finally, TransferWise illustrates how not to IPO.

In all of this, they considered a world without the word “unicorn” as it relates to billion-dollar valuations — before admitting they are probably responsible for a good amount of its use.

Alex: So I think that the real unicorns now are companies that are growing, and are profitable, while also been worth over a billion dollars. Because we’ve seen very few of these, Zoom famously, was a profitable company. And its S-1, appears TransferWise also is, I can’t name more than two. And that makes them actually as rare as unicorn should be, in my view.

Kate: Yeah, I’m thinking maybe we should just actually stop using the term unicorn unless they’re profitable.

Alex: The problem with that is, it would be a two-person crusade against a wave of usage. I don’t think you and I have that clout. No offense to us.

Kate: I do think you and I are responsible for using that term, at least like 20% of the times that it’s used.

Alex: If that’s true, I’m going to retire. But I hear your point, we should actually get rid of the word unicorn, it’s now effectively meaningless, it means nothing. And profitable growing and worth a billion would be a great constellation of things to actually meet some threshold to be called special, because that is.

This also marks the last time the show was recorded from the only home it’s ever known — because TC SF is moving — so Kate trolling Alex at the tail end is quite fitting. Come back next week for the same great podcast from a different great location.

Want more Extra Crunch? Need to read this entire transcript? Then become a member. You can learn more and try it for free. 


Kate Clark: Hello and welcome back to Equity. I’m TechCrunch’s Kate Clark. This week, I am with Alex Wilhelm of Crunchbase News. How’s it going today, Alex?

Alex Wilhelm: Very, very good. Excited about all this and enjoying the sun out here on the East Coast. And I am missing you in the studio today because it is, I think, the very last one out of 410 Townsend, is that right?

Kate: Yeah, so last week we said it was our last one recording altogether, Chris, Alex and I. But this week is my last recording in the studio as Alex is in Providence. So next week we’ll be with you from a brand new spot. I have not seen the new podcast studio, I have no idea what to expect but hoping it’s nicer than this little cave.

Alex: It is cave-ish. It’s kind of nice. There are chairs and a table.

Kate: There is exposed brick which I really like. So I’ll miss the exposed brick.

Alex: It’s very SOMA rustic, if you will.

Kate: It is, indeed. All right, well, let’s get going, we have a few really good topics to get into today that I’m excited to talk about but first, let’s just do a quick little IPO update. Alex, why don’t you lead the way?

With new Fit technology, Nike calls itself a tech company

In 1927, Charles Brannock, the son of a local shoe company owner in Syracuse, N.Y., invented the Brannock Device. The steel measurement tool with five scales has been the most effective way in the U.S. to find an accurate shoe size.

Industry-wide, 60% of consumers are wearing the wrong-sized shoes. Not only is there a discrepancy among different styles of shoes (high heels to leather boots), sizing can often differ from brand to brand within one type of shoe (like adidas sneakers to Nike sneakers) and even silhouette to silhouette within a singular brand.

For instance, I’ve owned Nike React Epic sneakers with Flyknit technology in a women’s size 10. I have men’s suede Nike Air Max 95s in a 9.5. All of my men’s Air Jordan 1s are comfortably a men’s size 8.5, but I have a women’s pair in an 11, and my Air Jordan 4s are an 8. Meanwhile, my Nike Air Max 720s feel decidedly too small at a men’s 8.5. And this is all within one brand.

During the 92 years since its introduction, the birth of the internet, and some other society-altering technological advances, the Brannock Device has somehow remained uncontested. Until now.

This summer, Nike will introduce Nike Fit, a foot-scanning solution designed to find every person’s best fit. Conceptually, Nike Fit falls somewhere between “why would we reinvent the wheel” and “we don’t even need that wheel.”

Nike Fit uses a proprietary combination of computer vision, data science, machine learning, artificial intelligence and recommendation algorithms to find your right fit. With sub two-millimeter accuracy through dozens of data points, measurements are fed into the machine learning model that accommodates every detail of every Nike silhouette down to the materials that were used, the lacing systems and other critical aspects of fit. This is then paired with AI capabilities to learn a wearer’s personal fit preference and how they relate to the population as a whole.

Users can either find their size with the augmented reality feature in the Nike app or, soon, visit participating stores to use the technology. I recently had the opportunity to do both.

Within the Nike app, I used my phone’s camera to capture an empty space where the floor meets the wall as a point of reference, with the app’s guidance ensuring a level plane. I stood with my heels against the wall I captured as my reference point and pointed the camera down at my feet as if to take a photo. Once my feet were properly aligned with the outline guide within the app, I simply touched the button that looks just like I’m taking a photo.

In seconds, this action scans the feet and collects 13 data points, the best of the 32 points Nike is capable of capturing. Despite all of the data being collected, users will only be offered the length and width measurements, down to the millimeter, of each foot individually.

“Augmented reality is a new type of experience for a lot of consumers and sets a lot of challenges for them,” says Josh Moore, Nike’s vice president of design and user experience. “We’ve been doing a lot of experiments and creating new features in our SNKRS app over the last few years where we really learned a lot about how to use augmented reality successfully. Specifically, we know we have to guide our users through the journey at their own pace so they can comprehend as they go.”

“We’re talking about phones with cameras measuring your feet,” Moore continues. “It’s a new type of experience where you’re using your device, the device’s camera, the 3D space around you, and you’re using your body. There’s no common UX pattern for this.”

The in-store experience differs in a few ways. It wasn’t enough to simply have great technology, it also had to reduce friction within the in-store buying process. The idea is to reduce the amount of time associates spend going back and forth grabbing sizes from the stock room in order to ensure time spent with customers is higher quality and more efficient.

At the Retail Lab on Nike’s campus, I stood on a mat while a Nike sales associate scanned my feet with a handheld iTouch device. With the measurements taken (my right foot is 1 millimeter longer than my left, while my left is 1 millimeter wider than my right), the associate can provide a range of sizes for me, which includes where my best fit could fall in any shoe in Nike’s catalog. Once they look up the shoe I’m interested in, the app will offer the best fit size for my measurements and that shoe. If it’s available, they’ll bring out that size, and if there is any disbelief, they’ll bring out the size you’d like to try, as well.

Trying the Nike Fit experience at the Retail Lab on Nike’s campus

Whether using the app to find the right fit and make a purchase or going into the store, associates and customers can record which size is purchased, as well as other personal preferences around fit.

“Before a shoe arrives onto the market, it will already be trained into the solution. But since the solution encompasses both machine learning and AI, its accuracy out of the gate is astonishing and just gets even better,” says Michael Martin, vice president of Nike direct products, growth and innovation.

With more data, Nike will not only have continual improvements of an individual’s fit preferences, it will also learn the greater population’s preferences around each specific model, offering insight on creating better-fitting shoes. 

In development for just over 12 months, Nike Fit was being tested in three stores — one each in Seattle, Dallas and Pasadena, Calif. — only six months after Nike acquired Israeli startup, Invertex, whose entire mission was to create scans of the human body for better fit customization.

“Fundamentally, at this stage, Nike is a technology company. It’s a technology company that builds upon its historical strengths in footwear design, storytelling and inspiration, and it’s able to use those in combination to solve problems that no one else can solve,” says Martin. “We think this is arguably our biggest solution to date.”

Despite being for footwear right now, the technology created for Nike Fit has the potential to change retail in a lot of ways. One can imagine women being able to use the tech to find the right bra size. It could also make buying denim easier. As individualism and inclusivity have become marketing tools, custom fit seems like a natural next step, but until now, there hasn’t been a clear-cut solution.

Nike Fit will be introduced in select stores in the U.S. and within the Nike app in early July 2019, with Europe to follow later in the summer.

Nike has always had a place in the conversation alongside the likes of Apple when upper echelon branding and storytelling is discussed. With the introduction of Nike Fit, Nike just does it — again.

Equity transcribed: New a16z funds, a $200M round and the latest from WeWork and Slack

Welcome back to this week’s transcribed edition of Equity, TechCrunch’s venture capital-focused podcast that unpacks the numbers behind the headlines.

This week, Crunchbase News’s Alex Wilhelm and Extra Crunch’s Danny Crichton connected from their respective sides of the States to run through a rash of news about Divvy, Cheddar, SoftBank’s Vision Fund and Andreessen Horowitz. Plus, they got into the WeWork IPO:

Alex: We should move on to a business that we’ve never talked about on the show before WeWork.

Danny: To be clear, it’s not WeWork. It is the WeCompany.

Alex: But you have to put in quotes because no one knows what that is.

Danny: Sounds like a rollercoaster manufacturing company. So give us the top line numbers cause I never get tired of hearing them.

Alex: No, no, no. First we have to tell them the news Danny, what is the news then I will do the numbers.

Danny: Okay so the news was, so they originally had filed privately with the SEC to do their S1 in December and they didn’t pull it right? Or are they sort of delayed it but you know these filings don;t just disappear from the SEC but they refiled their S1 earlier or according to, I believe the Wall Street Journal, on Monday and that means that they’re sort of ready to go public and probably updated it with their Q1 figures.

For access to the full transcription, become a member of Extra Crunch. Learn more and try it for free. 

Why women are indefinitely sharing their locations

New York-based DJ and creative consultant Amrit and I are sitting at a women’s empowerment dinner waiting for her manager, Ramya Velury. Another friend of ours asks where Ramya is. “She said she was getting an Uber 15 minutes ago,” Amrit says as she unlocks her phone to check Ramya’s location.

“She’s still at home!” Ramya and Amrit share their locations with each other indefinitely through Apple’s Find My Friends app, which allows you to see a contact’s location at all times. Most of us have our locations shared with a friend.

One can easily wonder why anyone would want to allow someone complete 24-hour access to their location, especially the type who text “On my way!” before they’ve even stepped foot into the shower. However, women are foregoing privacy among their most trusted friends to offer full access to their location (more specifically, the location of their phone) at all times.

Conveniences by way of technological advances are normalizing a culture of being alone with strangers. Uber launched 10 years ago and multiple ridesharing apps followed. Tinder changed the world of online dating (and dating as a whole) with its millennial-friendly, instantly gratifying match-making. You can connect with someone nearby and be on the way to meet them as soon as you can get out the door.

We talk to strangers online, pay them to get into their cars and meet up with them alone. These developments go against every rule about strangers that our parents imbedded in our childhood brains.

Danueal Drayton, known as the “dating app murderer,” confessed to killing seven women, all of whom he met on dating apps. His criminal trial has been put on hold pending further psychiatric treatment and evaluation after a Los Angeles County judge deemed him incompetent for trial. And 24-year-old Sydney Loofe was murdered after a 2018 Tinder date.

“We utilize a network of industry-leading automated and manual moderation and review tools, systems and processes — and spend millions of dollars annually — to prevent, monitor and remove bad actors who have violated our Community Guidelines and Terms of Use from our app,” a Tinder spokesperson tells me, regarding the measures it takes to keep users safe. “These tools include automatic scans of profiles for red-flag language and images, manual reviews of suspicious profiles, activity and user-generated reports, as well as blocking email addresses, phone numbers and other identifiers.”

While these aren’t necessarily common occurrences, they are real-life horror stories nonetheless.

Sexual assault and sexual misconduct has gotten bad enough within Ubers that the company can no longer ignore it. In 2018, the company released a list of 21 types (categories, not 21 incidents) of sexual misconduct reported by drivers and riders, ranging from explicit gestures to rape.

Uber offers an option where you can share with a friend the status of your ride. The company did not respond to a request for comment about what they’re doing to combat the sexual misconduct within Ubers.

But, that’s just for cars that are actually employed by rideshare apps. Los Angeles resident and self-proclaimed introvert Erika Ramirez pointed to a crime of opportunity when a young woman got into a car that wasn’t her Uber.

“Recently, a 21-year-old woman [Samantha Josephson] was kidnapped and murdered by a man who pretended to be her Uber driver. Unfortunately, it feels like not a day goes by that you don’t hear of a case where a man kills a woman.” (That prompted Uber and Lyft to implement safety features in their apps.)

Conveniences by way of technological advances are normalizing a culture of being alone with strangers.

Ramirez is a freelance journalist and runs her indie publication ILY Magazine mainly from her one-bedroom apartment. “My schedule isn’t too set in stone. I wander to run errands, do laundry, grab food, meet with friends and go on dates at random times of the day or at night,” she says. “To be safe, I share my location with a close girlfriend, in case anything ever goes wrong during any of those instances. I let her know when I’m going on a late snack run or when I’m going on a date with someone.”

Naturally, there are concerns about sharing locations. In 2018, The New York Times reported there were 75 companies that track your location and use, sell or store it. They even illustrated how they were able to get the data and align the anonymous traveling dot to the human it belonged to based on a distinct daily routine.

“When my siblings first asked to share my location with them, I thought they were weird. It’s not like I was doing anything sketchy, but why do you need to know where I am all the time?,” Dr. Brittanny Keeler laments. She was living in Buffalo, N.Y., where she completed her residency and lived for six years. “If someone didn’t see me for 24 hours, the police would be notified. I have a bigger social circle there.”

Now she is an OBGYN in Norwalk, Conn., and newborns don’t adhere to a 9-to-5 work week. “If I deliver babies in the middle of the night, I’m getting out of work at all hours. Here, no one would know I was missing unless I didn’t show up for work.”

It wasn’t an incident or a friend or family member that caused her to reconsider sharing her location. It was one of those horror stories. “I listened to this podcast called Up And Vanished. I think it’s from 2016. It’s about a 30-year-old-woman that left a party and was supposedly going home and was never seen again. I thought to myself, I leave places alone all of the time and hopefully get home. That actual podcast is what prompted me to start sharing my location,” Keeler recalls.

Not at all as a result of Ubers, Tinders and other beneficial disruptive tech, socially, there’s a significant shift in traditional gender norms coinciding with and ultimately utilizing all of these advancements. The percentage of unpartnered adults living alone has risen from 56% in 2007 to 61% in 2017, and women are more likely to live alone than men. Sons are also more likely to live with their parents later in life than daughters, and in 2018, the median age for Americans’ first marriages was the oldest it has ever been, at 30 for men and 28 for women.

Dr. Keeler, Ramirez and Ramya are all unmarried and live alone. Amrit’s boyfriend just moved in after she lived on her own for the majority of her seven years in New York. She’s from Perth, Australia, and her family still lives there.

“Because my family is so far, Ramya is probably the closest to my family and would act responsibly in case of an emergency,” Amrit says. While Ramya is Amrit’s manager, she’s also one of her best friends, and Amrit regularly checks on her location, too. “She always stays out later. If it’s the morning, I’ll check where she is and that she’s made it home.”

It’s not just the number of women living alone that has increased, but more are also traveling alone. As recording artist Tommy Genesis’ tour DJ, it’s not unnatural for Amrit to be traveling as many days as she spends at home in New York. “I’m usually home for two to three weeks and gone for about the same,” she says. Ramirez is nearly bi-coastal, traveling to her former home of New York City once a month and sometimes spending weeks at a time there.

The New York Times just released a discouraging story connecting the dots of dangers the increased number of solo women travelers experience. In it, they highlighted a 2018 study conducted by online hostel booking site Hostelworld that showed a 45 percent increase in solo women travelers from 2015 to 2017. The bottom line of their findings: “Most countries do not comprehensively track violence against female travelers.”

This isn’t to say that women believe sharing their locations with each other will prevent violence against them. However, regardless of their awareness that Apple is not utilizing or sharing their data from Find My Friends, women are in favor of someone they trust to be able to track their every move in case something happens.

It actually may have saved Jaila Gladden’s life. After Jaila’s attacker kidnapped her from outside Atlanta and raped her in her own car, he tasked her with finding a gas station for him to rob, as he planned to take her to Michigan. She convinced him to let her use her phone to do so. She sent her location and alerted her boyfriend what was happening while “looking” for a gas station. Ultimately, police were able to find her, the car and her attacker.

While plenty of users are definitely hot and cold on location services, there is undeniable value and security in knowing someone can find you in case of emergency.

Since 2018, Apple iOS 12 securely and automatically shares location with first responders when U.S. users call 911. Now, iPhone 8s and later have the Emergency SOS feature that requires some setup but ultimately allows for an emergency call to trigger a text to a preselected group of contacts and a location alert to emergency services.

Google also has the iPhone and Android-friendly Google Trusted Contacts App, which allows users to trust and request locations from trusted contacts.

“Not only did I think it was weird that my family wanted to know where I am all the time, but our phones tracking everything in general is creepy to me,” says Dr. Keeler. “I don’t know what data collection I’m contributing to, but I do think it’s necessary for someone to have my location now.”

And it’s because of what Ramirez knows to be true: “Women have been killed by ex-boyfriends, men who’ve forced themselves on them on dates, men whose catcalling were ignored or rejected. Women have to be keenly aware of their surroundings, and sadly have a backup plan in case we are placed in harm’s way.”

Equity transcribed: Digging into the Uber S-1

Welcome back to this week’s transcribed edition of Equity, TechCrunch’s venture capital-focused podcast that unpacks the numbers behind the headlines.

And because it’s another week, why not another emergency episode? This time Kate Clark and Alex Wilhelm popped in the studio an hour before they were due to record the regular episode in order to dig into the Uber S-1. Not only did they dig into it, but they did so in real-time. That’s what happens when you only have 10 minutes to get through almost 300 pages of numbers. And if it’s numbers you like, this is the episode for you.

The duo talks Uber’s profits and losses and provides context into it all. And just to prove just how juicy this ep is, Equity Shots tend to be about 15 minutes long. Not this one. There was a lot to get to. And who better to lead the conversation than Kate and Alex? So join them as they walk you through what the Uber S-1 holds.

For access to the full transcription, become a member of Extra Crunch. Learn more and try it for free. 


Kate Clark: Hello and welcome to Equity Shot. This is TechCrunch’s Kate Clark, and I’m joined today by Alex Wilhelm of Crunchbase News.

Alex Wilhelm: Hello.

Kate: We are going to tackle some breaking news. But, a warning from Alex first.

Alex: Yeah, so it’s 2:09pm here on the West Coast on Thursday, which means that the S-1 dropped, I don’t know, about 45 minutes ago, maybe an hour. And there was a lot to do before the show, but we wanna get this out as soon as we can, so we did our note dock by hand, and we got the S-1 pulled up, and we have a lot to go through. But, there may be an awkward pause in this, because we don’t have every single number pulled out ahead of time.

Kate: We are literally scrolling through the document live. We have a piece of paper taped to the wall in the studio with a very rough outline of what we’re gonna talk about. And we agreed that we’re going to try to take it slow and carry you guys through these important numbers as best we can.

Alex: Yes, and we are gonna start with yearly numbers to stay at the highest possible level, and we’re gonna talk about revenue first.

Alex: Now, keep in mind that we’re not talking about bookings, which is the total spend on Uber’s platform, we’re gonna talk about revenue, which is Uber’s portion of that overall platform spend. So, in 2014, because the S-1 goes back all the way to 2014, Uber had revenue of 495 million. That nearly quadrupled in 2015 to 1.99 billion … call it 2 billion flat. In 2016 that grew to 3.85 billion. It expanded to 7.9 billion in 2017, and 11.3 billion in 2018. So, basically a half a billion, to 11.3 billion from 2014 to 2018.

Kate: Yeah, quick reminder, a lot of these we’ve seen. I know there’s been plenty of reports highlighting Uber’s 2018 revenues of around 11 billion, but this is the first time we’re getting a full glimpse into financial history all the way back to 2014, and then also losses, which were interesting.

 

Alex: Very, very interesting.

Kate: I’ll quickly run through losses beginning in 2014. So, Uber lost 670 million that year, they were not profitable. The next year they lost 2.7 billion, again, not profitable. The next year they lost 370 million, guessing there was a big … oh, no, that was the year of the divestiture of … we just talked about this.

Alex: Uber China.

An Equity deep dive on Patreon

The popular TechCrunch podcast Equity this week launched a new series called Equity Dive, wherein a host interviews the writer of the latest edition of the Extra Crunch EC-1.

If you’ve ever wanted to know everything there is to know about Patreon, the platform that connects creators with fans and their wallets, then this is the show for you. TechCrunch Silicon Valley editor Connie Loizos speaks with Eric Peckham who spent hours upon hours meeting with the Patreon team to learn its origin story and the ins and outs of its business practices to get the company to where it is today.

Read a deep dive of Patreon on Extra Crunch

As Eric says:

The way to think about how Patreon has evolved is I see it in kind of three stages, which was this initial crowd funding platform, and then evolving beyond that to try and be a destination platform for consumers where there would be great content that you just go to Patreon to find and you go to discover creators, kind of a marketplace model. They moved away from that. That was somewhat of a gradual shift and essentially the decision was it’s not good to be stuck in this game of trying to be yet another destination platform for consumers competing with YouTube and Instagram and every single media site out there. Really the opportunity and mission underlies our work is about helping creators and enabling all these independent creators to sustain themselves and to build thriving businesses.

They shifted, they now describe themselves as a SaaS company actually, which is very different from framing yourself as kind of a consumer destination. The long and short of it is they see this opportunity, which is a growing market of independent creators around the world who are building fan bases, and for that particular type of SMB they want to provide essentially the full suite of tools and services that they need to run their businesses.

For access to the full transcription, become a member of Extra Crunch. Learn more and try it for free. 


Connie Loizos: Hi, I’m Connie Loizos and I’d like to welcome you to our first Equity Dive. Once a month we’re going to be dedicating an entire episode to a deep dive into the life of one company. This month I’m joined by Eric Peckham, who has reported extensively on the crowd funding membership platform Patreon. Hi Eric.

Eric Peckham: Hey Connie, excited to be here for the first Equity Dive.

Connie Loizos: Same, so Eric you and I ran into each other first in Berlin but we don’t know each other very well. I’d love to hear more about you. You’re based in LA, and from what I understand you are a media industry analyst. Is that correct?

Eric Peckham: Yes, so I cover through both my own newsletter Monetizing Media, the happenings of the global media and entertainment industry. It’s kind of a very business minded lens on media and entertainment.

Connie Loizos: Well I read your extensive coverage on Patreon and it was really impressive, and I wondered considering how much you wrote, is this sort of a long interest of yours this company or how did you decide to settle on this for your first deep dive for TechCrunch?

Eric Peckham: Yes, it was an exciting process digging into this. We made a short list of exciting companies, a lot of unicorn companies or late stage startups we thought were about to become unicorns, and Patreon jumped out for a number of reasons. One is as someone who runs his own newsletter I have had subscribers to that newsletter suggest creating a Patreon. I’ve looked into it before, so I had a little bit of a creator perspective of just wanting to better understand Patreon and other options in the market. I think from a bigger picture, more of a Silicon Valley perspective, Patreon’s a really fascinating company. They’ve raised over $100 million from top PC firms like Index, CRV, they’re the dominant player in this space they’re targeting, but it’s kind of them versus just the big social media platforms. There isn’t the startup that’s comparable in size to it and it’s really trying to own this whole territory of independent content creators, surveying them with different business tools or services.

Connie Loizos: It is really interesting to think the David and Goliath story involves a $100 million venture backed startup versus, as you say, I know these big players Facebook, YouTube. Let’s start at the beginning, so you decided on Patreon for reasons that I can certainly understand now. How did you set about pitching them on this idea? Because obviously you were going to need a lot of access to them, a lot of their time.

Equity transcribed: Funding news round-up, a16z’s future, an upcoming IPO and more Lyft

Welcome back to this week’s transcribed edition of Equity, TechCrunch’s venture capital-focused podcast that unpacks the numbers behind the headlines. Here we put the words of our wildly popular venture capital podcast, Equity, into Extra Crunch members’ eyes instead of their ears. This week, TechCrunch’s Kate Clark and Crunchbase News’s Alex Wilhelm get rapid-fire on funding news from around the way. And because they both talk so fast, they got a lot in.

In case you hadn’t heard, Andreessen Horowitz relinquished its status as a venture capital firm and registered all 150 of its employees as financial advisors. Kate and Alex dug in a bit more about that story. And who is sick of hearing about Lyft’s IPO? Nobody? Great. Because they talked about that and the implications of Uber’s imminent journey into the land of the public.

And finally, the Midas List. Does it matter? Why are we talking about it? Why do lists exist? Who’s on top? Who’s not? Who’s sad? Who cares? And more questions left unanswered.

For access to the full transcription, become a member of Extra Crunch. Learn more and try it for free. 


Kate Clark: Hello, welcome back to Equity. I’m TechCrunch’s Kate Clark and I’m joined again this week with Crunch Base News’ Alex Wilhelm.

Alex Wilhelm: Hello Kate. Good to be back in the studio.

Kate Clark: Welcome back. It’s just Alex and I this week and we have a lot of news to get through. So we’re just gonna start off by doing a rapid-fire overview of some of the stuff we’re not going to go as deep on. Alex, start us off with Affirm.