Biology as technology will reinvent trillion-dollar industries

We face two major threats today: one to the health of our planet and the other to our own. The U.N. says the global population will hit 9.7 billion by 2050, meaning more people consuming more natural resources than at any point in human history. Consumption is already doubling every 10-12 years. Add to that the challenges of a warming planet. On the human health front, some 30% of young people under age 20 are obese, 31% of deaths are from cardiovascular disease, and cancer cases are growing at a rate twice as fast as the population.

Fortunately, biology and technology are creating fixes for the planet as well as for the human body. As they do so, they are poised to reinvent countless industries, giving rise to what I believe is a golden age for biology as technology. As Arvind Gupta, the founder of health-science accelerator IndieBio, argued in one recent Medium post, “the twin catastrophes of planetary and human health” will create a $100 trillion opportunity.

Before I tell you how, here is an extremely brief history of the field. Biology, of course, is the original technology. Our tinkering with life’s building blocks, and our ancestor’s manipulation of plants and herbs as medicines and their use of neem branches as toothpaste or the cultivation of plants like corn has been going on for millennia. It wasn’t until the 1970s and 1980s that we saw the first flowering of today’s modern biotech industry.

In 1972, Robert A. Swanson helped launch the birth of biotech when he co-founded Genentech, which became a pioneer in the field of recombinant DNA technology. By creating novel DNA sequences in the lab, Genentech was able to synthesize human insulin for diabetics (1982), and create growth hormones for kids who suffered from a hormone deficiency (1985).

Among the other early leaders in the field was Applied Molecular Genetics (today known as Amgen). In 1989, it won approval for the first recombinant human erythropoietin drugs to treat anemia in people with chronic kidney failure, and later to treat anemia in HIV patients. Last year, the $23.75 billion company’s best-selling drugs were Neulasta, used to prevent infections in cancer patients undergoing chemotherapy, and Enbrel, to treat some autoimmune diseases.

Startups working in these fields are creating entirely new industries, disrupting others and bringing us into what I believe is a golden era of biology as technology.

Today, innovative researchers are building on those early technologies. Among the most promising is the discovery of the CRISPR-Cas9 gene-editing technique. Using what they refer to as molecular scissors, scientists can use CRISPR to edit a living person’s DNA, deleting or repairing damaged sections. Because the changes are made at the genome, the DNA fix is hereditary, unlike previous fixes that affect only the individual patient. The technique promises to slow if not eradicate cancer. It could also prevent sickle cell disease, cystic fibrosis, hemophilia and heart disease.

Notwithstanding the concern over creating designer babies (and the recent controversial creation of the first gene-edited babies in China), it promises to fortify our bodies for us, those of our kids and all succeeding generations. Co-founded by Jennifer Doudna, a leader in the CRISPR field, Mammoth Biosciences is on a mission to leverage the power of CRISPR to democratize disease detection by bringing accurate and affordable testing out of the laboratory and into the point-of-care.

Other technologies, like DNA sequencing, cell engineering and bioprinting, have led to the creation of animal-free protein products, bio fuels for jet engines, lightweight materials stronger than steel and even memory for computer storage. As a result, startups working in these fields are creating entirely new industries, disrupting others and bringing us into what I believe is a golden era of biology as technology.

One successful company is Beyond Meat, which bills itself as the future of protein. With its plant-based meat product, it is trying to address our global population’s need for protein while also tackling the cow problem (they consume land and water and destroy the ozone with their flatulence, not to mention some people think eating them is wrong). The company’s work promises to disrupt the $270 billion global meat industry.

The entrepreneurs at New Culture are also tackling the cow issue. They are using an engineered version of baker’s yeast to make cheese without milk. Unlike other vegan cheeses, made from soy or nuts, this one has been praised as tasting like the real thing.

image001

Another area ripe for disruption is our home. The startup Lingrove is trying to lessen our reliance on trees, and the deforestation that comes with it, by creating wood products with flax fiber and bio-epoxy resin. With its Ekoa TP product, Lingrove is targeting the $80 billion interior market, with an eye toward using its products in the construction industry. Another player in this field is bioMASON. The making of concrete contributes massive amounts of carbon to the air. But this company has shown it can “grow” bricks and masonry from sand without using a traditional heating-blasting-process, by infusing the sand with microorganisms that initiates a process like the one that creates coral.

There’s no telling where this golden age of biology as technology will lead.

And then there’s transportation, the No. 1 global contributor of greenhouse gasses. Companies like Amyris are trying to do away with fossil fuels by turning genetically engineered yeast (i.e. sugar) into environmentally friendly gas and jet fuel.

And that’s not all. There are many more biology as technology stories, with innovative companies doing things like turning mushrooms into leather (MycoWorks), molecules into whiskey (Endless West) and bacteria into silk (Bolt Threads). Biology might even reinvent information technology. Scientists have shown how a few grams of DNA can store as much information as an entire data center (Microsoft is working on this). Another company is building computers from neurons (Airbus is a partner).

There’s no telling where this golden age of biology as technology will lead, how many products it will come up with and how many industries it will end up disrupting, or creating. But it seems destined to reinvent trillion-dollar industries and create a healthier planet where we can live longer, healthier lives.

Disclosure: Genentech and Amgen are Mayfield investments from the 1970s and 1980s. Mammoth Biosciences is a current investment.

How to stop vanity marketing from killing your startup

In 2014, I got in on the ground floor of what I thought was a rocket ship. Fling was the fastest-growing app in 2014, and I was pulled in as their chief growth officer, with a handsome compensation package — one that in retrospect should have given me pause. Within 24 hours of arriving in London, I was greeted at the door by a Fling-branded Humvee, which wouldn’t even turn out to be the worst use of the company’s money.

Fling’s marketing team consisted of 20 people, or about 30% of the company. Skeptical that any startup needed a marketing team remotely close to that size, I sat down with each one of them to learn about each individual’s expertise, role and what value they added. Each focused on a particular area — online user acquisition, brand, partnerships, metrics, to name a few. Surprisingly, I found myself impressed with their skill sets, at least on paper.

Nevertheless, my spidey sense was tingling. It’s not that they were lazy or shirked responsibilities; in fact, each seemed like they tried to create value in earnest. But everyone on the team lacked a sense of urgency — the one that drives truly great startups to be thoughtful and careful about understanding why and how things work.

And when resources are seemingly infinite, any expenditure — whether time, money or both — seems like a good idea, so long as the return is net positive. And in isolation, perhaps many (or all of them) yield a positive ROI. The result was a constant cash-burn, despite “doing everything right” — everything was working, but it wasn’t working in a sustainable, manageable way, and I’d ended up buying into the hype. I’d been concerned about marketing bloat, and I was right. The company would eventually go under after burning $21 million.

I knew I was part of it. I could have stopped the bleed. But everything I was doing had a positive outcome — not one I could necessarily quantify or describe, but I knew it was there. We had all the money and time in the world — right up until we didn’t.

Before Fling I’d been scrappy, constantly brushing arms with death running one of the most popular fitness apps in the world perpetually from the end of our runway. After Fling, I’d developed bad habits — by my own hand — and had to force myself through a series of less glamorous but more fulfilling jobs wherein all that really mattered was results.

For me — and I’d say for any marketer — to develop resourcefulness, I needed to have spent significant time in an environment of scarcity, not abundance. This environment is the difference between whether or not a marketer ends up cutting their teeth and growing in their abilities or forever sucking on the teat provided by your friendly neighborhood VC.

And the word  “resourcefulness” is a misnomer, containing a beautiful irony of sorts: it’s a trait that only develops when resources are running on empty.

It’s time for you to understand a new term — vanity marketing.

We had all the money and time in the world — right up until we didn’t.

Vanity marketing is a tempting investment for a company. It’s got some vague, ephemeral yet satisfying results — you’ve got a big party, you’ve got a wrapped Humvee, you’ve got something cool to point at, and perhaps you’ll achieve the mythical “virality” that gets a particular thing 10,000 shares or retweets.

You’re popular — a non-specific yet incredibly sexy thing that theoretically would mean that investors would talk to you, or reporters would speak to you, or that you’ve “made it.” It’s a result of the fact that many markets don’t have the level of scrutiny of, say, a sales team applied to them — marketing’s this big, powerful juggernaut where many people survive just by not getting fired.

If premature scaling is the leading killer of startups, marketing is the symptomless cancer that leads to its demise. Marketers with abundance ingrained into their mindset will spend until those resources are no longer there. It’s easy to succeed in marketing by burning capital to grow.

You know how there are some people who are entrepreneurs just so they can say they’re entrepreneurs? I’ve noticed a similar pattern in marketing. Everyone wants to call themselves a “growth hacker,” but no one wants to learn to write SQL or Python.

Why? Because it’s not sexy. Neither is obsessing over metrics like CPM, Average Order Value and cost per unique “add to cart.” What is sexy, though, is spending (other people’s) money to reach new audiences, and pointing at increasingly bigger numbers. The problem is that unless you get your hands dirty, you won’t actually be able to understand whether your marketing efforts command a return. I’ve seen marketers waste hundreds of thousands of dollars with no repercussions. Could you imagine if a salesperson expensed that same amount in sales trips without landing a single client?

Almost every single major startup flameout you’ve seen has had some form of major Vanity Marketing Spend, one totally divorced from, say, the cost of acquiring a single user. If you’re reading this and saying that you’re not one of these marketers, then I’m proud, yet suspicious, of you. It’s fine if you’ve dabbled — a happy hour here, a CES party there — and understood that those were brief attempts to get something that’s unquantifiable. And it’s even stupider if you’ve spent this money “just because everybody else is doing it.”

But the dark truth is that many, many marketing expenditures are totally unquantifiable — they have little to no grounding in reality beyond telling people you’ve spent money.

The boring, consistent marketing you can do — that you can analyze, that you can truly understand the effect of — is so much less interesting than the big, shiny objects. It might not look as impressive, but it’ll work. And it’ll teach you to succeed anywhere.

Where have all the seed deals gone?

When it comes to big business, the numbers rarely lie, and the ones PitchBook and other sources have pulled together on the state of seed investing aren’t pretty. The total number of seed deals, funds raised and dollars invested in seed deals were all down in the 2015-2018 time frame, a period too long to be considered a correctable glitch.

The number of seed deals, defined as U.S.-based deals under $1 million, dropped to 882 in Q4 2018 from 1,500 three years earlier, a 40% drop. The number of seed funds raised and the total dollars invested in seed rounds were both down roughly 30% over the same time period. And the trend isn’t limited to the U.S. — venture capital investment volume outside the U.S. dropped by more than 50% between 2014 and 2017.

The rise before the fall

To discover the reason behind the precipitous drop in seed deals requires a trip back in time to 2006, which was the start of a seed boom that saw investing rise 600% over a nine-year period to 2014. If you’re an internet historian, 2006 should ring a bell. It’s the year Amazon unveiled their Elastic Compute Cloud, or EC2, its revolutionary on-demand cloud computing platform that gave everyone from the government to your next-door neighbor a pay-as-you-go option for servers and storage.

Gone were the days of investing millions of dollars in tech infrastructure before writing the first line of code. At the same time, the proliferation of increasingly sophisticated and freely available open-source software provided many of the building blocks upon which to build a startup. And we can’t forget the launch of the iPhone in 2007 and, more importantly for startups, the App Store in 2008.

With the financial barrier to starting a business obliterated, and coupled with the launch of an entirely new and exciting mobile platform, Silicon Valley and other innovation hubs were suddenly booming with new businesses. Angel investors and dedicated seed funds quickly followed, providing capital to support this burgeoning ecosystem. As more capital became available, more companies were formed, leading to a positive reinforcing cycle.

Enter stagnation

But this cycle began to slow in 2015. Had investor optimism waned, or was the supply of founders dwindling? Had innovation simply stopped? To find the answer, it’s helpful to understand a key role of the traditional venture capitalist. Once the Series A round of financing closes, the lead investor will join the company’s board of directors to provide support and guidance as the company grows. This differs from the seed round of financing when investors typically do not join the board, if one exists at all. But even the most zealous and hardworking of VCs can only sit on so many boards and be fully engaged with each portfolio company.

An old-fashioned logjam

If you’ve ever ridden Splash Mountain at Disneyland, you’ve likely experienced a moment when the boats stack up due to a hiccup in the flow somewhere farther down the route. This is what happened with seed companies looking to raise a Series A round of financing in 2015.

Venture capital remains a hands-on business.

With venture investors limited by the number of board seats they could responsibly hold, a huge percentage of seed-stage companies failed to successfully raise more capital. Inevitably, many seed funds also felt this pain as their portfolios started to underperform. This led to tighter availability of capital, which led to a tougher fundraising environment for seed-stage companies. Series A investors could not absorb the giant wave of seed opportunities — the virtuous cycle had turned vicious.

The scaling of venture capital

In its simplest form, venture investing has three distinct phases: seed, venture and growth.

Because seed investors are not weighed down by the constraints of active board roles, they have the ability to build large portfolios of companies. In this sense, seed funds are more scalable than traditional early-stage venture funds.

At the other end of the spectrum, growth funds are able to scale their volume of dollars invested. With the average age of a company at IPO now being 12 years, companies are staying private longer than ever, which affords growth funds an opportunity to invest enormous amounts of capital and raise ever-larger funds.

It’s in the middle — traditional venture — where achieving scalability, by quantity of deals or dollars, is the most challenging. It was this inability to scale that led to the great winnowing of seed companies hoping to raise their Series A.

It’s a situation that is unlikely to change. Venture capital remains a hands-on business. The tight working relationship between investors and founders makes venture capital a unique asset class. This alchemy doesn’t scale.

The irony for traditional Series A venture investors is that the trait they find most desirable in a startup — scalability — is the one thing they themselves are unlikely to achieve.

The new marketplaces connecting school and work

Evidence continues to roll in that American workers are out of position for the high-value jobs of today and tomorrow. Start with the fact that there are 7.3 million unfilled jobs, millions of which are high-skill positions in IT, professional services and healthcare. Then add that employment growth in IT is stagnant — a phenomenon that is entirely a supply-side problem.

What are America’s colleges and universities doing to solve the problem? Until recently, they’ve been a big part of the problem. Academic programs at colleges and universities are controlled by faculty members who typically aren’t incentivized to align curricula to employer needs. Few are interested in what employers are seeking, particularly for entry-level positions. Many have never worked in the private sector or have only outdated or tenuous connections to non-academic employers.

Most educators simply resist the idea that instruction should be aligned to employment opportunities. Colleges have always positioned themselves to help students gain the skills they need to get a good fifth job, not necessary a first job. Unfortunately, the labor market has changed: If you don’t get a good first job, you’re unlikely to get a good fifth job. And currently, around 45% of new college graduates are not getting good first jobs and find themselves underemployed.

In early August, EMSI, a provider of labor market analytics that is part of the Strada Education Network, released a study showing that our current system of post-secondary education is not providing linear paths to good first jobs, but rather a “crazy flow” or “swirl.” The report analyzed millions of graduates from six very different majors and found that graduates of all six are effectively going after the same jobs in sales, marketing, management, business and financial analysis.

Commenting on the study in Inside Higher Education, experts concluded that straightening the swirl might require integrating actual work into academic programs. “This really makes a strong case for work-based learning,” said Jane Oates, a former official in the U.S. Department of Labor during the Obama administration, now president of WorkingNation. “Colleges and universities need to provide students with practice in the context of the workplace,” agreed Lynn Pasquerella, president of the Association of American Colleges and Universities.

Creating clearer pathways to good first jobs by connecting school and work becomes even more critical considering that a recent survey found that 61% of all full-time jobs seeking entry-level employees at least on the surface ask for at least three years of experience, and that summer employment for students remains near an all-time low. With this backdrop, perhaps 45% underemployment for new graduates is as good as we can do.

New models are emerging to better connect school and work. New career services management platforms like Handshake offer much more functionality than legacy systems to connect students with employers recruiting on campus. Portfolium — a division of Instructure — allows students to create ePortfolios of their work and show their skills to employers.

Many colleges and universities have invested in experiential learning and work-study programs. Some schools do this better than others; Northeastern University offers the most comprehensive co-op program of any American institution. But few have been able to do it systematically, for the same reasons academic programs aren’t well-aligned to employer needs. That’s all changing with the rise of new marketplaces connecting students and faculty with real work from real employers.

If you don’t get a good first job, you’re unlikely to get a good fifth job.

One such marketplace is Parker Dewey. Named for progressive educator Francis Parker and philosopher John Dewey, Parker Dewey helps employers create “micro-internships”: real projects that employers need done but that can be outsourced to college students. In Parker Dewey’s micro-internship marketplace, the employer defines a project and sets a fixed fee for completing the work. Parker Dewey reaches students through career services postings and attracts applicants for the project. Then the employer selects one or more students to do the work. The marketplace makes it easy for employers to try out students who may have no work experience and therefore reduces “Hiring Friction,” i.e. the reduced propensity of employers to hire candidates who literally haven’t done the same job before, and the reason so many entry-level jobs seem to be asking for experience.

Another marketplace that’s gained even more traction is Riipen, a platform that got its start in Canada, connecting Canadian colleges and universities with employers, but now growing rapidly in the U.S. While Riipen works with employers in a manner similar to Parker Dewey, its approach to colleges and universities is very different. Rather than approaching career services, Riipen incorporates employer projects directly into college and university courses, thereby connecting employment and employability with the beating heart of colleges and universities: individual faculty.

Riipen’s three-sided marketplace of employers, educators and students appears to provide a more effective vehicle for gathering talent (and employers) on the platform; once faculty incorporate projects into their coursework — e.g. a professor of marketing adding a project reviewing and analyzing Google Ads data — the projects become mandatory and more students complete them. On Riipen, small and mid-size businesses tend to provide real-time projects, while larger companies have begun to re-use the same projects in a bid to test dozens or hundreds of students and recruit top performers. Over the past year, Riipen reports an order of magnitude increase in platform usage by employers, faculty and students.

New marketplaces like Riipen have the potential to be win-win-win-win. First, employers recruit better talent, and more reliably; content valid simulations are more than twice as accurate as any other talent screening mechanism or criteria. And it’s more cost-effective than attempting to recruit on campus. Second, universities augment career services and improve employability of graduates, which should allow them to attract more students. Third, for the first time, faculty can easily incorporate real work projects into their courses — projects that students will be energized to complete knowing there’s a real employer on the other end. And last but not least, students gain a way to stand out from the pack by exhibiting their abilities in a meaningful context, hopefully clearing a path to a good first job at the same employer, or if not, gaining valuable relevant work experience.

In a few years, as a result of marketplaces like Riipen, completing real work projects as part of an academic program should be commonplace. So there’s also a fifth winner from marketplaces that connect school and work: the overall economy. Millions of new college graduates will get relevant work experience, many more will find good first jobs and our workforce will be better positioned for the high-value jobs of today and tomorrow.

Will this tech close on never-ending real estate waiting periods?

The most anticipated part of every real estate transaction is being done with it.

Every seller looks forward to the moment of closing: the period where all involved parties put a bow on the sale and the keys get handed over to the happy new owners. The closing is by nature the most complicated part of the proceedings. The task of tying together every loose end and officially sealing the deal can end up dragging on and on, a proceeding somewhat at odds with today’s lightning-paced business environment.

There’s also the fact that, for an everyday homebuyer unaccustomed to the ins and outs of real estate purchases, the process shuts them out and leaves them waiting.

It’s a bit ironic that an industry that’s been streamlined through tech in so many ways has left its most complex aspect nearly untouched. Today’s real estate customer has an amazing number of tools at hand to make the process easier for them. They can trawl the web to find the ideal neighborhood for their new dwelling or business, skim through listings on multiple websites (complete with floor plans and detailed photo albums) and deal directly with sellers or landlords to eliminate traditional brokers’ fees. The front end of the process has become nearly painless. It’s the back end, getting to the finish line, where many find that the speed they were previously enjoying has slowed to a crawl.

Many readers may now be thinking: If closings are simultaneously incredibly important but also incredibly immobile and opaque, isn’t there a better way to make it happen? As with many other long-standing speed bumps in business and elsewhere, the tech community has come forth with several different attempts at a solution. A number of competing startups have produced tools that claim to bring this month-plus process down to a much more manageable time frame. 2018 saw two prominent attempts at becoming the top dog in this niche get nearer to the top: Modus, formed by vets of food-delivery startup Peach, and JetClosing, which closed a $20 million round of funding after two years of existence.

While each company’s founders can likely enumerate their differences better than I could, the services perform essentially the same task: shrinking closing times from the 44-day average down to a more manageable span. They do this thanks to the communicative powers of the internet, cutting down waiting time by enabling several different steps to happen at once.

Some hoary old standbys may be in for a rude awakening.

Much of the closing process is done by rote, so there’s room for automation — as long as every step is open and transparent. The potential for legal hiccups is lessened when tech tools can assure everyone is on the same page. JetClosing even throws in a title scoring system to sweeten the deal.

The big-bucks excitement over these companies’ potential is the clearest signal possible that the industry is on the precipice of a transformation. When it comes to title and escrow and other financial details that snag real estate closings, the involved parties are often long-standing institutions with little interest in making their work more transparent. Thanks to the expected rise of closing-quickening tech, some hoary old standbys may be in for a rude awakening. But it may be the case that real estate won’t be the only beneficiary of these new software rollouts.

It stands to reason that if these startups can fix the logjam in real estate closings, there are more ways businesses of all types can take advantage of their process-management tools. The mind wanders: Perhaps even onerous courtroom procedures can use an injection of smart technology to bring down waiting times for trial lawyers and defendants. If the legal hurdles involved in keeping a big-money real estate transaction both fast and transparent can be negotiated, why not apply the same tactics to the achingly slow process of appeals courts? With the use of cloud-based tech inhabiting more and more of public life, it’s not too far-fetched an idea.

In the world of real estate, we may finally be entering an era when the once-meandering part of the process is as easy as turning the keys for the first time. There’s sure to be near-universal interest on the part of all involved to speed up closing; after all, each are likely champing at the bit to either move in to their new space or get paid for their former property. With so much competition to be the go-to closing handling service, it seems clear that no matter who wins that battle, both buyers and sellers will end up feeling victorious.

2019 tech IPOs: Some thoughts from the public company roller coaster

2019 has already been an active year for U.S. tech IPOs. Some highly anticipated unicorns, such as Uber and Lyft, have disappointed investors with their IPO debuts and their first results as public companies. Others, such as Fiverr, Zoom and CrowdStrike, have soared. And food-tech brand Beyond Meat (two words you normally don’t see together) hit a high of $239 from their $25 IPO price.

The first of these 2019 tech IPO companies will soon face a new challenge as the early investor and employee lockups expire — often 180 days after the IPO — allowing them to sell and increasing the number of shares available to trade. Lyft will remain at the front of the 2019 pack when the lockups expire, bringing more of the company’s stock into play on the public market. Regardless of what happens next, it’s amazing to see the trajectory of companies that have built such impressive businesses in such a remarkably short period of time.

I was recently at the New York Stock Exchange (NYSE) to ring the opening bell and celebrate our three- millionth borrower on the platform. It brought back great memories from when our company, LendingClub, entered the public fray in 2014. LendingClub was the largest U.S. tech IPO that year, and is still one of the biggest U.S. tech IPOs of all time. We listed at a $5.4 billion valuation, and our shares surged 67% on the first day of trading. We were thrilled to celebrate the validation of our hard work and excited about the next stage of our growth. However, by the time our lockups expired, we had fallen back to around our IPO valuation of $15 a share.

Since then, despite being the market leader in the fastest-growing sector of consumer credit in the country with double-digit annual growth, the company today is worth less than a fifth of what it was in 2014. Our story is thankfully unique, and I’ll spare you the details here, but suffice to say… we had a rough period. We are back on track now, delivering growth and margin expansion while executing against our vision.

However bespoke our story, there are some observations I’ll share that might be useful for others as they think about life post-IPO. I’m not going to cover the issues around short-termism and the tyranny of quarterly targets (which have been well-documented elsewhere), but rather a few of the implications that sure would have been useful for me to know going in…

Things will be different — really

I’d compare the period leading up to the IPO to the period when you are expecting a baby. Intellectually, you know things will be different when you bring home a newborn. But knowing it and living it are two different things. Going public is a transformational event that permanently changes your company and how the CEO, CFO and board spend their time (with obvious trickle-down effects). From the moment we rang the NYSE bell on December 11, 2014, everything changed.

Making money matters

Investors buying your stock are essentially valuing your future cash flow. At some point, you have to have your “show them the money” moment and become profitable. Amazon famously lost a total of $2.8 billion over 17 straight quarters after their IPO and was the subject of a lot of skepticism and criticism throughout. The company maintained their strategy, delivering top-line growth and investing in their future and, suffice to say, investor patience paid off!

At LendingClub, we have invested millions of dollars to develop products that delight our 3 million+ customers (and, at 78, our NPS is at its highest level in the history of the company) and expand our competitive moat. We are now driving toward adjusted net income profitability.

Like it or not, there is a scoreboard

Once you go public, some people stop thinking of you as a business, and start thinking about you as a stock price. And that stock price is always broadcasting. It broadcasts to your equity investors, your employees, your partners, your board — to everyone who is listening.

You can’t preserve your culture, but you can and must maintain the values your company holds dear.

When the stock is up, everyone feels great. But, in a volatile market or a downturn, there are a lot of people who will be needing to hear your view on what’s happening. Communication to your stakeholders is not in the way of you doing your job, it is a critical part of your job that just got A LOT bigger. You need to stay ahead of it and deliberately carve out the time to make it a priority.

There are others sharing the microphone

When you are starting out, the world is divided into two types of people: those who love you, and those who don’t know/care. When you are a public company, a lot of voices join the conversation. You’ll add a different beat of reporters focused on your financials. You have analysts who are paid to research and think about your company, your strategy, your prospects and your value. These analysts may have never covered a company quite like yours (after all, you are breaking new ground) and you’ll need to spend time together to understand what matters.

You also can attract a whole new kind of investor, a “short” who has a vested interest in your stock going down. All of these voices are speaking to your stakeholders and you need to understand what they are saying and how it should affect your own communications.

Be careful, the microphone is on

Remember those days when everyone attended the “all hands” and you could share the details of your product road map, your corporate strategy, what’s working and what isn’t? Yeah, those are over. The risk of material nonpublic information leaking means you need to find a new balance in transparency with your employees (and your friends and partners for that matter).

It’s a change to behavior and to culture that doesn’t come naturally (at least it didn’t to me). It’s a change that can be frustrating to employees as the necessary opacity can erode trust as people feel out of the loop. At LendingClub, we still regularly communicate as much as we can and trust our employees, but there are places where you have to draw the line.

Your competitors are listening

Ironically enough, while your ability to share key details with employees is limited, you are sharing a lot with your competition. Shareholders and money managers want to know your battle plans and expect a detailed update at your earnings call every quarter. You can expect that your competitors are taking notice and taking notes.

Your scarcest resource

As the above would indicate, being public means that you are inevitably going to be spending less time running the business, and more time focused externally. Not a bad thing, but something you need to plan for so that you have the resources in place underneath you to maintain business momentum. If your management team isn’t materially different as you head to the market than it was a few years ago, I’d be surprised if you have what you need.

Your culture will change, focus on your values

I once asked a senior Google executive advice on how to preserve culture when going through massive periods of transition. She told me that you can’t preserve your culture, but you can and must maintain the values your company holds dear. Her advice, which I have followed and am passing on to you, is to make sure you write them down, hire against them and assess performance against them.

We started this practice years ago and it is remarkable how consistent our values have remained even as the company has evolved and matured. We codified six core values that put the customer at the center of everything we do. We are guided by our No. 1 value — Do What’s Right. You know a LendingClubber when you meet them, and it is part of what makes us great.

Being a public company is not for the faint-hearted, but being public is part of growing up. Being public legitimizes the company, unlocks liquidity to fuel growth and enables you to attract the next generation of talent. We always said that going public would allow us to deliver more value to a greater number of consumers and would lend legitimacy to our growing industry. We have facilitated more than $50 billion in loans and are still at a small percentage of our immediately addressable market. Although challenging at times, we’re seeing our dream to truly help everyday Americans come to life.

We’ve worked hard since our IPO to change the face people associate with finance. We’ve built a diverse team, established strong core values and nurtured a culture that has resulted in the kind of company we want to represent fintech and the tech industry as a whole — both inside and outside Silicon Valley.

So, to the new joiners in the public sphere — life in the spotlight is a wild ride. Congratulations on this step in your journey, and on to the next!

Why ‘one app to rule them all’ is not the future of digital health

“One app to rule them all” is a compelling idea if you’re a healthcare giant.

In this imagined model, patients flock to one comprehensive user experience for all their healthcare needs, from insurance and scheduling to lab results and disease management. And the healthcare giant, which has developed or acquired its way to market dominance, now has the ability to guide patients to their preferred providers, treatments and services.

But “one app to rule them all” is flawed. Not only does it ignore the way people use technology, it puts the patient experience second. Forcing patients to use one app for every healthcare interaction disregards the complexity and specificity of individual diseases and patient profiles.

Until now, we haven’t had to reckon with the “one app” problem, because most healthcare experiences have been relatively disparate. But as payers, providers, pharmacies, pharma and digital health all race to create digital experiences and solidify their offerings, we will soon have to confront this question: Will we end up with separate disease management apps that work together in a best-of-breed ecosystem model, or will several large players dominate with one app to rule them all?

How we got here

Up until now, the question of who would control the user experience wasn’t a material issue in the market. Most apps didn’t have a large enough user base to impinge on their competitors, and organizations created “single function” experiences for a specific problem space.

Payers created tools to let you look up coverage and find providers, health systems allowed you to book appointments and see your EMR and lab data, pharmacies allowed you to refill prescriptions, pharmaceutical companies made apps to support their specific medicines and most successful digital health companies have focused on individual diseases and specific use cases.

But now that digital health has taken root and patient adoption is expanding, these “single function” experiences are starting to bump into each other. And organizations are reacting to this change in different ways.

Many health systems and payers recognize this as an opportunity to create an ecosystem model, taking best-of-breed solutions for each core patient need and connecting them through identity and data linking so they work seamlessly together.

But alarmingly, some large organizations see this as an opportunity to assert more control over the patient experience.

They’re attempting to develop “one app to rule them all” solutions, with one user experience covering all patients with all possible diseases.

At Propeller, we recently “broke up” with a customer who couldn’t see past their vision for a single dominant app to rule the marketplace.

It was a tough decision, but one I felt we had to make. Here’s why.

Why “one app to rule them all” results in a worse patient experience

The advantage to “one app to rule them all” is obvious on its face. Patients would have fewer apps to download and engage with, advantages that seem more pronounced in more complex and comorbid patients. Organizations would also be able to guide patients to their preferred providers, treatments and services via a single app.

But there’s a significant problem with this approach.

When one platform tries to excel in a vast number of areas, it usually ends up doing them all badly. If you’ve used a leading marketing software platform that I won’t name, you know this to be true. And healthcare is even more difficult, because it’s at once more complex and more personal. It turns out it is pretty easy to build a complex and complicated product, but it is very hard to build a simple one, especially with a multitude of inputs and use cases.

Healthcare isn’t fast food.

All my experience in digital health has told me this: It’s very difficult to build an engaging and useful user experience in one disease state, let alone across multiple disease states within one experience.

Enrolling users is hard. Keeping them engaged is hard. Improving specific clinical outcomes — and proving it continuously — is especially hard. Making a great product requires an obsessive focus on a specific user and problem space, as well as relentless experimentation and iteration. When you don’t have that singular focus, the needs of the patient are deprioritized compared to the needs of the organization, and the user suffers.

To make this approach worthwhile, you’d have to believe that the convenience of one app would make up for a worse user experience by driving higher enrollment or retention rates. You’d have to believe that user experience simply doesn’t matter as much as the convenience of an all-in-one platform.

I don’t believe that. Healthcare isn’t fast food. People’s humanity, dignity and lives are at stake, and they deserve our obsessive focus on an experience built specifically for them.

Why a “best of breed” ecosystem approach is best

We’ve learned a lot in the last 20 years about how people prefer to use technology. If you want evidence that “best of breed” is the future, you only need to look to the other industries experiencing digital transformation.

For example, look at Software-as-a-Service (SaaS). It’s dominated by a large number of specific solutions that work together through identity (OAuth) and data integrations (APIs).

It’s a similar story in consumer apps. There is no single fitness app, travel app or communication app. In entertainment many tried to become the “one app,” but instead we have witnessed a proliferation of vertical content providers with individual subscriptions, such a Netflix, HBO, Disney and ESPN, all of which work together seamlessly on your Apple TV. Even when a major company acquires or develops new solutions, they often keep the solutions separate in terms of user experience. For example, Facebook has unbundled Messenger, Instagram and WhatsApp instead of bundling them all as Facebook.

We as consumers are comfortable using specific solutions to solve specific problems, and want them all to work together with ease. Often, we find that when a major player branches into more and more solutions because they want our total business, each solution becomes more shoddily made, less intuitive and more poorly supported.

Now, this situation is not necessarily universal. In China, products like Tencent’s WeChat have expanded across multiple healthcare verticals, backed by very different market dynamics (both in healthcare and in technology). Yet even WeChat looks to third parties with best of breed solutions to grow their ecosystem, in addition to building multiple solutions themselves.

What the future looks like

The future I envision may not feature a single app, but neither is it complicated.

In this future, patients use a core clinical app, likely provided by their health system or primary care provider, that takes care of clinical interactions like scheduling, clinical data, reminders and follow-ups.

Beyond that, patients use a set of specific apps that specialize in particular health issues — for example, respiratory disease, diabetes, mental health, increasing activity or improving sleep. Those apps will rise to the top because they’re the best on the market at managing those issues. The experience of managing your mental health will feel different than managing your diabetes, just as using Instagram feels different than using Facebook.

In this ecosystem model, the patient’s core clinical app will link out to and connect to the problem-specific solutions. Health systems and physicians will adopt a small number of specialized platforms and products to focus on large clinical domains like cardiovascular, diabetes, respiratory and mental health. Data from these solutions will integrate back to the provider’s organization and will be available in the EMR and for population health management.

We’ll end up with a diverse ecosystem of solutions, each the best in their vertical, delivering a tailored user experience based on the needs of the specific patient and provider type.

And patients will be better off for it.

Who gets to own your digital identity?

“On the Internet, nobody knows you’re a dog,” was stated in the legendary New York Times cartoon that captured the spirit of privacy and anonymity in the early days of the internet. Even though anonymity is still a hot topic and sought after in the online world, times have changed. With the rise of online banking, social media, e-commerce and peer-to-peer services, a verified digital identity is a crucial ingredient in making any digital platform succeed.

Banking is one of the areas where the ability to verify one’s identity in a secure and compliant manner is a prerequisite to access basic services. Looking at the unbanked population of the world today, it is estimated that as many as 1.5 billion people lack access to everyday banking services due to their inability to prove their identity through a valid birth certificate, passport, proof of residency through utility bill or some other means to fulfill traditional KYC procedures.

In addition to accessing digital banking, most of us also have verified our identity through a plethora of services like Google, Facebook, Blizzard and the list goes on, through various means of identity verification that make up an interlinked web of interdependencies, where one of your identities vouch for your eligibility to access another service. Two-factor authentication or biometric identification often rely on your mobile phone, and when you choose to log in with Facebook, you authorize Facebook to represent you online. While this is often convenient for easy and quick access to the latest mobile app you want to try out, you are paying a price by allowing Facebook to share and sell not only your data but also your digital identity.

However, your digital identity is more than your login credentials. This is merely the authentication that connects you with the digital you. Your digital identity consists of thousands of data points that make up a profile of who you are and your preferences. Today, your digital identity is scattered all over the internet, where Facebook owns our social identity, retailers own our shopping patterns, credit agencies hold our creditworthiness, Google knows what we have been curious of since the dawn of the internet and your bank owns your payment history. As a result, we are all analyzed in detail to predict our future behavior and monetize our digital identities.

A verified digital identity is a crucial ingredient in making any digital platform succeed.

Not only do we lack ownership of our own data, but our fragmented digital identities where various third parties own bits and pieces only gives part of the picture, and also proposes vulnerabilities for those third parties. As an example, fraudsters have started to take advantage of this in countries with no national identifier by creating synthetic digital identities by signing up digital services and applying for credit. Even though the initial credit application is rejected, a credit file is automatically created, thus creating a digital paper trail for a non-existing person. With approximately 10 million new consumer credit files generated in the U.S. each year, synthetic identities can be very difficult to detect. Over time, these synthetic identities gain access to credit, and bank losses due to synthetic fraud are estimated to amount for somewhere between $1 billion and $2 billion each year.

In the wake of numerous exposures of how our data is exploited, with Cambridge Analytica as the most notable example, privacy becomes an increasing concern for the public, as well. Apple seeks to leverage this attention to digital privacy by taking a radically different approach than their counterparts with “sign in with Apple,” where privacy is the main selling point for using their service instead of Google and Facebook.

Blockchain is often proposed as the silver bullet to solve all our digital identity needs, something that has caught the attention of Mark Zuckerberg that addresses what he sees as the pros and cons of a decentralized approach to digital identity. As Facebook represents a quintessential man in the middle, losing ownership of all our identities is most likely the biggest con of a decentralized approach to digital identity in the eyes of Zuckerberg.

With the upcoming launch of Facebook’s cryptocurrency, Libra, the company has the potential to further strengthen its position as a leading provider of a global digital identity solution. Often overlooked with most of the attention directed toward the cryptocurrency, many point to the decentralized identity associated with Libra as the most interesting aspect of Facebook’s plans. A passage hidden away near the bottom of the documentation states: “An additional goal of the association is to develop and promote an open identity standard. We believe that decentralized and portable digital identity is a prerequisite to financial inclusion and competition.”

There is too much at stake when it comes to our digital identities to remain unvigilant.

A consolidated and verified digital identity would be beneficial to both users and providers of digital services. However, allowing Facebook or The Libra Association to be the custodian of our consolidated digital identity is a sinister trail for the future of both privacy and democracy.

On the other hand, the Holy Grail of decentralized identity, often named a self-sovereign identity, has its weaknesses, namely ourselves as human beings. We tend to be forgetful, and sometimes downright unreliable. Letting users keep the only key to access their digital identities is a recipe for disaster the moment someone forgets their password or pass away. There is nobody to call and no Forgot Password button to reclaim the ownership of the identity.

It is difficult to envision a future of digital identity without relying on some kind of identity custodian that maintains a verified connection between your physical and digital self, ensures that no data is used without consent, monitors malicious behavior and provides user support in case of a lost key. This is far from an easy solution and should be provided by a regulated entity. One thing is for sure, such a solution relies on trust and must give the end user full ownership of their own data, similar to data portability under GDPR.

There is too much at stake when it comes to our digital identities to remain unvigilant of what is going on, as shown numerous times through both data breaches where our personal data is compromised and manipulation of public opinion through social media.

No matter which technology or appointed custodian we deploy to solve this, our identities should belong to we the people rather than one corporation or consortium of corporations that seek to exploit our data for profit.

The American AI Initiative: A good first step, of many

The path to general AI — and possibly superintelligence — is being paved before our eyes. And with the proliferation of an AI-driven society, the social and economic value of such technology is also on the rise. In turn, harnessing and leveraging such technology needs to extend beyond the interests of venture capitalists, investment groups and entrepreneurs — and also be a priority on a geopolitical scale.

When the global economy starts to feel the shift ushered in with mass-adoption of AI, the United States needs to be leading the charge as opposed to chasing the pack.

If the U.S. is to compete on a global level, they’ll face an arms race of sorts from a litany of nations that are already doubling-down on the massive advantages that come with national AI proficiency. In fact, 18 different countries have launched national AI strategies, with government funding ranging from $20 million to almost $2 billion.

A first step in the right direction was taken by the Trump administration recently when the president signed an executive order launching the American AI Initiative. This policy will funnel federal funding and resources toward AI-specific research while also implementing U.S.-led international AI standards. Additionally, the program will call for new research into increasing AI literacy in American workers.

Unfortunately, there are no specifics around what exactly this new program actually looks like in practice, and there is no additional research being dedicated toward AI development. There are no timelines for implementation of these initiatives, only a vague goal of roughly six-ish months before a detailed plan is rolled out. Jason Furman, a Harvard professor who helped draft the Obama administration’s report on AI, said that the plan had “all the right elements” but was also “aspirational with no details and is not self-executing.”

How can the private sector build on what the federal government has put in place?

Yet, the importance of government involvement in AI R&D cannot be overstated. If we remain on the path we’re on, one where large technology companies and VC firms are funding the bulk of AI research, the country would only see pockets of growth around the largest technology companies and the regions of the country would continue to stagnate. We would not be able to work on major moonshot projects and collectively pool our resources for the greater good across all regions of the U.S. All innovations would be tightly controlled by technology companies and adoption rates would not move up and actually make a difference in the way we utilize AI. This would result in a marginal talent pool, and new developments would be those of technology innovators — not problem-solvers. Everything would be driven by its contribution to business and not its contribution to society at-large.

So, government involvement matters, yet the administration can’t be solely responsible for catalyzing the change needed by the American workforce — it falls on us as well. So that begs the question…

How can the private sector build on what the federal government has put in place?

The program focuses on five key pillars: Research and development, infrastructure, governance, workforce and international engagement. Like Furman said, those concepts are well and good, but they remain vague and still clearly undefined. But, even if the administration’s program isn’t hitting the ground running, that doesn’t mean that you and I can’t push the ball in the right direction. So, how can we as a workforce help execute on the program? What do we need to do to enact the ideals that the federal government is focused on in AI?

Focus on building AI-literacy in American workers

Until the American workforce itself is concerned with being AI-first, we will see challenges in implementation, adoption and deployment, and AI literacy will be largely confined to the areas in which it’s already being heavily used (automation, customer service, insights, engagement, etc.).

Additionally, these industries aren’t even using AI to actually solve problems or improve society, they are largely using it as an autopilot. And if AI is being used simply to automate processes for tech companies, then we’re missing out on the opportunity to use it to its full advantage to solve actual sociological issues around hunger, poverty and healthcare.

And the focus needs to extend beyond the workforce and into the classroom. All STEM programs in American schools need AI-based coursework. Universities need AI-based programs and intelligence labs, such as MIT, for example, where roughly 25% of faculty conduct research on intelligence in labs like the MIT-IBM Watson AI Lab, the Robust Robotics Group and the Laboratory for Information and Decision Systems (LIDS).

Our academic institutions and research centers would continue to strive as centers of excellence around the world, meaning that the best and brightest minds would continue to be attracted and would keep our talent pool stocked. Our universities would increase enrollment for AI/digital experts, as those roles would be the golden mature standard.

Startups need to swarm and work closely with federal AI strategy

While I hate to use cliches, this is a “teamwork makes the dream work” situation. Aligning the startup community with government strategy would allow innovation and social good to walk hand-in-hand when it comes to AI development.

The importance of government involvement in AI R&D cannot be overstated.

This would lead in new space technologies, create new innovation for society in food, energy and health, and create a lifestyle that balances efficiency and leisure. It also would allow American corporations to go after dispersion and breakthrough innovation. From a government perspective, this means continuing to provide open and structured data sets for the public to use while still protecting the sensitive information that keeps our citizens safe. Providing these data sets is the first step, but making others aware through education campaigns is also important

Make AI all-inclusive

Much the same way that IT experts, coders and web/app developers had to learn to work side-by-side with business owners, marketers and production-level employees across the business ecosystem over the last two-and-a-half decades, we must bridge the “gap” between AI experts, technologists and leading technology companies and solutions owners, general SMBs and corporate America to develop an inclusive and universally understandable AI strategy.

The advancement of machine learning models, specifically deep learning, relies on the ingestion of data — structured or unstructured. The sharing of this data, from people involved in day-to-day problems and solutions to technologists who are concerned with the big picture, is the key to developing innovative and inclusive AI solutions. A better AI future built on diverse data sets requires both parties to work collaboratively.

Data is officially the most valuable commodity on earth and the countries that win the race to harness and use it to its maximum value and efficiency are going to position themselves favorably around the globe. And if America is to win the race, it will take the buy-in of the collective public, private and government entities in our country. If we are to move past improving our viewing patterns on Netflix and start solving the brass-tax issues in our country’s society, it will come as a result of the convergence of government, society and business.

The renaissance of silicon will create industry giants

Every time we binge on Netflix or install a new internet-connected doorbell to our home, we’re adding to a tidal wave of data. In just 10 years, bandwidth consumption has increased 100 fold, and it will only grow as we layer on the demands of artificial intelligence, virtual reality, robotics and self-driving cars. According to Intel, a single robo car will generate 4 terabytes of data in 90 minutes of driving. That’s more than 3 billion times the amount of data people use chatting, watching videos and engaging in other internet pastimes over a similar period.

Tech companies have responded by building massive data centers full of servers. But growth in data consumption is outpacing even the most ambitious infrastructure build outs. The bottom line: We’re not going to meet the increasing demand for data processing by relying on the same technology that got us here.

The key to data processing is, of course, semiconductors, the transistor-filled chips that power today’s computing industry. For the last several decades, engineers have been able to squeeze more and more transistors onto smaller and smaller silicon wafers — an Intel chip today now squeezes more than 1 billion transistors on a millimeter-sized piece of silicon.

This trend is commonly known as Moore’s Law, for the Intel co-founder Gordon Moore and his famous 1965 observation that the number of transistors on a chip doubles every year (later revised to every two years), thereby doubling the speed and capability of computers.

This exponential growth of power on ever-smaller chips has reliably driven our technology for the past 50 years or so. But Moore’s Law is coming to an end, due to an even more immutable law: material physics. It simply isn’t possible to squeeze more transistors onto the tiny silicon wafers that make up today’s processors.

Compounding matters, the general-purpose chip architecture in wide use today, known as x86, which has brought us to this point, isn’t optimized for computing applications that are now becoming popular.

That means we need a new computing architecture. Or, more likely, multiple new computer architectures. In fact, I predict that over the next few years we will see a flowering of new silicon architectures and designs that are built and optimized for specialized functions, including data intensity, the performance needs of artificial intelligence and machine learning and the low-power needs of so-called edge computing devices.

The new architects

We’re already seeing the roots of these newly specialized architectures on several fronts. These include Graphic Processing Units from Nvidia, Field Programmable Gate Arrays from Xilinx and Altera (acquired by Intel), smart network interface cards from Mellanox (acquired by Nvidia) and a new category of programmable processor called a Data Processing Unit (DPU) from Fungible, a startup Mayfield invested in.  DPUs are purpose-built to run all data-intensive workloads (networking, security, storage) and Fungible combines it with a full-stack platform for cloud data centers that works alongside the old workhorse CPU.

These and other purpose-designed silicon will become the engines for one or more workload-specific applications — everything from security to smart doorbells to driverless cars to data centers. And there will be new players in the market to drive these innovations and adoptions. In fact, over the next five years, I believe we’ll see entirely new semiconductor leaders emerge as these services grow and their performance becomes more critical.

Let’s start with the computing powerhouses of our increasingly connected age: data centers.

More and more, storage and computing are being done at the edge; that means, closer to where our devices need them. These include things like the facial recognition software in our doorbells or in-cloud gaming that’s rendered on our VR goggles. Edge computing allows these and other processes to happen within 10 milliseconds or less, which makes them more work for end users.

I commend the entrepreneurs who are putting the silicon back into Silicon Valley.

With the current arithmetic computations of x86 CPU architecture, deploying data services at scale, or at larger volumes, can be a challenge. Driverless cars need massive, data-center-level agility and speed. You don’t want a car buffering when a pedestrian is in the crosswalk. As our workload infrastructure — and the needs of things like driverless cars — becomes ever more data-centric (storing, retrieving and moving large data sets across machines), it requires a new kind of microprocessor.

Another area that requires new processing architectures is artificial intelligence, both in training AI and running inference (the process AI uses to infer things about data, like a smart doorbell recognizing the difference between an in-law and an intruder). Graphic Processing Units (GPUs), which were originally developed to handle gaming, have proven faster and more efficient at AI training and inference than traditional CPUs.

But in order to process AI workloads (both training and inference), for image classification, object detection, facial recognition and driverless cars, we will need specialized AI processors. The math needed to run these algorithms requires vector processing and floating-point computations at dramatically higher performance than general purpose CPUs provide.

Several startups are working on AI-specific chips, including SambaNova, Graphcore and Habana Labs. These companies have built new AI-specific chips for machine intelligence. They lower the cost of accelerating AI applications and dramatically increase performance. Conveniently, they also provide a software platform for use with their hardware. Of course, the big AI players like Google (with its custom Tensor Processing Unit chips) and Amazon (which has created an AI chip for its Echo smart speaker) are also creating their own architectures.

Finally, we have our proliferation of connected gadgets, also known as the Internet of Things (IoT). Many of our personal and home tools (such as thermostats, smoke detectors, toothbrushes and toasters) operate on ultra-low power.

The ARM processor, which is a family of CPUs, will be tasked for these roles. That’s because gadgets do not require computing complexity or a lot of power. The ARM architecture is perfectly designed for them. It’s made to handle smaller number of computing instructions, can operate at higher speeds (churning through many millions of instructions per second) and do it at a fraction of the power required for performing complex instructions. I even predict that ARM-based server microprocessors will finally become a reality in cloud data centers.

So with all the new work being done in silicon, we seem to be finally getting back to our original roots. I commend the entrepreneurs who are putting the silicon back into Silicon Valley. And I predict they will create new semiconductor giants.