Oracle could be feeling cloud transition growing pains

Oracle is learning that it’s hard for enterprise companies born in the data center to make the transition to the cloud, an entirely new way of doing business. Yesterday it reported its earnings and it was a mixed bag, made harder by changing the way the company counts cloud revenue.

In its earnings press release from yesterday, it put it this way: “Q4 Cloud Services and License Support revenues were up 8% to $6.8 billion. Q4 Cloud License and On-Premise License revenues were down 5% to $2.5 billion.”

Let’s compare that with the language from their Q3 revenue in March: “Cloud Software as a Service (SaaS) revenues were up 33% to $1.2 billion. Cloud Platform as a Service (PaaS) plus Infrastructure as a Service (IaaS) revenues were up 28% to $415 million. Total Cloud Revenues were up 32% to $1.6 billion.”

See how they broke out the cloud revenue loudly and proudly in March, yet chose to combine their cloud revenue with license revenue in June.

In the post-reporting earnings call, Safra Catz, Oracle Co-CEO, responding to a question from analyst John DiFucci, took exception to the idea that the company was somehow obfuscating cloud revenue by reporting it in this way. “So first of all, there is no hiding. I told you the Cloud number, $1.7 billion. You can do the math. You see we are right where we said we’d be.”

She says the new reporting method is due to the new combined licensing products that lets customer use their license on-premise or in the cloud. Fair enough, but if your business is booming you probably want to let investors know about that. They seem to be uneasy about this approach with the stock down over 7 percent today as of publishing this article.

Oracle Stock Chart: Google

Oracle could of course settle all of this by spelling out their cloud revenue, but instead chose a different path. John Dinsdale, an analyst with Synergy Research, a firm that watches the cloud market was dubious about Oracle’s reasoning.

“Generally speaking, when a company chooses to reduce the amount of financial detail it shares on its key strategic initiatives, that is not a good sign. I think one of the justifications put forward is that is becoming difficult to differentiate between cloud and non-cloud revenues. If that is indeed what Oracle is claiming, I have a hard time buying into that argument. Its competitors are all moving in the opposite direction,” he said.

Indeed most are. While it’s often hard to tell exactly the nature of cloud revenue, the bigger players have been more open about this. For instance in its most recent earnings report, Microsoft reported its Azure cloud revenue grew 93 percent. Amazon reported its cloud revenue from AWS was up 49 percent to $5.4 billion in revenue, getting very specific about the revenue number.

Further you can see from Synergy’s most recent market share cloud growth numbers from the 4th quarter last year, Oracle was lumped in with “the Next 10,” not large enough to register on its own.

That Oracle chose not to break out cloud revenue this quarter can’t be seen as a good sign. To be fair, we haven’t really seen Google break out their cloud revenue either with one exception in February. But when the guys at the top of the market shout about their growth, and the guys further down don’t, you can draw your own conclusions.

Salesforce deepens data sharing partnership with Google

Last Fall at Dreamforce, Salesforce announced a deepening friendship with Google . That began to take shape in January with integration between Salesforce CRM data and Google Analytics 360 and Google BigQuery. Today, the two cloud giants announced the next step as the companies will share data between Google Analytics 360 and the Salesforce Marketing Cloud.

This particular data sharing partnership makes even more sense as the companies can share web analytics data with marketing personnel to deliver ever more customized experiences for users (or so the argument goes, right?).

That connection certainly didn’t escape Salesforce’s VP of product marketing, Bobby Jania. “Now, marketers are able to deliver meaningful consumer experiences powered by the world’s number one marketing platform and the most widely adopted web analytics suite,” Jania told TechCrunch.

Brent Leary, owner of the consulting firm CRM Essentials says the partnership is going to be meaningful for marketers. “The tighter integration is a big deal because a large portion of Marketing Cloud customers are Google Analytics/GA 360 customers, and this paves the way to more seamlessly see what activities are driving successful outcomes,” he explained.

The partnership involves four integrations that effectively allow marketers to round-trip data between the two platforms. For starters, consumer insights from both Marketing Cloud and Google Analytics 360, will be brought together into a single analytics dashboard inside Marketing Cloud. Conversely, Market Cloud data will be viewable inside Google Analytics 360 for attribution analysis and also to use the Marketing Cloud information to deliver more customized web experiences. All three of these integrations will be generally available starting today.

A fourth element of the partnership being announced today won’t be available in Beta until the third quarter of this year. “For the first time ever audiences created inside the Google Analytics 360 platform can be activated outside of Google. So in this case, I’m able to create an audience inside of Google Analytics 360 and then I’m able to activate that audience in Marketing Cloud,” Jania explained.

An audience is like a segment, so if you have a group of like-minded individuals in the Google analytics tool, you can simply transfer it to Salesforce Marketing Cloud and send more relevant emails to that group.

This data sharing capability removes a lot of the labor involved in trying to monitor data stored in two places, but of course it also raises questions about data privacy. Jania was careful to point out that the two platforms are not sharing specific information about individual consumers, which could be in violation of the new GDPR data privacy rules that went into effect in Europe at the end of last month.

“What we’re [we’re sharing] is either metadata or aggregated reporting results. Just to be clear there’s no personal identifiable data that is flowing between the systems so everything here is 100% GDPR-compliant,” Jania said.

But Leary says it might not be so simple, especially in light of recent data sharing abuses. “With Facebook having to open up about how they’re sharing consumer data with other organizations, companies like Salesforce and Google will have to be more careful than ever before about how the consumer data they make available to their corporate customers will be used by them. It’s a whole new level of scrutiny that has to be apart of the data sharing equation,” Leary said.

The announcements were made today at the Salesforce Connections conference taking place in Chicago this week.

Workday acquires Rallyteam to fuel machine learning efforts

Sometimes you acquire a company for the assets and sometimes you do it for the talent. Today Workday announced it was buying Rallyteam, a San Francisco startup that helps companies keep talented employees by matching them with more challenging opportunities in-house.

The companies did not share the purchase price or the number of Rallyteam employees who would be joining Workday .

In this case, Workday appears to be acquiring the talent. It wants to take the Rallyteam team and incorporate it into the company’s engineering unit to beef up its machine learning efforts, while taking advantage of the expertise it has built up over the years connecting employees with interesting internal projects.

“With Rallyteam, we gain incredible team members who created a talent mobility platform that uses machine learning to help companies better understand and optimize their workforces by matching a worker’s interests, skills and connections with relevant jobs, projects, tasks and people,” Workday’s Cristina Goldt wrote in a blog post announcing the acquisition.

Rallyteam, which was founded in 2013, and launched at TechCrunch Disrupt San Francisco in September 2014, helps employees find interesting internal projects that might otherwise get outsourced. “I knew there were opportunities that existed [internally] because as a manager, I was constantly outsourcing projects even though I knew there had to be people in the company that could solve this problem,” Rallyteam’s Huan Ho told TechCrunch’s Frederic Lardinois at the launch. Rallyteam was a service designed to solve this issue.

[gallery ids="1055100,1053586,1053580,1053581"]

Last fall the company raised $8.6 million led by Norwest Ventures with participation from Storm Ventures, Cornerstone OnDemand and Wilson Sonsini.

Workday provides a SaaS platform for human resources and finance, so the Rallyteam approach fits nicely within the scope of the Workday business. This is the 10th acquisition for Workday and the second this year.

Chart: Crunchbase

Workday raised over $230 million before going public in 2012.

Devo scores $25 million and cool new name

Logtrust is now known as Devo in one of the cooler name changes I’ve seen in a long time. Whether they intended to pay homage to the late 70s band is not clear, but investors probably didn’t care, as they gave the data operations startup a bushel of money today.

The company now known as Devo announced a $25 million Series C round led by Insight Venture Partners with participation from Kibo Ventures. Today’s investment brings the total raised to $71 million.

The company changed its name because it was about much more than logs, according to CEO Walter Scott. It offers a cloud service that allows customers to stream massive amounts of data — think terabytes or even petabytes — relieving the need to worry about all of the scaling and hardware requirements processing this amount of data would require. That could be from logs from web servers, security data from firewalls or transactions taking place on backend systems, as some examples.

The data can live on prem if required, but the processing always gets done in the cloud to provide for the scaling needs. Scott says this is about giving companies this ability to process and understand massive amounts of data that previously was only in reach of web scale companies like Google, Facebook or Amazon.

But it involves more than simply collecting the data. “It’s the combination of us being able to collect all of that data together with running analytics on top of it all in a unified platform, then allowing a very broad spectrum of the business [to make use of it],” Scott explained.

Devo dashboard. Photo: Devo

Devo sees Sumo Logic, Elastic and Splunk as its primary competitors in this space, but like many startups they often battle companies trying to build their own systems as well, a difficult approach for any company to take when you are dealing with this amount of data.

The company, which was founded in Spain is now based in Cambridge, Massachusetts, and has close to 100 employees. Scott says he has the budget to double that by the end of the year, although he’s not sure they will be able to hire that many people that rapidly

SAP gives CRM another shot with with new cloud-based suite

Customer Relationship Management (CRM) is a mature market with a clear market leader in Salesforce. It has a bunch other enterprise players like Microsoft, Oracle and SAP vying for position. SAP decided to take another shot today when it released a new business products suite called SAP C/4HANA. (Ya, catchy I know.)

SAP C/4HANA pulls together several acquisitions from the last several years. It started in 2013 when it bought Hybris for around a billion dollars. That gave them a logistics tracking piece. Then last year it got Gigya for $350 million, giving them a way to track customer identity. This year it bought the final piece when it paid $2.4 billion for CallidusCloud for a configure, price quote (CPQ) piece.

SAP has taken these three pieces and packaged them together into a customer relationship management package. They see this term much more broadly than simply tracking a database of names and vital information on customers. They hope with these products to give their customers a way to provide consumer data protection, marketing, commerce, sales and customer service.

They see this approach as different, but it’s really more of what the other players are doing by packaging sales, service and marketing into a single platform. “The legacy CRM systems are all about sales; SAP C/4HANA is all about the consumer. We recognize that every part of a business needs to be focused on a single view of the consumer. When you connect all SAP applications together in an intelligent cloud suite, the demand chain directly fuels the behaviors of the supply chain,” CEO Bill McDermott said in a statement.

It’s interesting that McDermott goes after legacy CRM tools because his company has offered its share of them over the years, but its market share has been headed in the wrong direction. This new cloud-based package is designed to change that. If you can’t build it, you can buy it, and that’s what SAP has done here.

Brent Leary, owner at CRM Essentials, who has been watching this market for many years says that while SAP has a big back-office customer base in ERP, it’s going to be tough to pull customers back to SAP as a CRM provider. “I think their huge base of ERP customers provides them with an opportunity to begin making inroads, but it will be tough as mindshare for CRM/Customer Engagement has moved away from SAP,” he told TechCrunch.

He says that it will be important with this new product to find its niche in a defined market. “It will be imperative going forward for SAP find spots to “own” in the minds of corporate buyers in order to optimize their chances of success against their main competitors,” he said.

It’s obviously not going to be easy, but SAP has used its cash to buy some companies and give it another shot. Time will tell if it was money well spent.

How Yelp (mostly) shut down its own data centers and moved to AWS

Back in 2013, Yelp was a 9-year old company built on a set of internal systems. It was coming to the realization that running its own data centers might not be the most efficient way to run a business that was continuing to scale rapidly. At the same time, the company understood that the tech world had changed dramatically from 2004 when it launched and it needed to transform the underlying technology to a more modern approach.

That’s a lot to take on in one bite, but it wasn’t something that happened willy-nilly or overnight says Jason Yellen, SVP of engineering at Yelp . The vast majority of the company’s data was being processed in a massive Python repository that was getting bigger all the time. The conversation about shifting to a microservices architecture began in 2012.

The company was also running the massive Yelp application inside its own datacenters, and as it grew it was increasingly becoming limited by long lead times required to procure and get new hardware online. It saw this was an unsustainable situation over the long-term and began a process of transforming from running a huge monolithic application on-premises to one built on microservices running in the cloud. It was a quite a journey.

The data center conundrum

Yellen described the classic scenario of a company that could benefit from a shift to the cloud. Yelp had a small operations team dedicated to setting up new machines. When engineering anticipated a new resource requirement, they had to give the operations team sufficient lead time to order new servers and get them up and running, certainly not the most efficient way to deal with a resource problem, and one that would have been easily solved by the cloud.

“We kept running into a bottleneck, I was running a chunk of the search team [at the time] and I had to project capacity out to 6-9 months. Then it would take a few months to order machines and another few months to set them up,” Yellen explained. He emphasized that the team charged with getting these machines going was working hard, but there were too few people and too many demands and something had to give.

“We were on this cusp. We could have scaled up that team dramatically and gotten [better] at building data centers and buying servers and doing that really fast, but we were hearing a lot of AWS and the advantages there,” Yellen explained.

To the cloud!

They looked at the cloud market landscape in 2013 and AWS was the clear leader technologically. That meant moving some part of their operations to EC2. Unfortunately, that exposed a new problem: how to manage this new infrastructure in the cloud. This was before the notion of cloud-native computing even existed. There was no Kubernetes. Sure, Google was operating in a cloud-native fashion in-house, but it was not really an option for most companies without a huge team of engineers.

Yelp needed to explore new ways of managing operations in a hybrid cloud environment where some of the applications and data lived in the cloud and some lived in their data center. It was not an easy problem to solve in 2013 and Yelp had to be creative to make it work.

That meant remaining with one foot in the public cloud and the other in a private data center. One tool that helped ease the transition was AWS Direct Connect, which was released the prior year and enabled Yelp to directly connect from their data center to the cloud.

Laying the groundwork

About this time, as they were figuring out how AWS works, another revolutionary technological change was occurring when Docker emerged and began mainstreaming the notion of containerization. “That’s another thing that’s been revolutionary. We could suddenly decouple the context of the running program from the machine it’s running on. Docker gives you this container, and is much lighter weight than virtualization and running full operating systems on a machine,” Yellen explained.

Another thing that was happening was the emergence of the open source data center operating system called Mesos, which offered a way to treat the data center as a single pool of resources. They could apply this notion to wherever the data and applications lived. Mesos also offered a container orchestration tool called Marathon in the days before Kubernetes emerged as a popular way of dealing with this same issue.

“We liked Mesos as a resource allocation framework. It abstracted away the fleet of machines. Mesos abstracts many machines and controls programs across them. Marathon holds guarantees about what containers are running where. We could stitch it all together into this clear opinionated interface,” he said.

Pulling it all together

While all this was happening, Yelp began exploring how to move to the cloud and use a Platform as a Service approach to the software layer. The problem was at the time they started, there wasn’t really any viable way to do this. In the buy versus build decision making that goes on in large transformations like this one, they felt they had little choice but to build that platform layer themselves.

In late 2013 they began to pull together the idea of building this platform on top of Mesos and Docker, giving it the name PaaSTA, an internal joke that stood for Platform as a Service, Totally Awesome. It became simply known as Pasta.

Photo: David Silverman/Getty Images

The project had the ambitious goal of making their infrastructure work as a single fabric, in a cloud-native fashion before most anyone outside of Google was using that term. Pasta developed slowly with the first developer piece coming online in August 2014 and the first  production service later that year in December. The company actually open sourced the technology the following year.

“Pasta gave us the interface between the applications and development teams. Operations had to make sure Pasta is up and running, while Development was responsible for implementing containers that implemented the interface,” Yellen said.

Moving to deeper into the public cloud

While Yelp was busy building these internal systems, AWS wasn’t sitting still. It was also improving its offerings with new instance types, new functionality and better APIs and tooling. Yellen reports this helped immensely as Yelp began a more complete move to the cloud.

He says there were a couple of tipping points as they moved more and more of the application to AWS — including eventually, the master database. This all happened in more recent years as they understood better how to use Pasta to control the processes wherever they lived. What’s more, he said that adoption of other AWS services was now possible due to tighter integration between the in-house data centers and AWS.

Photo: erhui1979/Getty Images

The first tipping point came around 2016 as all new services were configured for the cloud. He said they began to get much better at managing applications and infrastructure in AWS and their thinking shifted from how to migrate to AWS to how to operate and manage it.

Perhaps the biggest step in this years-long transformation came last summer when Yelp moved its master database from its own data center to AWS. “This was the last thing we needed to move over. Otherwise it’s clean up. As of 2018, we are serving zero production traffic through physical data centers,” he said. While they still have two data centers, they are getting to the point, they have the minimum hardware required to run the network backbone.

Yellen said they went from two weeks to a month to get a service up and running before this was all in place to just a couple of minutes. He says any loss of control by moving to the cloud has been easily offset by the convenience of using cloud infrastructure. “We get to focus on the things where we add value,” he said — and that’s the goal of every company.

Box expands Zones to manage content in multiple regions

When Box announced Zones a couple of years ago, it was providing a way for customers to store data outside the U.S., but there were some limits. Each customer could choose the U.S. and one additional zone. Customers wanted more flexibility, and today the company announced it was allowing them to choose to multiple zones.

The new feature gives a company the ability to store content across any of the 7 zones (plus the U.S) that Box currently supports across the world. A zone is essentially a Box co-location datacenter partner in various locations. The customer can now choose a default zone and then manage multiple zones from a single customer ID in the Box admin console, according to Jeetu Patel, chief product officer at Box.

Initially customers wanted to have a choice to store data in a region outside the U.S., but over time they began asking for a solution to not just pick one additional zone, but to have access to multiple zones.

Current Box Zones. Photo: Box

Content will go to a defined default zone unless the admin creates rules specifying another location. In terms of data sovereignty, the file will always live in the country of record, even if an employee outside that country has access to it. From an end user perspective, they won’t know where the content lives if the administrators allow access to it.

This may not seem like a huge deal on its face, but from a content management standpoint, it presented some challenges. Patel says the company designed the product with this ability in mind from the start, but it took some development time to get there.

“When we launched Zones we knew we would [eventually require] multi-zone capability, and we had to make sure the architecture could handle that,” Patel explained. They did this by abstracting the architecture to separate the storage and business logic tiers. Creating this modular approach allowed them to increase the capabilities as they built out Zones.

It doesn’t hurt that this feature is being made available just days before the EU’s GDPR data privacy rules are going into effect. “Zones is not just for GDPR, but it does help customers meet their GDPR obligations,” Patel said.

Overall, Zones is part of Box’s strategy to provide content management services in the cloud and give customers, even regulated industries, the ability to control how that content is used. This expansion is one more step on that journey.

Adobe CTO leads company’s broad AI bet

There isn’t a software company out there worth its salt that doesn’t have some kind of artificial intelligence initiative in progress right now. These organizations understand that AI is going to be a game-changer, even if they might not have a full understanding of how that’s going to work just yet.

In March at the Adobe Summit, I sat down with Adobe executive vice president and CTO Abhay Parasnis, and talked about a range of subjects with him including the company’s goal to build a cloud platform for the next decade — and how AI is a big part of that.

Parasnis told me that he has a broad set of responsibilities starting with the typical CTO role of setting the tone for the company’s technology strategy, but it doesn’t stop there by any means. He also is in charge of operational execution for the core cloud platform and all the engineering building out the platform — including AI and Sensei. That includes managing a multi-thousand person engineering team. Finally, he’s in charge of all the digital infrastructure and the IT organization — just a bit on his plate.

Ten years down the road

The company’s transition from selling boxed software to a subscription-based cloud company began in 2013, long before Parasnis came on board. It has been a highly successful one, but Adobe knew it would take more than simply shedding boxed software to survive long-term. When Parasnis arrived, the next step was to rearchitect the base platform in a way that was flexible enough to last for at least a decade — yes, a decade.

“When we first started thinking about the next generation platform, we had to think about what do we want to build for. It’s a massive lift and we have to architect to last a decade,” he said. There’s a huge challenge because so much can change over time, especially right now when technology is shifting so rapidly.

That meant that they had to build in flexibility to allow for these kinds of changes over time, maybe even ones they can’t anticipate just yet. The company certainly sees immersive technology like AR and VR, as well as voice as something they need to start thinking about as a future bet — and their base platform had to be adaptable enough to support that.

Making Sensei of it all

But Adobe also needed to get its ducks in a row around AI. That’s why around 18 months ago, the company made another strategic decision to develop AI as a core part of the new  platform. They saw a lot of companies looking at a more general AI for developers, but they had a different vision, one tightly focussed on Adobe’s core functionality. Parasnis sees this as the key part of the company’s cloud platform strategy. “AI will be the single most transformational force in technology,” he said, adding that Sensei is by far the thing he is spending the most time on.”

Photo: Ron Miller

The company began thinking about the new cloud platform with the larger artificial intelligence goal in mind, building AI-fueled algorithms to handle core platform functionality. Once they refined them for use in-house, the next step was to open up these algorithms to third-party developers to build their own applications using Adobe’s AI tools.

It’s actually a classic software platform play, whether the service involves AI or not. Every cloud company from Box to Salesforce has been exposing their services for years, letting developers take advantage of their expertise so they can concentrate on their core knowledge. They don’t have to worry about building something like storage or security from scratch because they can grab those features from a platform that has built-in expertise  and provides a way to easily incorporate it into applications.

The difference here is that it involves Adobe’s core functions, so it may be intelligent auto cropping and smart tagging in Adobe Experience Manager or AI-fueled visual stock search in Creative Cloud. These are features that are essential to the Adobe software experience, which the company is packaging as an API and delivering to developers to use in their own software.

Whether or not Sensei can be the technology that drives the Adobe cloud platform for the next 10 years, Parasnis and the company at large are very much committed to that vision. We should see more announcements from Adobe in the coming months and years as they build more AI-powered algorithms into the platform and expose them to developers for use in their own software.

Parasnis certainly recognizes this as an ongoing process. “We still have a lot of work to do, but we are off in an extremely good architectural direction, and AI will be a crucial part,” he said.

Google to acquire cloud migration startup Velostrata

Google announced today it was going to acquire Israeli cloud migration startup, Velostrata. The companies did not share the purchase price.

Velostrata helps companies migrate from on-premises datacenters to the cloud, a common requirement today as companies try to shift more workloads to the cloud. It’s not always a simple matter though to transfer those legacy applications, and that’s where Velostrata could help Google Cloud customers.

As I wrote in 2014 about their debut, the startup figured out a way to decouple storage and compute and that had wide usage and appeal. “The company has a sophisticated hybrid cloud solution that decouples storage from compute resources, leaving the storage in place on-premises while running a virtual machine in the cloud,” I wrote at the time.

But more than that, in a hybrid world where customer applications and data can live in the public cloud or on prem (or a combination), Velostrata gives them control to move and adapt the workloads as needed and prepare it for delivery on cloud virtual machines.

“This means [customers] can easily and quickly migrate virtual machine-based workloads like large databases, enterprise applications, DevOps, and large batch processing to and from the cloud,” Eyal Manor VP of engineering at Google Cloud wrote in the blog post announcing the acquisition.

This of course takes Velostrata from being a general purpose cloud migration tool to one tuned specifically for Google Cloud in the future, but one that gives Google a valuable tool in its battle to gain cloud marketshare.

In the past, Google Cloud head Diane Greene has talked about the business opportunities they have seen in simply “lifting and shifting” data loads to the cloud. This acquisition gives them a key service to help customers who want to do that with the Google Cloud.

Velostrata was founded in 2014. It has raised over $31 million from investors including Intel Capital and Norwest Venture partners.

Pivotal CEO talks IPO and balancing life in Dell family of companies

Pivotal has kind of a strange role for a company. On one hand its part of the EMC federation companies that Dell acquired in 2016 for a cool $67 billion, but it’s also an independently operated entity within that broader Dell family of companies — and that has to be a fine line to walk.

Whatever the challenges, the company went public yesterday and joined VMware as a  separately traded company within Dell. CEO Rob Mee says the company took the step of IPOing because it wanted additional capital.

“I think we can definitely use the capital to invest in marketing and R&D. The wider technology ecosystem is moving quickly. It does take additional investment to keep up,” Mee told TechCrunch just a few hours after his company rang the bell at the New York Stock Exchange.

As for that relationship of being a Dell company, he said that Michael Dell let him know early on after the EMC acquisition that he understood the company’s position. “From the time Dell acquired EMC, Michael was clear with me: You run the company. I’m just here to help. Dell is our largest shareholder, but we run independently. There have been opportunities to test that [since the acquisition] and it has held true,” Mee said.

Mee says that independence is essential because Pivotal has to remain technology-agnostic and it can’t favor Dell products and services over that mission. “It’s necessary because our core product is a cloud-agnostic platform. Our core value proposition is independence from any provider — and Dell and VMware are infrastructure providers,” he said.

That said, Mee also can play both sides because he can build products and services that do align with Dell and VMware offerings. “Certainly the companies inside the Dell family are customers of ours. Michael Dell has encouraged the IT group to adopt our methods and they are doing so,” he said. They have also started working more closely with VMware, announcing a container partnership last year.

Photo: Ron Miller

Overall though he sees his company’s mission in much broader terms, doing nothing less than helping the world’s largest companies transform their organizations. “Our mission is to transform how the world builds software. We are focused on the largest organizations in the world. What is a tailwind for us is that the reality is these large companies are at a tipping point of adopting how they digitize and develop software for strategic advantage,” Mee said.

The stock closed up 5 percent last night, but Mee says this isn’t about a single day. “We do very much focus on the long term. We have been executing to a quarterly cadence and have behaved like a public company inside Pivotal [even before the IPO]. We know how to do that while keeping an eye on the long term,” he said.