Making sense of a multi-cloud, hybrid world at KubeCon

More than 12,000 attendees gathered this week in San Diego to discuss all things containers, Kubernetes and cloud-native at KubeCon.

Kubernetes, the container orchestration tool, turned five this year, and the technology appears to be reaching a maturity phase where it accelerates beyond early adopters to reach a more mainstream group of larger business users.

That’s not to say that there isn’t plenty of work to be done, or that most enterprise companies have completely bought in, but it’s clearly reached a point where containerization is on the table. If you think about it, the whole cloud-native ethos makes sense for the current state of computing and how large companies tend to operate.

If this week’s conference showed us anything, it’s an acknowledgment that it’s a multi-cloud, hybrid world. That means most companies are working with multiple public cloud vendors, while managing a hybrid environment that includes those vendors — as well as existing legacy tools that are probably still on-premises — and they want a single way to manage all of this.

The promise of Kubernetes and cloud-native technologies, in general, is that it gives these companies a way to thread this particular needle, or at least that’s the theory.

Kubernetes to the rescue

Photo: Ron Miller/TechCrunch

If you were to look at the Kubernetes hype cycle, we are probably right about at the peak where many think Kubernetes can solve every computing problem they might have. That’s probably asking too much, but cloud-native approaches have a lot of promise.

Craig McLuckie, VP of R&D for cloud-native apps at VMware, was one of the original developers of Kubernetes at Google in 2014. VMware thought enough of the importance of cloud-native technologies that it bought his former company, Heptio, for $550 million last year.

As we head into this phase of pushing Kubernetes and related tech into larger companies, McLuckie acknowledges it creates a set of new challenges. “We are at this crossing the chasm moment where you look at the way the world is — and you look at the opportunity of what the world might become — and a big part of what motivated me to join VMware is that it’s successfully proven its ability to help enterprise organizations navigate their way through these disruptive changes,” McLuckie told TechCrunch.

He says that Kubernetes does actually solve this fundamental management problem companies face in this multi-cloud, hybrid world. “At the end of the day, Kubernetes is an abstraction. It’s just a way of organizing your infrastructure and making it accessible to the people that need to consume it.

“And I think it’s a fundamentally better abstraction than we have access to today. It has some very nice properties. It is pretty consistent in every environment that you might want to operate, so it really makes your on-prem software feel like it’s operating in the public cloud,” he explained.

Simplifying a complex world

One of the reasons Kubernetes and cloud-native technologies are gaining in popularity is because the technology allows companies to think about hardware differently. There is a big difference between virtual machines and containers, says Joe Fernandes, VP of product for Red Hat cloud platform.

“Sometimes people conflate containers as another form of virtualization, but with virtualization, you’re virtualizing hardware, and the virtual machines that you’re creating are like an actual machine with its own operating system. With containers, you’re virtualizing the process,” he said.

He said that this means it’s not coupled with the hardware. The only thing it needs to worry about is making sure it can run Linux, and Linux runs everywhere, which explains how containers make it easier to manage across different types of infrastructure. “It’s more efficient, more affordable, and ultimately, cloud-native allows folks to drive more automation,” he said.

Bringing it into the enterprise

Photo: Ron Miller/TechCrunch

It’s one thing to convince early adopters to change the way they work, but as this technology enters the mainstream. Gabe Monroy, partner program manager at Microsoft says to carry this technology to the next level, we have to change the way we talk about it.

AWS, Salesforce join forces with Linux Foundation on Cloud Information Model

Last year, Adobe, SAP and Microsoft came together and formed the Open Data Initiative. Not to be outdone, this week, AWS, Salesforce and Genesys, in partnership with The Linux Foundation, announced the Cloud Information Model.

The two competing data models have a lot in common. They are both about bringing together data and applying a common open model to it. The idea is to allow for data interoperability across products in the partnership without a lot of heavy lifting, a common problem for users of these big companies’ software.

Jim Zemlin, executive director at The Linux Foundation, says this project provides a neutral home for the Cloud Information model, where a community can work on the problem. “This allows for anyone across the community to collaborate and provide contributions under a central governance model. It paves the way for full community-wide engagement in data interoperability efforts and standards development, while rapidly increasing adoption rate of the community,” Zemlin explained in a statement.

Each of the companies in the initial partnership is using the model in different ways. AWS will use it in conjunction with its AWS Lake Formation tool to help customers move, catalog, store and clean data from a variety of data sources, while Genesys customers can use its cloud and AI products to communicate across a variety of channels.

Patrick Stokes from Salesforce says his company is using the Cloud Information Model as the underlying data model for his company’s Customer 360 platform of products. “We’re super excited to announce that we’ve joined together with a few partners — AWS, Genesys and The Linux Foundation — to actually open-source that data model,” Stokes told TechCrunch.

Of course, now we have two competing “open” data models, and it’s going to create some friction until the two competing projects find a way to come together. The fact is that many companies use tools from each of these companies, and if there continues to be these competing approaches, it’s going to defeat the purpose of creating these initiatives in the first place.

As Satya Nadella said in 2015, “It is incumbent upon us, especially those of us who are platform vendors to partner broadly to solve real pain points our customers have.” If that’s the case, having competing models is not really achieving that.

After selling enterprise biz, Docker lands $35M investment and new CEO

In what’s proving to be an interesting day for Docker, it announced it has received a $35 million investment from existing investors Benchmark Capital and Insight Partners.

It also announced the company has named long-time Chief Product Officer, Scott Johnston as CEO. Johnston is the third CEO at Docker this year, replacing Rob Bearden, who replaced Steve Singh after he stepped down in May.

The news came shortly after Mirantis had announced it had purchased Docker’s enterprise business. The moves are curious to say the least, but Johnston says that he still sees an opportunity for the company helping developers use Docker, the popular containerization engine that has struggled to find a business model.

“Specifically, we are investing in expanding our cloud services to enable developers to quickly discover technologies for use when building applications, to easily share these apps with teammates and the community, and to run apps frictionlessly on any Kubernetes endpoint, whether locally or in the cloud,” Johnston said in a statement.

Bearden said that the company decided to go in this direction after carefully studying its existing business models. “After conducting thorough analysis with the management team and the Board of Directors, we determined that Docker had two very distinct and different businesses: one an active developer business, and the other a growing enterprise business. We also found that the product and the financial models were vastly different. This led to the decision to restructure the company and separate the two businesses, which is the best thing for customers and to enable Docker’s industry-leading technology to thrive,” he said in a statement.

Prior to today’s announcement, the company had raised over $272 million, according to Crunchbase data. Now Benchmark and Insight are throwing it a $35 million lifeline to try one more time to build a successful business on top of the open source Docker project.

Datameer announces $40M investment as it pivots away from Hadoop roots

Datameer, the company that was born as a data prep startup on top of the open source Hadoop project, announced a $40 million investment and a big pivot away from Hadoop, while staying true to its big data roots.

The investment was led by existing investor ST Telemedia . Other existing investors including Redpoint Ventures, Kleiner Perkins, Nextworld Capital, Citi Ventures and Top Tier Capital Partners also participated. Today’s investment brings the total raised to almost $140 million, according to Crunchbase data.

Company CEO Christian Rodatus says the company’s original mission was about making Hadoop easier to use for data scientists, business analysts and engineers. In the last year, the three biggest commercial Hadoop vendors — Cloudera, Hortonworks and MapR — fell on hard times. Cloudera and Hortonworks merged and MapR was sold to HPE in a fire sale.

Starting almost two years ago, Datameer recognized that against this backdrop, it was time for a change. It began developing a couple of new products. It didn’t want to abandon its existing customer base entirely of course, so it began rebuilding its Hadoop product and is now calling it Datameer X. It is a modern cloud-native product built to run on Kubernetes, the popular open source container orchestration tool. Instead of Hadoop, it will be based on Spark. He reports they are about two-thirds done with this pivot, but the product has been in the hands of customers.

The company also announced Neebo, an entirely new SaaS tool to give data scientists the ability to process data in whatever form it takes. Rodatus sees a world coming where data will take many forms from traditional data to Python code from data analysts or data scientists to SaaS vendor dashboards. He sees Neebo bringing all of this together in a managed service with the hope that it will free data scientists to concentrate on getting insight from the data. It will work with data visualization tools like Tableau and Looker, and should be generally available in the coming weeks.

The money should help them get through this pivot, hire more engineers to continue the process and build a go-to-market team for the new products. It’s never easy pivoting like this, but the investors are likely hoping that the company can build on its existing customer base, while taking advantage of the market need for data science processing tools. Time will tell if it works.

Streamlit launches open source machine learning application development framework

Streamlit, a new machine learning startup from industry veterans, who worked at GoogleX and Zoox, launched today with a $6 million seed investment and a flexible new open source tool to make it easier for machine learning engineers to create custom applications to interact with the data in their models.

The seed round was led by Gradient Ventures with participation from Bloomberg Beta. A who’s who of solo investors also participated including Color Genomics co-founder Elad Gil, #Angels founder Jana Messerschmidt, Y Combinator partner Daniel Gross, Docker co-founder Solomon Hykes and Insight Data Science CEO Jake Klamka.

As for the product, Streamlit co-founder Adrien Treuille, says as machine learning engineers he and his co-founders were in a unique position to understand the needs of engineers and build a tool to meet their requirements. Rather than building a one-size-fits-all tool, the key was developing a solution that was flexible enough to serve multiple requirements, depending on the nature of the data the person is working with.

“I think that Streamlit actually has, I would say, a unique position in this market. While most companies are basically trying to systemize some part of the machine learning workflow, we’re giving engineers these sort of Lego blocks to build whatever they want,” Treuille explained.

self driving 1

Customized self-driving car data application built with Streamlit that enables machine learning engineers to interact with the data.

Treuille says that highly trained machine learning engineers that have unique set of skills actually end up spending an inordinate amount of their time building tools to understand the vast amounts of data they have. Streamlit is trying to help them build these tools faster using the kind of programming tools they are used to work with.

He says that with a few lines of code, a machine learning engineer can very quickly begin building tools to understand the data and help them interact with it in whatever way makes sense based on the type of data. That may mean building a set of sliders with different variables to interact with the data, or simply creating tables with subsets of data that make sense to the engineer.

Treuille says that this toolset has the potential to dramatically transform the way machine learning engineers work with the data in their models. “As people who are machine learning engineers and have seen this and know what it’s like to go through these challenges, It was really exciting for us to say, there’s a better way of doing this and not just a little bit better, but something that will turn a project that would have taken four weeks and 15,000 lines of code into something that you can do in an afternoon.”

The toolkit is available on GitHub for download starting today.

Confluent adds free tier to Kafka real-time streaming data cloud service

When Confluent launched a cloud service in 2017, it was trying to reduce some of the complexity related to running a Kafka streaming data application. Today, it introduced a free tier to that cloud service. The company hopes to expand its market beyond large technology company customers, and the free tier should make it easier for smaller companies to get started.

The new tier provides up to $50 of service a month for up to three months. Company CEO Jay Kreps says that while $50 might not sound like much, it’s actually hundreds of gigabytes of throughput and makes it easy to get started with the tool.

“We felt like we can make this technology really accessible. We can make it as easy as we can. We want to make it something where you can just get going in seconds, and not have to pay anything to start building an application that uses real time streams of data,” Kreps said.

Kafka has been available as an open source product since 2011, so it’s been free to download, install and build applications, but still required a ton of compute and engineering resources to pull off. The cloud service was designed to simplify that, and the free tier lets developers get comfortable building a small application without making a large financial investment.

Once they get used to working with Kafka on the free version, users can then buy in whatever increments make sense for them, and only pay for what they use. It can be pennies worth of Kafka or hundreds of dollars depending on a customer’s individual requirements. “After free, you can buy 11 cents worth of Kafka or you can buy it $10 worth, all the way up to these massive users like Lyft that use Kafka Cloud at huge scale as part of their ride sharing service,” he said.

While a free SaaS trial might feel like a common kind of marketing approach, Kreps says for a service like Kafka, it’s actually much more difficult to pull off. “With something like a distributed system where you get a whole chunk of infrastructure, it’s actually technically an extraordinarily difficult thing to provide zero to elastic scale up capabilities. And a huge amount of engineering goes into making that possible,” Kreps explained.

Kafka processes massive streams of data in real time. It was originally developed inside LinkedIn and open sourced in 2011. Confluent launched as a commercial entity on top of the open source project in 2014. In January the company raised $125 million on a $2.5 billion valuation. It has raised over $205 million, according to Crunchbase data.

How founder and CTO Dries Buytaert sold Acquia for $1B

Acquia announced yesterday that Vista Equity Partners was going to buy a majority stake in the company worth a $1 billion. That would seem to be reason enough to sell the company. That’s a good amount a dough, but as co-founder and CTO Dries Buytaert told Extra Crunch, he’s also happy to be taking care of his early investors and his long-time, loyal employees who stuck by him all these years.

Vista is actually buying out early investors as part of the deal, while providing some liquidity for employee equity holders. “I feel proud that we are able to reward our employees, especially those that have been so loyal to the company and worked so hard for so many years. It makes me feel good that we can do that for our employees,” he said.

Image via TechCrunch

Chef CEO does an about face, says company will not renew ICE contract

After stating clearly on Friday that he would honor a $95,000 contract with ICE, CEO Barry Crist must have had a change of heart over the weekend. In a blog post, this morning he wrote that the company would not be renewing the contract with ICE after all.

“After deep introspection and dialog within Chef, we will not renew our current contracts with ICE and CBP when they expire over the next year. Chef will fulfill our full obligations under the current contracts,” Crist wrote in the blog post.

He also backed off the seemingly firm position he took on Friday on the matter when he told TechCrunch, “It’s something that we spent a lot of time on, and I want to represent that there are portions of [our company] that do not agree with this, but I as a leader of the company, along with the executive team, made a decision that we would honor the contracts and those relationships that were formed and work with them over time,” he said.

Today, he acknowledged that intense feelings inside the company against the contract led to his decision. The contract began in 2015 under the Obama administration and was aimed at modernizing programming approaches at DHS, but over time as ICE family separation and deportation polices have come under fire, there were calls internally (and later externally) to end the contract. “Policies such as family separation and detention did not yet exist [when we started this contract]. While I and others privately opposed this and various other related policies, we did not take a position despite the recommendation of many of our employees. I apologize for this,” he wrote

Crist also indicated that the company would be donating the revenue from the contracts to organizations that work with people who have been affected by these policies. It’s a similar approach that Salesforce took when 618 of its employees protested a contract the company has with the Customs and Border Patrol (CBP). In response to the protests, Salesforce pledged $1 million to organizations helping affected families.

After a tweet last week exposed the contract, the protests began on social media, and culminated in programmer Seth Vargo removing pieces of open source code from the repository in protest of the contract in response. The company sounded firmly committed to fulfilling this contract in spite of the calls for action internally and externally, and the widespread backlash it was facing both inside and outside the company.

Vargo told TechCrunch in an interview that he saw this issue in moral terms, “Contrary to Chef’s CEO’s publicly posted response, I do think it is the responsibility of businesses to evaluate how and for what purposes their software is being used, and to follow their moral compass,” he said. Apparently Crist has come around to this point of view. Vargo chose not to comment on the latest development.

Programmer who took down open source pieces over Chef ICE contract responds

On Friday afternoon Chef CEO Barry Crist and CTO Corey Scobie sat down with TechCrunch to defend their contract with ICE after a firestorm on social media called for them to cut ties with the controversial agency. On Sunday, programmer Seth Vargo, the man who removed his open source components, which contributed to a partial shutdown of Chef’s commercial business for a time last week, responded.

While the Chef executives stated that the company was in fact the owner, Vargo made it clear he owned those pieces and he had every right to remove them from the repository. “Chef (the company) was including a third party software package that I owned. It was on my personal repository on GitHub and personal namespace on RubyGems,” he said. He believes that gave him the right to remove it.

Chef CTO Corey Scobie did not agree. “Part of the challenge was that [Vargo] actually didn’t have authorization to remove those assets. And the assets were not his to begin with. They were actually created under a time when that particular individual [Vargo] was an employee of Chef. And so therefore, the assets were Chef’s assets, and not his assets to remove,” he said.

Vargo says that simply isn’t true and Chef misunderstands the licensing. “No OSI license or employment agreement requires me to continue to maintain code of my personal account(s). They are conflating code ownership (which they can argue they have) over code stewardship,” Vargo told TechCrunch.

As further proof, Vargo added that he has even included detailed instructions in his will on how to deal with the code he owns when he dies. “I want to make it absolutely clear that I didn’t “hack” into Chef or perform any kind of privilege escalation. The code lived in my personal accounts. Had I died on Thursday, the exact same thing would have happened. My will requests all my social media and code accounts be deleted. If I had deleted my GitHub account, the same thing would have happened,” he explained.

Vargo said that Chef actually was in violation of the open source license when they restored those open source pieces without putting his name on it. “Chef actually violated the Apache license by removing my name, which they later restored in response to public pressure,” he said.

Scobie admitted that the company did forget to include Vargo’s name on the code, but added it back as soon as they heard about the problem. “In our haste to restore one of the objects, we inadvertently removed a piece of metadata that identified him as the author. We didn’t do that knowingly. It was absolutely a mistake in the process of trying to restore customers and our and our global customer base service. And as soon as we were notified of it, we reverted that change on this specific object in question,” he said.

Vargo says, as for why he took the open source components down, he was taking a moral stand against the contract, which dates back to the Obama administration. He also explained that he attempted to contact Chef via multiple channels before taking action. “First, I didn’t know about the history of the contract. I found out via a tweet from @shanley and subsequently verified via the USA spending website. I sent a letter and asked Chef publicly via Twitter to respond multiple times, and I was met with silence. I wanted to know how and why code in my personal repositories was being used with ICE. After no reply for 72 hours, I decided to take action,” he said.

Since then, Chef’s CEO Barry Crist has made it clear he was honoring the contract, which Vargo felt further justified his actions. “Contrary to Chef’s CEO’s publicly posted response, I do think it is the responsibility of businesses to evaluate how and for what purposes their software is being used, and to follow their moral compass,” he said.

Vargo has a long career helping build development tools and contributing to open source. He currently works for Google Cloud. Previous positions include HashiCorp and Chef.

Chef CEO says he’ll continue to work with ICE in spite of protests

Yesterday, software development tool maker Chef found itself in the middle of a firestorm after a Tweet called them out for doing business with DHS/ICE. Eventually it led to an influential open source developer removing a couple of key pieces of software from the project, bringing down some parts of Chef’s commercial business.

Chef intends to fulfill its contract with ICE, in spite of calls to cancel it. In a blog post published this morning, Chef CEO Barry Crist defended the decision. “I do not believe that it is appropriate, practical, or within our mission to examine specific government projects with the purpose of selecting which U.S. agencies we should or should not do business.”

He stood by the company’s decision this afternoon in an interview with TechCrunch, while acknowledging that it was a difficult and emotional decision for everyone involved. “For some portion of the community, and some portion of our company, this is a super, super-charged lightning rod, and this has been very difficult. It’s something that we spent a lot of time on, and I want to represent that there are portions of [our company] that do not agree with this, but I as a leader of the company, along with the executive team, made a decision that we would honor the contracts and those relationships that were formed and work with them over time,” he said.

He added, “I think our challenge as as leadership right now is how do we collectively navigate through through times like this, and through emotionally-charged issues like the ICE contract.”

The deal with ICE, which is a $95,000 a year contract for software development tools, dates back to the Obama administration when the then DHS CIO wanted to move the department towards more modern agile/DevOps development workflows, according Christ.

He said for people who might think it’s a purely economic decision, the money represents a fraction of the company’s more than $50 million annual revenue (according to Crunchbase data), but he says it’s about a long-term business arrangement with the government that transcends individual administration policies. “It’s not about the $100,000, it’s about decisions we’ve made to engage the government. And I appreciate that not everyone in our world feels the same way or would make that same decision, but that’s the decision that that we made as a leadership team,”Crist said.

Shortly after word of Chef’s ICE contract appeared on Twitter, according to a report in The Register, former Chef employee Seth Vargo removed a couple of key pieces of open source software from the repository, telling The Register that “software engineers have to operate by some kind of moral compass.” This move brought down part of Chef’s commercial software and it took them 24 hours to get those services fully restored, according to Chef CTO Corey Scobie.

Crist says he wants to be clear that his decision does not mean he supports current ICE policies. “I certainly don’t want to be viewed as I’m taking a strong stand in support of ICE. What we’re taking a strong stand on is our consistency with working with our customers, and again, our work with DHS  started in the previous administration on things that we feel very good about,” he said.