1010Computers | Computer Repair & IT Support

Google Cloud goes all-in on hybrid with its new Cloud Services Platform

The cloud isn’t right for every business, be that because of latency constraints at the edge, regulatory requirements or because it’s simply cheaper to own and operate their own data centers for their specific workloads. Given this, it’s maybe no surprise that the vast majority of enterprises today use both public and private clouds in parallel. That’s something Microsoft has long been betting on as part of its strategy for its Azure cloud, and Google, too, is now taking a number of steps in this direction.

With the open-source Kubernetes project, Google launched one of the fundamental building blocks that make running and managing applications in hybrid environments easier for large enterprises. What Google hadn’t done until today, though, is launch a comprehensive solution that includes all of the necessary parts for this kind of deployment. With its new Cloud Services Platform, though, the company is now offering businesses an integrated set of cloud services that can be deployed on both the Google Cloud Platform and in on-premise environments.

As Google Cloud engineering director Chen Goldberg noted in a press briefing ahead of today’s announcement, many businesses also simply want to be able to manage their own workloads on-premise but still be able to access new machine learning tools in the cloud, for example. “Today, to achieve this, use cases involve a compromise between cost, consistency, control and flexibility,” she said. “And this all negatively impacts the desired result.”

Goldberg stressed that the idea behind the Cloud Services Platform is to meet businesses where they are and then allow them to modernize their stack at their own pace. But she also noted that businesses want more than just the ability to move workloads between environments. “Portability isn’t enough,” she said. “Users want consistent experiences so that they can train their team once and run anywhere — and have a single playbook for all environments.”

The two services at the core of this new offering are the Kubernetes container orchestration tool and Istio, a relatively new but quickly growing tool for connecting, managing and securing microservices. Istio is about to hit its 1.0 release.

We’re not simply talking about a collection of open-source tools here. The core of the Cloud Services Platform, Goldberg noted, is “custom configured and battle-tested for enterprises by Google.” In addition, it is deeply integrated with other services in the Google Cloud, including the company’s machine learning tools.

GKE On-Prem

Among these new custom-configured tools are a number of new offerings, which are all part of the larger platform. Maybe the most interesting of these is GKE On-Prem. GKE, the Google Kubernetes Engine, is the core Google Cloud service for managing containers in the cloud. And now Google is essentially bringing this service to the enterprise data center, too.

The service includes access to all of the usual features of GKE in the cloud, including the ability to register and manage clusters and monitor them with Stackdriver, as well as identity and access management. It also includes a direct line to the GCP Marketplace, which recently launched support for Kubernetes-based applications.

Using the GCP Console, enterprises can manage both their on-premise and GKE clusters without having to switch between different environments. GKE on-prem connects seamlessly to a Google Cloud Platform environment and looks and behaves exactly like the cloud version.

Enterprise users also can get access to professional services and enterprise-grade support for help with managing the service.

“Google Cloud is the first and only major cloud vendor to deliver managed Kubernetes on-prem,” Goldberg argued.

GKE Policy Management

Related to this, Google also today announced GKE Policy Management, which is meant to provide Kubernetes administrators with a single tool for managing all of their security policies across clusters. It’s agnostic as to where the Kubernetes cluster is running, but you can use it to port your existing Google Cloud identity-based policies to these clusters. This new feature will soon launch in alpha.

Managed Istio

The other major new service Google is launching is Managed Istio (together with Apigee API Management for Istio) to help businesses manage and secure their microservices. The open source Istio service mesh gives admins and operators the tools to manage these services and, with this new managed offering, Google is taking the core of Istio and making it available as a managed service for GKE users.

With this, users get access to Istio’s service discovery mechanisms and its traffic management tools for load balancing and routing traffic to containers and VMs, as well as its tools for getting telemetry back from the workloads that run on these clusters.

In addition to these three main new services, Google is also launching a couple of auxiliary tools around GKE and the serverless computing paradigm today. The first of these is the GKE serverless add-on, which makes it easy to run serverless workloads on GKE with a single-step deploy process. This, Google says, will allow developers to go from source code to container “instantaneously.” This tool is currently available as a preview and Google is making parts of this technology available under the umbrella of its new native open source components. These are the same components that make the serverless add-on possible.

And to wrap it all up, Google also today mentioned a new fully managed continuous integration and delivery service, Google Cloud Build, though the details around this service remain under wraps.

So there you have it. By themselves, all of those announcements may seem a bit esoteric. As a whole, though, they show how Google’s bet on Kubernetes is starting to pay off. As businesses opt for containers to deploy and run their new workloads (and maybe even bring older applications into the cloud), GKE has put Google Cloud on the map to run them in a hosted environment. Now, it makes sense for Google to extend this to its users’ data centers, too. With managed Kubernetes from large and small companies like SUSE, Platform 9, containership is starting to become a big business. It’s no surprise the company that started it all wants to get a piece of this pie, too.

Powered by WPeMatico

InVision CEO Clark Valberg to talk design at Disrupt SF

To Clark Valberg, the screen is the most important place in the world. And he’s not the only one who thinks so. It isn’t just tech companies spending their money on design. The biggest brands in the world are pouring money into their digital presence, for many, the first step is InVision.

InVision launched back in 2011 with a simple premise: What if, instead of the back-and-forth between designers and engineers and executives, there was a program that let these interested parties collaborate on a prototype?

The first iteration simply let designers build out prototypes, complete with animations and transitions, so that engineers didn’t spend time building things that would only change later.

As that tool grew, InVision realized that it was in conversation with designers across the industry, and that it hadn’t yet fixed one of their biggest pain points. That’s why, in 2017, InVision launched Studio, a design platform that was built specifically for designers building products.

Alongside Studio, InVision also launched its own app store for design programs to loop into the larger InVision platform. And the company also launched a fund to invest in early-stage design companies.

The idea here is to become the Salesforce of the design world, with the entire industry centering around this company and its various offerings.

InVision has raised more than $200 million, and serves 4 million users, including 80 percent of the Fortune 500. We’re absolutely thrilled to have Clark Valberg, InVision cofounder and CEO, join us at Disrupt SF in September.

The full agenda is here. Passes for the show are available at the Early-Bird rate until July 25 here.

Powered by WPeMatico

Watch the Google Cloud Next day one keynote live right here

Google is hosting its big cloud conference, Google Cloud Next, this morning over at the Moscone center in San Francisco. Obviously it’s not quite as large as its flagship event I/O earlier this year, but Google’s cloud efforts have become one of its brightest points over the past several quarters.

With heavy investments in Google Cloud’s infrastructure, its enterprise services, as well as a suite of machine learning tools layered on top of all that, Google is clearly trying to make Google Cloud a core piece of its business going forward. Traditionally an advertising juggernaut, Google is now figuring out what comes next after that, even as that advertising business continues to grow at a very healthy clip.

The keynote starts at 9 a.m. Pacific time, and the TechCrunch team is on the ground here covering all the newsiest and best bits. Be sure to check out our full coverage on TechCrunch as the keynote moves forward.

Powered by WPeMatico

YC-backed Send Reality makes 3D virtual walkthroughs for residential listings

The fields of computer vision and VR are difficult. But a new company, Send Reality, is entering the race. The Y Combinator-backed company is looking to offer full 3D-modeling for virtual walkthroughs of real estate listings.

Founder and CEO Andrew Chen said he was the kid back in middle school and high school that spent hours walking around the streets of Paris, NYC and SF on Google Streetview.

“The thing I always wanted was to walk through the inside of all the interesting places of the world,” said Chen. “90 percent of the world’s most interesting physical content is inside, but I couldn’t do that.”

Chen explained that the field of computer vision has been able to make substantial technical breakthroughs, now allowing companies like Send Reality to create a videogame-style replica of the world.

For now, however, Send Reality is focused on luxury residential real estate.

Here’s how it works:

Send Reality sends photographers out to the listing with an iPad, a $250 commodity depth sensor, and a specialized Send Reality app. These photographers take hundreds of thousands of photos, and the Send Reality technology stitches those photos together to create a complete 3D model, as shown in the above .gif.

Chen says that what makes Send Reality tech special is how efficiently it’s able to stitch together these photos, explaining that the company can put together over 100K photos in the same time it takes for top academic labs in the world to put together 5,000.

“What this means is that the 3D models we create are so much more realistic than anything else anyone else has made,” said Chen.

For the luxury residential market that Send Reality is currently targeting, most listings are put up on their own website. Given this is still in beta, the numbers on Send Reality demoes are still rough. But Chen says that listing websites that include the Send Reality product see a 5x to 10x increase in the amount of time people spend on the website, with 75 percent to 80 percent of that extra time spent directly in the Send Reality viewer.

Send Reality sells directly to realtors, offering the product for $500 to $800 depending on the size and complexity of the home. In the future, the company can bring down that price point by allowing realtors to scan the home themselves from their own smartphone.

Send Reality has received funding from Y Combinator .

Powered by WPeMatico

Rescale reels in $32 million Series B to bring high performance computing to cloud

Rescale, the startup that wants to bring high performance computing to the cloud, announced a $32 million Series B investment today led by Initialized Capital, Keen Venture Partners and SineWave Ventures.

They join a list of well-known early investors that included Sam Altman, Jeff Bezos, Richard Branson, Paul Graham, Ron Conway, Chris Dixon, Peter Thiel and others. Today’s investment brings the total amount raised to $52 million, according to the company.

Rescale works with engineering, aerospace, scientific and other verticals and helps them move their legacy high performance computing applications to the cloud. The idea is to provide a set of high performance computing resources, whether that’s on prem or in the cloud, and help customers tune their applications to get the maximum performance.

Traditionally HPC has taken place on prem in a company’s data center. These companies often have key legacy applications they want to move to the cloud and Rescale can help them do that in the most efficient manner, whether that involves bare metal a virtual machine or a container.

“We help take a portfolio of [legacy] applications running on prem and help enable them in the cloud or in a hybrid environment. We tune and optimize the applications on our platform and take advantage of capital assets on prem, then we help extend that environment to different cloud vendors or tune to best practices for the specific application,” company CEO and co-founder Joris Poort explained.

Photo: Rescale

Ben Verwaayen, who is a partner at one of the lead investors, Keen Venture Partners, sees a company going after a large legacy market with a new approach. “The market is currently 95% on-premise, and Rescale supports customers as they move to hybrid and eventually to a fully cloud native solution. Rescale helps CIOs enable the digital transformation journey within their enterprise, to optimize IT resources and enable meaningful productivity and cost improvements,” Verwaayen said in a statement.

The new influx of cash should help Rescale, well, scale, and that will involve hiring more developers, solutions architects and the like. The company wants to also use the money to expand its presence in Asia and Europe and establish relationships with systems integrators, who would be a good fit for a product like this and help expand their market beyond what they can do as a young startup.

The company, which is based in San Francisco, was founded in 2011 and has 80 employees. They currently have 150 customers including Sikorsky Innovation, Boom Aerospace and Trek Bikes.

Powered by WPeMatico

Google Cloud CEO Diane Greene: ‘We’re playing the long game here’

Google is hosting its annual Cloud Next conference in San Francisco this week. With 25,000 developers in attendance, Cloud Next has become the cloud-centric counterpart to Google I/O. A few years ago, when the event only had about 2,000 attendees and Google still hosted it on a rickety pier, Diane Greene had just taken over as the CEO of Google’s cloud businesses and Google had fallen a bit behind in this space, just as Amazon and Microsoft were charging forward. Since then, Google has squarely focused on bringing business users to its cloud, both to its cloud computing services and to G Suite.

Ahead of this year’s Cloud Next, I sat down with Diane Greene to talk about the current state of Google Cloud and what to expect in the near future. As Greene noted, a lot of businesses first approached cloud computing as an infrastructure play — as a way to get some cost savings and access to elastic resources. “Now, it’s just becoming so much more. People realize it’s a more secure place to be, but really, I feel like in its essence it’s all about super-charging your information to make your company much more successful.” It’s the cloud, after all, where enterprises get access to globally distributed databases like Cloud Spanner and machine learning tools like AutoML (and their equivalent tools from other vendors).

When she moved to Google Cloud, Greene argued, Google was missing many of the table stakes that large enterprises needed. “We didn’t have all the audit logs. We didn’t have all the fine-grained security controls. We didn’t have the peer-to-peer networking. We didn’t have all the compliance and certification,” she told me.

People told her it would take Google ten years to be ready for enterprise customers. “That’s how long it took Microsoft. And I was like, no, it’s not 10 years.” The team took that as a challenge and now, two years later, Greene argues that Google Cloud is definitely ready for the enterprise (and she’s tired of people calling it a ‘distant third’ to AWS and Azure).

Today, when she thinks about her organization’s mission, she sees it as a variation on Google’s own motto. “Google’s mission is to organize the world’s information,” she said. “Google Cloud’s mission then is to supercharge our customers’ information.”

When it comes to convincing large enterprises to bet on a given vendor, though, technology is one thing, but a few years ago, Google also didn’t have the sales teams in place to sell to these companies. That had to change, too, and Greene argues that the company’s new approach is working as well. And Google needed the right partners, too, which it has now found with companies like SAP, which has certified Google’s Cloud for its Hana in-memory database, and the likes of Cisco.

A few months ago, Greene told CNBC she thought that people were underestimating the scale of Google’s cloud businesses. And she thinks that’s still the case today, too. “They definitely are underestimating us. And to some extent, maybe that hurt us. But we love our pipeline and all our engagements that we have going on,” she told me.

Getting large businesses on board is one thing, but Greene also argued that today is probably the best time ever to be an enterprise developer. “I’ve never seen companies so aggressively pursuing the latest technology and willing to adopt this disruptive technology because they see the advantage that can give them and they see that they won’t be competitive if the people they compete with adopt it first,” Greene told me. “And because of this, I think innovation in the enterprise is happening right now, even faster than it is in consumer, which is somewhat of a reversal.”

As for the companies that are choosing Google Cloud today, Greene sees three distinct categories. There are those that were born in the cloud. Think Twitter, Spotify and Snap, which are all placing significant bets on Google Cloud. Not shy to compare Google’s technology prowess to its competitors, Greene noted that “they are with Google Cloud because they know that we’re the best cloud from a technology standpoint.”

But these days, a lot of large companies that preceded the internet but were still pretty data-centric are also moving to the cloud. Examples there, as far as Google Cloud customers go, include Schlumberger, HSBC and Disney. And it’s those companies that Google is really going after at this year’s Next with the launch of the Cloud Services Platform for businesses that want or need to take a hybrid approach to their cloud adoption plans. “They see that the future is in the cloud. They see that’s where the best technology is going to be. They see that through using the technology of the cloud they can redeploy their people to be more focused on their business needs,” Greene explained.

Throughout our conversation, Greene stressed that a lot of these companies are coming to Google because of its machine learning tools and its support for Kubernetes. “We’re bringing the cloud to them,” Greene said about these companies that want to go hybrid. “We are taking Kubernetes and Istio, the monitoring and securing of the container workflows and we’re making it work on-prem and within all the different clouds and supporting it across all that. And that way, you can stay in your data center and have this Kubernetes environment and then you can spill over into the cloud and there’s no lock-in.”

But there’s also a third category, the old brick-and-mortar businesses like Home Depot that often don’t have any existing large centralized systems but that now have to go through their own digital transformation, too, to remain competitive.

While it’s fun to talk about up-and-coming technologies like Kubernetes and containers, though, Greene noted the vast majority of users still come to Google Cloud because of its compute services and data management and analytics tools like BigQuery. Of course there’s lot of momentum behind the Google Kubernetes Engine, too, as well as the company’s machine learning tools, but enterprises are only now starting to think about these tools.

But Greene also stressed that a lot of customers are looking for security, not just in the cloud computing side of Google Cloud but also when it comes to choosing the G Suite set of productivity tools.

“Companies are getting hacked and Google, knock on wood, is not getting hacked,” she noted. “We are so much more secure than any company could ever contemplate.”

But while that’s definitely true, Google has also faced an interesting challenge here because of its consumer businesses. Greene noted that it sometimes takes people a while to understand that what Google does with consumer data is vastly different from what it does with data that sits in Google Cloud. Google, after all, does mine a good amount of its free users’ data to serve them more relevant ads.

“We’ve been keeping billions of people’s data private for almost 20 years and that’s a lot of hard work, but a cloud customer’s data is completely private to them and we do have to continually educate people about that.”

So while Google got a bit of a late start in getting enterprises to adopt its Cloud, Greene now believes that it’s on the right track. “And the other thing is, we’re playing the long game,” she noted. “This thing is early. Some people estimate that only 10 percent of workloads are in the big public clouds. And if it’s not in a public cloud, it is going to be in a public cloud.”

Powered by WPeMatico

Xiaomi goes after global markets with two new Android One phones

Xiaomi gave Google’s well-intentioned but somewhat-stalled Android One project a major boost last year when it unveiled its first device under the program, Mi A1. That’s now joined by not one but two sequel devices, after the Chinese phone maker unveiled the Mi A2 and Mi A2 Lite at an event in Spain today.

Xiaomi in Spain? Yes, that’s right. International growth is a major part of the Xiaomi story now that it is a listed business, and Spain is one of a handful of countries in Europe where Xiaomi is aiming to make its mark. These two new A2 handsets are an early push and they’ll be available in over 40 countries, including Spain, France, Italy and 11 other European markets.

Both phones run on Android One — so none of Xiaomi’s iOS-inspired MIUI Android fork — and charge via type-C USB. The 5.99-inch A2 is the more premium option, sporting a Snapdragon 660 processor and 4GB or 6GB RAM with 32GB, 64GB or 128GB in storage. There’s a 20-megapixel front camera and dual 20-megapixel and 16-megapixel cameras on the rear. On-device storage ranges between 32GB, 64GB and 128GB.

The Mi A2 Lite is the more budget option that’s powered by a lesser Snapdragon 625 processor with 3GB or 4GB RAM, and 32GB or 64GB storage options. It comes with a smaller 5.84-inch display, there’s a 12- and 5-megapixel camera array on the reverse and a front-facing five-megapixel camera.

The A2 is priced from €249 to €279 ($291-$327) based on specs. The A2 Lite will sell for €179 or €229 ($210 or $268), against based on RAM and storage selection.

The 40 market availability mirrors the A1 launch last year, but on this occasion, Xiaomi has been busy preparing the ground in a number of countries, particularly in Europe. It has been in Spain for the past year, but it also launched local operations in France and Italy in May and tied up with CK Hutchison to sell phones in other parts of the continent via its 3 telecom business. While it isn’t operational in the U.S., Xiaomi has expanded into Mexico and it has set up partnerships with local retailers in dozens of other countries.

Xiaomi has been successful with its move into India, where it one of the top smartphone sellers, but it has not yet replicated that elsewhere outside of China so far.

China is, as you’d expect, the primary revenue market but Xiaomi is increasingly less dependent on its homeland. For 2017 sales, China represented 72 percent, but it had been 94 percent and 87 percent, respectively, in 2015 and 2016.

Powered by WPeMatico

Microsoft is building low-cost, streaming-only Xbox, says report

It was revealed at E3 last month that Microsoft was building a cloud gaming system. A report today calls that system Scarlett Cloud and it’s only part of Microsoft’s next-gen Xbox strategy. And it makes a lot of sense, too.

According to Thurrott.com, noted site for all things Microsoft, the next Xbox will come in two flavors. One will be a traditional gaming console where games are processed locally. You know, like how it works on game systems right now. The other system will be a lower-powered system that will stream games from the cloud — most likely, Microsoft’s Azure cloud.

This streaming system will still have some processing power, which is in part to counter latency traditionally associated with streaming games. Apparently part of the game will run locally while the rest is streamed to the system.

The streaming Xbox will likely be available at a much lower cost than the traditional Xbox. And why not. Microsoft has sold Xbox systems with a slim profit margin, relying on sales of games and online services to make up the difference. A streaming service that’s talked about on Thurrott would further take advantage of this model while tapping into Microsoft’s deep understanding of cloud computing.

A few companies have tried streaming full video games. Onlive was one of the first; while successful for a time, it eventually went through a dramatic round of layoffs before a surprise sale for $4.8 million in 2012. Sony offers an extensive library of PS2, PS3 and PS4 games for streaming through its PlayStation Now service. Nvidia got into the streaming game this year and offers a small selection of streaming through GeForce Now. But these are all side projects for the companies.

Sony and Nintendo do not have the global cloud computing platform of Microsoft, and if Microsoft’s streaming service hits, it could change the landscape and force competitors to reevaluate everything.

Powered by WPeMatico

Pokémon GO gets ‘Lucky’ Pokémon obtainable only by trading

Pokémon GO just got a little surprise update, complete with a curious new feature: “Lucky” Pokémon.

Most things in Pokémon GO are adapted from things that already exist in the Pokémon universe. Items like incense, lucky eggs and the like all exist in the main Pokémon series (though what these items actually do tends to be a bit different in GO).

Lucky Pokémon, as far as I know, is a new concept altogether.

So what are they? And how are they different from existing Shiny Pokémon?

Shiny Pokémon are rare variations of existing Pokémon with colors that differ from the standard. You might tap on your 398th Dratini, for example, only to find that it’s bright pink instead of the standard blue. You might randomly tap a Minun to find that it has green ears instead of blue, or an Aron with red eyes instead of blue. It’s a fun way to keep players tapping on Pokémon even after their Pokédex is technically complete. The differences are only skin deep, though; beyond the visual shift, Shiny Pokémon are generally functionally the same as their non-shiny version.

The new “Lucky” Pokémon, meanwhile, don’t look much different (save for a sparkly background when you look at them in your collection). They do, however, have a little functional advantage: powering them up requires less stardust. In other words, you’ll be able to make them stronger faster and with less work.

How do you get ’em? By trading. While folks are still working out the exact mechanics, it looks like non-Lucky Pokémon have a chance to become Lucky Pokémon when traded from one player to another. According to Niantic, the odds of a Pokémon becoming “lucky” after a trade increase based on how long ago it was originally caught.

And for the collectors out there: yes, for better or worse, “Lucky” Pokémon are now a category in the Pokédex. Niantic just added trading to Pokémon GO a month ago, and this is a clever way to get players to care about trading even after they’ve already caught everything there is to catch.

This update also brings a few other small changes, mostly just polishing up the way the friend/trading system works:

  • You can now give friends nicknames. That’s super useful for remembering who is who, or for remembering that you added PikaFan87 because they promised to trade you a Kangaskhan
  • You now get a bit of XP for sending gifts
  • Gifts can now contain stardust
  • You can now delete gifts from your inventory

Powered by WPeMatico

Doughbies’ cookie crumbles in a cautionary tale of venture scale

Doughbies should have been a bakery, not a venture-backed startup. Founded in the frothy days of 2013 and funded with $670,000 by investors, including 500 Startups, Doughbies built a same-day cookie delivery service. But it was never destined to be capable of delivering the returns required by the VC model that depends on massive successes to cover the majority of bets that fail. The startup became the butt of jokes about how anything could get funding.

This weekend, Doughbies announced it was shutting down immediately. Surprisingly, it didn’t run out of money. Doughbies was profitable, with 36 percent gross margins and 12 percent net profit, co-founder and CEO Daniel Conway told TechCrunch. “The reason we were able to succeed, at this level and thus far, is because we focused on unit economics and customer feedback (NPS scoring). That’s it.”

Many other startups in the on-demand space missed that memo and vaporized. Shyp mailed stuff for you and Washio dry cleaned your clothes, until they both died sudden deaths. Food delivery has become a particularly crowded cemetery, with Sprig, Maple, Juicero and more biting the dust. Asked his advice for others in the space, Conway said to “Make sure your business makes sense — that you’re making money, and make sure your customers are happy.”

Doughbies certainly did that latter. They made one of the most consistently delicious chocolate chip cookies in the Bay Area. I had them cater our engagement party. At roughly $3 per cookie plus $5 for delivery, it was pricey compared to baking at home, but not outrageous given SF restaurant rates. From its launch at 500 Startups Demo Day with an “Oprah” moment where investors looked beneath their seats to find Doughbies waiting for them, it cared a lot about the experience.

But did it make sense for a bakery to have an app and deliver on-demand? Probably not. There was just no way to maintain a healthy Doughbies habit. You were either gunning for the graveyard yourself by ordering every week, or like most people you just bought a few for special occasions. Startups like Uber succeed by getting people to routinely drop $30 per day, not twice a year. And with the push for nutritious and efficient offices, it was surely hard for enterprise customers to justify keeping cookies stocked.

Flanked by Instacart and Uber Eats, there weren’t many ripe adjacent markets for Doughbies to conquer. It was stuck delivering baked goods to customers who were deterred from growing their cart size by a sense of gluttony.

Without stellar growth or massive sales volumes, there aren’t a lot of exciting challenges to face for people like Conway and his co-founder Mariam Khan. “Ultimately we shut down because our team is ready to move on to something new,” Conway says.

The startup just emailed customers explaining that “We’re currently working on finding a new home for Doughbies, but we can’t make any promises at this time.” Perhaps a grocery store or broader food company will want its logistics technology or customer base. But delivery is a brutal market to break into, dominated by those like Uber who’ve built economies of scale through massive fleets of drivers to maximize routing efficiency. 

In the end, Doughbies was a lifestyle business. That’s not a dirty word. A few co-founders with a dream can earn a respectable living doing what they care about. But they have to do it lean, without the advantage of deep-pocketed investors.

As soon as a company takes venture funding, it’s under pressure to deliver adequate returns. Not 2X or 5X, but 10X, 100X, even 1,000X what they raise. That can lead to investors breathing down their neck, encouraging big risks that could tank the business just for a shot at those outcomes. Two years ago we saw a correction hit the ecosystem, writing down the value of many startups, and we continue to see the ripple effect as companies funded before hit the end of their runway.

Desperate for cash, founders can accept dirty funding terms that screw over not just themselves, but their early employees and investors. FanDuel raised more than $416 million at a peak valuation of $1.3 billion. But when it sold for $465 million, the founders and employees received zero as the returns all flowed to the late-stage investors who’d secured non-standard liquidation preferences. After nearly 10 years of hard work, the original team got nothing.

Not every business is a startup. Not every startup is a rocket ship. It takes more than just building a great product to succeed. It can require suddenly cutting costs to become profitable before you run out of funding. Or cutting ambitions and taking less cash at a lower valuation so you can realistically hit milestones. Or accepting a low-ball acquisition offer because it’s better than nothing. Or not raising in the first place, and building up revenues the old-fashioned way so even modest growth is an accomplishment.

Investors are often rightfully blamed for inflating the bubble, pushing up raises and valuations to lure startups to take their money instead of someone else’s. But when it comes to deciding what could be a fast-growing business, sometimes its the founders who need the adjustment.

Powered by WPeMatico