computing
Auto Added by WPeMatico
Auto Added by WPeMatico
At its Build developer conference, Microsoft today announced Azure Synapse Link, a new enterprise service that allows businesses to analyze their data faster and more efficiently, using an approach that’s generally called “hybrid transaction/analytical processing” (HTAP). That’s a mouthful; it essentially enables enterprises to use the same database system for analytical and transactional workloads on a single system. Traditionally, enterprises had to make some trade-offs between either building a single system for both that was often highly over-provisioned or maintain separate systems for transactional and analytics workloads.
Last year, at its Ignite conference, Microsoft announced Azure Synapse Analytics, an analytics service that combines analytics and data warehousing to create what the company calls “the next evolution of Azure SQL Data Warehouse.” Synapse Analytics brings together data from Microsoft’s services and those from its partners and makes it easier to analyze.
“One of the key things, as we work with our customers on their digital transformation journey, there is an aspect of being data-driven, of being insights-driven as a culture, and a key part of that really is that once you decide there is some amount of information or insights that you need, how quickly are you able to get to that? For us, time to insight and a secondary element, which is the cost it takes, the effort it takes to build these pipelines and maintain them with an end-to-end analytics solution, was a key metric we have been observing for multiple years from our largest enterprise customers,” said Rohan Kumar, Microsoft’s corporate VP for Azure Data.
Synapse Link takes the work Microsoft did on Synaps Analytics a step further by removing the barriers between Azure’s operational databases and Synapse Analytics, so enterprises can immediately get value from the data in those databases without going through a data warehouse first.
“What we are announcing with Synapse Link is the next major step in the same vision that we had around reducing the time to insight,” explained Kumar. “And in this particular case, a long-standing barrier that exists today between operational databases and analytics systems is these complex ETL (extract, transform, load) pipelines that need to be set up just so you can do basic operational reporting or where, in a very transactionally consistent way, you need to move data from your operational system to the analytics system, because you don’t want to impact the performance of the operational system in any way because that’s typically dealing with, depending on the system, millions of transactions per second.”
ETL pipelines, Kumar argued, are typically expensive and hard to build and maintain, yet enterprises are now building new apps — and maybe even line of business mobile apps — where any action that consumers take and that is registered in the operational database is immediately available for predictive analytics, for example.
From the user perspective, enabling this only takes a single click to link the two, while it removes the need for managing additional data pipelines or database resources. That, Kumar said, was always the main goal for Synapse Link. “With a single click, you should be able to enable real-time analytics on your operational data in ways that don’t have any impact on your operational systems, so you’re not using the compute part of your operational system to do the query, you actually have to transform the data into a columnar format, which is more adaptable for analytics, and that’s really what we achieved with Synapse Link.”
Because traditional HTAP systems on-premises typically share their compute resources with the operational database, those systems never quite took off, Kumar argued. In the cloud, with Synapse Link, though, that impact doesn’t exist because you’re dealing with two separate systems. Now, once a transaction gets committed to the operational database, the Synapse Link system transforms the data into a columnar format that is more optimized for the analytics system — and it does so in real time.
For now, Synapse Link is only available in conjunction with Microsoft’s Cosmos DB database. As Kumar told me, that’s because that’s where the company saw the highest demand for this kind of service, but you can expect the company to add support for available in Azure SQL, Azure Database for PostgreSQL and Azure Database for MySQL in the future.
Powered by WPeMatico
Earlier today, DigitalOcean announced that it raised $50 million more from prior investors Access Industries and Andreessen Horowitz. The capital comes after the SMB and developer-focused cloud infrastructure company raised nine-figures worth of debt back in February.
DigitalOcean is a large private company that generated revenue at a run rate of around $250 million towards the end of 2019. The company announced today that it has reached $300 million in annual recurring revenue, or ARR. (We recently added the company to our ARR club here.) That’s growth of around 20% in less than half a year, though we don’t know precisely when the company reached the $250 million mark, making it hard to calculate its true growth pace.
Critically, DigitalOcean is walking toward profitability while expanding.
DigitalOcean’s CEO Yancey Spruill told TechCrunch earlier this year that his firm would reach free cash flow positivity in the next few years, a timeline that appears to have moved up (more on that shortly). Provided that the cloud company can keep its growth pace up over the same time period, it could be well positioned for an IPO.
The new $50 million values the company at $1.15 billion, meaning it was worth $1.1 billion pre-money. DigitalOcean is not being valued like a SaaS startup today in revenue multiple terms, then, though its new valuation is still nearly double its old Series B valuation (a company spokesperson confirmed the numbers on that page).
TechCrunch wanted to know why the company raised equity capital so quickly after it had added debt to its books. The capital was surely welcome given the world’s economic condition, but the timing was worth digging into.
DigitalOcean was not “seeking additional funding,” according to Spruill, but after “reviewing our business performance and outlook with our investors at Access and a16z, they were interested in investing for our next phase of growth.” The company accepted, Spruill said.
Presumably, Digital Ocean’s quick revenue growth from a $250 million run rate to $300 million ARR played a part in the investment decision. For DigitalOcean, receiving a new, higher valuation and a monetary top-off from well-known investors may even provide a brand boost (see this article, especially in light of recent coverage the firm has attracted).
Regarding its plans for the new capital, Spruill told TechCrunch that DigitalOcean can now “better support the increase in demand we’ve seen from entrepreneurs and SMBs around the world as more businesses are transitioning to the cloud, particularly as a result of COVID.” Mark DigitalOcean down as one of the world’s companies that is seeing an uptick from the pandemic; most aren’t, but the firms that are appear to be using the moment to put more capital onto their balance sheets.
TechCrunch also wanted to know if the new capital opened new ground for the firm, or if its priorities for the new capital were similar to its preceding goals. The CEO told TechCrunch that his firm’s focus is the same, namely expanding its business.
“We remain committed to reaching $1 billion in revenue, achieving free cash flow profitability in the second half of this year and, ultimately, position DigitalOcean to be a public company,” Spruill said in an email.
That’s clear enough.
By that measure we can expect to see a DigitalOcean S-1 in the first half of 2021, if markets recover. So a16z and Access Industries (longtime investors in the company) could see a quick return for their most recent checks if current plans hold up.
The company’s release made note of “accelerating growth,” which TechCrunch wanted to know more about. How quickly is the company growing? Spruill didn’t share numbers to confirm or deny our rough math based on his firm’s public revenue milestones, but did tell TechCrunch that the company is “actively working on a number of initiatives to accelerate our revenue growth rate,” adding that these are internally dubbed “Grow Faster” initiatives.
Finally, TechCrunch was curious about the impact that COVID-19 is having on DigitalOcean. The company told us that it has “seen a modest increase in churn as a result of COVID-19,” but nothing too bad, saying that the change was “not significant” when “compared with recent trends immediately prior to the pandemic.”
On the positive side of the ledger, DigitalOcean said that its “sign up of new customers has been accelerating” and that it is seeing “increased business from some existing customers.” Adding that up for the SaaS kids: A little bit more churn, good new logo addition, and some upsell tailwinds. Overall that adds up to growth.
More when we have it, but now we’re at least set up to understand what the company does next.
Powered by WPeMatico
It seems that we are in the middle of a mini acquisition spree for Kubernetes startups, specifically those that can help with Kubernetes security. In the latest development, Venafi, a vendor of certificate and key management for machine-to-machine connections, is acquiring Jetstack, a U.K. startup that helps enterprises migrate and work within Kubernetes and cloud-based ecosystems, which has also been behind the development of cert-manager, a popular, open-source native Kubernetes certificate management controller.
Financial terms of the deal, which is expected to close in June of this year, have not been disclosed, but Jetstack has been working with Venafi to integrate its services and had a strategic investment from Venafi’s Machine Identity Protection Development Fund.
Venafi is part of the so-called “Silicon Slopes” cluster of startups in Utah. It has raised about $190 million from investors that include TCV, Silver Lake and Intel Capital and was last valued at $600 million. That was in 2018, when it raised $100 million, so now it’s likely Venafi is worth more, especially considering its customers include the top five U.S. health insurers, the top five U.S. airlines, the top four credit card issuers, three out of the top four accounting and consulting firms, four of the top five U.S., U.K., Australian and South African banks and four of the top five U.S. retailers.
For the time being, the two organizations will continue to operate separately, and cert-manager — which has hundreds of contributors and millions of downloads — will continue on as before, with a public release of version 1 expected in the June-July time frame.
The deal underscores not just how Kubernetes -based containers have quickly gained momentum and critical mass in the enterprise IT landscape, in particular around digital transformation, but specifically the need to provide better security services around that at speed and at scale. The deal comes just one day after VMware announced that it was acquiring Octarine, another Kubernetes security startup, to fold into Carbon Black (an acquisition it made last year).
“Nowadays, business success depends on how quickly you can respond to the market,” said Matt Barker, CEO and co-founder of Jetstack . “This reality led us to re-think how software is built and Kubernetes has given us the ideal platform to work from. However, putting speed before security is risky. By joining Venafi, Jetstack will give our customers a chance to build fast while acting securely.”
To be clear, Venafi had been offering Kubernetes integrations prior to this — and Venafi and Jetstack have worked together for two years. But acquiring Jetstack will give it direct, in-house expertise to speed up development and deployment of better tools to meet the challenges of a rapidly expanding landscape of machines and applications, all of which require unique certificates to connect securely.
“In the race to virtualize everything, businesses need faster application innovation and better security; both are mandatory,” said Jeff Hudson, CEO of Venafi, in a statement. “Most people see these requirements as opposing forces, but we don’t. We see a massive opportunity for innovation. This acquisition brings together two leaders who are already working together to accelerate the development process while simultaneously securing applications against attack, and there’s a lot more to do. Our mutual customers are urgently asking for more help to solve this problem because they know that speed wins, as long as you don’t crash.”
The crux of the issue is the sheer volume of machines that are being used in computing environments, thanks to the growth of Kubernetes clusters, cloud instances, microservices and more, with each machine requiring a unique identity to connect, communicate and execute securely, Venafi notes, with disruptions or misfires in the system leaving holes for security breaches.
Jetstack’s approach to information security came by way of its expertise in Kubernetes, developing cert-mananger specifically so that its developer customers could easily create and maintain certificates for their networks.
“At Jetstack we help customers realize the benefits of Kubernetes and cloud native infrastructure, and we see transformative results to businesses firsthand,” said Matt Bates, CTO and co-founder of Jetstack, in a statement. “We developed cert-manager to make it easy for developers to scale Kubernetes with consistent, secure, and declared-as-code machine identity protection. The project has been a huge hit with the community and has been adopted far beyond our expectations. Our team is thrilled to join Venafi so we can accelerate our plans to bring machine identity protection to the cloud native stack, grow the community and contribute to a wider range of projects across the ecosystem.” Both Bates and Barker will report to Venafi’s Hudson and join the bigger company’s executive team.
Powered by WPeMatico
Google Cloud today announced the next step in its partnership with VMware: the Google Cloud VMware Engine. This fully managed service provides businesses with a full VMware Cloud Foundation stack on Google Cloud to help businesses easily migrate their existing VMware-based environments to Google’s infrastructure. Cloud Foundation is VMware’s stack for hybrid and private cloud deployments.
Given Google Cloud’s focus on enterprise customers, it’s no surprise that the company continues to bet on partnerships with the likes of VMware to attract more of these companies’ workloads. Less than a year ago, Google announced that VMware Cloud Foundation would come to Google Cloud and that it would start supporting VMware workloads. Then, last November, Google Cloud acquired CloudSimple, a company that specialized in running VMware environments and that Google had already partnered with for its original VMware deployments. The company describes today’s announcement as the third step in this journey.
VMware Engine provides users with all of the standard Cloud Foundation components: vSphere, vCenter, vSAN, NSX-T and HCX. With this, Google Cloud General Manager June Yang notes in today’s announcement, businesses can quickly stand up their own software-defined data center in the Google Cloud.
“Google Cloud VMware Engine is designed to minimize your operational burden, so you can focus on your business,” she notes. “We take care of the lifecycle of the VMware software stack and manage all related infrastructure and upgrades. Customers can continue to leverage IT management tools and third-party services consistent with their on-premises environment.”
Google is also working with third-party providers like NetApp, Veeam, Zerto, Cohesity and Dell Technologies to ensure that their solutions work on Google’s platform, too.
“As customers look to simplify their cloud migration journey, we’re committed to build cloud services to help customers benefit from the increased agility and efficiency of running VMware workloads on Google Cloud,” said Bob Black, Dell Technologies Global Lead Alliance Principal at Deloitte Consulting. “By combining Google Cloud’s technology and Deloitte’s business transformation experience, we can enable our joint customers to accelerate their cloud migration, unify operations, and benefit from innovative Google Cloud services as they look to modernize applications.”
Powered by WPeMatico
Key Pixel team members Marc Levoy and Mario Queiroz are out at Google. The departures, first reported by The Information, have been confirmed on the pages of the former Distinguished Engineer and Pixel General Manager, respectively.
Both members were key players on Google’s smartphone hardware team before exiting earlier this year. Levoy was a key member of the Pixel imaging team, with an expertise in computational photography that helped make the smartphone’s camera among the best in class. Queiroz was the number two on the Pixel team.
The exits come as the software giant has struggled to distinguish itself in a crowded smartphone field. The products have been generally well-received (with the exception of the Pixel 4’s dismal battery life), but the Android-maker has thus far been unable to rob much market share from the likes of Samsung and Huawei.
The Information report sheds some additional light on disquiet among the Pixel leadership. Hardware head Rick Osterloh reportedly voiced some harsh criticism during an all-hands late last year. It certainly seems possible the company saw fit to shake things up a bit, though Google declined TechCrunch’s request for comment.
Breaking into the smartphone market has been a white whale for the company for some time. Google has explored the space through its Nexus partnerships, along with its short-lived Motorola Mobility acquisition (2012-2014). The Pixel is possibly the most successful of these projects, but Google’s struggles have coincided with an overall flattening of the market.
The company did find some success with last year’s budget Pixel 3A. The followup Pixel 4A was rumored for a late May launch, though the device has reportedly been delayed.
Powered by WPeMatico
After eight years of Unreal Engine 4, Epic Games is finally ready to talk about Unreal Engine 5, which they’re announcing will launch in preview early next year with a wider launch by the year’s end.
Unreal Engine 5 is all about harnessing the performance of next-generation consoles like the PlayStation 5 and Xbox Series X. The consoles support wild resolutions and frame rates, but Epic Games CEO Tim Sweeney was most excited about how the new hardware handles data storage, something he says will lead to “state of the art performance” better than any gaming PC.
For Unreal Engine 5, the big evolution appears to be dynamic rendering, allowing developers to drop massively complex objects with millions of polygons into their games and lean on the engine to determine how intricately the object can be rendered onscreen. In the case of the PlayStation 5, that’s pretty damn intricate. Epic Games showcased the new engine running on the PS5 in a truly stunning demo.
“We’re turning scalability from a developer’s problem into our problem,” Sweeney says.
Sweeney says the demo is the representation of what happens when the polygons being rendered shrink to the size of individual pixels. “This is all the detail that you can get until you get a higher-resolution monitor, or until 8K or 16K come along,” he says.
Unreal Engine 5’s major advances are centered around a pair of new products called Nanite and Lumen. Nanite deals with said dynamic rendering product allowing for massively detailed scenery, while Lumen is a new pipeline for dynamic scene illumination, allowing for more life-like lighting of digital assets.
The new update will also push connectivity further, bringing Epic Online Services into the fold with toolsets that can help developers make their online gameplay leverage multiple platforms, connecting mobile, console and PC, just as Fortnite has.
Alongside news of the big update’s release, Epic Games has shared that Fortnite, which will unsurprisingly be leveraging Unreal Engine 5, will be a launch title on the PlayStation 5 and Xbox Series X. While the game’s cartoonish art style won’t be pushing boundaries quite as much as hyperrealistic titles like the one above, adding the next-gen consoles means more platforms on which to reign supreme.
Powered by WPeMatico
VMware announced today that it intends to buy early-stage Kubernetes security startup Octarine and fold it into Carbon Black, a security company it bought last year for $2.1 billion. The company did not reveal the price of today’s acquisition.
According to a blog post announcing the deal, from Patrick Morley, general manager and senior vice president at VMware’s Security Business Unit, Octarine should fit in with what Carbon Black calls its “intrinsic security strategy” — that is, protecting content and applications wherever they live. In the case of Octarine, that is cloud native containers in Kubernetes environments.
“Acquiring Octarine enables us to advance intrinsic security for containers (and Kubernetes environments), by embedding the Octarine technology into the VMware Carbon Black Cloud, and via deep hooks and integrations with the VMware Tanzu platform,” Morley wrote in a blog post.
This also fits in with VMware’s Kubernetes strategy, having purchased Heptio, an early Kubernetes company started by Craig McLuckie and Joe Beda, two folks who helped develop Kubernetes while at Google before starting their own company,
We covered Octarine last year when it released a couple of open-source tools to help companies define the Kubernetes security parameters. As we quoted head of product Julien Sobrier at the time:
Kubernetes gives a lot of flexibility and a lot of power to developers. There are over 30 security settings, and understanding how they interact with each other, which settings make security worse, which make it better, and the impact of each selection is not something that’s easy to measure or explain.
As for the startup, it now gets folded into VMware’s security business. While the CEO tried to put a happy face on the acquisition in a blog post, it seems its days as an independent entity are over. “VMware’s commitment to cloud native computing and intrinsic security, which have been demonstrated by its product announcements and by recent acquisitions, makes it an ideal home for Octarine,” the company CEO Shemer Schwarz wrote in the post.
Octarine was founded in 2017 and has raised $9 million, according to PitchBook data.
Powered by WPeMatico
As the world looks to reopen after weeks of lockdown, governments are turning to contact tracing to understand the spread of the deadly coronavirus.
Most nations are leaning toward privacy-focused apps that use Bluetooth signals to create an anonymous profile of where a person has been and when. Some, like Israel, are bucking the trend and are using location and cell phone data to track the spread, prompting privacy concerns.
Some of the biggest European economies — Germany, Italy, Switzerland and Ireland — are building apps that work with Apple and Google’s contact-tracing API. But the U.K., one of the worst-hit nations in Europe, is going it alone.
Unsurprisingly, critics have both security and privacy concerns, so much so that the U.K. may end up switching over to Apple and Google’s system anyway. Given that one of Israel’s contact-tracing systems was found on an passwordless server this week, and India denied a privacy issue in its contact-tracing app, there’s not much wiggle-room to get these things wrong.
Turns out that even during a pandemic, people still care about their privacy.
Here’s more from the week.
When Zoom announced it acquired online encryption key startup Keybase, for many, the reaction was closer to mild than wild. Even Keybase, a service that lets users store and manage their encryption keys, acknowledged its uncertain future. “Keybase’s future is in Zoom’s hands, and we’ll see where that takes us,” the company wrote in a blog post. Terms of the deal were not disclosed.
Zoom has faced security snafu after snafu. But after dancing around the problems, it promised to call in the cavalry and double down on fixing its encryption. So far, so good. But where does Keybase, largely a consumer product, fit into the fray? It doesn’t sound like even Zoom knows yet, per enterprise reporter Ron Miller. What’s clear is that Zoom needs encryption help, and few have the technical chops to pull that off.
Keybase’s team might — might — just help Zoom make good on its security promises.
Powered by WPeMatico
When it comes to corporate venture capital, semiconductor giant Intel has shaped up to be one of the most prolific and prescient investors in the tech world, with investments in 1,582 companies worldwide, and a tally of some 692 portfolio companies going public or otherwise exiting in the wake of Intel’s backing.
Today, the company announced its latest tranche of deals: $132 million invested in 11 startups. The deals speak to some of the company’s most strategic priorities currently and in the future, covering artificial intelligence, autonomous computing and chip design.
Many corporate VCs have been clear in drawing a separation between their activities and that of their parents, and the same has held for Intel. But at the same time, the company has made a number of key moves that point to how it uses its VC muscle to expand its strategic relationships and also ultimately expand through M&A. Just earlier this month, it acquired Moovit, an Intel Capital portfolio company, for $900 million (a deal that was knocked down to $840 million when accounting for its previous investment).
“Intel Capital identifies and invests in disruptive startups that are working to improve the way we work and live. Each of our recent investments is pushing the boundaries in areas such as AI, data analytics, autonomous systems and semiconductor innovation. Intel Capital is excited to work with these companies as we jointly navigate the current world challenges and as we together drive sustainable, long-term growth,” said Wendell Brooks, Intel senior vice president and president of Intel Capital, in a statement.
The tranche of deals come at a critical time in the worlds of startups and venture investing. Many are worried that the slowdown in the economy, precipitated by the COVID-19 pandemic, will mean a subsequent slowdown in tech finance. Intel says that it plans to invest between $300 million and $500 million in total this year, so this would go some way to refuting that idea, along with some of the other monster deals and big funds that we’ve written out in the last couple of months.
The list announced today doesn’t include specific investment numbers, but in some cases the startups have also announced the fundings themselves and given more detail on round sizes. These still, however, do not reveal Intel’s specific financial stakes.
Here’s the full list:
Powered by WPeMatico
For a few years now, Microsoft has offered Azure Cache for Redis, a fully managed caching solution built on top of the open-source Redis project. Today, it is expanding this service by adding Redis Enterprise, Redis Lab’s commercial offering, to its platform. It’s doing so in partnership with Redis Labs and while Microsoft will offer some basic support for the service, Redis Labs will handle most of the software support itself.
Julia Liuson, Microsoft’s corporate VP of its developer tools division, told me that the company wants to be seen as a partner to open-source companies like Redis Labs, which was among the first companies to change its license to prevent cloud vendors from commercializing and repackaging their free code without contributing back to the community. Last year, Redis Labs partnered with Google Cloud to bring its own fully managed service to its platform and so maybe it’s no surprise that we are now seeing Microsoft make a similar move.
Liuson tells me that with this new tier for Azure Cache for Redis, users will get a single bill and native Azure management, as well as the option to deploy natively on SSD flash storage. The native Azure integration should also make it easier for developers on Azure to integrate Redis Enterprise into their applications.
It’s also worth noting that Microsoft will support Redis Labs’ own Redis modules, including RediSearch, a Redis-powered search engine, as well as RedisBloom and RedisTimeSeries, which provide support for new datatypes in Redis.
“For years, developers have utilized the speed and throughput of Redis to produce unbeatable responsiveness and scale in their applications,” says Liuson. “We’ve seen tremendous adoption of Azure Cache for Redis, our managed solution built on open source Redis, as Azure customers have leveraged Redis performance as a distributed cache, session store, and message broker. The incorporation of the Redis Labs Redis Enterprise technology extends the range of use cases in which developers can utilize Redis, while providing enhanced operational resiliency and security.”
Powered by WPeMatico