1010Computers | Computer Repair & IT Support

Pokémon GO is getting a big new ‘Special Research’ quest next week

Just a few months back, Niantic added its first “Special Research” to Pokémon GO. Sort of like an in-game quest, the research had players complete a series of tasks (often over a number of days) to unlock an otherwise unobtainable Pokémon.

Now they’re back with another one.

The company will be adding a second Special Research quest to the game on August 20th. Whereas the last set unlocked Mew from the first generation of Pokémon games, this one brings out Gen II’s Celebi.

This technically isn’t the first time Celebi has appeared in GO — attendees of GO Fest back in July got an early crack at a Special Research quest specifically tailored to the event, with the final reward being the opportunity to catch Celebi a solid month before anyone else.

Though a bummer to anyone who couldn’t make it to Chicago, it was a fitting way to debut Celebi. Celebi has almost always been an “event” Pokemon in the original series, meaning you had to do something special to encounter one. Depending on the game, sometimes that meant going to a physical, real-world event; sometimes it just meant having the right pre-order disc.

Those who already did the GO Fest research will also be able to do this public run of the Special Research, earning a bit more candy for the Celebi they’ve already caught.

And if you haven’t finished the first (Mew) Special Research yet? That’s okay — they can run in parallel.

These Special Research quests are a clever way for Niantic to keep things interesting. It turns the process of catching one particularly worthwhile Pokémon from something that might take 10 seconds into something that might spread into hours or days (depending on how intense you get about it). I just wish there were more of them, even if they were only for big lumps of XP. Though it’s smart for Niantic to keep them rare and special, these multi-stage tasks are a bit more rewarding than the one-off quick tasks you get anytime you spin a Pokéstop.

Powered by WPeMatico

Autonomous retail startup Inokyo’s first store feels like stealing

Inokyo wants to be the indie Amazon Go. It’s just launched its prototype cashierless autonomous retail store. Cameras track what you grab from shelves, and with a single QR scan of its app on your way in and out of the store, you’re charged for what you got.

Inokyo‘s first store is now open on Mountain View’s Castro Street selling an array of bougie kombuchas, snacks, protein powders and bath products. It’s sparse and a bit confusing, but offers a glimpse of what might be a commonplace shopping experience five years from now. You can get a glimpse yourself in our demo video below:

“Cashierless stores will have the same level of impact on retail as self-driving cars will have on transportation,” Inokyo co-founder Tony Francis tells me. “This is the future of retail. It’s inevitable that stores will become increasingly autonomous.”

Inokyo (rhymes with Tokyo) is now accepting signups for beta customers who want early access to its Mountain View store. The goal is to collect enough data to dictate the future product array and business model. Inokyo is deciding whether it wants to sell its technology as a service to other retail stores, run its own stores or work with brands to improve their product’s positioning based on in-store sensor data on custom behavior.

We knew that building this technology in a lab somewhere wouldn’t yield a successful product,” says Francis. “Our hypothesis here is that whoever ships first, learns in the real world and iterates the fastest on this technology will be the ones to make these stores ubiquitous.” Inokyo might never rise into a retail giant ready to compete with Amazon and Whole Foods. But its tech could even the playing field, equipping smaller businesses with the tools to keep tech giants from having a monopoly on autonomous shopping experiences.

It’s about what cashiers do instead

Amazon isn’t as ahead as we assumed,” Francis remarks. He and his co-founder Rameez Remsudeen took a trip to Seattle to see the Amazon Go store that first traded cashiers for cameras in the U.S. Still, they realized, “This experience can be magical.” The two met at Carnegie Mellon through machine learning classes before they went on to apply that knowledge at Instagram and Uber. The two decided that if they jumped into autonomous retail soon enough, they could still have a say in shaping its direction.

Next week, Inokyo will graduate from Y Combinator’s accelerator that provided its initial seed funding. In six weeks during the program, they found a retail space on Mountain View’s main drag, studied customer behaviors in traditional stores, built an initial product line and developed the technology to track what users are taking off the shelves.

Here’s how the Inokyo store works. You download its app and connect a payment method, and you get a QR code that you wave in front of a little sensor as you stroll into the shop. Overhead cameras will scan your body shape and clothing without facial recognition in order to track you as you move around the store. Meanwhile, on-shelf cameras track when products are picked up or put back. Combined, knowing who’s where and what’s grabbed lets it assign the items to your cart. You scan again on your way out, and later you get a receipt detailing the charges.

Originally, Inokyo actually didn’t make you scan on the way out, but it got the feedback that customers were scared they were actually stealing. The scan-out is more about peace of mind than engineering necessity. There is a subversive pleasure to feeling like, “well, if Inokyo didn’t catch all the stuff I chose, that’s not my problem.” And if you’re overcharged, there’s an in-app support button for getting a refund.

Inokyo co-founders (from left): Tony Francis and Rameez Remsudeen

Inokyo was accurate in what it charged me despite me doing a few switcharoos with products I nabbed. But there were only about three people in the room at the time. The real test for these kinds of systems are when a rush of customers floods in and cameras have to differentiate between multiple similar-looking people. Inokyo will likely need to be more than 99 percent accurate to be more of a help than a headache. An autonomous store that constantly over- or undercharges would be more trouble than it’s worth, and patrons would just go to the nearest classic shop.

Just because autonomous retail stores will be cashier-less doesn’t mean they’ll have no staff. To maximize cost-cutting, they could just trust that people won’t loot it. However, Inokyo plans to have someone minding the shop to make sure people scan in the first place and to answer questions about the process. But there’s also an opportunity in reassigning labor from being cashiers to concierges that can recommend the best products or find what’s the right fit for the customer. These stores will be judged by the convenience of the holistic experience, not just the tech. At the very least, a single employee might be able to handle restocking, customer support and store maintenance once freed from cashier duties.

The Amazon Go autonomous retail store in Seattle is equipped with tons of overhead cameras

While Amazon Go uses cameras in a similar way to Inokyo, it also relies on weight sensors to track items. There are plenty of other companies chasing the cashierless dream. China’s BingoBox has nearly $100 million in funding and has more than 300 stores, though they use less sophisticated RFID tags. Fellow Y Combinator startup Standard Cognition has raised $5 million to equip old-school stores with autonomous camera-tech. AiFi does the same, but touts that its cameras can detect abnormal behavior that might signal someone is a shoplifter.

The store of the future seems like more and more of a sure thing. The race’s winner will be determined by who builds the most accurate tracking software, easy-to-install hardware and pleasant overall shopping flow. If this modular technology can cut costs and lines without alienating customers, we could see our local brick-and-mortars adapt quickly. The bigger question than if or even when this future arrives is what it will mean for the millions of workers who make their living running the checkout lane.

Powered by WPeMatico

Work-Bench enterprise report predicts end of SaaS could be coming

Work-Bench, a New York City venture capital firm that spends a lot of time around Fortune 1000 companies, has put together The Work-Bench Enterprise Almanac: 2018 Edition, which you could think of as a State of the Enterprise report. It’s somewhat like Mary Meeker’s Internet Trends report, but with a focus on the tools and technologies that will be having a major impact on the enterprise in the coming year.

Perhaps the biggest take-away from the report could be that the end of SaaS as we’ve known could be coming if modern tools make it easier for companies to build software themselves. More on this later.

While the report writers state that their findings are based at least partly on anecdotal evidence, it is clearly an educated set of observations and predictions related to the company’s work with enterprise startups and the large companies they tend to target.

As they wrote in their Medium post launching the report, “Our primary aim is to help founders see the forest from the trees. For Fortune 1000 executives and other players in the ecosystem, it will help cut through the noise and marketing hype to see what really matters.” Whether that’s the case will be in the eye of the reader, but it’s a comprehensive attempt to document the state of the enterprise as they see it, and there are not too many who have done that.

The big picture

The report points out the broader landscape in which enterprise companies — startups and established players alike — are operating today. You have traditional tech companies like Cisco and HP, the mega cloud companies like Amazon, Microsoft and Google, the Growth Guard with companies like Snowflake, DataDog and Sumo Logic and the New Guard, those early stage enterprise companies gunning for the more established players.

 

As the report states, the mega cloud players are having a huge impact on the industry by providing the infrastructure services for startups to launch and grow without worrying about building their own data centers or scaling to meet increasing demand as a company develops.

The mega clouders also scoop up a fair number of startups. Yet they don’t devote quite the level of revenue to M&A as you might think based on how acquisitive the likes of Salesforce, Microsoft and Oracle have tended to be over the years. In fact, in spite of all the action and multi-billion deals we’ve seen, Work-Bench sees room for even more.

It’s worth pointing out that Work-Bench predicts Salesforce itself could become a target for mega cloud M&A action. They are predicting that either Amazon or Microsoft could buy the CRM giant. We saw such speculation several years ago and it turned out that Salesforce was too rich for even these company’s blood. While they may have more cash to spend, the price has probably only gone up as Salesforce acquires more and more companies and its revenue has surpassed $10 billion.

About those mega trends

The report dives into 4 main areas of coverage, none of which are likely to surprise you if you read about the enterprise regularly in this or other publications:

  • Machine Learning
  • Cloud
  • Security
  • SaaS

While all of these are really interconnected as SaaS is part of the cloud and all need security and will be (if they aren’t already) taking advantage of machine learning. Work-Bench is not seeing it in such simple terms, of course, diving into each area in detail.

The biggest take-away is perhaps that infrastructure could end up devouring SaaS in the long run. Software as a Service grew out of couple of earlier trends, the first being the rise of the Web as a way to deliver software, then the rise of mobile to move it beyond the desktop. The cloud-mobile connection is well documented and allowed companies like Uber and Airbnb, as just a couple of examples, to flourish by providing scalable infrastructure and a computer in our pockets to access their services whenever we needed them. These companies could never have existed without the combination of cloud-based infrastructure and mobile devices.

End of SaaS dominance?

But today, Work-Bench is saying that we are seeing some other trends that could be tipping the scales back to infrastructure. That includes containers and microservices, serverless, Database as a Service and React for building front ends. Work-Bench argues that if every company is truly a software company, these tools could make it easier for companies to build these kind of services cheaply and easily, and possibly bypass the SaaS vendors.

What’s more, they suggest that if these companies are doing mass customization to these services, then it might make more sense to build instead of buy, at least on one level. In the past, we have seen what happens when companies try to take these kinds of massive software projects on themselves and it hardly ever ended well. They were usually bulky, difficult to update and put the companies behind the curve competitively. Whether simplifying the entire developer tool kit would change that remains to be seen.

They don’t necessarily see companies running wholesale away from SaaS just yet to do this, but they do wonder if developers could push this trend inside of organizations as more tools appear on the landscape to make it easier to build your own.

The remainder of the report goes in depth into each of these trends, and this article just has scratched the surface of the information you’ll find there. The entire report is embedded below.

Powered by WPeMatico

Facebook cracks down on opioid dealers after years of neglect

Facebook’s role in the opioid crisis could become another scandal following yesterday’s release of harrowing new statistics from the Center for Disease Control. It estimated there were nearly 30,000 synthetic opioid overdose deaths in the U.S. in 2017, up from roughly 20,000 the year before. When recreational drugs like Xanax and OxyContin are adulterated with the more powerful synthetic opioid Fentanyl, the misdosage can prove fatal. Xanax, OxyContin and other pain killers are often bought online, with dealers promoting themselves on social media including Facebook.

Hours after the new stats were reported by The New York Times and others, a source spotted that Facebook’s internal search engine stopped returning posts, Pages and Groups for searches of “OxyContin,” “Xanax,” “Fentanyl” and other opioids, as well as other drugs like “LSD.” Only videos, often news reports deploring opiate abuse, and user profiles whose names match the searches, are now returned. This makes it significantly harder for potential buyers or addicts to connect with dealers through Facebook.

However, some dealers have taken to putting drug titles into their Facebook profile names, allowing accounts like “Fentanyl Kingpin Kilo” to continue showing up in search results. It’s not exactly clear when the search changes occurred.

On some search result pages for queries like “buy xanax,” Facebook is now showing a “Can we help?” box that says “If you or someone you know struggles with opioid misuse, we would like to help you find ways to get free and confidential treatment referrals, as well as information about substance use, prevention and recovery.” A “Get support” button opens the site of The Substance Abuse and Mental Health Services Administration, a branch of the U.S. department of health and human services that provides addiction resources. Facebook had promised back in June that this feature was coming.

Facebook search results for many drug names now only surface people and video news reports, and no longer show posts, Pages or Groups, which often offered access to dealers

When asked, Facebook confirmed that it’s recently made it harder to find content that facilitates the sale of opioids on the social network. Facebook tells me it’s constantly updating its approach to thwart bad actors who look for new ways to bypass its safeguards. The company confirms it’s now removing content violating its drug policies, and it’s blocked hundreds of terms associated with drug sales from showing results other than links to news about drug abuse awareness. It’s also removed thousands of terms from being suggested as searches in its typeahead.

Prior to recent changes, buyers could easily search for drugs and find posts from dealers with phone numbers to contact

Regarding the “Can we help?” box, Facebook tells me this resource will be available on Instagram in the coming weeks, and it provided this statement:

We recently launched the “Get Help Feature” in our Facebook search function that directs people looking for help or attempting to purchase illegal substances to the SAMHSA national helpline. When people search for help with opioid misuse or attempt to buy opioids, they will be prompted with content at the top of the search results page that will ask them if they would like help finding free and confidential treatment referrals. This will then direct them to the SAMHSA National Helpline. We’ve partnered with the Substance Abuse & Mental Health Services Administration to identify these search terms and will continue to review and update to ensure we are showing this information at the most relevant times.

Facebook’s new drug abuse resource feature

The new actions follow Facebook shutting down some hashtags like “#Fentanyl” on Instagram back in April that could let buyers connect with dealers. That only came after activists like Glassbreakers’ Eileen Carey aggressively criticized the company, demanding change. In some cases, when users would report Facebook Groups’ or Pages’ posts as violating its policy prohibiting the sale of regulated goods like drugs, the posts would be removed, but Facebook would leave up the Pages. This mirrors some of the problems it’s had with Infowars around determining the threshold of posts inciting violence or harassing other users necessary to trigger a Page or profile suspension or deletion.

Facebook in some cases deleted posts selling drugs, but not the Pages or Groups carrying them

Before all these changes, users could find tons of vendors illegally selling opioids through posts, photos and Pages on Facebook and Instagram. Facebook also introduced a new ads policy last week requiring addiction treatment centers that want to market to potential patients be certified first to ensure they’re not actually dealers preying on addicts.

Much of the recent criticism facing Facebook has focused on it failing to prevent election interference, privacy scandals and the spread of fake news, plus how hours of browsing its feeds can impact well-being. But its negligence regarding illegal opioid sales has likely contributed to some of the 72,000 drug overdose deaths in America last year. It serves as another example of how Facebook’s fixation on the positive benefits of social networking blinded it to the harsh realities of how its service can be misused.

Last November, Facebook CEO Mark Zuckerberg said that learning of the depths of the opioid crisis was the “biggest surprise” from his listening tour visiting states across the U.S, and that it was “really saddening to see.”

Zuckerberg meets with Opioid crisis caregivers and the families of victims in Ohio in April 2017

Five months later, Representative David B. McKinley (R-W.VA) grilled Zuckerberg about Facebook’s responsibility surrounding the crisis. “Your platform is still being used to circumvent the law and allow people to buy highly addictive drugs without a prescription” McKinley said during Zuckerberg’s congressional hearings in April. “With all due respect, Facebook is actually enabling an illegal activity, and in so doing, you are hurting people. Would you agree with that statement?” The CEO admitted “there are a number of areas of content that we need to do a better job policing on our service.”

Yet the fact that he called the crisis a “surprise” but failed to take stronger action when some of the drugs causing the epidemic were changing hands via his website is something Facebook hasn’t fully atoned for, nor done enough to stop. The new changes should be the start of a long road to recovery for Facebook itself.

Powered by WPeMatico

Twitter company email addresses why it’s #BreakingMyTwitter

It’s hard to be a fan of Twitter right now. The company is sticking up for conspiracy theorist Alex Jones, when nearly all other platforms have given him the boot, it’s overrun with bots, and now it’s breaking users’ favorite third-party Twitter clients like Tweetbot and Twitterific by shutting off APIs these apps relied on. Worse still, is that Twitter isn’t taking full responsibility for its decisions.

In a company email it shared today, Twitter cited “technical and business constraints” that it can no longer ignore as being the reason behind the APIs’ shutdown.

It said the clients relied on “legacy technology” that was still in a “beta state” after more than 9 years, and had to be killed “out of operational necessity.”

This reads like passing the buck. Big time.

It’s not as if there’s some other mysterious force that maintains Twitter’s API platform, and now poor ol’ Twitter is forced to shut down old technology because there’s simply no other recourse. No.

Twitter, in fact, is the one responsible for its User Streams and Site Streams APIs – the APIs that serve the core functions of these now deprecated third-party Twitter clients. Twitter is the reason these APIs have been stuck in a beta state for nearly a decade. Twitter is the one that decided not to invest in supporting those legacy APIs, or shift them over to its new API platform.

And Twitter is the one that decided to give up on some of its oldest and most avid fans – the power users and the developer community that met their needs – in hopes of shifting everyone over to its own first-party clients instead.

The problem isn’t that the API is old and buggy (which it was), the problem is that the replacement API is unavailable.

— Paul Haddad (@tapbot_paul) August 16, 2018

The company even dismissed how important these users and developers have been to its community over the years, by citing the fact that the APIs it’s terminating – the ones that power Tweetbot, Twitterrific, Tweetings and Talon – are only used by “less than 1%” of Twitter developers. Burn! 

Way to kick a guy when he’s already down, Twitter.

But just because a community is small in numbers, does not mean its voice is not powerful or its influence is not felt.

Hence, the #BreakingMyTwitter hashtag, which Twitter claims to be watching “quite often.”

The one where users are reminding Twitter CEO Jack Dorsey about that time he apologized to Twitter developers for not listening to them, and acknowledged the fact they made Twitter what it is today. The time when he promised to do better.

This is…not better:

When I built our push notification server, I added the ability to send a message to every device in case of emergency. Today is the first time I’ve used it. pic.twitter.com/edgkver2Nh

— Craig Hockenberry (@chockenberry) August 15, 2018

The company’s email also says it hopes to eventually learn “why people hire 3rd party clients over our own apps.”

Its own apps?

Oh, you mean like TweetDeck, the app Twitter acquired then shut down on Android, iPhone and Windows? The one it generally acted like it forgot it owned? Or maybe you mean Twitter for Mac (previously Tweetie, before its acquisition), the app it shut down this year, telling Mac users to just use the web instead? Or maybe you mean the nearly full slate of TV apps that Twitter decided no longer needed to exist?

And Twitter wonders why users don’t want to use its own clients?

Perhaps, users want a consistent experience – one that doesn’t involve a million inconsequential product changes like turning stars to hearts or changing the character counter to a circle. Maybe they appreciate the fact that the third parties seem to understand what Twitter is better than Twitter itself does: Twitter has always been about a real-time stream of information. It’s not meant to be another Facebook-style algorithmic News Feed. The third-party clients respect that. Twitter does not.

Yesterday, the makers of Twitterific spoke to the API changes, noting that its app would no longer be able to stream tweets, send native push notifications, or be able to update its Today view, and that new tweets and DMs will be delayed.

It recommended users download Twitter’s official mobile app for notifications going forward.

In other words, while Twitterific will hang around in its broken state, its customers will now have to run two Twitter apps on their device – the official one to get their notifications, and the other because they prefer the experience.

A guide to using Twitter’s app for notifications, from Iconfactory

“We understand why Twitter feels the need to update its API endpoints,” explains Iconfactory co-founder Ged Maheux, whose company makes Twitterrific. “The spread of bots, spam and trolls by bad actors that exploit their systems is bad for the entire Twitterverse, we just wish they had offered an affordable way forward for the developers of smaller, third party apps like ours.”

“Apps like the Iconfactory’s Twitterrific helped build Twitter’s brand, feature sets and even its terminology into what it is today. Our contributions were small to be sure, but real nonetheless. To be priced out of the future of Twitter after all of our history together is a tough pill to swallow for all of us,” he added.

The question many users are now facing is what to do next?

Continue to use now broken third-party apps? Move to an open platform like Mastodon? Switch to Twitter’s own clients, as it wants, where it plans to “experiment with showing alternative viewpoints” to pop people’s echo chambers…on a service that refuses to kick out people like Alex Jones?

Or maybe it’s time to admit the open forum for everything that Twitter – and social media, really – has promised is failing? Maybe it’s time to close the apps – third-party and otherwise. Maybe it’s time to go dark. Get off the feeds. Take a break. Move on.

The full email from Twitter is below:

Hi team,

Today, we’re publishing a blog post about our priorities for where we’re investing today in Twitter client experiences. I wanted to share some more with you about how we reached these decisions, and how we’re thinking about 3rd party clients specifically.

First, some history:

3rd party clients have had a notable impact on the Twitter service and the products we build. Independent developers built the first Twitter client for Mac and the first native app for iPhone. These clients pioneered product features we all know and love about Twitter, like mute, the pull-to-refresh gesture, and more.

We love that developers build experiences on our APIs to push our service, technology, and the public conversation forward. We deeply respect the time, energy, and passion they’ve put into building amazing things using Twitter.

But we haven’t always done a good job of being straightforward with developers about the decisions we make regarding 3rd party clients. In 2011, we told developers (in an email) not to build apps that mimic the core Twitter experience. In 2012, we announced changes to our developer policies intended to make these limitations clearer by capping the number of users allowed for a 3rd party client. And, in the years following those announcements, we’ve told developers repeatedly that our roadmap for our APIs does not prioritize client use cases — even as we’ve continued to maintain a couple specific APIs used heavily by these clients and quietly granted user cap exceptions to the clients that needed them.

It is now time to make the hard decision to end support for these legacy APIs — acknowledging that some aspects of these apps would be degraded as a result. Today, we are facing technical and business constraints we can’t ignore. The User Streams and Site Streams APIs that serve core functions of many of these clients have been in a “beta” state for more than 9 years, and are built on a technology stack we no longer support. We’re not changing our rules, or setting out to “kill” 3rd party clients; but we are killing, out of operational necessity, some of the legacy APIs that power some features of those clients. And it has not been a realistic option for us today to invest in building a totally new service to replace these APIs, which are used by less than 1% of Twitter developers.

We’ve heard the feedback from our customers about the pain this causes. We check out #BreakingMyTwitter quite often and have spoken with many of the developers of major 3rd party clients to understand their needs and concerns. We’re committed to understanding why people hire 3rd party clients over our own apps. And we’re going to try to do better with communicating these changes honestly and clearly to developers. We have a lot of work to do. This change is a hard, but important step, towards doing it. Thank you for working with us to get there.

Thanks,

Rob

Powered by WPeMatico

Cisco’s $2.35 billion Duo acquisition front and center at earnings call

When Cisco bought Ann Arbor, Michigan security company, Duo for a whopping $2.35 billion earlier this month, it showed the growing value of security and security startups in the view of traditional tech companies like Cisco.

In yesterday’s earnings report, even before the ink had dried on the Duo acquisition contract, Cisco was reporting that its security business grew 12 percent year over year to $627 million. Given those numbers, the acquisition was top of mind in CEO Chuck Robbins’ comments to analysts.

“We recently announced our intent to acquire Duo Security to extend our intent-based networking portfolio into multi- cloud environments. Duo’s SaaS delivered solution will expand our cloud security capabilities to help enable any user on any device to securely connect to any application on any network,” he told analysts.

Indeed, security is going to continue to take center stage moving forward. “Security continues to be our customers number one concern and it is a top priority for us. Our strategy is to simplify and increase security efficacy through an architectural approach with products that work together and share analytics and actionable threat intelligence,” Robbins said.

That fits neatly with the Duo acquisition, whose guiding philosophy has been to simplify security. It is perhaps best known for its two-factor authentication tool. Often companies send a text with a code number to your phone after you change a password to prove it’s you, but even that method has proven vulnerable to attack.

What Duo does is send a message through its app to your phone asking if you are trying to sign on. You can approve if it’s you or deny if it’s not, and if you can’t get the message for some reason you can call instead to get approval. It can also verify the health of the app before granting access to a user. It’s a fairly painless and secure way to implement two-factor authentication, while making sure employees keep their software up-to-date.

Duo Approve/Deny tool in action on smartphone.

While Cisco’s security revenue accounted for a fraction of the company’s overall $12.8 billion for the quarter, the company clearly sees security as an area that could continue to grow.

Cisco hasn’t been shy about using its substantial cash holdings to expand in areas like security beyond pure networking hardware to provide a more diverse recurring revenue stream. The company currently has over $54 billion in cash on hand, according to Y Charts.

Cisco spent a fair amount money on Duo, which according to reports has $100 million in annual recurring revenue, a number that is expected to continue to grow substantially. It had raised over $121 million in venture investment since inception. In its last funding round in September 2017, the company raised $70 million on a valuation of $1.19 billion.

The acquisition price ended up more than doubling that valuation. That could be because it’s a security company with recurring revenue, and Cisco clearly wanted it badly as another piece in its security solutions portfolio, one it hopes can help keep pushing that security revenue needle ever higher.

Powered by WPeMatico

Powered by $25 million, Arcadia Power looks to expand its distributed renewable energy services

As renewable energy use surges in the U.S. and the effects of global climate change become more visible, companies like Arcadia Power are pitching a nationwide service to make renewable energy available to residential customers.

While states like New York, California and regions across the upper Midwest have access to renewable energy through their utilities and competitive marketplaces, not all states in the country have utilities that are building renewable power generation to offset coal and natural gas energy production.

Enter Arcadia Power and its new $25 million in financing, which will be used to redouble its marketing efforts and expand its array of services in the U.S.

Right now, renewable energy is the fastest growing component of the U.S. energy mix. It’s grown from 15 percent to 18 percent of all power generation in the country, according to a 2018 report from Business Council for Sustainable Energy and Bloomberg New Energy Finance.

And while Arcadia Power is only accounting for 120 megawatts of the 2.9 gigawatts of new renewable energy projects initiated since 2017, its new $25 million in financing will help power new projects.

When we first wrote about the company in 2016, it was just developing solar projects that would generate power for the grid to offset electricity usage from its customers.

Now the company is expanding its array of services. All customers are automatically enrolled in a 50 percent wind energy offset program, where half of their monthly usage is matched in investments in wind farms — and they can upgrade to fully offset their energy usage with wind power. Meanwhile, community solar projects are also available for free or customers can then purchase a panel and receive a guaranteed solar savings on each monthly power bill.

Reduced prices are given to customers through the consolidation of their buying power across multiple competitive energy markets.

Finally, Arcadia is offering new home efficiency upgrades like LED lighting and smart thermostats, along with smart metering and tracking services to improve customers’ payment options, the company said.

“The electricity industry hasn’t changed much in the last hundred years, and we believe that homeowners and renters want a new approach that puts them first. Our platform places clean energy, home efficiency and data insights front and center for residential energy customers in all 50 states,” said chief executive Kiran Bhatraju.

Kiran Bhatraju, chief executive officer Arcadia Power

Funding for the new Arcadia Power financing was led by G2VP, the investment firm that spun out from Kleiner Perkins cleantech investing, ValueAct Spring Fund, McKnight Foundation, Energy Impact Partners, Cendana Capital, Wonder Ventures, BoxGroup and existing investors, according to the company. As a result of the investment, Alex Laskey, Opower’s founder and president; Ben Kortlang, a partner at G2VP; and Dan Leff, a longtime investor in energy technology companies, will all join the Arcadia board of directors.

“We’re taking a piece of the savings that is a part of the power purchase agreement,” says Bhatraju. “Customers get a 5 percent guaranteed savings against the utility rate. In competitive markets like Ohio or Maryland, it’s a shared savings model.”

Beyond the savings, the offsets can do something to reduce the carbon emissions that are exacerbating the problems of global climate change.

“When you build community solar projects you are displacing former fossil fuel plants from being used because these of customers,” Bhatraju said. But the entrepreneur recognizes that they have a long way to go to make a difference. “120 MW is not nearly enough,” Bhatraju said. “We’ve got a long way to go.”

Powered by WPeMatico

Shelf Engine uses machine learning to stop food waste from eating into store margins

Shelf Engine’s team

While running Molly’s, the Seattle-based ready meal wholesaler he founded, Stefan Kalb was upset about its 28 percent food wastage rate. Feeling that the amount was “astronomical,” he began researching how to lower it — and was shocked to discovered Molly’s was actually outperforming the industry average. Confronted by the sheer amount of food wasted by American retailers, Kalb and Bede Jordan, then a Microsoft engineer, began working on an order prediction engine.

The project quickly brought Molly’s percentage of wasted food down to the mid-teens. “It was one of the most fulfilling things I’ve ever done in my career,” Kalb told TechCrunch in an interview. Driven by its success, Kalb and Jordan launched Shelf Engine in 2016 to make the technology available to other companies. Currently participating in Y Combinator, the startup has already raised $800,000 in seed funding from Initialized Capital, the venture capital firm founded by Alexis Ohanian and Gerry Tan, and is now used at more than 180 retail points by clients including WeWork, Bartell Drugs, Natural Grocers and StockBox.

Shelf Engine’s order prediction engine analyzes historical order and sales data and makes recommendations about how much retailers should order to minimize waste and increase margins. The more retailers use Shelf Engine, the more accurate its machine learning model becomes. The system also helps suppliers, because many operate on guaranteed sales, or scan-based trading, which means they agree to take back and refund the purchase price of any products that don’t sell by their expiration date. While running Molly’s, Kalb learned what a huge pain point this is for suppliers. To alleviate that, Shelf Engine itself buys back unsold inventory from the retailers it works with, taking the risk away from their suppliers.

Kalb, Shelf Engine’s CEO, claims the startup’s customers are able to increase their gross margins by 25 percent and reduce food waste from an industry average of 30 percent to about 16-18 percent for items that expire within one to five days. (For items with a shelf life of up to 45 days, the longest that Shelf Engine manages, it can reduce waste to as little as 3-4 percent).

The food industry operates on notoriously tight margins, and Shelf Engine wants to relieve some of the pressure. Running Molly’s, which supplies corporate campuses, including Microsoft, Boeing and Amazon, gave Kalb a firsthand look at the paradox faced by retail managers. Even though a lot of food is wasted, items are also frequently out of stock at stores, annoying customers. Then there is the social and environmental impact of food waste — not only does it raise prices, food rotting in landfills is a major contributor to methane emissions.

A store manager may need to make ordering decisions about thousands of products, leaving little time for analysis. Though there are enterprise resource planning software products for food retail, Kalb says that during store visits he realized a surprisingly high number still rely on Excel spreadsheets or pen and paper to manage reoccurring orders. The process is also highly subjective, with managers ordering products based on their personal preferences, a customer’s suggestion or what they’ve noticed does well at other stores. Sometimes retailers get stuck in a cycle of overcorrecting, because if customers complain about missing out on something, managers order more inventory, only to end up with wastage, then scaling back their next order and so on.

“Americans want selection at all times, we get furious when a product is sold out, but it’s a really hard decision to make about how much challah bread to stock on a Monday,” says Kalb. “Yet we are doing that ad hoc.”

When retailers use Shelf Engine’s prediction engine, it decides how many units they need and then submits those orders to their suppliers. After products reach their sell-by dates, the retailer reports back to Shelf Engine, which only charges them for units they sold, but still pays suppliers for the full order. As time passes, Shelf Engine can make more granular predictions (for example, how precipitation correlates with the sale of specific items like juice or bread).

In addition to providing the impetus for the creation of Shelf Engine, Molly’s also helped Kalb and Jordan, its CTO, build the startup’s distribution network. Kalb says Shelf Engine has benefited from the network effect, because when a retailer signs up, their suppliers will often mention it to other retailers that they serve. Kalb says the startup is currently hiring more engineers and salespeople to help Shelf Engine leverage that and spread through the food retail industry.

“It’s a world I got to know and I came into the world fascinated with healthy food and making delicious grab-and-go meals,” says Kalb. “It turned into a fascination with this crazy market, which is so massive and still has so many opportunities to be maximized.”

Powered by WPeMatico

Coinbase acquires Distributed Systems to build ‘Login with Coinbase’

Coinbase wants to be Facebook Connect for crypto. The blockchain giant plans to develop “Login with Coinbase” or a similar identity platform for decentralized app developers to make it much easier for users to sign up and connect their crypto wallets. To fuel that platform, today Coinbase announced it has acquired Distributed Systems, a startup founded in 2015 that was building an identity standard for dApps called the Clear Protocol.

The five-person Distributed Systems team and its technology will join Coinbase. Three of the team members will work with Coinbase’s Toshi decentralized mobile browser team, while CEO Nikhil Srinivasan and his co-founder Alex Kern are forming the new decentralized identity team that will work on the Login with Coinbase product. They’ll be building it atop the “know your customer” anti-money laundering data Coinbase has on its 20 million customers. Srinivasan tells me the goal is to figure out “How can we allow that really rich identity data to enable a new class of applications?”

Distributed Systems had raised a $1.7 million seed round last year led by Floodgate and was considering raising a $4 million to $8 million round this summer. But Srinivasan says, “No one really understood what we’re building,” and it wanted a partner with KYC data. It began talking to Coinbase Ventures about an investment, but after they saw Distributed Systems’ progress and vision, “they quickly tried to move to find a way to acquire us.”

Distributed Systems began to hold acquisition talks with multiple major players in the blockchain space, and the CEO tells me it was deciding between going to “Facebook, or Robinhood, or Binance, or Coinbase,” having been in formal talks with at least one of the first three. Of Coinbase the CEO said, they “were able to convince us they were making big bets, weaving identity across their products.” The financial terms of the deal weren’t disclosed.

Coinbase’s plan to roll out the Login with Coinbase-style platform is an SDK that others apps could integrate, though that won’t necessarily be the feature’s name. That mimics the way Facebook colonized the web with its SDK and login buttons that splashed its brand in front of tons of new and existing users. This turned Facebook into a fundamental identity utility beyond its social network.

Developers eager to improve conversions on their signup flow could turn to Coinbase instead of requiring users to set up whole new accounts and deal with crypto-specific headaches of complicated keys and procedures for connecting their wallet to make payments. One prominent dApp developer told me yesterday that forcing users to set up the MetaMask browser extension for identity was the part of their signup flow where they’re losing the most people.

This morning Coinbase CEO Brian Armstrong confirmed these plans to work on an identity SDK. When Coinbase investor Garry Tan of Initialized Capital wrote that “The main issue preventing dApp adoption is lack of native SDK so you can just download a mobile app and a clean fiat to crypto in one clean UX. Still have to download a browser plugin and transfer Eth to Metamask for now Too much friction,” Armstrong replied “On it :)”

On it 🙂

— Brian Armstrong (@brian_armstrong) August 15, 2018

In effect, Coinbase and Distributed Systems could build a safer version of identity than we get offline. As soon as you give your Social Security number to someone or it gets stolen, it can be used anywhere without your consent, and that leads to identity theft. Coinbase wants to build a vision of identity where you can connect to decentralized apps while retaining control. “Decentralized identity will let you prove that you own an identity, or that you have a relationship with the Social Security Administration, without making a copy of that identity,” writes Coinbase’s PM for identity B. Byrne, who’ll oversee Srinivasan’s new decentralized identity team. “If you stretch your imagination a little further, you can imagine this applying to your photos, social media posts, and maybe one day your passport too.”

Considering Distributed Systems and Coinbase are following the Facebook playbook, they may soon have competition from the social network. It’s spun up its own blockchain team and an identity and single sign-on platform for dApps is one of the products I think Facebook is most likely to build. But given Coinbase’s strong reputation in the blockchain industry and its massive head start in terms of registered crypto users, today’s acquisition well position it to be how we connect our offline identity with the rising decentralized economy.

Powered by WPeMatico

Square can now process chip cards in two seconds

If you’ve made any payments with a chip card, you’ve probably had awkward moments — those long seconds after you’ve inserted the card and everyone behind you is (literally or metaphorically) tapping their foot, waiting for the card to be processed.

Well, Square has been working on this problem for a while now. Last fall, for example, CEO Jack Dorsey said the company had gotten the processing time down to under three seconds.

Today, the company is announcing that it’s shaved even more time off, and that Square contactless and chip Readers and Registers can now process chip cards in two seconds. To achieve this, it says it’s worked closely with payment partners — and it’s also streamlined the process so that you can remove your card as soon as it’s read, without waiting for the response from the card issuer.

In contrast, when the Wall Street Journal timed chip cards in over 50 transactions a couple years ago, it found that the average processing time was 13 seconds. Those extra seconds might not sound like much in theory, but again, if you’re in a hurry or you’ve got a line of people behind you, the wait can be painful.

Plus, it sounds like this can make a real difference for businesses. In the announcement, Regan Long, co-founder and brewmaster at Local Brewing Co., said that with his brewery’s location near the Giants’ AT&T Park in San Francisco, there’s usually “a rush of customers all ready to close out their open beer tabs at the same time.”

“With Square’s chip card reader update, we’ve cut processing time in half — helping us keep customers happy and on their way to catch the first pitch,” he added.

In addition to faster chip card processing, Square is making another speed-related announcement: With the latest update, Square’s free point-of-sale app will allow sellers to skip collecting signatures if they choose.

Powered by WPeMatico