Kurvv Blog

How to charge for a POC when free is the norm?

Aug 23, 2021 5:08:29 PM / by Ryan Lee posted in AI, ideas, startup

TL;DL - To be able to charge for POCs in markets that typically do not pay, structure your POCs so that the customer gets value even things don't work out. And don't call it a POC. 

Read More

Why is Enterprise AI adoption taking so long?

Jun 29, 2021 4:50:33 PM / by Jeff Croft posted in Machine Learning, AI, Kurvv, ideas, leadership, educational

Digital Transformation!! AI ML! 5G! IOT!! Intelligent Edge! Insert buzzwords.

The promise of the transformative power of connected devices, mountains of company and customer data, and using "AI" to optimize everything has been a selling point of all technology companies for the last 20+ years. So why hasn't it happened yet? Where is the transformation?

The answer to that question is nuanced, and not uniform, but there are a lot of themes I have seen in the last 11 years working in this industry, and even more so in the last year leading these conversations at Kurvv.ai. Here are s few of the prevailing themes I think are plaguing the mass adoption of AI.

Data Issue:
Data integrity issues are well known and widely covered within the technology community, but one of the first things you realize when you step outside of that bubble is that a very large number of enterprise organizations do not have a common or structured practice on collecting, storing, and utilizing data across their organizations. One organization is utilizing one system, others are using something else, they aren't tied together in any way. You want to cross-reference from multiple teams and business units? Great, everyone export their data to .CSV, and lets combing in a massive spreadsheet, run a very clunky pivot table, and some lite analysis and see what stands out. This is the picture at some of the most profitable companies in the world. Why does it look like this? I thought we were all #DigitallyTransforming?

Sure, they have IT teams keeping the lights on, and may have even migrated some or all of their workloads to the "Cloud", but other than a shift in IT cost from CapEx to OpEx, how they operate their business really hasn't changed. (Side Bar: I could write a whole article on how this extremely slow transformation is actually great for the Big Tech Companies, but another day.) They still rely on vast amounts of employee expertise and intuition to make decisions. When data IS used to make decisions its snapshot data from outcome metrics. This has worked for the last 100 years and continues to work. A lot of these companies have had their best years ever here recently. So many industries, and leaders, don't feel the need to ruthlessly measure every aspect of their business, quantify, log, analyze, at a deep, deep level. Why go through the hassle? Measuring just some of what matters is still delivering in a big way.

Prospect Theory:
Borrowing an idea from the field of Behavioral Economics, this was developed by Daniel Kahneman and Amos Tversky in 1979. It may help shed some light on why leaders are NOT adopting Enterprise Machine Learning and AI use cases as fast as we might expect, and instead opting for slow incremental progress. To grossly oversimplify the theory, it says that when faced with a risky choice leading to gains, individuals are risk-averse, preferring solutions that lead to a lower expected utility but with a higher certainty. A bit counterintuitive, but humans don't make completely rational decisions and don't choose options with maximal utility, despite what they would have you believe.

This came to me, as it relates to AI adoption, not while reading Kahneman & Tversky, but a book by John Lewis Gaddis, On Grand Strategy. In it, he suggests that leaders of empires risk much more to avoid losses than to achieve gains. (In the specific example, he uses King Philip of Spain). Immediately, this stick out as what I was hearing from leaders in certain industries, as well as from technology leaders trying to build solutions for these industries.

When given the choice of maintaining the status quo and guaranteeing steady growth, vs significantly investing in the promise of double-digit efficiency, revenue, cost savings,..(insert KPI of your choice), leaders are choosing the former. Why? They are still winning without the need to push the technological envelope. As one CEO told me.

"I believe AI will reshape our industry, but we have just had our best quarter ever..4 quarters in a row. I guess I don't see the rush. I can always add it later as the business requires it."
They are not wrong.

Every company and industry are different, but the traditional industries that are yet to harness the power of their data sure do have a lot in common. Some are moving faster than others; Manufacturing and Oil/Gas seem to be exploring the possibilities of how AI can accelerate and optimize their companies.

Covid, Supply chain issues, and unpredictable consumer demand have been the driving force putting pressure on some of these of late. Innovation and adoption are bred more out of necessity than desire.

Executive Understanding:
CEO's come in a kaleidoscope of abilities and experiences. From cutting-edge visionaries to game managers that have the sole mission of "Don't mess it up". What very few have is a true understanding of what AI means or how it could apply to their business/industry. Maybe it's because most people think of AI in terms of Westworld, or a Chess Computer, or HAL 9000. How do you articulate the vastness of all the components that fall under the "AI" umbrella? Well, for starters, perhaps we can stop using "AI" as a catch-all. Be more specific. Machine Learning, Predictive Maintenance, Operational Intelligence, etc.

If you were asked to explain AI or any of its components to a 5-year-old, how would you do it? Do you use mostly jargon or basic concepts? People who work in technology tend to over-explain/complicate concepts when relaying them to people without a tech background. We lose them. (Myself included)

Here is my current explanation for Machine Learning (It implies you have seen the Marvel Movies, but understandable even if not)

"Have you seen the Marvel movie, Avengers EndGame? Great. Do you remember when Dr. Strange had to view all 14 Million possible outcomes to find the one way to win? That is kind of how machine learning uses your company data to learn and gain the ability to make highly accurate predictions. The computer has run millions of tests and scenarios to learn and recommend the optimal path."

Is it perfect, no, but it's simple. It gets the point across. It demystifies the black box of AI in just a few min in a way most people can understand.

CEOs and Industry leaders certainly need to be educated on the basics, but we can do a much better job in how we talk about it and explain the concepts behind the benefits that we so easily throw around.

When will it happen?
When will the broad adoption of Enterprise AI take place? SLOWLY, Then ALL AT ONCE

As with most technological advancements, adoption follows a curve. The time frames differ, but the cycle is the same.

No alt text provided for this image
Currently, I believe we are barely entering the Early Adopters stage. I think we are < 10% of Enterprises with a meaningful AI practice or system in place. A few years ago, when I was at Microsoft, the number I heard the most was 5% of Enterprises were actively using meaningful AI to help make business decisions.

If we think we are still very early in the Early Adopters section of the graph then, we are still a few years away.

How long could it be?

Mobile phones were introduced in the 1940s and reached broad adoption in the 1990s.
Smart Phones were introduced in the 1990s and reached adoption in the 2010s.
Personal Computers were introduced in the 1970s and reached adoption in the early 2000s.
The driving force behind adoption is usually event-driven. For Mobile Phones, it was coverage and cost coming low enough to reach everyday people. For Smart Phones it was Steve Jobs making it a consumer product and creating a whole new category. For Personal computers, it was internet rollout and cost.

So what will it be for Enterprise AI? My best guess is that it will be when it becomes hands-off. Simple to buy, use, and see clear, actionable results. No need for in-house data science teams, just plug and play.

There are of course a lot of challenges to overcome before that can happen, but that is what it will take. Especially in the technology laggard industries.

We are still very early and will continue to build, chip away, launch new applications, and use cases. It will happen when the market creates the right conditions and incentives, not before.

We have a lot of work to do, but it does feel like the tide is starting to turn. Leaders are open to learning more, getting their data in order, trying a POC. That's how it starts. We need to operate off a Proof Of Work standard. No more overpromising the magic of AI, only to fall short in the real world. Less Hype, More Results. Be undeniably better and so smooth to implement it becomes a no-brainer.

The future is just getting started.

Read More

COVID-19 Progression by Country & Attributes

May 15, 2020 1:26:48 PM / by Soo Lee posted in COVID-19, Machine Learning, AI, Kurvv

The COVID-19 virus, first reported in the Chinese city of Wuhan in December, has spread to 180 countries, according to data compiled by Johns Hopkins University. As of 4/30, more than 3,308,233 infections have been reported, with over 234,105 deaths.

Read More