December 11, 2018

Consumer AI vs Industrial AI diffusion rates - 3 Differences & Why the party is just getting started ?



All of us have steadily been exposed to a lot of media diet which advocates both the advantages and pitfalls of AI. Its the new reality. It has been silently and steadily been penetrating multiple facets of every industry . This steady state AI penetration is having tangible/intangible impact on outcomes, changing behavior, shifting business models, disrupting marketplaces. In this blog we wanted to share Fluturas practical insights on Industrial AI adoption and why that matters for the future of the economy at large which is powered to a a great extent by energy and engineering companies


At a macro level, AI Applications can broadly be divided into 2 buckets
Consumer AI Applications : Cross sell reccomenders , Sentiment Analyzers, Market Mix Modelers, Diabetic retinopathy predictors etc
Industrial AI Applications : Process chemical yield predictors, Down-hole drilling inefficiency predictors, Motor down time failure predictors while fracking etc
Question-1 : What discriminates Industrial AI adoption rates from Consumer AI adoption rates and why do those differences matter ?
Flutura found from its "from the trenches" experience that consumer AI has had a head start because of 3 reasons
  • Reason- 1 : Difference in labelled data availability for Industrial & Consumer AI models
  • Reason- 2 : Difference in perception of unlocked $ in Industrial & Consumer contexts
  • Reason- 3 : Difference in mindsets between Industrial & Consumer executives

Reason-1 : Sea of labelled data available for Industrial and Consumer AI Models
If one takes facial recognition as a problem to solve using deep learning neural nets, there is a ton of data to learn from sources like Minst.
If we take a similar problem as detecting product quality image anomalies in a diaper manufacturing plant or crack detection and progression on sub sea structures, the foundational job of creating labelled data sets need to be initiated. If the company has a "postponement of gratification" mindset the projects take off whereas if the executives want a "here a now pain balm" these projects get stalled.
What can be done about it ?
Industrial executives must be made aware that, access to labelled data will be a source of competitive advantage .
Labelled Industrial process and equipment data will be a tool for survival in the hyper competitive marketplace where access to algorithms becomes democratized and access to labelled data becomes the "moat".


Reason-2 : Difference in perception of $ value unlocked in Consumer and Industrial AI
In consumer industry when one executes for example the market mix models which change promotion $ allocation in the retail industry, executives can perceive $ value unlocked by measuring ROMI ( Return on Marketing dollar metric )
Whereas Industrial mindsets are used to perceiving value on electro-mechanical dimensions ("I can see what horizontal drilling does, I can see how adding vibration and shock sensors reduce warranty liability" which is more tangible than digital dimension ( "I cant see what I cant perceive") as a result of which they are unable to answer the question "Show me the money" with enough confidence


What can be done about it ?
Flutura has found "Engineering curves" as a good tool to manifest tangible value additions to skeptical industrial mindsets
For equipment manufacturers PF curves can be a tool for making buyers perceive value. For example Engine Anomaly detectors move detection window of warning to failure from 60 seconds to 60 mins across thousands of consumer/naval ship engines in operational deployment


For drilling contractors Depth vs Time at each rig state can be a construct for making them perceive value of AI in moving the dwell time at each drilling state unlocking millions of dollars of efficiency realization across hundreds of rigs
Every industry segment must have an engineering efficiency curve on which $ value perception of Industrial AI can be mapped. Find it and you have nailed it :)


Reason-3 : Difference in Industrial & Consumer mindsets
Because of this difference RELIABILITY becomes the operative word whether it is guarantee of prediction rates for complex fracking equipment or top drives or downhole stick shift failure.
Industrial executives carry a lot on their shoulders. A small mistake could mean in some cases life and death of humans in close contact with those risky industrial operations. Having experienced many successes and failures they need to be empathized with and gently guided into looking at operations thru new "Industrial AI" eyes.


Why do these 3 differences matter on the S curve for measuring Industrial AI diffusion rates ?
The proverbial S Curve has always been a reliable barometer to measure disruptive technology adoption rates in marketplaces be it cell phone adoption, bit coin adoption or AI adoption. If we plot the S curve for Industrial AI vs Consumer AI 2 differences in impact points will distincly emerge
Difference -1 : The take off point in Consumer AI > Industrial AI
It takes longer to implement an Industrial AI application but the potential impact of surgical AI apps can eclipse the collective impact of many consumer AI applications combined.


Closing thoughts

We at Flutura been blessed to have a ringside view of many practical industrial AI applications in the last 5 years which have massively scaled beyond "innovation POCs" in upstream drilling, process chemical manufacturing, Industrial heavy equipment manufacturers across Houston, Tokyo, Dusseldorf and a host of other industrial hubs. We noticed a difference in rhythm across both these sectors and asked ourselves 2 simple questions

Question-2 : What can be done about it ?

Lets face it ... Machine learning algorithms need to "hog" a lot of "labelled data" before the model tunes into real world behavior - be it modeling consumer behavior or machine behavior :)

" Forget the fancy terms, Show me the money" is a constant feedback we heard from Houston, Tokyo & Dusseldorf.


Industrial processes are complex with massive investments in electro-mechanical moving parts with high reliability have been in operations for many long years. Retail/Insurance/Banking process are relatively "asset light" as compared to the industrial sector.


Difference -2 : The peaking point in Industrial AI > Consumer AI

What this means is that the Industrial AI adoption party to unlock massive economic value is just getting started. As mentioned earlier there is a long horizontal line in adoption curve followed by a massive spike in adoption. We need to be empathetic to the needs of industrial executives as they manage the risk vs reward equation in a world which is accelerating and changing at a rate never before seen in human history. To conclude we at Flutura leave you with this parting thought from Jay Asher ( 13 reasons why )



“You can't stop the future
You can't rewind the past
The only way to learn the secret
...is to press play.” 





November 29, 2018

17 Points to remember while architecting an IIoT solution which scales



Architecting an IIoT solution which scales? Here comes the key aspects and points to remember.


Technical
- SOA design - When the number of things is large, scalability is problematic at different levels including data transfer and networking, data processing and management, and service provisioning
- Network considerations - It is a challenging task to develop networking technologies and standards that can allow data gathered by a large number of devices to move efficiently within IoT networks. Managing connected things in terms of facilitating the collaboration between different entities and the administering devices addressing, identification, and optimization at the architectural and protocol levels is a research challenge
- Integration – of IT and OT systems
- Data maturity & Data landscape – quantity, quality and granularity
- Ability to handle large volumes of data and making it reusable

Standardization

Various standards used in IoT (e.g., security standards, communication standards, and identification standards) might be the key enablers for the spread of IoT technologies and need to be designed to embrace emerging technologies. Specific issues in IoT standardization include interoperability issue, radio access level issues, semantic interoperability, and security and privacy issues


Hidden technical debt

- Developing & deploying is still relatively fast and cheap compared to maintenance
- Debt at system level and code level
- Abstraction boundaries are not strict – as it is data dependent
- Changing Anything Changes Everything (CACE) applies not only to input signals, but also to hyper-parameters, learning settings, sampling methods, convergence thresholds, data selection, and essentially every other possible tweak

Info security and privacy protection
- Definition of security & privacy from the viewpoint of social, legal, and culture
- Trust and reputation mechanism
- Communication security such as end-to-end encryption
- Privacy of communication and user data and
- Security on services and applications.


Would you like to add to the list? Comment below.



References - 
1. XU et al.: INTERNET OF THINGS IN INDUSTRIES: A SURVEY
2. C. Flügel and V. Gehrmann, “Scientific workshop 4: Intelligent objects for the internet of things: Internet of things-application of sensor networks in logistics,” CommunComput. Inf. Sci., vol. 32, pp. 16–26, 2009
4. M. Fowler. Refactoring: improving the design of existing code. Pearson Education India, 1999.

August 16, 2018

Artificial Intelligence for Oilfields and Pipelines [Digital Transformation]


Vibration analysis can help companies manage assets for measured success, plan service calls, track failure modes and increase responsiveness to faults for mining, oil and gas, petrochemical, refineries and original equipment manufacturers. However when you couple up vibration analysis with an artificial intelligence platform that can ingest

large amounts of data from systems like OSI PI, Maximo and SAP to couple this data with historical machine failure data, maintenance records, technician data on where the most qualified person is to make the repair and then algorithmically check the spare parts refurbishment inventory is where the magic of artificial intelligence begins to transform how companies start to move from the old inefficient time based maintenance to condition based maintenance.

Safety & Risk mitigation is a big focus for industrial companies as well, especially pipeline companies that have assets spanning across hundreds of thousands of miles through out the U.S. and globally. Artificial Intelligence can process caustic and corrosion values of various types of liquids and materials running through the pipes asking things like:  At what rates do the corrosive materials start to breakdown the pipelines and affect the structural integrity? 

This is key because then artificial intelligence can give key stakeholders insights into where to go inspect next.  Artificial intelligence can give senior level management personnel insights into where to write their next check or allocate financial resources to mitigate their risks and raise shareholder value by keeping as much materials as they can to pump through the pipelines which is how these companies make money at the end of the day.

In the digital oilfield there is still a big gap between having insights that are leading up to failure versus just having production data. There is a huge opportunity to help companies move from condition-based maintenance to time-based maintenance. There are still thousands of well sites being driven to on a daily basis by personnel like Guager’s that literally drive around from well site to well site writing down analog readings from the meters on production data.  Often times it is only when these Guagers go out to the well site is when they discover the well site has a broken sucker rod or failed pump shaft is broken.  This old way of doing business is inefficient, costly and at the end of the day how the old world of oil and gas works. 


So how do we understand events that are leading up to failure of a sucker rod?  First, we need to understand the anatomy of a sucker road.  A sucker rod barrel is a single-piece hollow tube with threads on both ends. The structure of the barrel's materials can be divided into two groups: base materials and the coating or surface treatment layer. The most common base materials are steel, brass, stainless, nickel alloy and low-alloy steel. These base materials abrasion and corrosion resistances are enhanced by plating and other surface treatment processes.

The most common coatings and treatment processes are chrome plating, electroless nickel carbide composite plating, carbonitriding, carburizing and induction case hardening.  Coated or plated barrels have the largest market share because barrels experience the most wear and operate in severe, abrasive and corrosive environments.  The most commonly sold barrel types are Stainless steel, chrome plated, Plain steel, chrome plated, Brass, chrome plated. Brass, nickel carbide coated, plain steel and nickel carbide coated.

As you can see with these various types of sucker rods and various types of materials there is in inherently variation on their strengths and tolerances when it comes to performing in the field.  With artificial intelligence the platform can ingest these values and provide insights into the failure behavior and signals that are leading up to failure.  This can be a valuable tool for pump operators, designers and manufacturers to have insights into reducing total and complete failure of a well sites.

For upstream operations there is still a big opportunity in the market to help companies understand signals that are leading up to failure that can greatly affect their production.  If something like a top drive, cat walk or casing running tool (CRT) fails in some cases there might be a spare waiting on the sideline but what happens if that fails?  To be able to understand signals that are leading up to failure for these types of components of an upstream operation is not only essential to these operators it is also essential to the OEM’s that are supplying these critical components of the oilfield operations.  This is why more and more companies are turning to artificial intelligence to help then build a better product, better understand the performance of their machines, have insights into signals that are leading up to failure and enabling their workforces to Take Action and Not Just Have Insights.

March 12, 2018

Why we love GE Digital- The “Industrial Michelangelo” (3 Reasons)



Last week Krish, Sri and myself got a link to an article which screamed “Why GE didn't make it big” in Inc magazine. We felt it was being insensitive towards a player who had the guts to re-imagine the industrial future when no one could conceive it. ( Agreed execution / economics are important too and in the next iteration it would evolve :)
So we wanted to frame the discussion differently — more positively and highlight what GE has done right instead of focusing on the negative ( understandably so since negative news garner more eye balls for monetization in the publication industry :)
So here are 3 bold things which GE Digital did right

1. GE Digital unchained the industrial world from the shackles of its electro-mechanical past

Digital twins, Remote diagnostics, Data driven prognostic models detecting early warning signals ( before physics driven models). These were gaming changing ways of running operations . This catalyzed the industrial mindsets to think different and be liberated from the chains of traditional mental models which celebrated 
  • Asset ownership over asset access
  • Physics over Statistics 
  • Capex over Opex
Thanks to GE a host of other industrial companies have woken up to the digital possibilities — Hitachi with Vantara, Siemens with it Mindsphere and Honeywell with Uniformance. Always a visionary peer taking a leap of faith propels others into action.

2. GE Digital and the power of 1 %A

By framing how a 1 % improvement in operational inefficiencies can unlock millions of dollars, they were able to find new levers for cost optimization which were inconceivable before

3. GE Digital tweaked not just AI, but next generation business models

GE did not just innovate on technology but also innovated on business models to deliver their digital services — Asset as a service ( in opex mode) and Pay per outcome were truly game changers.
Closing thoughts …
Pathfinders in uncharted waters are bound to make mistakes and discover things you never envisioned when you started the voyage. It happened to Columbus as he set out in uncharted waters to find India and discovered America in the process :). What if Edison had heeded to journalistic/public pressures at the 500th iteration of light bulb it would hampered the invention of the electric bulb. ( Incidentally Edison said, “I didn’t fail 1,000 times. The light bulb was an invention with 1,000 steps.” :)
So To all the folks at GE Digital — Go where no man has gone before and keep iterating/reinventing. As Michaelangelo said
“In every block of marble I see a statue as plain as though it stood before me, shaped and perfect. I have only to chip away the rough walls that imprison the lovely apparition to reveal it to the other eyes as mine see it.”

GE Digital is truly the Michelangelo for the Industrial World ! 

Others will see what GE Digital saw in time :)

The show must go on !!!

Small Side Note : 
Flutura has also been playing in the same marketplace with our platform Cerebra and have an intimate understanding of whats happening. We compete with GE Predix platform but have also been “brothers” as we both aspire to make industrial companies experience massively transformative outcomes using AI. The “Industrial AI pie” is so large and its not a zero sum game :)

March 11, 2018

Flutura's 6th Anniversary - Cerebra: AI platform for Industrial IoT


Flutura Cerebra AI for IIoT


Flutura completed 6 years!

2017 was a defining year for Flutura since we started our journey in 2012 with lots of achievements to reflect and be proud of.
  1.  We now have one of the Best Ever Customer Base in Industrial IoT - each of them having a potential to become multi-million dollar accounts in the years to come. 
  2. Flutura continues to be seen as a Thought Leader in What We Do globally - Flutura shared its vision on AI in Industrial IoT at Stanford, Flutura will receive the Connected Plants Game Changer Award next week, Investments by Vertex Ventures, Lumis Partners, Hive, and Hitachi who believe in us. 
  3. Cerebra Powering Trans-formative Outcomes - on its journey to become the Most Reliable Industrial Intelligence Platform in the market
But I think the best way to remember 2017 would be to feel privileged to be in the midst of a Potent Team Shaping the Future of Industrial IOT - Pioneering Engineering + AI. 

I am really excited and cannot wait to see the things we would ExperienceLearn and the new opportunities which our customers and partners would Entrust us with, as we move forward through uncharted territories together!  
Thanks to our customers and partners who have believed in us and shaping our journey.