March 15, 2019

CIOs as a Growth Driver

The days of CIOs focusing exclusively on keeping technology systems working and up to date are over. As technology continues to evolve at a rapid pace, the CIO role is no longer restricted to merely saving costs and improving costs. Today, CIOs are also increasingly expected to innovate and contribute to a businesses' revenue stream.

Facing disruption from numerous fronts, pressure is mounting for companies to leverage innovative solutions to generate top-line growth. Who better to build these capabilities than CIOs, who have an array of emerging technologies at their disposal?

According to IDG’s recent State of the CIO survey,CIOs are focusing more and more on strategic business tasks, helping to identify new revenue opportunities, and operational innovations. Nearly two-thirds (62%) of CIOs say that the creation of new revenue-generating initiatives is among their job responsibilities. To support the creation of growth-driving initiatives, CIOs are learning about customer needs, building teams focused on innovation, and creating business case scenarios with defined costs and benefits. 

Welcome to the age of the CIO as a growth driver.

When It Comes to AI, Focus on Business Outcomes, Not Just the Technology
For CIOs aiming to drive growth for their companies, artificial intelligence (AI) platforms are an excellent starting point. A recent survey by Deloitte of “aggressive adopters” of cognitive and AI technologies found that 76% believe that they will “substantially transform” their companies within the next three years.

The possibilities would seem to justify the hype. AI isn’t just one technology, but a wide array of tools, including a number of different algorithmic approaches, an abundance of new data sources, and advancements in hardware. A recent McKinsey study pegs the potential economic value of AI tools at between $3.5 trillion and $5.8 trillion.

Nevertheless, there remains a large gap between aspiration and reality. Gartner estimates that 85% of big data and AI projects fail.

There are manifold reasons for these failures, but a chief culprit is CIOs taking a technology-centric approach – wasting precious time by experimenting with an AI platform in the futile hope that will solve all problems for everybody across the enterprise – instead of finding an AI platform that delivers on actual business outcomes. Rather than focusing exclusively on an AI platform’s technology, forward-thinking CIOs should strive for an AI platform that drives specific business objectives and key performance indicators (KPIs) that are valuable to their organization.

In lieu of a technology-centric approach, progressive CIOs are opting for AI platforms that take an asset-and-process-centric approach. Put simply, they are selecting an AI platform that enables their companies to achieve competitive advantage and generate new revenue streams by maximizing asset performance/reliability and process efficiency/optimization.

Reduce Time to Impact, Guarantee Reliability
In addition to ensuring their company’s AI platform drives growth by delivering on critical business outcomes, CIOs must also make certain that the AI solution reduces time to impact and guarantees reliability.

Rapid deployment of the AI platform is vital to keeping up with the accelerated needs of the modern business world and safeguarding the organization from being left behind. Moreover, the platform must possess the ability to scale quickly across the enterprise, so that its benefits can be distributed throughout the business efficiently and effectively.

Finally, the AI platform must demonstrate a proven track record of success for all of the above –imparting value by empowering companies to consistently solve problems and meet business goals, reducing time to impact by standardizing the solution and empowering self-service for users, and assuring reliability by providing rigorously tested scientific solutions and deploying at scale.

Case Study: Leading Specialty Chemicals Company
By way of example, a leading multi-national specialty chemicals company headquartered in Germany faced numerous problems: 1) their quality check analysis procedures were conducted post-production rather than pre-production, 2) they lacked real-time intervention and quality control during the manufacturing process, 3) their standard operating procedures failed to consider the dynamics of the manufacturing process, and 4) they were consistently 2-3% off-spec annually.

By leveraging the Cerebra AI platform tuned for the Industrial Internet of Things (IIoT), this company achieved 95% accuracy in quality prediction for its finished goods and reduced its time taken for Root Cause Analysis by 90%. Additionally, the company experienced a 14% drop in off-spec products and lowered its customer complaints by 37%.

Closing Thoughts
From being a cost-center in the past, to a growth driver today, CIOs will play an increasingly important role in increasing the top-line. The CIO is uniquely positioned to play a major part in the overall success – and profitability – of their organization.
Forward-thinking, growth-driving CIOs will continuously search for opportunities to improve their companies through the implementation of digital technology, and it can begin by selecting an AI platform that attains desired business outcomes.

By: Greg Slater, Flutura

January 15, 2019

Here’s to 2018 – A Great Year for Flutura!

Before we get too far into 2019, we wanted to take a proud look back at the past year. Flutura reached many significant milestones and achieved numerous accomplishments over the last twelve months.

Here are some of our highlights from 2018:

  • Continued Growth: Our brand maintained its strong growth trajectory, as 2018 saw a 150% increase in the number of assets managed by Cerebra. We also added more than 120 new product features to Cerebra.
  • Prestigious Projects: We expanded our footprint across high-profile, mission-critical deployments such as powering Drilling Operations Command Center, powering remaining useful life for mining operations, powering next-generation ultrasonic flow meters, analyzing non-destructive testing for quality optimization, and many more.
  • Added Crucial Talent: Flutura continued to scale in 2018, as we expanded our team with key hires across various critical functions in engineering operations, presales, and sales, among other areas.
  • Hitachi Investment: Flutura signed a strategic partnership and investment agreement with Hitachi High-Tech Solutions, which paves the way for us to provide our market-leading IIoT solutions to Hitachi’s customers through rapid and cost-efficient implementations, primarily among heavy machinery OEMs.
  • Siemens Partnership: In September, we announced a partnership with Siemens which will enable customers to benefit from Cerebra’s pre-built industry-specific applications on MindSphere, Siemens’ cloud-based, open IoT operating system, particularly for the specialty chemicals and oil/gas industries.
  • Gartner IIoT Magic Quadrant: Flutura was one of the vendors recognized by Gartner in its inaugural edition of the Magic Quadrant for Industrial IoT Platforms.
  • Frost & Sullivan Excellence In Best Practices Award: Frost & Sullivan recognized Flutura as the 2018 North American Artificial Intelligence in Energy Entrepreneurial Company of the Year at its Excellence in Best Practices Awards Gala.
  • Recognized as Top Innovator: Flutura was acknowledged as one the most innovative startups by Inc42, which evaluated thousands of startups and narrowed the field to its top 42, including Flutura.
  • Airbus Innovation Hub: Flutura was selected by Airbus as one of the startups for its prominent BizLab global aerospace business accelerator, where startups and Airbus intrapreneurs speed up the transformation of innovative ideas into valuable businesses.

We take a lot of pride as we reflect on 2018, and we are confident that 2019 will be another big year for Flutura.

But don’t take our word for it. Read the insights from CRN, which just named Flutura as one of the “10 Hot IoT Companies to Watch in 2019.”

Visit website:

January 10, 2019

Henkel Maintains Golden Batch with Leading AI Platform

Henkel Harnesses the Power of AI and Saves $300M Annually

One of the most important challenges currently faced by process chemical manufacturing companies is creating products which are golden batch quality. Ideally, optimal product quality is achieved “first time right” by streamlining all processes for best product outcomes.

Leading manufacturing plants are accomplishing this today by leveraging Cerebra – an Artificial Intelligence (AI) platform tuned for the Industrial Internet of Things (IIoT) in the specialty chemical industry.

Data ingested into Cerebra provides actionable insights, delivering visibility into all stages of the manufacturing process. Cerebra gathers data from the processes and identifies the key parameters influencing product quality — empowering manufacturing plants to consistently get product outcomes right the first time (i.e., without needing to rework the batch due to poor quality).

Watch this video to learn how Cerebra enabled Henkel, the world’s leading adhesive manufacturer to reduce its time taken for root cause analysis by 90%, achieve 95% accuracy in quality prediction, and win the 2018 Connected Plant Game Changer Award.

Since its inception, Cerebra has impacted over 15 countries — improving asset uptime and increasing operational efficiency for over 20 manufacturing plants and more than 100 process lines.

Ready to learn more? Let’s start a conversation. Send an email to, or visit our website and click on the “Request a Demo” button in the top right corner.

January 9, 2019

Industrial AI Trends and Predictions for 2019

Trend-1: Cambrian explosion of vertical AI solvers

More AI companies in 2019 are going to solve ultra-specific industry problems which are narrow and deep. For example, Flutura has created specialized drilling efficiency AI apps that reduce invisible loss time in the upstream drilling process. These are not generic, AI algorithms; these are deeply specialized to solve a high impact problem.

Why is this important? 

As deep learning algorithms become democratized, novel AI applications that solve a narrow and deep problem become more important than a horizontal AI platform which needs immense tuning for the industry context.

Trend-2: AI powered business models that create new revenue pools

In 2019, the adoption of the “as-a-service” business model will accelerate to the point where customers are not obsessed with owning an asset but instead, consuming a service. For example, Flutura is powering remote AI powered prognostics as a service where the end customer pays the OEM per asset per app per month.

Why is this important?

As AI accelerates and the cost of sensors plummet, creating new revenue pools using AI is going to be an even more attractive value proposition. This convergence of trends is going to put the “AI-as-a-Service” business model on an exponential growth curve.

Trend-3: AI fades into the background and autonomous operations rise to the foreground

In 2019, we are going to see AI fade into the background and see more of autonomous equipment which self-learn with its cognitive abilities. For example, Flutura is working with a large upstream OEM provider to make 60% of its upstream operations autonomous by 2025.

Why is this important?

This is very important because AI is not just generating insights to augment operations but is also playing a very active role in recommending a front line option and executing the same through control systems. This means higher efficiencies, lower costs and improved safety conditions in mission critical onshore and offshore operations


Visit our website to start a conversation with us : 

December 11, 2018

Consumer AI vs Industrial AI diffusion rates - 3 Differences & Why the party is just getting started ?

All of us have steadily been exposed to a lot of media diet which advocates both the advantages and pitfalls of AI. Its the new reality. It has been silently and steadily been penetrating multiple facets of every industry . This steady state AI penetration is having tangible/intangible impact on outcomes, changing behavior, shifting business models, disrupting marketplaces. In this blog we wanted to share Fluturas practical insights on Industrial AI adoption and why that matters for the future of the economy at large which is powered to a a great extent by energy and engineering companies

At a macro level, AI Applications can broadly be divided into 2 buckets
Consumer AI Applications : Cross sell reccomenders , Sentiment Analyzers, Market Mix Modelers, Diabetic retinopathy predictors etc
Industrial AI Applications : Process chemical yield predictors, Down-hole drilling inefficiency predictors, Motor down time failure predictors while fracking etc
Question-1 : What discriminates Industrial AI adoption rates from Consumer AI adoption rates and why do those differences matter ?
Flutura found from its "from the trenches" experience that consumer AI has had a head start because of 3 reasons
  • Reason- 1 : Difference in labelled data availability for Industrial & Consumer AI models
  • Reason- 2 : Difference in perception of unlocked $ in Industrial & Consumer contexts
  • Reason- 3 : Difference in mindsets between Industrial & Consumer executives

Reason-1 : Sea of labelled data available for Industrial and Consumer AI Models
If one takes facial recognition as a problem to solve using deep learning neural nets, there is a ton of data to learn from sources like Minst.
If we take a similar problem as detecting product quality image anomalies in a diaper manufacturing plant or crack detection and progression on sub sea structures, the foundational job of creating labelled data sets need to be initiated. If the company has a "postponement of gratification" mindset the projects take off whereas if the executives want a "here a now pain balm" these projects get stalled.
What can be done about it ?
Industrial executives must be made aware that, access to labelled data will be a source of competitive advantage .
Labelled Industrial process and equipment data will be a tool for survival in the hyper competitive marketplace where access to algorithms becomes democratized and access to labelled data becomes the "moat".

Reason-2 : Difference in perception of $ value unlocked in Consumer and Industrial AI
In consumer industry when one executes for example the market mix models which change promotion $ allocation in the retail industry, executives can perceive $ value unlocked by measuring ROMI ( Return on Marketing dollar metric )
Whereas Industrial mindsets are used to perceiving value on electro-mechanical dimensions ("I can see what horizontal drilling does, I can see how adding vibration and shock sensors reduce warranty liability" which is more tangible than digital dimension ( "I cant see what I cant perceive") as a result of which they are unable to answer the question "Show me the money" with enough confidence

What can be done about it ?
Flutura has found "Engineering curves" as a good tool to manifest tangible value additions to skeptical industrial mindsets
For equipment manufacturers PF curves can be a tool for making buyers perceive value. For example Engine Anomaly detectors move detection window of warning to failure from 60 seconds to 60 mins across thousands of consumer/naval ship engines in operational deployment

For drilling contractors Depth vs Time at each rig state can be a construct for making them perceive value of AI in moving the dwell time at each drilling state unlocking millions of dollars of efficiency realization across hundreds of rigs
Every industry segment must have an engineering efficiency curve on which $ value perception of Industrial AI can be mapped. Find it and you have nailed it :)

Reason-3 : Difference in Industrial & Consumer mindsets
Because of this difference RELIABILITY becomes the operative word whether it is guarantee of prediction rates for complex fracking equipment or top drives or downhole stick shift failure.
Industrial executives carry a lot on their shoulders. A small mistake could mean in some cases life and death of humans in close contact with those risky industrial operations. Having experienced many successes and failures they need to be empathized with and gently guided into looking at operations thru new "Industrial AI" eyes.

Why do these 3 differences matter on the S curve for measuring Industrial AI diffusion rates ?
The proverbial S Curve has always been a reliable barometer to measure disruptive technology adoption rates in marketplaces be it cell phone adoption, bit coin adoption or AI adoption. If we plot the S curve for Industrial AI vs Consumer AI 2 differences in impact points will distincly emerge
Difference -1 : The take off point in Consumer AI > Industrial AI
It takes longer to implement an Industrial AI application but the potential impact of surgical AI apps can eclipse the collective impact of many consumer AI applications combined.

Closing thoughts

We at Flutura been blessed to have a ringside view of many practical industrial AI applications in the last 5 years which have massively scaled beyond "innovation POCs" in upstream drilling, process chemical manufacturing, Industrial heavy equipment manufacturers across Houston, Tokyo, Dusseldorf and a host of other industrial hubs. We noticed a difference in rhythm across both these sectors and asked ourselves 2 simple questions

Question-2 : What can be done about it ?

Lets face it ... Machine learning algorithms need to "hog" a lot of "labelled data" before the model tunes into real world behavior - be it modeling consumer behavior or machine behavior :)

" Forget the fancy terms, Show me the money" is a constant feedback we heard from Houston, Tokyo & Dusseldorf.

Industrial processes are complex with massive investments in electro-mechanical moving parts with high reliability have been in operations for many long years. Retail/Insurance/Banking process are relatively "asset light" as compared to the industrial sector.

Difference -2 : The peaking point in Industrial AI > Consumer AI

What this means is that the Industrial AI adoption party to unlock massive economic value is just getting started. As mentioned earlier there is a long horizontal line in adoption curve followed by a massive spike in adoption. We need to be empathetic to the needs of industrial executives as they manage the risk vs reward equation in a world which is accelerating and changing at a rate never before seen in human history. To conclude we at Flutura leave you with this parting thought from Jay Asher ( 13 reasons why )

“You can't stop the future
You can't rewind the past
The only way to learn the secret to press play.” 

November 29, 2018

17 Points to remember while architecting an IIoT solution which scales

Architecting an IIoT solution which scales? Here comes the key aspects and points to remember.

- SOA design - When the number of things is large, scalability is problematic at different levels including data transfer and networking, data processing and management, and service provisioning
- Network considerations - It is a challenging task to develop networking technologies and standards that can allow data gathered by a large number of devices to move efficiently within IoT networks. Managing connected things in terms of facilitating the collaboration between different entities and the administering devices addressing, identification, and optimization at the architectural and protocol levels is a research challenge
- Integration – of IT and OT systems
- Data maturity & Data landscape – quantity, quality and granularity
- Ability to handle large volumes of data and making it reusable


Various standards used in IoT (e.g., security standards, communication standards, and identification standards) might be the key enablers for the spread of IoT technologies and need to be designed to embrace emerging technologies. Specific issues in IoT standardization include interoperability issue, radio access level issues, semantic interoperability, and security and privacy issues

Hidden technical debt

- Developing & deploying is still relatively fast and cheap compared to maintenance
- Debt at system level and code level
- Abstraction boundaries are not strict – as it is data dependent
- Changing Anything Changes Everything (CACE) applies not only to input signals, but also to hyper-parameters, learning settings, sampling methods, convergence thresholds, data selection, and essentially every other possible tweak

Info security and privacy protection
- Definition of security & privacy from the viewpoint of social, legal, and culture
- Trust and reputation mechanism
- Communication security such as end-to-end encryption
- Privacy of communication and user data and
- Security on services and applications.

Would you like to add to the list? Comment below.

References - 
2. C. Fl├╝gel and V. Gehrmann, “Scientific workshop 4: Intelligent objects for the internet of things: Internet of things-application of sensor networks in logistics,” CommunComput. Inf. Sci., vol. 32, pp. 16–26, 2009
4. M. Fowler. Refactoring: improving the design of existing code. Pearson Education India, 1999.

August 16, 2018

Artificial Intelligence for Oilfields and Pipelines [Digital Transformation]

Vibration analysis can help companies manage assets for measured success, plan service calls, track failure modes and increase responsiveness to faults for mining, oil and gas, petrochemical, refineries and original equipment manufacturers. However when you couple up vibration analysis with an artificial intelligence platform that can ingest

large amounts of data from systems like OSI PI, Maximo and SAP to couple this data with historical machine failure data, maintenance records, technician data on where the most qualified person is to make the repair and then algorithmically check the spare parts refurbishment inventory is where the magic of artificial intelligence begins to transform how companies start to move from the old inefficient time based maintenance to condition based maintenance.

Safety & Risk mitigation is a big focus for industrial companies as well, especially pipeline companies that have assets spanning across hundreds of thousands of miles through out the U.S. and globally. Artificial Intelligence can process caustic and corrosion values of various types of liquids and materials running through the pipes asking things like:  At what rates do the corrosive materials start to breakdown the pipelines and affect the structural integrity? 

This is key because then artificial intelligence can give key stakeholders insights into where to go inspect next.  Artificial intelligence can give senior level management personnel insights into where to write their next check or allocate financial resources to mitigate their risks and raise shareholder value by keeping as much materials as they can to pump through the pipelines which is how these companies make money at the end of the day.

In the digital oilfield there is still a big gap between having insights that are leading up to failure versus just having production data. There is a huge opportunity to help companies move from condition-based maintenance to time-based maintenance. There are still thousands of well sites being driven to on a daily basis by personnel like Guager’s that literally drive around from well site to well site writing down analog readings from the meters on production data.  Often times it is only when these Guagers go out to the well site is when they discover the well site has a broken sucker rod or failed pump shaft is broken.  This old way of doing business is inefficient, costly and at the end of the day how the old world of oil and gas works. 

So how do we understand events that are leading up to failure of a sucker rod?  First, we need to understand the anatomy of a sucker road.  A sucker rod barrel is a single-piece hollow tube with threads on both ends. The structure of the barrel's materials can be divided into two groups: base materials and the coating or surface treatment layer. The most common base materials are steel, brass, stainless, nickel alloy and low-alloy steel. These base materials abrasion and corrosion resistances are enhanced by plating and other surface treatment processes.

The most common coatings and treatment processes are chrome plating, electroless nickel carbide composite plating, carbonitriding, carburizing and induction case hardening.  Coated or plated barrels have the largest market share because barrels experience the most wear and operate in severe, abrasive and corrosive environments.  The most commonly sold barrel types are Stainless steel, chrome plated, Plain steel, chrome plated, Brass, chrome plated. Brass, nickel carbide coated, plain steel and nickel carbide coated.

As you can see with these various types of sucker rods and various types of materials there is in inherently variation on their strengths and tolerances when it comes to performing in the field.  With artificial intelligence the platform can ingest these values and provide insights into the failure behavior and signals that are leading up to failure.  This can be a valuable tool for pump operators, designers and manufacturers to have insights into reducing total and complete failure of a well sites.

For upstream operations there is still a big opportunity in the market to help companies understand signals that are leading up to failure that can greatly affect their production.  If something like a top drive, cat walk or casing running tool (CRT) fails in some cases there might be a spare waiting on the sideline but what happens if that fails?  To be able to understand signals that are leading up to failure for these types of components of an upstream operation is not only essential to these operators it is also essential to the OEM’s that are supplying these critical components of the oilfield operations.  This is why more and more companies are turning to artificial intelligence to help then build a better product, better understand the performance of their machines, have insights into signals that are leading up to failure and enabling their workforces to Take Action and Not Just Have Insights.