October 16, 2013

Upstream Sensor Data + Big Data Analytics = Game Changer in Oil n Gas industry


Each blow costs the Oil and Gas industry a Billion dollars.

Can we avoid it? Can we see it coming and take action? 

We also know that each operating rig consists of thousands of sensors. This sensor data is used to analyse and reduce HSE (Health, Safety and Environment) risk considerably and dollars.


To show in detail how this is done, Flutura will explore breakthroughs achieved with upstream oil and gas data pools and Big Data analytics. This would transform the way HSE (Health, Safety and Environment) risk is detected by unlocking patterns previously not seen. Let’s begin, by asking ourselves some key leading questions that may help us diagnose the problem better and help transform the way we have been dealing with this problem so far:

1.       How can disruptive innovations in big data processing help mitigate blow out risks in the Oil and Gas industry to save lives and secure our environment?

2.       What are the key data blind spots in data upstream in the rig?

3.       What are some of the powerful unanswered questions from MWD (Measurement while drilling) / LWD (Logging while drilling) data?

4.       How can we triangulate across data streams to see key early warning signs upfront?

5.       What if we can wear new lenses to see risk patterns previously undetected?

The Oil n Gas industry is ripe for innovative solutions as two factors are converging to create the perfect storm.

·         Disruption enabler-1 : An audacious problem related to Health, Safety and Environment ( HSE) issues

·         Disruption enabler-2 : Large headroom to juice untapped data assets in upstream part of Oil n Gas value chain


Flutura calls this next generation approach - Oil n Gas 'Last Mile' Safety Risk Analytics 2.0. This approach hits at the intersection of Sensor data + Big Data Platform Engineering + Deep Machine Learning. This is a definite turning point from the traditional ways of looking at data using Historian and/or traditional database systems for assessing risk.

Most risks which happen in the Oil n Gas industry are emanating from events which happen in the last mile. If one is able to triangulate on sensor and human generated event streams to detect early warning signs which have not been seen, then one can detect signals early on. Flutura believes that the best way to mitigate risk is by reacting strongly to weak signals in the last mile and this involves continuous triangulation  of  streaming MWD/LWD and SCADA data streams over a long period of time using a combination of platforms + people
So what does the last mile in the Oil industry look like? The last mile in the oil industry is a sensor jungle which consists of a variety of MWD (Measurement while drilling) and LWD (Logging while drilling) sensors and SCADA devices. Thousands of these sensors are measuring multiple things from drill rpm, mud flow rates, CO2 gas emissions, valve positions, pump states and in the process emit billions of events. By using innovations in Big data like Map reduce and horizontal scaling, the Oil and Gas  industry has enormous potential for  unlocking previously unseen patterns buried in the data which help reduce risk in the last mile.

1.       Is there a co-relation between the out of bound of drill rpm rhythms from MWD Sensor data and operators experience?

2.       What are the frequent sequence of events (mud flow, Drill pressure, temperature, CO2) exhibited in rhythm disturbances prior to a near miss event which can trigger shutdown actions before an adverse event occurs?

3.       Which parameters where out of preset bounds (2 sigma events) before an incident happened - pressure / temperature or rpm so that the checklists on recalibration can be fine tuned?

4.       Are there keyword frequencies in maintenance inspection notes like "vibration", "leakage" which can be mined from text mining process which gives early warning signals to an impending incident?

5.       Where their signals in identity management logs which provide a clue to forensically investigate safety incidents?

6.       What is the NBI- "Next Best Intervention" for a Drill, valve, pump etc based on its historical behaviour?

The data to answer the above questions is available in the last mile sensor data captured as illustrated below:


Throwing data-blind-spots into the spotlight

There are devices like pumps, valves and sensors attached to a drill which emit either change of value events or alarm events. These events can be stored in a central event repository instantiated on a Hadoop cluster with massively parallel processing capabilities which can scale in an economic manner. Storm can be used for event stream processing, PIG to do computations in data pipeline and Hadoop /HIVE to do batch based computation



3 key best practices to mitigate risk in Oil n Gas (HSE) using big data

So what are additional best practices which can be leveraged to mitigate HSE risks?

1.       TRIANGULATE :
Look for patterns which reinforce each other across SCADA/MWD/LWD/Surveillance video event streams - structured, unstructured, internal and external.

- What are the top 3 cross event stream triangulations which are important signals to watch out for?

2.       REACT STRONG TO WEAK SIGNALS :
Consider having organisational processes which can pick the weakest link and amplification processes for field intervention. An alert prioritisation framework which rank orders signals would help risk specialists respond in a surgical fashion

- How can a weak signal be picked up and transmitted in real time to a point of action?

3.       LAST MILE DATA BLIND SPOTS :
Be conscious of last mile data blind spots be it contractor data, incident notes or device logs.

- How can we build a last mile integrated 360 degree integrated data picture?
We at Flutura believe that answering powerful unanswered questions and executing the 3 key best practices would dramatically heighten situational awareness and mitigate operational risks.

New machine learning and big data innovations in processing billions of events and a variety of messy data sources using Map reduce paradigms has helped a lot of human generated companies to disrupt their markets. The question is how can Oil and Gas companies use the same data innovations levers on machine data to disrupt the way of managing last minute risks?

As Marcel Proust said “The real voyage of discovery consists not in seeking new landscapes, but in having new eyes”, we at Flutura too believe that discovering HSE (Health, Safety and Environment) risk consists not in looking for more systems but in seeing with ‘new analytical eyes’ to look at previously collected data.

Srikanth Muralidhara
Co-Founder
Flutura Decision Sciences & Analytics






July 20, 2013

M2M PREDICTION ECONOMICS IN 3 INDUSTRIES




As the possibilities at the intersect of M2M+Big Data gets unlocked its very important to examine the economics underlying the use cases

So the core questions becomes

"What is the economic value of correct prediction in the M2M Big Data world ?"


  1. How can we monetize on device predictions ?
  2.  Is the effort taken disproportionately large compared to benefits unclocked in predicting outcome ?
  3.  Is the cost of incrementing additional sensors points to collect data worth the returns ?




In this blog Flutura would  share 3 real world examples of the economic value power of prediction in the M2M Big data world

·         M2M Example-1 : Predicting Oil blow outs  in a Digital Oil Field


One month back Flutura's  Oil n Gas M2M data scientists were engaged with a leading Houston based provider of valves for the oil industry .  The problem they were solving was that of Oil blowouts. A blowout is the uncontrolled release of crude oil and/or natural gas from an oil well or gas well after pressure control systems have failed.  While executing the engagement we understood the economic equation when for example the cost of an pipeline blowout  on an average cost 5 billion USD. Modeling blowout can help save billions of dollars in cost.

·         M2M Example-2 : Predicting energy leakages in a Power Grid


Similarly last week Flutura's Energy M2M data scientists kicked off an  engagement with  a utility firm. The problem they were trying to solve was the millions of dollars lost when energy transmission in a power grid gets leaked when equipments are inefficient in transmitting (step down transformers ) and/or electricity pilferage happens. A business case from a utility in Canada, BC Hydro, reveals electricity theft cost them at least 850 GWh or approximately $100 million U.S. dollars per year. We understood that millions of dollars can be saved by predicting energy pilferage on specific high risk corridors.


·         M2M Example-3 :  Predicting network attacks from "low and slow attacks" in Telecom

Fluturas Telecom M2M data scientists were engaged with a leading telecom provider who was experiencing a surge in attacks on their network infrastructure. These attacks were getting increasingly sophisticated going beyond the traditional denial of service attacks. These were experienced hackers lying low and "poking" vulnerable points in the network to solicit a response before launching a full blown attack. Identifying signatures of these low and slow attacks can save millions of dollars in lost down time on the telecom infrastructure.

The above real life end user use cases give a glimpse of the economic impact of M2M prediction models
To conclude,while M2M and Big Data is very interesting, Flutura is sharply focussed on the economic value which gets unleashed at the intersect. We shall share more of Fluturas experiences in the coming weeks, specifically on the economic value on executing M2M Big data solutions.

July 12, 2013

8 M2M Use cases ... Analytics + Big Data + Building Management Systems


As the internet of things explodes, the building management industry is ripe for disruption from sensor data. Flutura has been obsessed with end user stories in the M2M big data space. Here are a 8 specific big data use cases which are at the intersect of M2M Analytics + Big Data + Building Management Systems.




Triangulating false alarms
Triangulating alerts from multiple big data event streams is key in M2M situation. For example the control room which is receiving fire alarm event needs to know if this is a smoker accidentally causing an alert or are their multiple alarms going of which could mean a serious event is happening in the building

Geofencing assets
In an Building management situation certain assets can have constraints on their possible location. If the asset crosses certain geo coordinates an alarm event can be triggered by examining the sensor and geospatial event streams.

Signal in technicians notes
Building management technician creates inspection reports which have a lot of unstructured text around keywords like "leakage", "noise", "vibration" which could serve as an early warning system to impending

Predictive modelling of crucial asset failures
Chillers and boilers are 2 crucial components in a building management systems. If enough historical data exists, one can build a predictive model which can proactively nudge maintenance staff to respond when it sees an increase in probability of breakdown

Real time condition monitoring
As a swarm of sensors engulf a building, it's very important to ensure that the alarm events are able to "swim" faster to the central command centre and not be impeded by the avalanche of state messages waiting to be processes by the central command centre in real time

Causal hypothesis testing
Building management technicians and maintenance staff have a lot of experience regarding potential causes of failure of say a lift. These hunches can be periodically harvested and a statistical hypothesis tests can be done to either confirm or reject the existence of an "experiential hunch"

Adverse event message forensics
When an adverse event happens say - crash of a lift or a fire gutting a floor of a building, investigative agencies may need to process the exchange of messages between the sensor and the command centre in order to understand the sequence of events leading to a failure

Alarm hot spot analysis

A central command center team can also do hot spot analysis of alarm frequencies to see if there are clusters of devices or geo locations which need intervention because of the abnormal amount of alerts experienced. It could also trigger a replacement of device on a proactive basis

Flutura would be posting more M2M big data use cases ... Stay tuned ...

July 2, 2013

What are real life end user stories which bring out the power of M2M and Big Data Analytics ?


The best way to understand the disruptive potential of Big data is thru actual real life end user stories of adoption on the ground.

Here are 3 we found interesting

End user story-1 : GE Trip Optimizer
It is a solution which leverages M2M and Big data analytics to optimize fuel consumption in locomotive engines. A 1 % saving in fuel = $ 1 Trillion dollars per year.

End user story-2: Samsung T9000
Its a futuristic refrigerator from Samsung which can be connected to the internet with Wifis and runs a lot of apps.

End user story-3 : Trxcare
Pill dispensers which contain a sensor. Each time a medicine is consumed from the dispenser it activates an sms to Trxcare command centre thru the vodafone network


We at Flutura will be sharing more such end user stories as the adoption of M2M goes mainstream and a sensor network engulfs the world :)

Which are the 3 Cs of M2M ?


July 1, 2013

Intelligent Building Big Data Use cases (M2M )


Intelligent buildings are the future. An intelligent building consists of devices like lifts, chillers, boilers, fire alarms, smoke detectors, carbon monoxide detectors all of which are emitting data. There are multiple times of data

1. State data .For example temperature of a room or pressure of water in boiler
2. Alarms data . For example when certain events of interests are observed, for example a fire alarm goes off.

Using these 2 inputs the context of a device in an intelligent building can be modeled.

Lets take the simple case of understanding the history of a fire alarm which has been installed on a crucial building in the city.



This alarm has been installed for almost 3 years. The technician wants to understand the history behind the device

  1. What is the rate at which alarms are going off ? ( Velocity )
  2.  When was the last time this went off ? ( Recency )
  3.   How many times did it go off ? ( Frequency )


A big data analytical solution can stream events from this device and help the technician answer previously unanswerable questions regarding the health of a building or a device in the building. So what are the other possibilities M2M and Big Data Analytics unlock for the Building Management Systems


When 2 worlds collide - M2M + Big Data ...


When human beings got connected, it unlocked a whole new set of possibilities and companies like facebook, linkedin etc came up with solutions which had never been there before.

The ability of machines to interact with each other promises to unlock a whole new set of opportunities unprecedented in our history

Flutura has been working on some game changing M2M big data analytical solutions with multiple industry players for the last 12 months .






Some of the most interesting M2M use cases have emerged from Oil and gas, Energy/Utility industries, Intelligent building management systems, Security device use cases and telecom sector. 

But before we go the use cases lets step back and walk thru some of the most frequently asked questions in plain simple language

What is M2M ?
Its machines having the ability to communicate and "talk" ( hmmm machines can talk ? )

What are some real life examples of M2M ?

  1.  Connected car ( Think Fords onstar program )
  2. Clinical remote monitoring
  3.  Security ( Think listening to firewalls and identity management devices )
  4. Pay as u drive car insurance ( Insurance )
  5. Smart meters ( Energy )
  6. Traffic Management ( Smart City )
  7.  Building Automation systems

What are the forces driving M2M to the tipping point ?
  1.   Miniaturisation of sensors
  2.  Plummeting cost of instrumenting an asset with sensors
  3. Regulatory needs ( European regulation on sensors in vehicles and Energy smart meters )
  4. Emergence of eco systems
  5.  Innovations in bandwidth + cloud + taming big data


Who are the players and what roles do they play in the M2M ecosystem ?
The 3 major players in the M2M eco system are
  1. Device manufacturers ( Honeywell, GE , Philips etc )
  2. Network carriers ( Vodafone, AT&T, Deutsche telecom etc )
  3. Central monitoring players ( Pacific data systems )


Trust you enjoyed the first in a series of blogs on the intersect between M2M and Big Data Analytics. Stay tuned as Flutura goes deeper and deeper in the M2M Big data ocean :)




June 20, 2013



An Info graphic to represent the Mental Model of a Data Scientist @ Flutura


May 28, 2013

April 6, 2013

Lean UX - A review



I review for the O'Reilly Blogger Review Program
One of the best books in a long time ... 

Here are 10 nuggets of wisdom We learnt from Jeff and Josh’s book - Lean UX
Click here to see the book
  1. 1.       Insane focus on business outcomes
    2.       14 core principles
    a.       Small, Dedicated, Colocated teams
    b.      Progress = Outcomes, Not Output
    c.       Problem-Focused Teams
    d.      Removing Waste
    e.      Small Batch Size
    f.        Continuous Discovery
    g.       GOOB : “getting out of the building.”
    h.      Shared understanding
    i.         Anti-patterns
    j.        Externalising work
    k.       Making over Analysis
    l.         Learning over growth
    m.    Permission to fail
    n.      Getting out of deliverables business

    3.       Product being predicated on 5 pillars - Assumptions + Hypothesis + Outcomes + Personas + Features

    4.       12 point business assumption worksheet which makes explicit all the assumptions

    5.       6 point user worksheet

    6.       Breaking down macro product  hypothesis(un testable) into micro testable hypothesis

    7.       Bringing the end user ‘alive’ -  a 4 Quadrant worksheet to map proto-personas ( our best guess of the user)
    a.       Sketch and name
    b.      Behavioural demographic
    c.       Pain points and needs
    d.      Potential solutions

    8.       Feature map template
    a.       We will create ___________ feature
    b.      For ____________ persona
    c.       To achieve ___________________ outcome

    9.       Low fidelity prototypes to increase interaction, collaboration and crystallisation

    10.   Jeff and Josh emphasize on “Human conversations” as the most powerful tool to eliciting latent needs

    The best statement =  “stakeholder conversation becomes less
    about what artifact is being created and more about which outcome is being achieved. “

    One of the most actionable for product development. 150 pages of distilled wisdom .
    Click here to see the book

March 1, 2013

33 "Gotchas" from Strata 2013 @ Santa Clara

Flutura participated in the mother of all big data conferences - Strata - right at the heart of Silicon Valley-  the hub of disruption

Business takeaways ( Strata-2013)

1. All of them had a feeling of 'Data Renaissance' sweeping around them
2. Telefonica presented wonderful use cases of telecom log files being used to make retail decisions
3.Data brokering is a new monetizable product and revenue stream for them
4.Retailers are going to salivate at location data more and more
5.Telephone log files were used to decode where love is blooming and fading ( long late night calls x times in y time frame )
6.Multi disciplinary exploration of data would maximize insight yield
7.Map pathways to specific business outcomes from big data
8.Think 'Data Products' ... Think ' Minimum viable features for data products'.
9.Story telling can breath life into boring numbers 
10. Data is increasingly going to be the lifeblood of an organisation
11. The process of translating market place intuitions into testable hypothesis is extremely key !
12.There are multiple ways of seeing ... spotting new patterns
13.Lots of cognitive biases exist while reading data
14.Awareness of these biases and blind spots in what the data is telling us is key
15. Every social event would be instrumented and recorded on Social platforms forming a rich data stream
16. Visual story telling skills are going to be in demand
17. Privacy frameworks are key
18. Privacy framework = Anonymization + Aggregation
19.Think 'Data ocean' and 'Data tributaties'
20. Important metrics to watch - 'time 2 impact' + 'return on data' ( not ROI )

Technical 'gotchas'
21. Human knowledge precedes machine knowledge ... Respect the human :)
22. Graphs = hot !
23. Graphs + Geospatial = hotter !
24. Graphs + Geospatial + Text = hottest !
25. WebGL / Panthera / Project Rhino /REST Services / Raphael= Very very cool  :)
26. The folks from Google showed some amazing stuff on world shipping lanes gathered from satellite data 
27. Modeling Individual behavior + geospatial = hot !
28. Mutable data models are key !
29. Data Scientist is a very broad term ... There is immense variability in data science
30. Needs a new taxonomy to describe nuances
31.Dont interview data scientists ... 
32. Ask them to read real analytical outputs, try before u buy models 
33. Look for Curiosity + Passion + Communication is mandatory traits of a data scientist