June 29, 2014

IOT and Big data - Change the world from Reactive to Preventive

This guest post is from Francisco Maroto, the CEO and founder of OEIS consulting.He has a deep knowledge of Telecom, M2M,Internet of the Thing and Big Data business and solid technical background of the technology sector. He writes about how IOT and Big data can be used to improve our lives in all aspects.

The world needs IoT and Big Data to be transformed from Reactive to Preventive
We are living in a world of services run by an inducted consumption. The technological advances that receive most media attention are those that are aimed at consumers’ leisure. Big Data and the Internet of Things (IoT) get more attention when justifying their value based on financial or marketing criteria such as user experience or increase sales than when used to predict situations that have not happened yet.
Gary Atkinson, Director of Emerging Technologies at ARM, during The Connected Conference, the first international conference dedicated to the Internet of Things, discussed how IoT can help out the big challenges humanity faces.

Despite being too well known, it is important to remember again the main challenges that the planet is heading towards, and that IoT and Big Data can contribute to solve:
·         Food is running out of room.
·         Water is our rarest commodity.
·         Energy needs to be cheaper to be efficient.
·         Healthcare is a growing problem entailed by an ageing population.
·         Transport, everyone will be able to afford cars, but won’t be able to afford to park of fuel.

There are many other applications where IoT and Big Data technologies are very necessary but unfortunately some time it costs more to justify its implementation. The benefits are so obvious as needed but require a commitment and willingness of companies and governments think more to avoid risks of accidents or disasters that act as unfortunately occurred.

After the Fukushima nuclear disaster, some hackers designed customized Geiger counters that automatically updated radioactivity levels on an online map and people could see real-time radiation levels. A similar example emerged in earthquake-prone Chile, where a student designed a seismometer to tweet its readings. It quickly amassed more than 300,000 followers, who were grateful for the early alerts.

In a recent Fujitsu event, the company explain how the company is building a Predictive Alarm system using IoT and Big Data to predict off shore accidents, tsunamis and other natural disasters using marine sensor and data analytics.

Canada came up with another initiative. The project Smart Oceans BC will use sensors and data analytics to enhance environmental stewardship and public and marine safety. If will monitor vessel traffic, waves, currents and water quality in major shipping arteries and will include a system to predict the impact of off-shore earthquakes, tsunamis, storms surge, and underwater landslides.

We have to transform Organizations and Governments from reactive to preventive.  With the aid of asset intelligence and predictive maintenance analytics solutions, and a more predictive and proactive service model, organizations and Governments will not only be able to reduce their service cost, but deliver additional value to customers and citizens.

When I launched OIES Consulting  with the mission “To become one of the world's leading independent consulting companies, delivering top class M2M/Big Data services supporting innovations that improve the way people works and lives", I started my search of companies like Flutura with a vision to transform operational outcomes by monetizing machine data mostly in Oil and Gas and Utilities companies across the world.  

For instance, in industries like Oil & Gas, it’s a massive exercise in collaboration, team work, project management and oversight to ensure safe and smooth operations. Given that work is complicated and executed in highly hazardous conditions, there is a need to have better systems of risk assessment and safety management. And although comprehensive policies, procedures, and regulations exist to ensure things remain safe, the industry continues to experience fatalities, serious injuries, and other forms of damage to people, property, environment and reputation. There is significant headroom to adopt and benefit from Machine to Machine and Advanced Analytics of raw data from critical systems which affect safety than just safety statistics.

The benefits of implement Asset Management solutions that use cutting edge Big Data, M2M and Predictive analytics technologies are enormous to determine optimal asset availability and provide detailed predictions that avoid human and material losses.


Conclusion

Solving the big challenges that humanity faces needs the IoT, Big Data and Predictive Analytics.
For the first time in our history we will have the possibility of connect billions of sensors placed anywhere (in the atmosphere, in the middle of the oceans, in the cities, in the vehicles,..), to collect and analyze data in real time and provide more precise and accurate information that will allow predict natural disasters resulting in the saving of human lives.


For additional information please contact: francisco.maroto@oies.es

June 23, 2014

Fluturas 5 Analytical Constructs for Modelling Energy Prices


As IOT Data proliferates one of the game changing use cases which it enables is dynamic pricing. As assets get instrumented one can have usage based pricing of assets on lease. We are already seeing disruptions in pricing model in the automotive industry where sensor data which is a proxy for driving habits is fuelling usage based insurance premiums.

Flutura has been working with Utility companies which is one of the industries in Industrial Internet/IOT category undergoing fundamental shifts because of deregulation and increased instrumentation of the grid. Pricing in the utility industry is a very crucial lever and there is a lot of headroom for Utility companies to innovate on their pricing levers using data science. In this blog Flutura outlines 9 components in the pricing framework  out of which 5 analytical constructs can be used to dig deeper into pricing models. Dig deeper into the science of pricing using big data analytics can impacts multiple Utility outcomes like Profitability, Customer churn (now that Utility is getting deregulated) and Peak grid stress (influencing customers behaviour at peak time using pricing).

Analytical Construct-1: ENERGY PRICING RECCOMENDER

Based on the maturity of the market, the intended business outcome and the amount of deregulation in the distribution of energy a number of pricing models can be adopted. One could adopt TOU (Time of use) pricing which charges a premium for energy consumed at peak time. One could adopt UBP (Usage based pricing) model where based on the deviance from the median usage a customer could be charged a premium for the differential energy consumed. The construct to operationalize will have to be carefully chosen based on the factors outlined above. Some of the specific signals governing the choice of model can be the sensitivity of the neighbourhood to price changes, the sentiment of the end consumer to marginal price changes, the kind of segment the the customer belongs to and past knowledge bank of what stimuli worked for the consumer segment. Each of these are elaborated below.


Analytical Construct-2: PRICE ELASTICITY MODELLING

Price elasticity process essentially answers a simple question – Is the neighbourhood responsive to price changes? Is Houston more sensitive to the$ 0.25 cent increase in unit energy pricing than Dallas? What is the % change in peak energy consumed in Palo Alto and its neighbourhood when there is a change in peak price? Do households in Austin change their energy consumption pattern in response to time of use pricing (TOU) or Usage based pricing or Location based pricing framework? Depending on the observed behaviour we can be tag that Austin as price sensitive neighbourhood using markers. Also if 2 neighbourhoods have similar characteristics, one can do what if scenario analysis to understand the potential reduction in energy consumed in that neighbourhood as a function of changed pricing

Analytical Construct-3: PRICING MODEL A/B TESTING


Let’s assume that there are 2 variants of energy pricing model and utility wants to test which of the pricing models has an impact on the intended energy consumption habit of the user. One is Time of use pricing where energy consumed at peak time , say 10-3 and the other is usage based pricing where we have differential charging for consumers who cross a certain threshold of energy irrespective of the time at which they consume. The utility wants to figure out version of pricing model is better. In order to do that the Utility subject both versions to experimentation simultaneously. In the end, they measure which pricing version was more successful and select that version for real-world use. A/B test is a perfect construct to quantify the impact pricing has on energy behaviour

Analytical Construct-4: UTILITY CONSUMER SEGMENTATION


A consumer can be segmented based on his/her behavioural profile using clustering algorithms like K means / neighbourhood. Depending upon the segments which emerge we can map a pricing model for each emergent behavioural segment and measure their response to that pricing stimuli

Analytical Construct-5: SENTIMENTS TO PRICE CHANGES

It’s very important to factor in consumer sentiments as a signal into the pricing model so that one is able to find the graceful balance between intended consumer behaviour at peak time and churn ( specifically as markets get deregulated and consumers have a choice ) . Two Key places where one can sense customers pulse for a pricing change are twitter feeds for real time expressions and call centre channels where representatives can dig deeper into their response

Flutura strongly feels that as the world around us gets digitized increasingly using sensors, IOT industries like Utility, Auto and Asset engineering firms will use real time data combined with advanced math as a strategic weapon to compete on pricing.  As Warren Buffet rightly said "Price is what you pay. Value is what you get."



Managing scale – a CIO’s dilemma

Flutura is proud to present a series of guest blog posts by Industry stalwarts. 

This first one is by Clif Triplett . Cliff served as the first CIO for the 100 year old, $20 Billion revenue, global oil field services business Baker Hughes. Cliff also held several executive positions at Motorola, General Motors, and other Fortune 200 companies. Cliff also served the US Army between 1980-1990. He writes about Managing scale – a CIO’s dilemma.


One of the challenges most CIOs face today is the cyclic nature of business.  We as IT leaders must devise ways to be nimble and respond to the financial demands and pressures of meeting the expectations of the quarter.  To meet these demands we must plan and prepare.  A top priority is to drive more of our cost to a variable cost structure so that we have the levers we require to respond.    Next, we need to understand just how far we may need to move our cost position to meet the financial valleys demanded sometimes by business conditions.  If we want to be considered a valuable element of the business and a true business partner we must show we can respond to these market pressures and make the sacrifices or better yet, the actions necessary to achieve financial targets.  
                                                                                                                     
What are some of the things we can do to meet this challenge?  First we need to instill in our project managers the practice of managing our projects to hit quarterly financial targets and quickly migrate to managing to the month.  Virtually all of our project and program managers manage to a budget, but perhaps not with the concept of being able to stop the plan on any month end and have delivered associated value.  Key to this begins with the design of the projects.  This is not to say we create a process to just stop the project on demand; it means we design our projects to deliver value on the quarter and can stop additional expenditures at that point if required without stranding past expenditures.   We generally as IT leaders can get at least a one month outlook on whether the business will need us to pull back or can stay on our business plan and need to have the insight and processes to cause our expenditure rate to dramatically shift. 


Part of a project design must be the planning of staffing and infrastructure requirements.   We need to establish contracts that allow us to take a portion of our staff and be able to draw back or even halt expenses on a very short notice; generally less than 30 days.  One of the easiest areas to pull back cost is in the area of development.  Projects should be designed in one month increments of capability that would be of value to the business.  If cost pressures necessitate a slow down, we halt the next unit of deployment.  Deployment is the next area where cost controls can be implemented.  If cost pressures demand it, we can delay deployment and all the associated expenses if we designed it and contracted in a manor to respond to such circumstances. 

Let’s review how this might be possible.  Contract the software license costs such that they are not activated until the system is moved into a production environment.  Second, the infrastructure you require to run the application can be purchased in a manner similar such that it is not paid for unless being used.  This is possible with strategies like Platform as a Service (PaaS) or Infrastructure as a Service (IaaS) or even Software as a Service (SaaS).  Any testing services and deployment services can significantly leverage outsourced services and therefore can also be pulled back on those portions of the system that has been placed under financial suspension. 
The IT market has shifted and it is now possible to buy most IT resources as services, and in a variable cost model.  We must learn how to take advantage of this and begin to migrate to these new financial service models so we can meet the financial challenges placed upon the business. 

Now the previous is scenario is what most of us have facd, but another scenario faces us as well – success.   What if our project is highly successful or popular and the business wants or needs us to go faster;  and what if cost and budget are not an issue?  Business leadership generally does not understand the concept that money can’t fix most things.  I believe it is in our best interest to be able to accept money and be able to accelerate to meet business demand.  When IT says it can’t deliver regardless of cost, this is a major missed opportunity.  This scenario is similar to the challenge of being asked to scale back, but it is one of scaling up to meet demands.   Once again to meet this challenge, one must prepare.  We must plan for success and that the demand for our services will exceed our current plan and available resources.  How do we meet this challenge?  We need to attempt to put in place contracts and processes that allow us to double our capacity in two weeks.  This seems perhaps a bit odd to many IT teams, but whether you exercise this capability or not, it will have a secondary impact of improving service delivery.  To double your capacity, you must have well defined processes, process discipline and a very engaged and open communication process with the key IT suppliers.

The challenge can be met.  You can design your organization to scale back or scale up rapidly, but it does take planning.  The IT services market has now provided us new tools at our disposal to meet these new demands we need to be able to react to.  If you have not begun to prepare, it is now time.  Failure to be nimble can cause great harm to the reputation of the IT organization and the level of partnership you will be able to earn and develop with the senior management of the business.

June 19, 2014

What can IOT industries learn from the Jungle? – “Smelling” signals that matter


Today’s IOT(Internet of Things ) world is a complex web of sensors - connected building, digital oil fields, smart meters, connected cars, smart cities, and wearable’s like fitbit.  

This is also a fast growing market and according to IDC by 2020 the market for IOT will be $8.9 trillion with 212 billion connected things. This is an astronomical phenomenon taking shape and the “sensor jungle” around us is exploding. 

This explosion is also triggering off an avalanche of new data streams, unprecedented in history from micro sensors which are constantly tweeting their state (very similar to the jungle where birds are constantly chirping :) 

So, what opportunities are unfolding when this irreversible and massive digitisation, the scale of which we have never seen before plays itself out?

Fluturas Data Scientists looked to nature for inspiration … The biological world is a complex web of micro interactions between the various species to maintain overall eco system harmony. Our Data Scientists framed 1 powerful question

·        How can IOT industries use principles of Bio mimicry ( the science of mimicking nature for engineering solutions )?

The answer to this question has inspired several design principles we encoded in our Cerebra Signal Studio platform.

In the biological world of jungle a species is able to survive because it is able to sense and respond to a few vital signals around them. It is very important to know which “noise” patterns to ignore and which “signals” to process.

Let’s deep dive into a concrete example … In the forests the ability to see patterns in the way birds chirp is crucial. The bird calling patterns is a proxy signal for the presence of wild animals in the vicinity. If one is unable to detect these signals one loses situational awareness and becomes extremely vulnerable- and in many cases just dies! The ability to detect patterns is key to survival in the biological world.

Key Learning
There are a vital few signals key to Situational awareness and survival

So what implications does this have for industries impacted by IOT– Oil n Gas, Autonomous vehicles, Utility, healthcare wearable’s etc.  Flutura strongly feels that they increasingly need to focus their attention on making sense of the world around them. The IOT industry will soon start developing capabilities to industrialize the ability to detect real time patterns from sensor data .

We feel that survival in IOT marketplace will be predicated on capabilities which enable them to triangulate the vital few patterns and detect them in real time before it becomes an existential threat. These could be asset behaviour patterns, Operational patterns, breakdown patterns ...all of which are buried deep within petabytes of sensor data.

For example one of the Cerebra use cases for next generation digital oil field Fluturas Data Scientists have developed consists of  triangulating last mile signals to proactively trigger an asset intervention which in turn affect HSE risk outcomes.

Another Cerebra use case Flutura instantiated involved solving the problem of demand response in electric grids using granular sensor data by “smelling” detailed bottom up neighbourhood demand signals.

We at Flutura strongly believe that going forward for the IOT industries, data is the new oil which is going fuel success in a hyper competitive world. What discriminates the winners in the IOT market place from the rest of the crowd is - the ability to “smell” signals and respond to them at scale in real time ! In the coming years as the ecosystem matures, IOT driven industries, specifically Oil n Gas, Utilities, Automobiles, Healthcare, Smart cities etc. will start mimicking the biological nervous system  ( which is the most complex pattern detection engine )

So if you are in any of the IOT driven industries here is a question to ask - “Which are the vital few signals we should “smell” and respond to at scale in real time in order to remain competitive in the IOT world?”


As the Guns n rose’s song goes “Welcome to the (sensor) jungle”!!! :)

June 17, 2014

Fluturas 22 Non Statistical Questions for a Statistician!

Flutura has always believed that when the world of business collides with the world of math, magic unfolds. As these 2 worlds collide, it also presents a set of unique challenges - bridging the semantic language gap between business and math. Modelling complex business outcomes using math requires an interdisciplinary team consisting of business folks, data folks and math folks. While doing so business folks are always at a loss because a language chasm exists. Math folks love their ”geek speak” ( Tanimoto coefficient, chi square, odds ratio) and business folks are focussed on impactful outcomes (Mean time between failure, Next best action etc.).
Having been caught in between, we at Flutura we have been obsessed with the question - How do we bridge the world of business to the world of math?


Our data scientists have come up with an ultra-specific checklist of 22 questions  to lubricate the friction which exists and would like to share with the world . So here they come in no particular order …

1.       Which business outcome are we attempting to model and predict?
·         For ex : Mean time between failure of asset - MTBF, Next best action- NBA
2.       What surgical actions can we drive once we are able to predict the outcome?
·         For ex : preventive replacement of asset, stock up on spares
3.       What is the impact of these actions?
·         For ex:  Reduced down time, minimized risk
4.       What is the economic impact of a correct prediction?
·          For ex : cost of reduced downtime translated into $
5.       What is the economic impact of a wrong prediction (false positives)?
·         For example : $ spent which goes into replacing a healthy asset
6.       What is the non-economic impact of a wrong prediction?
·         For ex:  Customer experience gets compromised
7.       Is not predicting a bearable business option?
·         For example : In some industries gut feel accuracy is still a workable solution ( but these are very rare )
8.       Is the business phenomena we are trying to predict modellable using the data we have?
·         For example : Ambient temperature may not be instrumented to model downtime
9.       Is there enough breadth available in data to explain the predicted behaviour?
·         For example : Certain asset attributes like tenure may not be available
10.   Is there enough depth available in the data we are using?
·         For example : 2 years, 3 years , 4 years
11.   Are there blind spots in causal data?
·         For example - Ambient context? Human context, Machine context?
12.   Is the past representative of the future?
·         For example – A new design may invalidate historical data of an asset
13.   Are we modelling black swan events?
·          For example - Black swan events are rare difficult to predict event
14.   Do rhythms and patterns exist in historical data which is correlated to outcome?
·         For example are their frequent sequences of event prior to asset breaking down ?
15.   Was the modelling an armchair exercise or did the modeller soak in the biz process?
·         For example – Modellers who are intimate contact with field can model assets behaviour better
16.   Are we focussing only on signals which reinforce our world view?
·         For example – Typically people have Cognitive bias
17.   Which real world behaviours are encoded in vectors?
·         For example - volatility, velocity, dispersions, ranks
18.   Does the statistical model articulate a range of possible business outcomes?
·         For example – best case scenario MTBF = 768 days, Worst case = 628 days
19.   Does the statistical model articulate the realistic outcome?
·         For example – realistic MTBF = 680 days
20.   Are their weak signals if triangulated which could become a strong signal?
·         For example – combining vibrations + experience of maintenance engineer + asset age
21.   Are we mistaking correlation for causality?
·         For example – Vibration frequency is co-relation whereas maintenance engineer could be causal
22.   Have we polled multiple models to see if 2 models reinforce the same outcome ?
·         For example does logistic regression reinforce the outcome received from a decision tree ?


So go ahead, ask these 22 questions to the models you have in place. Hope it minimizes the geek vs business chasm and fosters a share meaning of the world!  Please do share with us your experiences and a beer on Flutura for the most insightful feedback J

June 7, 2014

Is the juice worth the squeeze ? Big Data ROI


In Fluturas experience before one starts massively squeezing big data using map reduce and other new age big data processing frameworks, there are 3 important questions  which many organisations fail to answer.

  1.  “What are the right problem to solve using big data use cases?”.

  2.  “How much is the specific monetary outcome expected from solving the problem?”

  3.   In layman’s terms – IS THE JUICE WORTH THE SQUEEZE?


Flutura has created a unique methodology which takes this to the next level by codifying specific steps to zero in on the $ denting use cases.

Let’s deep dive into one specific real life example where Flutura is working for in the upstream section of the Oil n gas industry in US.  In the upstream side  there are a lot of heavy engineering assets increasingly going digital as a result of sensor instrumentation .This results in a lot of bottom up sensor data which is collected from the digital oil field – mud flow rates, CO emissions, vibration sensors all of which are “seen” using historian or out of condition monitoring systems. So what can one do with the sensor data ocean captured for months and years from the various oil fields and rigs ? So what are some of the problems which can be solved using the sensor data use cases?

-          Upstream Option -1: SOLVING MAINTENANCE PROBLEM - Can we predict mean time between failures for crucial upstream assets by triangulating crucial patterns in sensor and human generated data?

-          Upstream Option -2: SOLVING HSE PROBLEM - Can we reduce probability of blowout by detecting new last mile signals, not seen before? ( Health , Safety and environment dimension )

-          Upstream Option-3 : SOLVING COMMAND CENTER ENGAGEMENT PROBLEM - How do I increase the engagement of command center folks looking at the mundane alert screens which impact the last mile ?
Depending upon the market and the maturity of the organisation the flavour of the immediate problem to solve and priority may change. So here they come – 5 penetrative “kick in the gut” questions for getting enough “juice from the squeeze”
-          Kick in the gut question-1 – “Making it real”: What is our situation – do we have a problem looking for a solution or do we have a solution looking for a problem?

-          Kick in the gut  question-2 – “Painkiller or Vitamin” :  : Is the problem a “Pain Killer” or a “Vitamin” – Painkillers are problems where the intensity of pain is explicit and experientially felt, whereas vitamins are problems which are implicit and need education in acknowledging the pain ?

-          Kick in the gut question-3 “Budget line item”: Is the problem acknowledged and are there line items in budgets allocated to solving the pain ?

-          Kick in the gut question-4 : “Modellable data” : Is the problem “modellable” using the data which is available or are there blind spots in the data which

-          Kick in the gut question-5 : “Is juice worth the squeeze ?” : Is it REALLY worth it ?

Good luck “Squeezing your big data” from all of us at Flutura … And make sure you solve the right problem J

June 1, 2014

Data Maturity: Is your Organization Data Savvy?




Organizations are getting more data driven day by day but all of them are not at the same level of maturity.Its like a natural resource for the organisation and it has to be refined to derive more value out of it.