January 14, 2014

Role of Big Data in harvesting Utility Intelligence


Technology shifts periodically occur that change the rules of the game. Machine 2 Machine ( M2M) & Big Data Analytics are two fundamental forces which are profoundly disrupting business models globally. M2M + Big data analytics offer fantastic opportunities by harvesting  behavioural patterns which were previously not seen or answering powerful unanswered questions.

The Utility sector is ripe for unlocking energy efficiencies by reducing technical and commercial losses along the complete grid value chain. It also offers to understand energy consumption patterns at a level of granularity which was previously not possible

Business Challenge-1 : Neighborhood Outages

When a neighborhood experiences outages there are multiple dimensions to the pain experienced. If the outage is experienced in a neighborhood with a lot of industrial/corporate customers then the frequency and duration of outages has a direct economic impact as it affects industrial productivity and costs. These outages are again classified into various types - black outs, brownouts and transient outages.  If the outage is experiences in a neighborhood with heavy concentration of residential customers, it impacts the customer satisfaction index. Also today the time taken to respond to an outage is long since the latency between outage and the utility knowing about it is long. So the utility really wanted to dig deep into outages, minimize turn around time ( TAT) for outages and minimize its occurrence and duration

Business Challenge-2 : Last mile energy blind spots


The last mile in the power transmission value chain is a blind spot for most utility companies. In many neighborhoods there have been instances of various kinds of tamper on the distribution side leading to loss of revenue for the utility company. The utility company wanted to identify revenue leakage hot-spots and minimize last mile leakages.

What kind of data is typically captured in Utility industry ?


There are typically 3 classes of Utility data which are captured across the Power grid
Meter data streams
-          Current
-          Voltage
-          Power  (across various phases at 15,30,60 minute intervals )
Grid events data pool ( Both Status data +  Exception events + Derived events )
-          Outage events
-          Voltage surge events
-          Tamper events ( reverse energy flow )
-          Voltage sag events
-          Weak emission signal events
-          "Last gasp" events
-          Power restore events
-          Volatility events
-          Low Battery alarm events
Grid Master data
-          Consumer data
-          Smart meter location data
-          Feeder station data
-          Substation data
-          Field force data
-          Organisational hierarchy data

Powerful advanced visualisation and machine learning techniques can help surface patterns in the 3 classes of data outlined above

Unlock new business models to monetize machine data


As more devices get connected to the internet grid more machine data gets generated across smart meters, vehicle sensors, flow meters, boilers, chillers etc.  This  provides an unparalleled opportunity to create new business models which can unlock value from machine data. For example dynamic pricing based on usage of an asset is one monetization vehicle.  Also one could think of syndication of anonymised/aggregated device benchmarking data with an intent of becoming "Nielsen for Machine data" which provides trust worthy machine data performance characteristics across manufacturer and  event streams. Flutura intends to curate new machine data products and do rapid experimentation on pricing models with a view to intercept this opportunity

Take for example the Utility industry. The last 3 years have seen a paradigm shift in instrumenting more data points resulting in  a sudden data avalanche . It has been driven by  2 waves which are going to unleash a lot of data for Utility companies to make sense of. In the first wave as Smart meters proliferate, Utilities have to process data at 15 minute intervals which is a 3000 fold increase in daily data processing for a utility.  In the second wave as the number of SCADA devices which are metering energy flow throughout the grid - substations, transformers and other elements of the distribution systems increases there will be the next level of data explosion. The implications of the massive release of data from Utility grids has profound implications for the industry as it opens a huge set of possibilities to monetize grid data.

January 8, 2014

Flutura demystifies Machine Learning:

Machine Learning is one term that gets thrown at you every day if you are even somewhere remotely close to working with huge amounts of data, or trying to make sense of it.