March 29, 2016

Role of Big Data in harvesting Utility Intelligence


Technology shifts periodically occur that change the rules of the game. Machine 2 Machine ( M2M) & Big Data Analytics are two fundamental forces which are profoundly disrupting business models globally. M2M + Big data analytics offer fantastic opportunities by harvesting  behavioural patterns which were previously not seen or answering powerful unanswered questions.

The Utility sector is ripe for unlocking energy efficiencies byreducing technical and commercial losses along the complete grid value chain. It also offers to understand energy consumption patterns at a level of granularity which was previously not possible

Business Challenge-1 : Neighborhood Outages


When a neighborhood experiences outages there are multiple dimensions to the pain experienced. If the outage is experienced in a neighborhood with a lot of industrial/corporate customers then the frequency and duration of outages has a direct economic impact as it affects industrial productivity and costs. These outages are again classified into various types - black outs, brownouts and transient outages.  If the outage is experiences in a neighborhood with heavy concentration of residential customers, it impacts the customer satisfaction index. Also today the time taken to respond to an outage is long since the latency between outage and the utility knowing about it is long. So the utility really wanted to dig deep into outages, minimize turn around time ( TAT) for outages and minimize its occurrence and duration

Business Challenge-2 : Last mile energy blind spots


The last mile in the power transmission value chain is a blind spot for most utility companies. In many neighborhoods there have been instances of various kinds of tamper on the distribution side leading to loss of revenue for the utility company. The utility company wanted to identify revenue leakage hot-spots and minimize last mile leakages.

What kind of data is typically captured in Utility industry ?


There are typically 3 classes of Utility data which are captured across the Power grid
Meter data streams
-          Current
-          Voltage
-          Power  (across various phases at 15,30,60 minute intervals )
Grid events data pool ( Both Status data +  Exception events + Derived events )
-          Outage events
-          Voltage surge events
-          Tamper events ( reverse energy flow )
-          Voltage sag events
-          Weak emission signal events
-          "Last gasp" events
-          Power restore events
-          Volatility events
-          Low Battery alarm events
Grid Master data
-          Consumer data
-          Smart meter location data
-          Feeder station data
-          Substation data
-          Field force data
-          Organisational hierarchy data

Powerful advanced visualisation and machine learning techniques can help surface patterns in the 3 classes of data outlined above

March 28, 2016

How fast can a company access the big data to turn it into meaningful & actionable insights?

Something to consider when mapping out your IOT strategy and putting initiatives into place...  This info was presented to me by Tu Nguyen, a trusted advisor at Violin Memory.  It made sense so I thought I would share it.
How fast can a company access the big data to turn it into meaningful & actionable insights?  
The objective is to enable optimization in their business operations by analyzing large amounts of data as close to real time as possible in order to gain a market based competitive advantage that no company can afford to be without in a digital age.  And (2) continuously improve productivity in all areas of business- applications-performance:  each click, report & process is faster adding up over many employees.  Also, employees will start doing more when they see the different functions working faster, leading to better decisions.  It’s about latency.  New & more workloads.  With the freed up resources (time or hardware) the business can do new business or further refine their current business.
Too often, IT groups see their function as to “maintain” the medium for companies rather than “leveraging” technologies to “gain” competitive advantage and improve productivity or align with business objectives.  These challenges present organizations with hard choices every day. When businesses can’t run reports as frequently as they desire, they must act based upon potentially stale information, increasing risk of a misstep. Traditional storage and SSD has been challenged to effectively deliver the performance required to meet the needs of mission critical applications and consolidation initiatives.
O&G invested heavily in SAP HANA.  The objective is to enable optimization in their business operations by analyzing large amounts of data in real time. It can achieve very high performance without requiring any tuning.  HANA puts the entire database in memory, so it required dedicated RAM-intensive servers.  Call for data still have to exit the database server(s) to an external device.   The constraint is how quickly or what yield (2X, 5X …10X) can O&G companies’ backend system (disk/IBM/EMC’s SSD) return the data.  
The impact of poor database performance - One of the biggest bottlenecks in a database is I/O. Why? Because disk technology hasn’t really advanced in the last 10-15 years whereas servers and networking have been getting faster and more powerful. This means that databases now spent a lot of their timing waiting on I/O calls which equates to extra CPU being used and wasted application time. But year after year, the majority of organizations are over provisioning RAM, CPU and data centers related storage in hopes that they will drastically improve their mission critical applications latency. 
Likewise, solid state drive (SSD ), designed for sequential read and write operations, struggle under the load and often deliver high latency and slow application performance, alienating customers and users, delaying business operations, and slowing time to business intelligence. Organizations try various techniques like wide striping, short stroking, over provisioning (a.k.a risk mitigation), and caching to improve performance, but these techniques all have an impact on the business.  When more IOPS are required, organizations must invest capital in additional storage infrastructure, which results in excess, unutilized capacity. Maintenance, tuning, and ongoing optimization all require additional operational expenditures to maintain the additional performance.
Why disk & SSD are based storage no good for latency? Disks are mechanical with moving parts which have long seek times for random I/O workloads in particular. When a block is required the mechanical arm has to move to where the block is stored and read it. In random I/O workloads blocks are stored all over the place therefore seek times are much longer and the latency increases as more calls are queued up. Over the years storage engineers have had to use methods such as short stroking (Using only outer sectors so the head doesn’t have to move a lot) to achieve the lowest possible latencies but these techniques increase costs in DC environmental such as floor space, cooling and power so is actually less cost effective per gigabit. Many storage vendors also include caching in their SANs which give an application the illusion of low latency but in a fact that data hasn’t even hit the disk yet.
Why is latency important? One of the key successes in any application environment is to read or write the data as fast as possible so that there is a high level of productivity. While a data block is being read and placed in the buffer cache, the session that wants this data is sitting around waiting until the operation is complete using up resources such as CPU.  In other words, 80% of the time, the CPU is waiting for data to process while the data in your data centers is 80% in wait state.  In plain English, organizations are overpaying for 80% of their infrastructure. 
But what is the cascading effect of poor database performance for the business?
Example 1 - The impact of a slow database is not just an IT problem but also a big business problem. When a database runs slow, for example due to an inefficient storage subsystem, it creates unacceptable response times and excessive waiting for the application users. In some cases these applications could be other applications or end-users for example in a call centre or sports betting engine. If these other applications or end-users have to wait long times for their actions to be processed then their productivity will drop. Productivity is directly linked to a business making money. The more productive the business is the more revenue they make. Therefore if the application performance leads to lost productivity then ultimately the business will make less revenue. Let’s translate this to a realistic scenario of an online sports betting Navajo owned company which uses disk or SSD based storage to store the database.
They rely on their online betting system to make money. Simply put, the more people that bet the more money they make. As you can imagine the business relies heavily on the database to store the events, odds, bets, account etc. and the application on top of that to process the data. Here’s the flow of a bet by a punter from their point of view:
Each of these three basic steps involves the application interacting with a database and during busy periods can be I/O intensive. So if the database is performing poorly because the disk based SAN can’t keep up and it takes longer for the application to perform the tasks of login, read events & odd and place bet, this leads to lost productivity of the betting engine. This will cause the customer to become dissatisfied and maybe use a competitor’s site instead or place less bets within a time frame. If less bets are being placed then, you guessed it, the business will make less money. If the average customer places 10 bets in 10 minutes on a quiet day but on a busy day can only place 5 bets in 10 minutes then the business loses half the revenue in that time. If each bet is say $10 on a good day that’s $100 but on the bad day this reduces to $50. Now if 100 customers are doing the same thing every 10 minutes then the business will lose $500 in those 10 minutes all because of poor database performance due to a slow disk based storage subsystem.
The impact of fast database performance - So how does can we fix this problem and turn the waterfall into a success story. In this case I am talking about this problem being related to an I/O bottleneck so this has to be fixed. Violin’s all flash architecture provides the low sustained microsecond latency, high IOPs and high throughput required to achieve high database performance. Therefore data can be read and written much quicker but also consistently during peak loads. This leads to a new cascading effect called “The impact of fast database performance” where a faster database increases application performance and increases productivity. Increase the productivity of any business and you know have increased revenue.
So using the same betting example for when the I/O bottleneck has been removed. During the busy periods the punter can now quickly log on, check the events & odds and place the bet. They are now placing the average 10 bets in 10 minutes during these busy times which means the business if now making their $100 again; an increase in revenue by 50%. 100 punters now equals an extra $500 every 10 minutes. Actually because I/O is now so much faster and there’s no storage contention the company is able to increase their betting engines productivity to 15 bets every 10 minutes per punter ;-) That’s an extra $50 per 10 minutes for each punter and for 100 punters that’s another $500 in the same time frame.
So by using Violin Flash Fabric Architecture (FFA) flash technology businesses can change the waterfall of “The impact of poor database performance” into a success story and increase their revenue.
Furthermore, latency has an enormous impact on organizations’ competitive edge.  (1) In a world where real time content and analytics is getting popular the need for lower latency storage is growing.  Why?  The basic idea behind the phrase “big Data” is that everything your clients’ customers or your customer do is increasingly leaving a digital trace (or data), which you and your customers can use and analyze. Big Data therefore refers to that data being collected and you or your clients’ ability to make use of it.  The datafication increases your ability to analyze data in a way that was never possible before. The benefits of big data are very real, and truly remarkable.  Data has the power to change behaviors – more than education or prohibition ever could. This emerging trend in Big Data will be driven by you or your clients’ who will stand to profit from it.  And (2) for a business, as the latency values increase with the workload, the impact on their SLAs can be dramatic and potentially lose customers or revenue.

March 27, 2016

Industrial IOT in 2016


I have to tell you I am excited about 2016!  This year I have had the opportunity to work on numerous IIOT projects and have had many conversations with customers and
partners to lay out the architecture that will enable new business models in the coming year.  I am proud to say I have had the opportunity to help shape what is coming in 2016 in industrial IOT applications. 

  In fact some of these technologies have been a direct result of the interactions we have had with customers that are trying to drive new business models within their organizations.  Much like the movie “Field of Dreams” the customers have said “if you build it, I (they) will come”.  Customers have expressed how IIOT can help them unlock new revenue streams if we do “X”, or if you can do “X” we can finally have a better grasp on our process, controls and even people. Yes, the human factor…

  Anyone in the industry knows IIOT enables companies to use some type of hardware, sensors, software, and machine learning algorithms and other technologies to collect and analyze data from machines, processes and “things” that are producing large data streams.  However there has been some game changing advancements with even higher speed processors and DSPs.  Memory cost is at an all-time low.  Broadband wireless connectivity is more abundant than ever at any time before, even in remote areas never seen before thanks to the massive build outs from telecom partners.   In addition there has been an abundance of new high resolution sensors being built able to capture data that wasn’t available before and even a deep and rich set of standards that is enabling the number of monitored or managed data points to vastly explode.

  With these considerations people and the companies they work for will begin to be able to see data in a whole new granular detail like never before seen.  By combining live streaming data with historian data we will be able to enable a continuously optimized process and give autonomous control without having to second guess the process or the controls, thus vastly improving efficiencies.  With that said the expected growth of IIOT over the next four years is over 6B devices.  That is roughly four million new devices, sensors, etc. per day. Of course in just one area like utilities there will be an explosion of over 1M new connected devices in power generation plants alone, 2M in transmission and distribution substations, 16M in smart meters!  In oil and gas there will be over 4M new connected devices that include about eight new sensors per well site and about twenty new sensors per separation and storage site.

  Like I said, I am excited about 2016 but reflecting back over 2015, this has been a great year.  Shaping the future is exciting and fun and I can’t wait to make a positive impact on society with IIOT in 2016.


Author :Rick Harlow

March 24, 2016

Why Platforms Business Models are revolutionizing Asset Business Models in the Industrial world

Context Setting
There is a disruptive trend reshaping industrial and consumer marketplaces.
Platforms are powering game changing business models where emphasis is shifting from a one time revenue generator to creating new recurring revenue streams while also adding immense value.
The change in mindset is drivingorchestrating ecosystems using plug and play API which is more important than owning assets in a lot of cases.
For example the B2C marketplace is tangibly experiencing this massive upheaval where platforms are powering new business models for companies like Airbnb where they're platform is disrupting traditional asset based real estate models whereby the company provides consumers to rent property from them without even owning any property.  
Uber's consumer based platform model is also disrupting the traditional car as an asset based model specifically giving them the ability to basically run a taxi cab or limousine company without even having to own a single vehicle.
A similar movement is firmly taking root in the Industrial B2B world.  
Flutura has been participating in this important transition that is taking place.
Manufacturing, OEM, Energy, Oil n Gas are  all undergoing a complete makeover right now.
Pioneering companies like Ford, Hitachi, Schneider, ABB, PTC and yes, Flutura are seeding these Industrial platforms.
These platforms are sparking radically new business models which are filling previously unseen gaps in the market place. A new credo is being created where an orchestration of ecosystem platforms triumphs asset ownership!

This article looks at concrete examples of Industrial platforms impacting business models which are at seed stage.

 

Three Platforms Seed powering Industrial disruptions

Let’s look at various platforms movements which have begun in three Industrial organizations and which will rapidly blossom”as layers of dense sensor fabric, intelligent autonomous edge and central algorithms get added over time.

Industrial Platforms at Ford

The car industry is a classic case where the shift is evident. Every car is accessible via API.  Ford collects exhaustive information regarding Mustang, Explorer, Fusion etc. via OBD-II Protocol on CAN bus.  The specific state events collected include steering wheel angle, engine speed, accelerator pedal position, fuel level, latitude, longitude, transmission torque, gear lever position etc. This rich data pool is available to partners via OpenXC framework of APIs. For example the Brake Distance Tracking app leverages this data pool to warn drivers when they follow other vehicles too closely.

Industrial Platforms at Bombardier

The other place where the platform revolution manifests itself is Aircraft industry. A typical engine has about 250 sensors but some of the new age aircraft engines comes with 5000 sensors emitting 10 GB of data per second which means a single twin engine aircraft with an average of 12 hours flight-time can produce 844 TB of data which are transmitted using next generation aircraft data networks (AFDX) which transmit data at 12.5 mbs/second.  Engine health monitoring apps can look proactively for engine health signals and intervene early on instead of grounding aircraft for long expensive periods of time after an adverse event has occurred.

Industrial Platforms at GE

GE has even already articulated its vision for paying per thrust instead of paying for an engine. This is a truly differentiated offering where customers are guaranteed an outcome because industrial IOT platforms allow you to measure each micro event in an asset.  
All three examples conclusively prove that the movement has begun and is irreversible.

How Flutura is powering Platform based Business Models

Flutura has been powering revenue generating business models using Asset as a service models in Hydro, Energy and Engineering sectors using its Cerebra IOT platform in two core industrial markets – United States and Japan/APAC.
The Cerebra IOT platform which powers these next generation business models has three layers.
  1. Cerebra Signal IOT Data Exchange: This component of Cerebra essentially tames the torrent of sensor data by connecting to machines using data adapters to historians, SCADA and PLC systems.
  2. Cerebra Signal Studio: Cerebra Signal Studio helps Engineers surface signals using a 16 step guided process eliminating the minimizing the need for expensive hard to get industrial data scientists. Its unique functionality is the ability to see the most important patterns with “One click EDA”
  3. Cerebra Nano Apps: Cerebra Nano Apps are verticalized solvers which target an ultra-specific asset or process problem.

Closing thoughts

Platforms for Industrial sector are no longer ‘nice to have’ but rather now a ‘must have’ for an industrial company to survive in a changing world.  Industrial companies which do not have a platform road map risk being left behind as other Industrial companies leapfrog them.  
Industrial organizations like Siemens, ABB, Schneider, Eaton, and Honeywell are increasingly focused on digital IOT platforms. Flutura has been working with several pioneering industrial companies in Tokyo and Houston – which is being called the "Silicon Valley" for the industrial sector where we got a ringside view of this transition. 
For us as a company the future is very clear – Platforms will eat Assets.  We arenot in the business of building software, but powering disruptive business models in the Industrial world.  Click here to read more about Flutura

3 Reasons why Industrial companies need an Industrial IOT alternative


The last couple of months have been interesting in the Industrial world as the pace of disruption accelerated on multiple converging dimensions with sensor fabrics, digital twins, augmented reality, drones and 3D printing which promises to change forever the way business is being conducted in the Industrial world.  As the power shifts from electro-mechanical to digital happens there are 2 important questions running thru the minds of Engineering CXOs
  1. Digital Business Models: What new revenue streams can be powered by the digital revolution of sensors and digital twins?
  2. Foundational Industrial Platforms: How do I build a strategic, future proof platform which powers the future of my business?
GE with its Predix platform has done a lot of visionary work in opening the worlds eyes to possibilities at the intersect of Industrial and Digital worlds. This has resulted in a lot of organizations getting inspired by GE and looking at their own roadmaps to intersect the digital world which is encompassing their engineering world.
Srikanth, Derick, Krish and I have been traversing the Industrial power centers of Bourbon drinking Houston, Saki drinking Tokyo and Napa Valley wine drinking NorCal talking to industrial customers who have embarked on a journey to intercept the new industrial-digital reality.  
Here are 3 insights that have surfaced.
 Industrial insight-1: Vendor Neutral Process/Asset repository
The reality of the Industrial Landscape is that it consists of a plethora of complicated assets many of which are non GE with proprietary message emission formats. For example in the Oil n Gas world there are a variety of Swivels, Submersible pumps, top drives etc. to be digitally stitched together. The mind-set prevalent within Industrial companies is to create a vendor neutral platform which can connect to a variety of asset from heterogeneous manufacturers.
Industrial insight-2: Flexibility to expand Digital Apps
The ability to compose new apps and extend the platform in a direction which is unique to their process, asset and business model needs. For example in the transmission and distribution side of digitized utility grid there are capacitor banks and distribution transformers.  With this there are always specific apps to be built at the intersect of these data pools and a vendor neutral platform like Flutura is best equipped to expand these apps.
Industrial insight-3: Edge is fragmented
The ability to decode edge intelligence and close action loop on GE owned assets is relatively good because they have intimate access to proprietary machine log data and digital mechanisms to sense and respond to edge conditions with less latency. The fact that the closed loop process is baked into hardware to trigger electro-mechanical actions based on sensor data signals makes it quite a compelling solution. This is where with the advent of fog computing vendor neutral platforms there will be a time not too far from now that will mature in achieving SLAs related to responding to edge events.
 In the world of business processes, SAP BW had trouble expanding to non-SAP Business processes because of its perceived bias towards SAP processes. Similarly in the industrial operational world, focus will be more on vendor neutral platforms which are flexible and open.
The future belongs to vendor neutral platforms like the Flutura Cerebra platform which is liberating  machines by giving them a voice to talk.

Actions! Not just insights.

What are real world examples of IOT Big Data Use Cases impacting Biz Outcomes?
Flutura keenly read the market signals from Houston, Tokyo + European markets and more specifically in the energy + engineering industries we are focused on.
After analyzing the data accrued over the last 24 months we have distilled it down to six practical IOT big data use case families.
We have heard a lot of marketing hype around IOT and Big data. At the end of the day it boils down to ONE POWERFUL UNANSWERED QUESTION.  
What are some of the measurable outcomes impacted by Industrial IOT use cases?  
IOT USE CASE FAMILY-1: NEW BUSINESS MODELS
This is the holy grail of IOT and Big data. While most IOT big data use cases focus on cost optimization, there are a couple of interesting use cases which we encountered which activated new revenue streams by tweaking the business model in the market place. For example, in the Oil and Gas
Industry Flutura encountered organizations creating new revenue streams by monitoring real time situational awareness of digitized wells as a paid value added service. Flutura helped a provider of Smart city solutions to monitor 11,000 government owned buildings in real time by watching their boiler, chiller, fire alarm and other ambient data in real time where the pricing model was per device.
In the deregulated REP markets, Flutura has encountered REPs offering new value added offerings like energy audits and asset refinancing for commercial industrial customers which are powered by smart meter based big data products.
IOT USE CASE FAMILY-2: ENERGY OPTIMIZATION
Energy consumption is a very important lever which drives profitability in the industrial setup. Surgically targeting this business outcome can really make the business case for IOT based big data solutions as we now have the ability to mine granular state information and correlate it to energy outcomes
For example, we recently executed an engagement in Houston based fleet provider where we reduced average fuel consumption by 2% unlocking 65 million dollars in savings per year. We did this by digging deep into signals buried in operating sensor event streams and location related data which was then correlated to the fuel consumption data.
Another project we executed which was in the utility industry involved reducing average peak power consumed by residential customers by analyzing smart meter data across millions of households gathered at 15 minute intervals.  This pattern seems to be repetitive in multiple industry context where a single digit change in energy/fuel efficiency has a six figure impact on savings
IOT USE CASE FAMILY-3: ASSET INTEGRITY
Industrial industries are asset intensive. For every minute an asset is down millions of dollars are lost. Reducing down time of certain nodal assets is a use case which has a business case. Let’s take two examples we executed in this area to illustrate this theme.
We worked with a leading Spanish wind power generation company recently who gave us data regarding the performance characteristics of 2 turbines like main shaft rpm, grid voltage, slide ring temperature, reactive power, and rotor characteristics at second level. We unleashed Machine learning algorithms on terabytes of time series data a Hadoop cluster and successfully extracted signals emitted prior to a turbine breakdown A US Engineering giant gave us data from Electrical trip unit data (48 Samples/Cycle and 4 Cycles prior and post events along with power data – voltage, current , frequency ) which was then analyzed to spot significant differences in trip unit calibration which compromised asset integrity “Smelling” asset signals is again a recurring pattern encountered across industries.
IOT USE CAE FAMILY-4: DEFECT DENSITY + SENSOR FORENSICS
In process and discrete manufacturing industries, it’s very important to keep defects below a certain threshold. A new set of possibilities is being enabled from granular data collected from digital factories. It is the ability to dig deep into second level sensor data to understand specifics of process states which increased defect density. Let’s quickly take an example of a project we executed for a US based electrical product manufacturer.
The product underwent a variety of operations and at each step of the operations sensors were monitoring viscosity, humidity, temperature data at second level and streaming it to SCADA based Historians. We were able to spot patterns using advanced machine learning techniques on a Spark based architecture which resulted in an 8 % reduction in defect density for the product
IOT USE CASE FAMILY-5: GRID SECURITY
As the grid gets increasingly intertwined into all the assets, security becomes a very important consideration. Security forensics using granular event data can help investigators analyze sequence patterns exhibited prior to an adverse event happening. These digital signatures can be codified into a knowledge bank and watched in real time.
IOT THEME-6: PRICING
Pricing innovations using analytics is another area where we are seeing some powerful use cases blossom. For example, in the energy value chain trading happens in Energy exchanges and there is a lot of data generated from trades, competitive pricing information which needs to be mined to understand volatility patterns and to help time the market. Flutura’s energy data scientist are helping a major energy trade see those invisible pricing patterns to optimize millions of dollars by timing the market well by analyzing past correlations Another use case which is finding momentum is dynamic asset pricing. Assets which are leased to customers (for example golf carts are equipped with sensors which record location, topple events, average cart speed, start, stop events). By analyzing the granular usage statistics, asset leasing companies are able to have dynamic pricing of assets.
CONCLUDING THOUGHTS
We do agree that the marketing machines have been on steroids advocating IOT and Big Data as the panacea for everything.  Industrial mindsets focus on tangible outcomes and some of the frequent questions we hear are they just fads or is it real?  Where is the business case for IOT big data in Engineering industries? What are the use cases which are real for the Oil and Gas industry?  How can Retail energy providers and energy trading firms benefit from IOT and big data?   While a lot has been done on buzz word introduction, we felt the best way to make the case was to highlight tangible business outcomes which are tangible and "show them the money".  We hope you found the use case taxonomy useful and would love to hear your experiences from the trenches. As we say in Flutura…   May the big data renaissance awake every industrial organization.

ABOUT US

Flutura is a niche Big Data analytics solutions company based out of Palo Alto (Development Centre in Bangalore) with a vision to transform operational outcomes by monetizing machine data. It does so by triangulating economic impactful signals from fragmented data pools. The name Flutura stands for butterfly; inspired by nature's greatest transformation from a caterpillar to a butterfly. We are obsessed with Trust and Transformation and align our daily lives to these core principles. Flutura has recently been identified as one of the top 20 most promising Big Data companies across the globe by leading analyst magazine CIO Review. Flutura has also been featured on Gigaom reports on Big data + M2M in energy sector. Flutura was also the winner at Tech Sparks, where 800 innovative startups were evaluated.
Flutura is funded by Silicon Valley's leading VC fund The Hive (based in Palo Alto) which primarily invests in big data companies worldwide.