95%, maybe 99% of data streaming from machine sensors will have no actionable value eg

A wind-turbine operating within operating constraints day after day after day...... 

It's that moment as gale force winds buffet the wind-turbine when the bearings start to overheat and fire threatens that is important. 

Important to feather or shut it down before the fire breaks out and blades are destroyed. 

Should the analytics/algorithms work at wind-turbine level or at central data centre level to apply potential problem analysis? As Bernard Marr's excellent article explains, depends on the volume, velocity and bandwidth available. 

A recent IDC FutureScape for IoT report  states that 40% of IoT data will be stored, processed, analysed and actioned  at the edge of the network where it is created. 

Or in other words there will always be a continuum of analytics between the data centre and the edge of the network. 

That is not all; the 90% or so of streaming data that does not indicate action is needed must be separated from the 10% that is valuable for emergency repair as against planned & preventative maintenance (PPM)

Algorithms, machine learning and AI will all play their part. Rather than just have the system feather the wind-turbine to avoid damage it might be that the planned maintenance should be advanced. It might not be the gale that is the issue but the state of the turbines- or both.

Knowing when to alert and bring to bear human intelligence is vital. Nothing like the human brain to know when the outlier (one turbine overheating) might be the new norm. The addition of Location Intelligence to the analytics platform should be considered . There could be similar potential problems in other similar turbines experiencing gentle breezes at the moment.

Their time will come and better to anticipate it now than find out too late.

All aspects of the benefits purpose built analytics and MIS solutions offer