Predicting the future is getting easier. While it’s still not possible to accurately forecast tomorrow’s winning lottery number, the ability to anticipate various types of damaging network issues — and nip them in the bud — is now available to any network manager.
Predictive analytic tools draw their power from a variety of different technologies and methodologies, including big data, data mining and statistical modeling. A predictive analytics tool can be trained, for instance, to use pattern recognition — the automated recognition of patterns and regularities in data — to identify issues before they become significant problems or result in partial or total network failures.
“Relying on multiple sources of clean data, along with built-in redundancies to deliver good, accurate information, visibility in the network can prevent issues rather than simply reacting to them,” says Richard Piasentin, chief strategy officer at network performance specialist Accedian. He notes that analytics can even be integrated into closed-loop orchestration systems to provide network self-correction for many common problems. “Ultimately, predictive analytics … helps companies save on operational costs and prevents issues from going unnoticed — issues that usually culminate in complete outages,” he says.
Analyzing network behavior, infrastructure thresholds
When properly designed and deployed, predictive analytics can deliver deep insights into an array of commonplace and unique network issues, helping operators handle everything from policy setting and network control to security, says Rahim Rasool, an associate data scientist with Data Science Dojo, a data science training organization. To tackle security issues, for instance, predictive analytics can use anomaly detection algorithms to sniff out suspicious activities and identify possible data breaches. “These algorithms scan the behavior of networks working in the transfer of data and distinguish legitimate activity from others,” Rasool explains. “With predictive analytics systems, the vulnerabilities in a network can be detected before a hacker group does and, subsequently, a defense mechanism can be drawn out.”
Another way predictive analytics can help organizations is by comparing trends to infrastructure capabilities and alert thresholds. “Almost all signals have an upper bound and a lower bound that are a result of the infrastructure’s capabilities,” says Gadi Oren, vice president of technology evangelism at LogicMonitor, which operates a cloud-based performance monitoring platform. “For example, a certain device interface can only transfer so much capacity per unit of time before it is saturated,” he says. Additionally, some signals are linked to alert thresholds. “Using the exposed trend and its variance, we can predict when a certain physical system will max its capacity or when the trend is expected to reach a threshold and cause an alert.”
Predictive analytics at work
Although just about any type of enterprise network can benefit from predictive analytics-generated insights, networks transporting crucial data in life-or-death sectors such as health care, emergency response and aviation traffic management stand to gain the most from the technology.
Power utilities are particularly concerned with ensuring network reliability, since even a minor outage can result in significant human and financial harm. “We leverage machine learning models to predict future outages that impact a customer’s network based on an incoming weather event as well as help improve data integrity by detecting and correcting errors in the customer’s network model,” says Farnaz Amin, principal digital product manager for grid analytics at GE Power.
Operating in more than 180 countries, GE Power says it produces one-third of the world’s electricity, equips 90% of power transmission utilities worldwide, and develops software that manages more than 40% of the world’s energy. Yet as power transmission networks grow in complexity, largely attributable to the arrival of renewable energy technologies, generation has become less resilient due to unforeseen decreases in wind or sunlight. Ineffective management of a transmission system could result in blackouts and major financial and reputational penalties for utilities. To address this concern, GE Power has turned to AI to facilitate the measurement and forecasting of generation resiliency, enabling a more stable power grid.
The company is also using analytics to tackle another persistent and potentially service-crippling problem: data errors, which frequently arrive from the manual entry of information at the service provider level. Even seemingly simple errors can hinder emergency and outage response teams, leading to overall poor customer service experiences and lower satisfaction levels. GE Power is addressing this issue by developing algorithms incorporating geographic information system (GIS) and other forms of operational system data to detect, recommend and correct pervasive errors. With better quality data, utilities can more efficiently dispatch crews, reduce outage restoration time and avoid incorrect outage notifications to customers.
Utilities are generating a wealth of data from their assets every day, but it requires very specific expertise in both hardware and software to unlock its potential and drive actionable insights, Amin notes. “Current approaches to solve these problems, like network model data integrity, outage prediction and labor optimization, are manual, labor intensive and often inaccurate,” she says. “An advanced analytics approach learns from historical data to be able to make predictions that will provide better visibility in network assets and help organizations make more economical decisions.”
Predictive analytics adoption challenges
Despite growing interest, predictive analytics remains an emerging technology, rife with pitfalls and adoption challenges. “The main drawback is that these [predictive analytics] approaches work well for environments that are scaling up, but not for environments that are changing very rapidly,” Oren warns. Rapidly changing environments create a situation where the signals change too quickly, before the analytical system can detect slow-moving trends. “This, in turn, provides inaccurate results in trying to predict when something is going to happen.”
Obtaining and using high-quality data is also essential for accurate predictions. On average, the energy sector uses only a small fraction of the data it aggregates, Amin observes. “This data includes everything from sensors, recommendations from plant managers, and dynamic information from a multitude of assets and networks,” she explains. “Data is clearly available, but many utilities are drowning in the amount with no clear picture on how to leverage it, meaning they are unable to take full advantage of predictive analytics.”
Beyond data collection, to achieve maximum predictive power, it’s also necessary to establish a system for gathering and recording the various alerts and reports generated by the organization’s network operations team over time. Such details can be used to augment a predictive analytics tool’s ability to detect lurking network anomalies. “Furthermore, the team must be focused on processing data and analyzing the insights,” Rasool says. “This step requires having a team with domain expertise to understand the entire setup.”
An organization’s network team should also be able to supply appropriate positive and negative feedback processes to the predictive analytics system, since such information will aid the model’s learning capabilities. “With proper analysis, this data can create a lot more value by developing a means to deal with abnormal circumstances—sort of like what a network manager does currently,” Rasool says. However, self-learning doesn’t mean that predictive analytics will eliminate the need for human network managers. “In fact, such a system will be able to assist managers in better decision making and responding,” Rasool explains.
Yet another challenge is convincing network teams to embrace and routinely use predictive analytics tools. “IT and data science teams can come up with solutions, but if these are not adopted by the operations groups this investment does not yield a good return,” Amin says. “Therefore, understanding current processes and embedding advanced analytics into those processes are key to success.”
How to get started
For organizations just getting started in network predictive analytics, it’s important to think carefully about exactly what types of data should be captured, as well as the types of network problems that need to be resolved. “Once you have a clear picture of the use cases, that will add value to your organization,” Amin notes.
Remember, too, that feeding too much information into a predictive analytics tool can be almost as bad as supplying too little data. “If an organization doesn’t reduce the dimensionality of what data it’s analyzing, it could send an impractical amount of network telemetry information to the cloud,” warns John Smith, CTO and co-founder of network performance technology provider LiveAction.
It’s also important to think about the practical side of data management: specifically, how and where store relevant data. “Exploring containerization and its functionality and applicability in various use cases is also worthwhile, as it provides optionality,” Amin says. “Technology is advancing rapidly, so putting off this exploration will leave companies behind.”
This story, “Using predictive analytics to troubleshoot network issues: Fact or fiction?” was originally published by
Share this post if you enjoyed! 🙂