As 2018 comes to a close, speculation season takes full charge. Enterprise network professionals across the IT space reevaluate this year’s trends and errors in preparation for next year. In doing so, customers learn about the most relevant solutions to focus on. We spoke with Gadi Oren, VP of Products at LogicMonitor, to get his predictions on monitoring for 2019 and beyond.
LogicMonitor offers an SaaS architecture for existing IT environments. Their network monitoring solution automatically discovers data on network infrastructures and provides alerts when needed. We feature LogicMonitor in our free network monitoring buyer’s guide below.
Monitoring gets Opinionated
With the inception and deployment of advanced algorithms and machine learning into the monitoring market, we are seeing a greater need for specific information and ultimately better information to make smarter decisions on the performance and delivery of services and applications.
Prediction: Over the next 2 years customers will demand and vendors will provide systems that are more “opinionated” that move away from simple signal and alerts and move to compound alerts that explain what happened and what someone can do about it. Through machine learning and SaaS-driven tools, customers will deliver a strong network effect that leverages newly learned patterns from many customers, basically learning problems from other customers, and provide value directly to the user.
AIOps and the “Right Type” of Monitoring
AI is making huge strides in monitoring things like signals that are natural to humans (images/video/speech). However, in comparison, the killer applications within IT have not emerged yet because no company knows yet how to prepare the “right type” of signals and the related feedback to allow for machine learning, and produce a strong meaningful application supporting IT management.
Prediction: Within the next 3 years, companies will figure out the right mix of signals and feedback for machine learning and will create a breakthrough in monitoring strategies. The first to leverage this new strategy after gathering the right data will have the key advantage in the market. Ultimately, these tools will increase team efficiencies by enabling teams, which used to require experts, to operate through generalists providing considerable customer value.
APM will remain with a limited footprint (5%-10%) and a new type of monitoring will emerge
There are as many types of monitoring tools as there are monitoring strategies. APM, for instance, was traditionally created to help with the business case of monitoring clicks and performance to enable people having good experiences when engaging in online transactions. These solutions tend to focus on the “front side” of the house, covering web-to-database paths. Today, however, many organizations have 80%-90% of the code and the IP doing data crunching that is completely outside of the APM footprint. Taking the mountains of data and crunching that data requires a different kind of solution. Currently, no one is really driving this kind of solution well.
Prediction: Within the next 1-2 years a new space will emerge based on the increase in modern workloads that are doing data-crunching. This will happen in all type of environments (e.g. on-premises) but will be more emphasized in cloud environments. Today, most servers are doing all the data crunching, so we expect to see more advanced elements of this new space happen offline. These new solutions will fully understand the different needs of customers for offline-workloads as organizations seek new and innovative ways to provide special capabilities to their customers. We also expect to see the adaptation of the current team to a new type of environment.