I have seen many IT terms come and go, and each new buzzword is followed by vendors jumping on the bandwagon, players who pretend to be part of the crowd or try to redefine the space in their own Image, marketing jargon and the like (not that this is always a bad thing!). When I pick up on a new market term which I believe will reshape the way we work and do business, I tend to be very cautious about others who claim to be in this space. After all, my personal repetition is on the line with what I promote and use in each of my customer solutions.
The purpose of this post is to outline how I personally go about validating vendors and technologies within the AIOps space. This paper is not about naming valid players, as that list is constantly changing, but to outline the way in which I vet them out.
Reviewing the solution with the vendor is always the best first step. I am notorious for taking the vendor demo a little to far and often need a second demo normally from a vendor corporate resource or senior engineer. During these demos I really try to drill deep into the technology, to identify how it will work for each of my customers. I always choose a use case that has existing competitive technologies and challenging infrastructure or business needs. I will try to picture how the vendor would interact and be a positive contributor in real-world scenarios that actually happened at my customer. Finally, having spent 20 years on the other side, I do the research. I have a keen sense when it comes to features that are overstated or not well-vetted yet. I make a note of these features and do some hard research to verify that the vendor gave me the full story.
I always work in very complex environments where there is no single technology that has the capability to address all of the company’s needs, and I don’t stop at technical fit. Therefore, it must integrate with and work well with other common industry technologies, even if they are competitive products. I don’t expect the tool to integrate out of the box with everything, but they must allow for enough openness where the customer or the community, in general, can extend the solution to integrate with and play nicely with others. These key tenants are;
Open data: While very few tools will actually hand over the schema to their data store, they should always provide a complete, well documented and scalable interface to the data collected and generated within the technology. Both event and time series data should be accessible as well as any configuration or discovered component metadata.
Common industry integrations: Many if not all AIOps solutions create and leverage events and metrics. The IT industry has identified some specific standards in the area, and they have been fine-tuned and matured over decades. Some of these technologies are; SNMP and REST.
Maturity: Industry standards and proven mature technologies are a must for the basis of AIOps tools which will be leveraged in the large enterprise. This includes the algorithms the tool bases their membership in the AI market on. They should be leveraging mature, well defined algorithms for prediction, grouping, and classification.
Real Testimonials: Research what is available online with other reviews and customers who they clam love the vendor’s solution. Follow up on referrals and talk with other users at trade shows and meetups. This part of the research takes some time to execute correctly.
Try to Break It: Does it work, kick the tires and see how it works. Put it in our lab, test out their claims, break it, see what the limits are, proven scalability. Finding a tool with an awesome set of features but limited in its ability to scale in large environments is not a showstopper in my world. This is normally resolved through architecting the solution, so it does not break the limits. I feel that the technology’s ability to solve a problem and integrate with other technologies is paramount.
Proof of Value (POV): Implement it in the real world and stay close / live with it. I often leverage friendly customers who are willing to try new things. I view this group as my customer advisory board and invest my time and experience alongside their teams to solve their unique business needs. This is where we learn the most about technology, and again is very hard to execute, without a lot of time investment. Again, by the time a vendor gets to this point I have seen considerable value and have vetted out their technology fully. The main objective of this stage is to see how the vendor works through customer needs. Open tickets, challenge their engineers with edge use cases, see what the vendor’s capability is around customer support and product flexibility. Above all, see how willing the vendor is to work with the customer.
In closing, many of my industry contacts often reach out to me about technologies they have come in contact with. They are always challenging me to do research on the next best thing and identify how the technology fits into the big picture. Finally, I offer up my team for any specific inquiries you may have about a technology. Feel free to click the link below and submit your question.