In my first installment of this 3 part blog I discussed the reason for SMI-S and the goodness that it has brought to the world. SMI-S though, like my favorite dark chocolate, is often bitter sweet. Clint Eastwood, in “The Good, The Bad and The Ugly” is not good unless we have Lee Van Cleef playing the ruthless villain “Angel Eyes” to contrast him with.
While the technology chosen by the SMI program has significant benefits, it also has challenges. To anyone but a seasoned software engineer the technology is essentially inaccessible. To be proficient in leveraging SMI-S takes significant time and energy. So while the initial goal was, at least on the surface, to be inclusive, it was only truly inclusive for those willing to pay the significant price of admission.
On the surface the SMI program may have a common goal of uniting through a common language, but at the root is the desire for businesses to make more money. Like the gold stashed in the cemetery vendors are only willing to make investments in the standard if they feel it will bring them money. The way in which one business believes they can leverage the SMI-S standard or organization for their own benefit trumps the more altruistic goal of improving the standard.
Like “Angel Eyes” big organizations can be quite ruthless. They have their hands and eyes on the program and have purchased smaller companies to improve their offerings. In many ways this is like fishing in a fishbowl. This can be good for the small companies too, but isn’t necessarily the best thing for the consumer.
Like the United Nations Security Council, there are big organizations represented within SMI-S that have veto power allowing them to strongly influence how the standard changes and ensuring that the standard does not change in a way that would require them to invest outside of ways in which they believe have the best ROI for their business. Essentially, this is a coalition that doesn’t have teeth. This is not to say that this is actually any different than any other standards project, it is just part of the reality.
One of the goals of the SMI program was to include a way to determine if the standard was implemented properly by a vendor. This is called a conformance test. Those that passed the standard are awarded the badge of conformance with SMI-S version such and such. But the same organizations that are manufacturing the hardware strongly influence the test success criteria. Like the dubious characters in “The Good, The Bad, and The Ugly”, those with the biggest (or fastest) gun make the law. This has led to a fairly low bar for the conformance requirement particularly in the area of performance where conformance simply means that a few metrics from a single ElementType are supported.
For example, a vendor that only reports at the Storage system and Logical Volume has the same level of conformance as the vendor that has reporting on all storage system components: System, Front-end Adapter, Front-end Ports, Cache, Back-end Adapter, Disk Drives, Storage Pools and Logical Volumes. On the surface it appears like an effort to promote self-esteem to all who want to play even if they are not playing nice. Under the covers though we need to remember that there are typically two surface level motivations for withholding information:
1) Fear that the information can or will be used negatively and
2) Fear that vendor tool sales will be placed at risk if other technically competent solutions are offered.
The root cause of both of these surface level motivations is fear of losing money. Until vendors realize that improved management solutions benefit their customers directly and indirectly themselves, there will continue to be shortcomings in SMI-S. In reality, improved storage performance, capacity and configuration solutions lead to higher customer satisfaction levels through higher availability, improved performance service level objectives, and faster mean time to root cause.
In summary, from a storage software perspective, the standard is challenged by the technology and the organizational structures formed to perpetuate it. This means that performance reporting capabilities in software products that rely on SMI and the BSP tend to vary quite a bit depending on the choices the hardware vendor has made. And unfortunately, the software vendors don’t necessarily have a Clint Eastwood on their side to fight for anything more comprehensive!
For more information on the standard see: http://snia.org/forums/smi