New interoperability tests will be required for products billing themselves as OpenStack-powered, OpenStack Foundation Executive Director Jonathan Bryce revealed in the opening moments of OpenStack Summit in Vancouver on Monday.
The move follows a change to the Foundation’s bylaws granting the organization’s board direct control over use of the OpenStack trademark, an attempt to consolidate branding and promote interoperability across the fast-evolving open source cloud architecture platform’s many sub-projects.
The tests require OpenStack-branded products to carry a common subset of code and utilize common APIs.
“It can actually be pretty intimidating, this proliferation of tools,” Bryce told the packed house at OpenStack Summit on Monday. “The vision we had when we kicked this off…is a global footprint of interoperability that could bring together compute, storage, and network resources.”
With the new verification testing, Bryce adds, “you know what capabilities you’re getting, you can count on this to build your foundation on.”
In January, OpenStack’s member base approved a significant overhaul of the foundation’s bylaws, removing a detailed trademark approval process and allowing the board to determine trademark policy without input from members.
The ultimate goal of that shift was to allow the group’s powerful DefCore committee to create validation tests determining eligibility for use of the OpenStack trademark, sources told SDxCentral at the time.
The move toward requirement testing could also have internal implications for OpenStack’s monolithic release cycle, which attempts to herd a growing ecosystem of component projects into coordinated releases every six months.
“All of these projects should be running on their own timeline,” OpenStack board of directors member Randy Bias told us in January. “If we have this testing framework for what is OpenStack, it starts to open the door to that.”