ODPi provides cross-compatibility between different distributions of Hadoop and big data technologies.
ODPi Core specifies how Apache components should be installed and configured and provides a set of tests for validation to make it easier to create big data solutions and data-driven applications.
ODPi Core is not a distribution, it’s an industry standard deployment model over which the industry can build enterprise-class big data solutions.
① Reinforces the role of the Apache Software Foundation (ASF) in the development and governance of upstream projects.
② Accelerates the delivery of Big Data solutions by providing a well-defined core platform to target.
③ Defines, integrates, tests, and certifies a standard “ODPi Core” of compatible versions of select Big Data open source projects.
④ Provides a stable base against which Big Data solution providers can qualify solutions.
⑤ Produces a set of tools and methods that enable members to create and test differentiated offerings based on the ODPi Core.
⑥ Contributes to ASF projects in accordance with ASF processes and Intellectual Property guidelines.
⑦ Supports community development and outreach activities that accelerate the roll out of modern data architectures that leverage Apache Hadoop®.
⑧ Will help minimize the fragmentation and duplication of effort within the industry.