Infor and Snowflake announced a partnership that will use business intelligence (BI) tools from Birst, a cloud-based BI vendor, to allow users to build automated data warehouses on Snowflake.
Brad Stillwell, VP of analytics product strategy and solution consulting at Infor, told SDxCentral that this partnership will enable users to perform data warehouse automation within the Snowflake database. “This is key because it will help current and future customers move beyond a legacy approach to analytics and data integration, which inhibits their ability to innovate, scale, and grow their businesses,” he added.
New York-based enterprise resource planning (ERP) vendor Infor acquired Birst in 2017 – the same year Birst received one of the four highest scores in four of the five use cases assessed in the "2017 Gartner Critical Capabilities for Business Intelligence and Analytics Platforms" report. The report lauded the platform’s capabilities to render insights, discovery, correlations, and predictive analytics in a single interface.
It is through this previously established partnership that joint customers of Infor and Snowflake will be able utilize Birst's integrated end-to-end platform for building automated data warehouses natively on Snowflake.
Birst With SnowflakeBirst eliminates the need for separate extract, load, and transform (ELT) and extract, transform, and load (ETL) data modeling, data preparation, and analytics tools. Stillwell explained that it provides enterprise data governance and fine-grained control and security at the row and column level within Snowflake, along with auditing and built-in usage tracking.
As cloud becomes the first choice for enterprise data transformation, initiatives utilizing resources that capture data from across the enterprise – whether generated by disparate applications, people, or IoT infrastructure – offers tremendous potential. Because Birst and Snowflake run natively on Amazon Web Services (AWS), customers can eliminate the run-around operation of extracting data out of Snowflake. They can load it into Birst to transform and analyze, and then extract it from Birst and move it back into Snowflake.
The partnership creates a “cloud-first approach to data management,” separating computing and storage, which Stillwell called “vital” for joint customers who are looking for a scalable and cloud-first data management platform to inform their broader digital transformation initiatives.
“It’s this value that Snowflake has unlocked – the automatic creation of an analytic-ready data model or data warehouse on Snowflake,” Stillwell said.
AI-Powered BI Systems In Data LakesToday, more data crosses the internet every second than was stored in the entire internet just 20 years ago, and how that data is collected, accessed, and analyzed can determine whether a business sinks or swims.
“Companies that are currently not harnessing the power of artificial intelligence (AI) are putting themselves at a huge disadvantage,” Stillwell said.
All things considered, AI-enabled BI capabilities offer organizations the potential to elevate beyond traditional reports and dashboards with machine learning (ML) algorithms to drive intelligent insights that were previously unavailable to business users. The result: companies can better analyze their data, giving them the ability to start asking questions they didn’t even know they should be asking.
Stillwell said they are seeing enterprises rapidly move from static data warehouses to data lakes, whereby structured, semi-structured, and unstructured data from across their businesses feed into a secure, encrypted data repository.
The use of data lakes has grown alongside the increased use of cloud platforms as storage mediums. Gartner released a report last year that predicts 75% of all databases will be deployed or migrated to the cloud by 2022. Organizations are developing and deploying new applications in the cloud and moving existing assets at an increasing rate. And this, according to the report, is expected to grow in response to the need to extract in-depth insights from growing volumes of data and access data from departmental silos, mainframe, and legacy systems.
This is further underscored by additional market research that projects the global data lake market size will grow from $7.9 billion in 2019, to $20.1 billion by 2024.
“They [data lakes] will be key in helping enterprises grow their insights and investments, ingest more content for better informed decisions, improve their analytics profiles, and provide rich data sets to build more powerful machine learning processes,” Stillwell added.