Logical Data Fabric to the Rescue: Integrating Data Warehouses, Data Lakes, and Data Hubs

Rick van der Lans
Rick van der Lans Independent Industry Analyst R20

Data warehouses were introduced to offer one integrated view of all the enterprise data spread across numerous isolated transactional systems. Now, organizations are struggling with a myriad of new data architectures that also try to offer some integrated view of the data to all business users and data scientists, such as data lakes, data hubs, data lakehouses.

For business users, this is far from ideal. If they need data where should they get it from and how to integrate it? Especially now, that organizations want to become more data-driven, frictionless access to data is crucial.

A popular new architecture that supports this approach is data fabric. With a data fabric, existing transactional and data delivery systems are wrapped (encapsulated) to make all the independent systems look like one integrated system.

Data fabrics cannot be bought, they must be developed. They can be developed in many ways and with many different technologies. This whitepaper discusses the benefits of using data virtualization to create data fabrics, on-demand data-integration, high productivity, easy maintenance, harnessing the power of underlying technologies through query pushdown, centralized data security, embedded metadata support, built-in support for record-oriented and set-oriented data access, and AI support.

What's Next?

Gain real-time insights from your data and begin
your digital transformation today!