Showing 1 - 10 of 12 pages tagged with: Fast Data Strategy Virtual Summit 2019
The experts debate the merits and demerits of using physical vs. virtual data models for data science.In this session, you will learn:When to use physical/virtual data models or a combination of both for data science?Can automation of models help business users become citizen data scientists?How will machine learning and AI influence data science?Visit the Fast Data Strategy Virtual Summit 2019 page for details on all the sessions.
The experts field tough questions on the best approaches for companies to migrate their data warehouses and data lakes to the cloud.What pitfalls to avoid while undertaking the cloud journey?Should you go with a single cloud provider or multi-cloud? What criteria should you use?How to set up data virtualization in a hybrid-cloud scenario?Visit the Fast Data Strategy Virtual Summit 2019 page for details on all the sessions.
Efficiently integrating data from multiple data sources is the linchpin for successful data science projects. Data virtualization provides an environment to leverage business analysts’ domain knowledge and SQL skills and offload the data prep and integration work from data scientists. Further, the data virtualization environment also provides reusability of integrated data along with higher performance SQL data access.In this session, you will learn:
A virtual layer can help the data scientist speed up some of the most tedious tasks, like data exploration and analysis. At the same time, it also integrates well with the data scientist ecosystem. There is no need to change tools and learn new languages. In this session we will see:How the data catalog simplifies the search for useful dataHow to use Denodo's SQL engine to combine, transform and analyze data from any sourceHow Denodo integrates with tools like Zeppelin and Spark to work with large data volumesVisit the Fast Data Strategy Virtual Summit 2019 page for details on all the...
Autodesk is a global software corporation that develops software for the engineering, manufacturing, and media industries. The company decided to transform its revenue model from a perpetual licensing to subscription-based licensing model to propel growth. Autodesk designed a modern data architecture around logical data warehouse that heavily uses data virtualization to integrate new transaction sources, including big data systems like Spark, while retaining the old systems, which are actively used for reporting and complying with regulations.In this session, you will learn:
Denodo customer presentation by Nicolas Brisoux, Director of Product Management at Tableau Software.
According to a leading analyst firm, the total spend in data and analytics is expected to reach $104 billion in 2019! Companies are investing in data warehouse modernization and data lake projects for descriptive and advanced analytics; however, for the analysis to be holistic, today’s architects weave disparate data streams together, not only from these analytical sources, but also from operational, third party, and streaming data sources. Logical data warehouse is a modern architectural methodology that virtually combines all the data across the enterprise and makes it available to...
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best of breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform and provide real-time data integration, while delivering self-service data platform to business users. Denodo partner Kadenza will speak about why they choose data virtualization for any big data fabric implementation.Attend this session to...
Data catalogs are en vogue answering critical data governance questions like “Where all does my data reside?” “What other entities are associated with my data?” “What are the definitions of the data fields?” and “Who accesses the data?” Data catalogs maintain the necessary business metadata to answer these questions and many more. But that’s not enough. For it to be useful, data catalogs need to deliver these answers to the business users right within the applications they use.In this session, you will learn:
Today’s cloud migration strategies need to account for increased complexity of data governance and hybrid and multi-cloud architectures while reducing the inherent risks of disrupting users and applications during the migration. The core benefits of data virtualization technology provide the data abstraction required for decoupling users and applications from activities such as data migration and consolidation, while adding the semantics and governance necessary in modern data environments.In this session, you will learn: