You are here


Showing 1 - 10 of 13 pages tagged with: Hadoop
Vizient Cuts Costs and Improves Member Services Using Denodo Data Virtualization
Vizient is the largest member owned healthcare company in the US, delivering industry-leading supply chain management services and clinical improvement services to its members.They grapple with more than 1 billion detailed line item spend data, 110 million detailed patient records, 9 million products with 100 million cross-references, and 110,000 non-acute care customers, submitted each month.The Sales Operations team needed various types of reporting, such as opportunities, leads and member-related reports. Additionally, they needed to analyze opportunity-to-outcome, member-to-product and...
Data Services – Rapid Application Development using Data Virtualization
Drillinginfo uses data virtualization to manage and quickly provision data to the product development team and its customers. Drillinginfo will present how they have created a virtual data abstraction layer using data virtualization and reduced creation of web services for application development time from 1 – 2 weeks to less than a day.
Big Data: Architecture and Performance Considerations in Logical Data Lakes
In this session Alberto Pan, CTO at Denodo, covers the "Architecture and Performance Considerations in the Logical Data Lake." He explains in detail what a Data Lake Architecture looks like, how data virtualization fits into the Logical Data Lake, and goes over some performance tips. This session concludes with an example demonstrating this model's performance.
Hadoop and Data Virtualization - A Case Study by Vizient
Driving the Modern Data ArchitectureLearn how Vizient simplified data management and enabled faster data discovery with Hadoop and Data Virtualization.
BNSF Railways
ChallengesInefficient manual integration of disparate data spread across legacy mainframes, enterprise (CRM, ERP, Excel) and Hadoop sourcesDelayed data for business reportingHigh costs in building data integration workflowsSolutionDenodo deployed to modernize transportation scheduling system (5 – 10 year project)Platform used to normalize operational sources, mainframe tables and Hadoop systemsData Virtualization used to provide an abstracted view of legacy data while company migrates to a Linux based architectureBenefitsIncreased efficiency in extracting and combining data due to connectors...
Data Virtualization for Big Data Access and Data Services Delivery
Big Data projects can quickly turn into Big Data silos…unless you can work out how to easily share your HDFS files and Map/Reduce job results with the rest of the organization. See how Denodo’s Data Virtualization Platform allows you to turn your Big Data projects into valuable Big Data assets that can be shared with everyone in your organization.
Achieve Value and Insight with IBM Big Data Analytics and Denodo Data Virtualization
Denodo´s partnership with IBM has resulted in solutions for integrating data in Pure Data System for Analytics (Netezza) and IBM's BigInsights. After extensive collaborative technical work and optimization, the performance for the Denodo Platform was tested in the IBM Labs and it exceeded expectations in all three use patterns initially contemplated to be co-marketed by IBM. These patterns are described in the IBM solution brief.
DBTA: Elephant Traps … How to Avoid Them with Data Virtualization
Big Data is the latest ‘must have’ on the CIO’s wish list. But how do you take advantage of the benefits that Big Data products, such as Hadoop, can deliver without getting stuck in an ‘elephant trap’?
DBTA Best Practices: Data Virtualization is Vital for Maximizing Your NoSQL and Big Data Investment
In order to derive meaningful ROA from investments in Big Data, Cloud Computing, and NoSQL, organizations should consider adopting Data Virtualization from the get-go.
Data Virtualization for Big Data: How to Choose the Right Integration Model
Denodo’s unique Extended relational Model makes it easier to leverage your Big Data assets and integrate them with the rest of your enterprise data and avoid data silos.