Future Proofing Analytics Modernization in the Federal Government: How to Mitigate Disruption while Modernizing the Analytics Technology Stack

Agencies that have had legacy data warehouse, data marts, and analytics reporting technologies for the last 10 plus years are looking to modernize them, especially to a cloud equivalent. For e.g., many are migrating from on-premises data warehouses like Teradata to its cloud data warehouses equivalents like Snowflake. However, such transition is sure to impact internal and external data consumers in many ways: applications not being available due to downtime during the migration, training on the new analytics technologies, and new ways to search, find, and use the data in the new system. Such disruption can cause degradation in user experience, productivity and scalability to future changes. 

Is it possible to minimize or eliminate the disruption? There might be impact to critical reporting that needs to be run weekly, compliance considerations and data consumer assistance that need to be provided in real-time. So, how can critical agency operations continue unimpeded while switching to new technologies?

CDOMagazine convened a panel of federal leaders to explore this topic:

  • Mr. David Nelson - Director, Chief Information Officer and Chief Data Officer - United States Nuclear Regulatory Commission
  • Mr. Stephen Keller - Acting Director Data Strategy at US Treasury - Fiscal Services (for Mr Justin Marsico)
  • Mr. William Sullivan - GM and SVP, Federal at Denodo
  • Mr. Paul Brubaker - Deputy Chief Information Officer/AMO at U.S. Department of Veterans Affairs
  • Mr. Kenneth Clark - Chief Data Officer/Assistant Director at U.S. Immigration and Customs Enforcement (ICE)
  • Moderator: Mr. Paul Moxon, SVP Data Architecture and Chief Evangelist at Denodo

What's Next?

Gain real-time insights from your data and begin
your digital transformation today!