Go Fast Using Data Virtualization

denodo
2 min readJan 18, 2022

During a recent house move I discovered an old notebook with metrics from when I was in the role of a Data Warehouse Project Manager and used to estimate data delivery projects. For the delivery a single data mart with supporting Reports and Dashboards, with 6–8 dimensions and a handful of measures or facts, the delivery cost was approximately $200K AUD per data initiative and it needed a delivery team of 8–10 resources over a duration of roughly 12 weeks.

In this example we used more traditional methods of data delivery. We used an ETL approach for all data movement (in this case DataStage) and an agile, iterative methodology for development and deployment. The data would move from the source system to a staging area (making it easy to reconcile the data being captured), then the data would move to an ODS or Operational Data Store (to service the business from an operational reporting perspective), then the data was modelled and moved to the EDW or Enterprise Data Warehouse (for historical storage and time-series analysis), then the data was moved into subject matter specific data models (quite often star-schema designs) and then finally multi-dimensional cubes (used for more in-depth analysis and reporting).

Read more in https://www.datavirtualizationblog.com. Originally published on January 14, 2022

--

--

denodo

We do #DataVirtualization We care about #AgileBI, #BigData #Analytics, #Dataservices, #DataManagement, Logical #DataWarehouse Web, #SaaS and #Cloud integration.