Daasity Knowledge Base
Ask or search…
K

Daasity Data Model Overview

This page provides an outline of the general Daasity Data Model, both our Unified Schemas and Data Marts, why we designed the data model this way and how our transformation layer works

Design philosophy

The Daasity Data Model was designed to be future-proof so that a complete rebuild of reporting would not be needed when either a source system changed or a new source system was added.
Our team's experience of dealing with switching Email Service Providers (ESPs) and changing APIs lead to this design that allows us to minimize the impact by leveraging a normalized middle layer so that we only need to make one set of changes when upstream systems are modified.
Data undergoes three (3) steps:
  1. 1.
    Data is replicated from the source system into the Extractor schema
  2. 2.
    Data is transformed from the Extractor schema into a Normalized schema which we call Unified Data Schemas
  3. 3.
    Data is transformed from the Normalized schema into the reporting / Data Mart schemas.
Daasity Data Model

Extractor schemas

The extractor schema is the best representation of the source system (SaaS platform, database or other data source) that is possible in a traditional database structure. Thus, nested data sources (e.g., JSON) may be denested into multiple tables.
This approach enables us to implement an ELT approach and move the transformation logic to a SQL/Python layer where it is easier to access and modify.
Although our storage costs may increase because of the data replication, in the consumer brand industry, the size of data is relatively small, and storage costs are minimal in comparison to the cost of maintaining pipelines that transform from source to end reporting.

Normalization schemas

The Normalization schemas (Daasity Unified Schemas) are a core component of the Daasity platform. Developing a unified schema that normalizes similar data sources into the same data model has significant impacts for analytics development. It reduces the overall maintenance of the data model and allows you to plan for the future.
For example, our Unified Order Schema (UOS) is built to support a multi-shipment/multi-recipient framework across eCommerce, Marketplace, Retail, and Wholesale, which very few commerce platforms support.
This means if a commerce platform were to add additional functionality for multi-shipment/multi-recipient you would only need to change the transformation code from the Extractor Schema to the Unified Schema, and none of the downstream data models and reports would be impacted. This greatly reduces the maintenance, as we have one single data model to change.
More detail on our Unified Schemas is available starting with our Unified Data Schemas page

Reporting schemas

The data reporting schema (DRP) is the source schema for reporting and where we link a visualization tool like Looker, Tableau or Sigma Computing. Building the data reporting schema from the normalized schema enables us to build the business logic into this transformation layer and limit changes that need to be made to changes to business logic and not the source system.

Data reporting platform (DRP)

DRP is the original reporting data model which uses the concept of data marts even though the tables are stored in a single schema. This allows us to build our visualization layer for specific user groups to ensure that a user can build reports themselves and reduce the likelihood they will get the wrong results.

Data marts (DM)

Data Marts are the new design approach that Daasity is using to build new and update existing reporting capability to leverage the functioanlity of our data orchestration engine. This new design separates each reporting area into its own schema enable the data models to run more independently.
Similar to our original drp schema, the data mart structure enables us to build the visualization layer on top of each data mart to address specific questions related to that area.
More information on our Reporting and Data Mart Schemas are available starting with our Data Marts Models page