In this section we explain how i-refactory orchestrates and executes the data logistics considering datasets delivery and OLTP integration.
The i-refactory orchestrates and executes a critical part of your data logistics. We designed our process taking the characteristics of physical deliveries into account. For this reason we identify the following main building blocks to manage your data logistics:
{list}
- A
Delivery Agreement
is an agreement for data-exchange between a Tenant and the i-refactory dataplatform targeted to/from one Layer.- A
Tenant
is a party for a dataset or datarecord as agreed in the Delivery Agreement.- A
Delivery
is a container that holds the specification of what is actually exchanged to/from the i-refactory platform.- A
Layer
holds the logical model relevant for the the Delivery (Agreement). In practice the layer specifies if you deliver datasets (the layer is identified as Logical Validation Layer), you opt for OLTP integration (the layer is identified as Generic Data Access Model) or you opt to consume data in a physical file (the layer is identified as File Export).
The i-refactory orchestrates the data logistics for three different scenario's :
The delivery of datasets is targeted towards i-refactory's Logical Validation Layer. The delivery of datasets is a combination of:
inclusion of all or some entities related to the Logical Data Model, also known als a Complete
and Partial
Delivery. A Complete Delivery is a delivery that includes all entities of the corresponding Logical Data Model. A Partial Delivery is a delivery that includes some entities of the corresponding Logical Data Model.
indication per entity if this entity represent a Full
or Delta
of the provided snapshot.
A Full provided snapshot of the corresponding entity includes all known records at the snapshot datetime. All inserts, updates and logical deletes can be determined automatically by comparing the available current facts with the content of the Delivery.
A Delta provisioned snapshot of the corresponding entity includes the delta as of previous successfully delivered facts and thus must include and indicate deleted records while new and changed records can be identified automatically.
With the options as described above a Tenant can:
Context Based Delivery
Context Based Delivery is a special type of delivery that allows Full
datasets within a given context to be executed by i-refactory. The main advantage is that the implicit delete detection (or physical delete process) works within context.
Example: assume we have a Logical Datamodel that holds the following entities: Sales Organization related to N Sales Orders related to N Sales Order lines. There are 5 regional Sales Organizations that individually provide their sales order details to this Logical Model. If we treat Sales Organization as Context, you can provide for one Sales Organization (thus context) all their Sales Order and related Sales Order Lines. The processing of Sales Order and Sales Order Lines are treated as Full
within their context and thus implicit delete detection is determined automatically for this scenario.
The architecture layer File Export
is used to support file based data consumption.
For this architecture layer you can define Delivery Agreements and deliveries to load data into parquet files on your local filesystem or on your Azure Storage Account.
The event that triggers the outbound delivery is based upon an external process that uses our delivery-API.
Outbound delivers can be configured to balance system resources in similar manner as inbound deliveries, also monitoring outbound deliveries is done on the same overview as where you monitor inbound deliveries.
You need to specify the following parameters in our API to trigger an outbound delivery:
The delivery of OLTP integration is targeted towards i-refactory's Generic Data Access Layer. This means OLTP integration can Create, Read, Update and Delete facts through the Generic Data Access Layer. From a delivery perspective this means you can activate deliveries to a Data Access Model, that results in Create, Read, Update and Delete actions orchestrated by the i-refactory solution.
The default pattern supported is a record based process. For applications that deliver sets of records to be processed, this requires additional configuration. Specific Business Rules or Controls other than primary key and relationship constraints can be included as part of the execution process as so-called Data Application Programming Interface (DAPI) controls. CRUD transactions process current timeperspective on data in a relational structure.
As a result of the temporal consistency we use a time stamp, part of the Delivery, for the (technical) timelines throughout relevant layers. The snapshotdatetime of the source system or the datetime chosen by user determine the timestamp used. Per entity a 'high water mark' is defined: the latest (most recent) time stamp processed for that entity. The logistical process won't accept delivery set to a time stamp older than the high water mark.
The logistic process fully manages the Traveling Now
to ensure a GDAL view always delivers a time consistent view of the registered facts. This also enables processing facts in CFPL in parallel to time consistent consumption of data through GDAL - this avoids typical scenario's to separate loading in night batches from daily data consumption assuming the database is capable to technically orchestrate the workloads.
The i-refactory solution allows user influence of the date used as timestamp by defining the timestamp as part of the Delivery. This provide an option to migrate historical timelines to your i-refactory managed dataplatform by providing data per timeslice in chronological order. Through this method you can recreate your historical data perspectives with the existing datawarehouse as a source system.
When performing a delivery you can opt for:
{info} In the case of physical deletes this means that if you want to remove the records because of a AVG request ex. customer name, customer phone number etc. A physical delete will make sure that this sensitive information will be deleted without compromising the referential integrity of the database.