DTP determines the process for
transfer of data between two persistent objects within BI.
As of SAP NetWeaver 7.0,
InfoPackage loads data from a Source System only up to PSA. It is DTP that
determines the further loading of data thereafter.
Use
- Loading data from PSA to InfoProvider(s).
- Transfer of data from one InfoProvider to another within BI.
- Data distribution to a target outside the BI system; e.g.
Open HUBs, etc.
In the process of transferring
data within BI, the Transformations define mapping and logic of data updating
to the data targets whereas, the Extraction mode and Update mode are determined
using a DTP.
NOTE:
DTP is used to load data within BI system only; except when they are used in
the scenarios of Virtual InfoProviders where DTP can be used to determine a
direct data fetch from the source system at run time.
Key Benefits of using a DTP
over conventional IP loading
- DTP follows one to one mechanism between a source and a
Target i.e. one DTP sources data to only one data target whereas, IP loads
data to all data targets at once. This is one of the major advantages over
the InfoPackage method as it helps in achieving a lot of other benefits.
- Isolation of Data loading from Source to BI system (PSA) and
within BI system. This helps in scheduling data loads to InfoProviders at
any time after loading data from the source.
- Better Error handling mechanism with the use of Temporary
storage area, Semantic Keys and Error Stack.
Extraction
There are two types of Extraction
modes for a DTP – Full
and Delta.
Full:
Update mode full is same as that
in an InfoPackage.
It selects all the data available
in the source based on the Filter conditions mentioned in the DTP.
When the source of data is any one
from the below InfoProviders, only FULL Extraction Mode is available.
- InfoObjects
- InfoSets
- DataStore Objects for Direct Update
Delta is not possible when the
source is anyone of the above.
Delta:
Unlike InfoPackage, delta
transfer using a DTP doesn’t require an explicit initialization. When DTP is
executed with Extraction mode Delta for the first time, all existing request
till then are retrieved from the source and the delta is automatically
initialized.
The below
3 options are available for a DTP with Extraction Mode: Delta.
- Only Get Delta Once.
- Get All New Data Request By Request.
- Retrieve Until No More New Data.
Ionly get delta once:
If this
indicator is set, a snapshot scenario is built. The Data available in the
Target is an exact replica of the Source Data.
Scenario:
Let us
consider a scenario wherein Data is transferred from a Flat File to an
InfoCube. The Target needs to contain the data from the latest Flat File data
load only. Each time a new Request is loaded, the previous request needs to be
deleted from the Target. For every new data load, any previous Request loaded
with the same selection criteria is to be removed from the InfoCube
automatically. This is necessary, whenever the source delivers only the last
status of the key figures, similar to a Snap Shot of the Source Data.
Solution
– Only Get Delta Once
A DTP
with a Full load should suffice the requirement. However, it is not recommended
to use a Full DTP; the reason being, a full DTP loads all the requests from the
PSA regardless of whether these were loaded previously or not. So, in order to
avoid the duplication of data due to full loads, we have to always schedule PSA
deletion every time before a full DTP is triggered again.
‘Only Get
Delta Once’ does this job in a much efficient way; as it loads only the latest
request (Delta) from a PSA to a Data target.
- Delete the previous Request from the data target.
- Load data up to PSA using a Full InfoPackage.
- Execute DTP in Extraction Mode: Delta with ‘Only Get Delta
Once’ checked.
The above 3 steps can be
incorporated in a Process Chain which avoids any manual intervention.
II Get all new data request by request:
If you
set this indicator in combination with ‘Retrieve
Until No More New Data’, a DTP gets data from one request in the
source. When it completes processing, the DTP checks whether the source
contains any further new requests. If the source contains more requests, a new
DTP request is automatically generated and processed.
NOTE:
If ‘Retrieve Until No More
New Data’ is unchecked, the above option automatically changes to ‘Get One Request Only’.
This would in turn get only one request from the source.
Also,
once DTP is activated, the option ‘Retrieve
Until No More New Data’ no more appears in the DTP maintenance.
Package Size
The number of Data records
contained in one individual Data package is determined here.
Default value is 50,000.
Filter
We have following options to
restrict a value / range of values:
Multiple selections
OLAP variable
ABAP Routine
A Tick Mark on the
right of the Filter
button indicates the Filter selections exist for the DTP.
Choose Semantic Groups to specify
how you want to build the data packages that are read from the source
(DataSource or InfoProvider). To do this, define key fields. Data records that
have the same key are combined in a single data package.
This setting is only relevant for
DataStore objects with data fields that are overwritten. This setting also
defines the key fields for the error stack. By defining the key for the error
stack, you ensure that the data can be updated in the target in the correct
order once the incorrect data records have been corrected.
A Tick Mark on the
right side of the ‘Semantic Groups’ button indicates the Semantic keys exist
for the DTP.
Error Handling
- Deactivated:
If an
error occurs, the error is reported at the package level and not at the data
record level.
The
incorrect records are not written to the error stack since the request is
terminated and has to be updated again in its entirety.
This
results in faster processing.
- No Update, No Reporting:
If errors
occur, the system terminates the update of the entire data package. The request
is not released for reporting. The incorrect record is highlighted so that the
error can be assigned to the data record.
The
incorrect records are not written to the error stack since the request is
terminated and has to be updated again in its entirety.
- Valid Records Update, No Reporting (Request Red):
This
option allows you to update valid data. This data is only released for
reporting after the administrator checks the incorrect records that are not
updated and manually releases the request (by a QM action, that is, setting the
overall status on the Status tab page in the monitor).
The
incorrect records are written to a separate error stack in which the records
are edited and can be updated manually using an error DTP.
- Valid Records Update, Reporting Possible (Request Green):
Valid
records can be reported immediately. Automatic follow-up actions, such as
adjusting the aggregates, are also carried out.
The
incorrect records are written to a separate error stack in which the records
are edited and can be updated manually using an error DTP.
Error DTP
Erroneous records in a DTP load
are written to a stack called Error Stack.
Error Stack is a request-based
table (PSA table) into which erroneous data records from a data transfer
process (DTP) are written. The error stack is based on the data source (PSA,
DSO or Info Cube), that is, records from the source are written to the error
stack.
In order to upload data to the
Data Target, we need to correct the data records in the Error Stack and
manually run the Error DTP.
Processing Mode
Serial
Extraction, Immediate Parallel Processing:
A request
is processed in a background process when a DTP is started in a process chain
or manually.
Serial in
dialog process (for debugging):
A request
is processed in a dialog process when it is started in debug mode from DTP
maintenance.
This mode is ideal for simulating the DTP execution in Debugging mode. When this mode is selected, we have the option to activate or deactivate the session Break Points at various stages like – Extraction, Data Filtering, Error Handling, Transformation and Data Target updating.
This mode is ideal for simulating the DTP execution in Debugging mode. When this mode is selected, we have the option to activate or deactivate the session Break Points at various stages like – Extraction, Data Filtering, Error Handling, Transformation and Data Target updating.
You
cannot start requests for real-time data acquisition in debug mode.
Debugging
Tip:
When you
want to debug the DTP, you cannot set a session breakpoint in the editor where
you write the ABAP code (e.g. DTP Filter). You need to set a session break
point(s) in the Generated program as shown below:
No data
transfer; delta status in source: fetched:
This processing
is available only when DTP is operated in Delta Mode. It is similar to Delta
Initialization without data transfer as in an InfoPackage.
In this
mode, the DTP executes directly in Dialog. The request generated would mark the
data found from the source as fetched, but does not actually load any data to
the target.
We can
choose this mode even if the data has already been transferred previously using
the DTP.
Delta DTP
on a DSO
- Active Table (with Archive)
The data is read from the DSO
active table and from the archived data.
- Active Table (Without Archive)
The data is only read from the active table of a DSO. If there is data in the archive or in near-line storage at the time of extraction, this data is not extracted.
- Archive (Full Extraction Only)
The data is only read from the archive data store. Data is not extracted from the active table.
- Change Log
The data is read from the change log and not the active table of the DSO.
No comments:
Post a Comment