lo datasources
TRANSCRIPT
-
8/10/2019 LO Datasources
1/4
LO DataSources
Logistic Cockpit (LC) is a technique to extract logistics transaction data from R/3.
All the DataSources belonging to logistics can be found in the LO Cockpit (Transaction LBWE)
grouped by their respective application areas.
The DataSources for logistics are delivered by SAP as a part of its standard business content in
the SAP ECC 6.0 system and has the following naming convention. A logistics transaction
DataSource is named as follows: 2LIS__ where,
Every LO DataSpurce starts with 2LIS.
Application is specified by a two digit number that specifies the application relating to aset of events in a process. e.g. application 11 refers to SD sales.
Event specifies the transaction that provides the data for the application specified, and is
optional in the naming convention. e.g. event VA refers to creating, changing or deleting sales
orders. (Verkauf Auftrag stands forsales order in German).
Suffix specifies the details of information that is extracted. For e.g. ITM refers to item
data, HDR refers to header data, and SCL refers to schedulelines.
Up on activation of the business content DataSources, all components like the extract structure,
extractor program etc. also gets activated in the system.
The extract structure can be customized to meet specific reporting requirements at a later point of
time and necessary user exits can also be made use of for achieving the same.
An extract structure generated will have the naming convention, MC 0
. Where, suffix is optional. Thus e.g. 2LIS_11_VAITM, sales order item, will have the
extract structure MC11VA0ITM.
http://4.bp.blogspot.com/-7Zf61tYtKUo/Tohs_8APpRI/AAAAAAAAAQE/myiXYvyS1lE/s1600/LO+DataSource.JPG -
8/10/2019 LO Datasources
2/4
Delta Initialization: LO DataSources use the concept of setup tables to carry out the initial data extraction
process.
The presence of restructuring/setup tables prevents the BI extractors directly access the
frequently updated large logistics application tables and are only used for initialization of data to
BI. For loading data first time into the BI system, the setup tables have to be filled.
Delta Extraction: Once the initialization of the logistics transaction data DataSource is successfully carried
out, all subsequent new and changed records are extracted to the BI system using the
delta mechanism supported by the DataSource.
The LO DataSources support ABR delta mechanism which is both DSO and InfoCube
compatible. The ABR delta creates delta with after, before and reverse images that are updated
directly to the delta queue, which gets automatically generated after successful delta initialization.
The after image provides status after change, a before image gives status before the
change with a minus sign and a reverse image sends the record with a minus sign for the deletedrecords.
The type of delta provided by the LO DataSources is a push delta, i.e. the delta data
records from the respective application are pushed to the delta queue before they are extracted to
BI as part of the delta update. The fact whether a delta is generated for a document change is
determined by the LO application. It is a very important aspect for the logistic DataSources as the
very program that updates the application tables for a transaction triggers/pushes the data for
information systems, by means of an update type, which can be a V1 or a V2 update.
The delta queue for an LO DataSource is automatically generated after successful
initialization and can be viewed in transaction RSA7, or in transaction SMQ1 under name
MCEX.
Update MethodThe following three update methods are available
1. Synchronous update (V1 update)
2. Asynchronous update (V2 update)
3. Collective update (V3 update)
Synchronous update (V1 update) Statistics updates is carried out oat the same time as the document update in the
application table, means whenever we create a transaction in R/3, then the entries get into the R/3
table and this takes place in v1 update.
Asynchronous update (V2 update)
Document update and the statistics update take place in different tasks. V2 update starts afew seconds after V1 update and this update the values get into statistical tables from where we
do the extraction into BW.
V1 and V2 updates do not require any scheduling activity.
Collective update (V3 update) V3 update uses delta queue technology is similar to the V2 update. The main differences
is that V2 updates are always triggered by applications while V3 update may be scheduled
independently.
-
8/10/2019 LO Datasources
3/4
Update modes1. Direct Delta
2. Queued Delta
3. Unserialized V3 Update
Direct Delta
With this update mode, extraction data is transferred directly to the BW delta queues witheach document posting.
Each document posted with delta extraction is converted to exactly one LUW in the
related BW delta queues.
In this update mode no need to schedule a job at regular intervals (through LBWE Job
control) in order totransfer the data to the BW delta queues. Thus additional monitoring of
update data or extraction queue is not require.
This update method is recommended only for customers with a low occurrence of
documents (a maximum of 10000 document changes - creating, changing or deleting - between
two delta extractions) for the relevant application.
Queued Delta With queued delta update mode, the extraction data is written in an extraction queue and
then that data can be transferred to the BW delta queues by extraction collective run.
If we use this method, it will be necessary to schedule a job to regularly transfer the data
to the BW delta queues i.e extraction collective run.
SAP recommends to schedule this job hourly during normal operation after successful
delta initialization, but there is no fixed rule: it depends from peculiarity of every specific
situation (business volume, reporting needs and so on).
Unserialized V3 Update With this Unserialized V3 Update, the extraction data is written in anupdate table and
then that data can be transferred to the BW delta queues by V3 collective run.
Setup Table Setup table is a cluster table that is used to extract data from R/3 tables of same
application.
The use of setup table is to store your data in them before updating to the target system.
Once you fill up the setup tables with the data, you need not go to the application tables again and
again which in turn will increase your system performance.
LO extractor takes data from Setup Table while initialization and full upload.
As Setup Tables are required only for full and init load we can delete the data after
loading in order to avoid duplicate data.
We have to fill the setup tables in LO by using OLI*BW or also by going to SBIW
Settings for Application Specific Data Sources Logistics Managing Extract Structures Initialization Filling in Setup tables Application specific setup table of statistical data.
http://2.bp.blogspot.com/-BiSb2uXFiq0/Tohlx1I8DEI/AAAAAAAAAQA/xC0Hl3vBQkM/s1600/Update+Modes.JPG -
8/10/2019 LO Datasources
4/4