Escolar Documentos
Profissional Documentos
Cultura Documentos
Blogs
Introduction
With this blog I would like to present a new functionality to view and restore data of the queues for the extraction of logistics data to the BI. The functionality has been made available via note
1008250, as well as via support packages.
From a high-level point of view you get two new things with the note:
1. A new table which is storing data of the queues for the Logistics BI extraction.
2. A new transaction (LBWR) which can be used to either view the data of this table or to rebuild queue data from it.
So why should you care about it? Well, you should if you ever wanted to
l view data sitting in one of the MCEX queues or
l restore data which has been lost for your BI without doing a new init/setup.
For example you are enhancing your extractor with new fields but for some unknown reason during the delta process the fields' contents gets lost. Now you have one more point in the processing
chain where you can easily check the data, without debugging.
Another example would be data loss due to RFC problems in one of your queues: With this functionality you can rebuild the missing LUWs without doing a setup of data (init).
For a detailed technical documentation you might want to refer to the report documentation of report RMBWV3RE. In future releases this documentation will also be available via SAP Reference
IMG.
Guessing what might become frequently asked questions, I am trying to answer some here:
E. Scroll down for the Customizing block of the screen. Here you will see two parameters:
n No.Coll Processing and
n No. of Days with Backup Data
Check the F1 help of those fields for the exact definition. For a first test (preferably in a non-productive system ;-)) you could just enter 0 for the first and 2 for the second
parameter.
F. Now press the Execute button (F8) and acknowledge the info popup.
G. Congratulations! You are done. From now on the backup table will store all the data of the queue for the specified amount of time (with the second parameter set to 2 it will be kept
for two days).
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 2 of 7
4. Can I disrupt my statistics with rebuilding queue data? What precautions should I take?
Yes, you can absolutely cause harm to your statistics in BI if you are not careful. Specifically you will cause double (triple, ...) figures if you are not making sure, that the data you
plan to rebuild does not exist in all data targets subsequent to the extraction queue.
So you always should check RSA7 (BI delta queue), PSA and all subsequent data targets in BI for the data you plan to rebuild. In case of corrupted but existing data in subsequent targets
you first have to delete this data before you can rebuild the data.
Another issue can occur if you have a newer change of a document in BI and you are trying to rebuild and extract an older one. In this case you likely have a serialization problem in one of
the BI data targets. This can be solved by deleting all data related to the document before rebuilding all those changes.
You see, rebuilding queue data via LBWR takes a great deal of responsibility on the user's side. This also means the authorizations for this action (authorization object M_QU_RE with activity
'01') should be limited to a small group of data extraction experts. Authorizations for viewing backup/queue data can be granted less restrictively, although the importance/secrecy of the
specific queue data should be taken into account, too.
5. What is the potential performance/DB size impact of using this in our productive system?
It depends. ;-)
Well, just after creating the ordinary queue entry the new functionality will basically perform two possibly time-consuming operations:
A. It will draw a stamp (counter) from the enqueue server. In our tests this always has been a non-issue, but if you are already having problems with the enqueue server in your
productive system this might need some attention.
B. It will write the complete data needed to reconstruct the queue entry to the new cluster table MCEX_DELTA_BACK (via EXPORT TO DATABASE). The performance of this operation
mainly depends on the size of that table. Fortunately you can limit the size via the customizing (see question number 1 above).
Now, for starting with the backup table I would recommend to just keep the data of one collective run (parameter "No.Coll Processing"), as a first test in your productive system. If that
works just fine you can always increase one or both of the two parameters. There is no special procedure required for changing the parameters which reside in table TMCEXUPD (although
make sure you are not unintentionally transporting a change of update mode with the parameter changes ;-)).
If you have more questions or comments please feel free to post them here.
Comment on this weblog
Showing messages 1 through 28 of 28.
Hi,
I tried the backup customizing with Applications 02 and 13 with immediate success;
this means: after then next collection run and new booking, the backup-table-content could be seen.
In application 45 , which is delta-queued in the same way, and was customized with the same parameters, there is no effect seen (table TMCEXUPD was checked, Delta was updated in BW ).
Do you have an idea what problem this could be ?
Another Question: Is there any possibility to get backups in a similar way in Application 03,
if booking-mode is set to unserialized-V3-booking ?
Hi,
this functionality has been implemented for application 45 as well. I just checked and the required coding is present (function module MCEX_UPDATE_45_V1, search for "mcex_delta_back").
So if it does not work for that application only we have to consider a bug. Please feel free to create a support message in that case, on component BW-BCT-ISR-AB.
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 3 of 7
Regarding your second question: No, that is currently not possible or intended. Doing this would require several modifications.
If you really want to use this functionality with application 03 you will have to use the update mode "Delta Queued" for it, I am afraid.
Best regards,
Bernd Sieger
n Problem with Appliction 45
2009-07-30 06:15:34 Udo Koepsell Business Card [Reply]
Hi,
Best Regards
Udo Koepsell
n Problem with Appliction 45
2009-07-30 06:23:52 Bernd Sieger Business Card [Reply]
That looks just like the reason for the issue. If you create a message with that hint a correction should be coming up very quickly.
Hi,
I know we need to clear the delta queue before any relevant DDIC changes hit the system (transports, SPs, etc.). How abt this delta back up table, do we need to clear backup tables also?
Vijay
Effect of DDIC changes to datasource
2009-01-15 01:44:02 Bernd Sieger Business Card [Reply]
Vijay,
there is not really a technical reason to clear out the backup tables prior to importing relevant DDIC changes. The old data in the backup table will be invalidated (if the change is relevant),
but that is not worse than deletion. You might receive dumps when trying to use the LBWR for displaying the old data, but there should be no problem for new data being collected.
In general it would be the "cleaner" approach to delete the old data in the backup tables first, but if it does not fit in your schedule for the upgrade/SP implementation it is acceptable to
omit this step, in my opinion.
Best regards,
Bernd
HI, Can we implement this note in our ECC 5.0 system which has SAP_APPL 500 component? Will it still bring the new functionality?
Thanks,
Venkat
APPL version for implementing note 1008250
2008-10-07 23:04:13 Bernd Sieger Business Card [Reply]
Hi Venkat,
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 4 of 7
you will need at least PI 2004_1_500 on top of your SAP_APPL 500. If you have a lower PI version the coding might still work, but I can not guarantee that (since I did not test it with lower
versions).
Best regards,
Bernd Sieger
Hi Bernd,
Based on your clear cut detailing we tested the LBWR transaction in QA and it works very well. This development of yours will go a long way in helping us to correct the corrupted data. In view
of the excellent support received from you, we were able to overcome many issues in authorization, translation etc. Thanks a lot for the support given by you even in very odd hours.
I have some basic questions (why not create a table in source system and extract data by a generic extractor) on LO extraction, can I contact you further on this blog? I did go through all the
blogs of Robert Negro and other documents but not getting answers.
Excellent support and explanation
2007-08-15 04:50:28 Bernd Sieger Business Card [Reply]
I would prefer to keep this blog very close to the topic. Feel free to send me other questions regarding LO extraction via e-mail. But please be aware that I will only be able to answer them
as time permits. ;-)
For everyone: English translations for this new functionality are now available as attachment LANG.zip of the latest version (version 9) of note 1008250. Finally this got resolved!
Best regards,
Bernd Sieger
Thanks for the immediate response. I tried exactly the way you wanted me to do, but unfortunately again it failed. As suggested by you,taking up with SAP. Meanwhile if you have any updates
on this OSS Note please let me know.
Issues in application of OSS Note 1008250
2007-07-05 07:51:34 Anand Gupta Business Card [Reply]
Hi Veerabhadra and Bernd Sieger, I also get exactly the same result. I have checked all corrections and the authorisation. I am also raising a note for it but wondered if you have already got
it solved.
As suggested, I attempted to work with the blank field in the "Processing Mode" dropdown box and after filling the "No.Coll Processing" and "No. of days with Backup data" the transaction is
failing. Any suggestions please.
Issues in application of OSS Note 1008250
2007-06-26 00:32:32 Bernd Sieger Business Card [Reply]
If you tried this and it did not work for you please create a customer message on BW-BCT-LO-LIS. SAP can not fully support you via SDN. ;-)
Best regards,
Bernd Sieger
Hi,
The Blog is excellent and gave us the remedy for the issue we are facing of missing data for the LO cocpit extraction. With the help of BASIS we did apply the OSS Note 1008250 to our sandbox
having the following specifications:
SAP_BASIS 620 0060 SAPKB62060
SAP_ABA 620 0060 SAPKA62060
SAP_APPL 470 0027 SAPKH47027 Logistics and Accg SAP_HR 470 0010 SAPKE47010 Human Resources
EA-IPPE 200 0023 SAPKGPIB23 SAP_iPPE
PI 2004_1_470 0013 SAPKIPZI5D R/3 Plug-In (PI) 2004.1
PI_BASIS 2005_1_620 0010 SAPKIPYJ5A PI_BASIS 2005_1_620
ST-PI 2005_1_620 0005 SAPKITLQG5 SAP Solution Tools Plug-In
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 5 of 7
Hi,
yes, we have been recently made aware of the fact that the English translation currently is missing from the transports attached to the note. It seems like this happened during the
downport of the functionality.
Currently we are trying to clarify with our translation team how to provide an English translation. This may take some more time. I will post an update here as soon as this has been fixed.
Regarding the "Processing Mode" box: There should be three possible values, even if you lack the descriptions in your logon language, being ' ', 'D' and 'X'. Try to select ' ' (blank) for
changing the customizing.
HTH,
Bernd Sieger
l WoW!!!!
2007-05-31 22:15:52 Shyamkumar k Business Card [Reply]
Hi
Thats Gr8 news. A good blog that befits the wonderful functionality!! What is the Support package level. Is there any instance of performace and table sizing issues reported?
Shyam
WoW!!!!
2007-05-31 23:21:48 Bernd Sieger Business Card [Reply]
Hi Shyam,
you can find the support package entries in note 1008250: SAPKIPZI5F, SAPKIPZI6G and SAPKH60009 for PI 2004_1_470, 2004_1_500 and SAP_APPL 600 respectively.
Best regards,
Bernd Sieger
l Selective deletion
2007-05-17 06:28:05 Suresh Natarajan Business Card [Reply]
Hi Bernd,
Is there a possibility to selectively delete the data from the backup table.
In many instances we might have user exits to populate the enhanced fields based on business logic and it might not have the values updated for all transactional records. This would give us
the ability to load records based on document types etc in the backup table.
Thanks
Suresh
Selective deletion
2007-05-17 22:56:23 Bernd Sieger Business Card [Reply]
Hi Suresh,
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 6 of 7
I am afraid that is no option, since the backup table is a cluster table and does not contain application specific fields as key fields.
So for a rebuild run you can select either via timestamp (key field in the backup table) or via number of collective run (the timestamps of the collective runs are stored in a separate table).
Selecting via any application specific field would mean reading all the records into an internal table and evaluating them. This would be very slow.
From the design of the functionality deletion should be no option anyway (except for records being deleted automatically when they are old), since a backup table should provide a full
backup (for the time specified), not a partial one.
Best regards,
Bernd Sieger
As a beginner in BW, I wonder how this wonderful technique can benefit the BI 7.0. As far as I know, the reconstruction technique is not used in the current BW version, isn't it?
Just a beginner's question
2007-05-09 23:08:54 Bernd Sieger Business Card [Reply]
Hi,
this functionality is not dependent on your BI version, but rather on the SAP_APPL/PI version in your application system. It is available for PI 2004_1_470, 2004_1_500 and ECC 600.
So you can combine it with any BI/BW release which works with those application releases.
Best regards,
Bernd Sieger
l Thanks
2007-05-08 12:05:05 John Kurgan Business Card [Reply]
Thanks Bernd,
This new enhancement should also strengthen the importance of modeling the BI DSO / ODS Objects fields to update with 'Overwrite'. Coupled with the new transaction, correcting data
downstream to BW should be fairly straight forward now.
Hello Bernd
This is Great. This will make LBWQs lot more reliable. Thanks a lot.
Do you have any good documentations that can help us understand qRFC queues and how to manage them. It will be of a great help. This is 2nd delicate piece of Technology to support the
integration of ECC and BI.
Pankaj Gupta
This long waited enhancement
2007-05-08 06:41:02 Bernd Sieger Business Card [Reply]
I am not aware of any general documentation regarding the handling of qRFC queues. I guess this might be an issue because the handling differs from one application to the next.
Hello Bernd
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010
SAP Network Blog: A Safety Belt for Logistics Extraction Queues Page 7 of 7
I would like to ask totally different question. We used to have Release information by Plug-ins. Like What is new data source or extractor per Plug-in comming. I was able to access all
that info by URL that was connected with referekce to Plug-in release notes.
Now, Plug-in are all part of the main software. I have not found proper place to look the Data Source Release notes. Suppose, I want to find new Data Source came after 2004.1 by
different application components. I do not have clue how to start.
If you can point me that will be great. If this is not appropriate question, feel free to request the removal from Blog
Pankaj Gupta
l Really useful!
2007-05-08 06:02:46 David Zhuwao Business Card [Reply]
Hi Bernd,
This is a welcome development! A few months ago I had to reconstruct logistics and was forced to run init/setup. And, if this tool was available then it would saved a couple hours of downtime.
I was left wondering SAP did not support such a feature. I'm happy that it now does.
Regards,
Joao.
Really useful!
2007-05-17 02:54:48 sridhar gande Business Card [Reply]
Hi Bernd,
Thanks,
Sridhar
http://weblogs.sdn.sap.com/pub/wlg/5316 11/26/2010