background preloader

BW - Technical

Facebook Twitter

Copy Queries(RSZC) across infoproviders which are slightly differentetwork Blogs. Copying queries across infoproviders which is done using RSZC is a well discussed topic and available on help.sap.Similary, copying queries across infoproviders which are not alike is a widely discussed topic too.Some of the solutions for this are as below: Method 1: Setting debug point as in this wiki. Method 2: Using ABAP as in this article. Method 3: Using a multiprovider as in this Re: How to copy query elements without Bex.

I personally have not tried method 1 and 2, these two are generic solutions as I see it. But if you have a small requirements where you need to just copy between two infoproviders then you can use below option too (Method 4). Let me take an example, Source Infoprovider: YCOPY_SRC Target Infoprovider: YCOPY_TGT The only difference between YCOPY_SRC and YCOPY_TGT is one characteristic say 0MATERIAL (0MATERIAL is present in YCOPY_SRC but not present in YCOPY_TGT). Requirement: Copy query YCOPY_SRC_Q0001(query on YCOPY_SRC) to YCOPY_TGT. Solution: In your development system: Summary of BI 7.0 performance improvements. I just want to give you an overview and not go into deep details or exact instructions.

Just wanna give you some points from where you can start your analyze and tune your system. You should try out all the tables, views and transactions by yourself. Performance issues in summaryQuery performance analyse Cache monitorST03nST13ST14StatisticsST02BW Administration CockpitOptimizing performance of InfoProvidersILM (Information Lifecycle Management)BWAQuery analyzing exampleHow Oracle 11g affects BW performanceGeneral Hints 1. The common reasons for performance issues in summary Causes for high DB-runtimes of queries no aggregates/BWADB-statistics are missingIndexes not updatedread mode of the query is not optimalsmall sized PSAPTEMPDB-parameters not optimal (memory and buffer)HW: buffer, I/O, CPU, memory are not sufficientUseage of OLAP Cache? Causes for high OLAP runtimes high amount of transmitted cells, because read mode is not optimaluser exits in query executionusage of big hirarchies 2. 3.

Why Attribute Change Run in SAP BI 7.0 i. Scenario In SAP BI 7.0, we often require a sub set of data to be accessed from Info cube. So, in this case we need to create smaller sets of data from the larger info cube. For Example: For a FY2009 with "Material A‟ data is only required from a set of 10 materials. Introduction Since the info cube is small, the required disk I/0 volumes during the query execution will be smaller too, and the query users will have an improved query performance. Attribute Change Run A change run refers to the activating master data changes to attributes and hierarchies, realigning aggregates containing navigational attributes or defined on hierarchy levels. Fig 1.1 describes the changes affected to the availability of master data once a new record has been added. Process Chains Step by step analysis for the integration of attribute change runs in process chains Fig 2.23 – This figure is a simple demonstration of further simplicity introduced by process chains for ACR.

Performing ACR. How to create a new Factory Calendar in. This blog describes about the requirement to run the process chains on specific days of the week. For example, if you want to run finance process chain only for 6 days of the week and would not like to run finance process chain on saturday of the week, you should be able to achieve this by following this blog. Step by Step Procedure: Step 1) Creating a new Factory Calendar to run only on Monday, Tuesday, Wednesday, Thursday, Friday and not on Saturday.

Step 2) Go to SCAL transaction in R/3 system and click the Factory calendar radio button and click on change. Step 3) Click on Create Icon and then name the new Factory calendar Step 4) Go to BW system and Source system and right click on the source system and transfer global settings Step 5) Follow the next step Step 6) You will get the new factory calendar into BW system and you can see that in the process chain Start scheduling section. Step 7) In the Start process type, click on Change Selections. Why is Oracle writing out so many archivelogs in m. Sometimes support messages have go a rather long way to finally get a solution.

The following problem came up in a support message recently and since neither the analysis steps nor the solution were obvious, I decided to make a blog out of it. Ok, so the problem statement was something like this: "Hello support,we're using SAP BI and recently re-initialized some of our datasources.From then on, our Oracle database writes out Archivelogs at a rate 4 times higher than before.What's causing this and how can we stop it? " Hmm... this does sound a bit odd! So the first step for this message was of course to verify the observation of the customer. Nifty tools at hand A very nice way to do that is to use the scripts from the RSORASTT support script framework (see note #1299493 -\ "Loadable SAP Support Monitors in DBA Cockpit" for details on this). With one of those scripts the following matrix was produced: /BIC/B0000356000PSA for 2LIS_41_S920 P11-100 2LIS_41_S920 A hot lead - or not?

And so on. How to enhance CRM specific DataSource (0CRM_SALES_ORDER_I) There are a lot of topics on DataSource enhancement, all about R/3, generic DataSource, or table/view. Once I have to enhance the CRM order item DataSource, I dig a while. I can explain the theory by the following diagram. We have enhanced the table CRMD_ORDERADM_I. First, enhance the extract structure in RSA6. Then, go to BWA1, add a line in Mapping. Now we need to implement it. * Define internal table for the extract structure. * The type of the Structure is visible in transaction RSA6. * Also the enhancement of the extract structure will be done * with transaction rsa6. * This datasource structure. * define structure for data import.

DATA: lt_item TYPE CRMXIF_BT_SALES_ITEM_T, LS_DOCUMENTS_XIF LIKE LINE OF lt_item. * copy data from extract structure to internal table. . * Get the XIF documents. * Call the callback function. loop at LT_DOCUMENTS_XIF into LS1_DOCUMENTS_XIF. append lines of LS1_DOCUMENTS_XIF-item to lt_item. endloop. ** Get the data in the internal table for the partners ENDLoop. when others. SAP BW Indexing Scheme (ORACLE) <body>Definition: Indices are pointers which speed up the data retrieval process significantly. In its absence the performance can take a severe beating by doing a Full table scan. The Schema<br />BW uses two fact tables per InfoCube - the "F-fact table" (request packages with ids > 0) and the "E-fact table" (consolidated request package with id=0).

The corresponding table name prefixes are /BI0/F, /BIC/F, /BI0/E and /BIC/E. InfoCubes (fact tables, dimension tables) are fully indexed and usually do not require additional indices. <u>Standard InfoCubes<br /><br /></u>Primary Index:It’s pretty interesting that both fact tables do not have a unique index defined over its primary key. The secondary indexing scheme is similar for both types of tables (F and E). provide details for checking the scheme. Exceptions to this are: a) Line-item dimensions which are indexed via a normal non-unique B-tree. b) Real-time InfoCubes (Real-time InfoCube) – Discussed later. ! HowTo restrict F4 Help Values for hierarchies in Bex variable screen. I read recently some great articles about the new enhancement framework from Thomas Weiss.

You can find the series in the SDN wiki here. I got an idea, how easy it is to extend SAP classes or function modules for customers without making modifications! In my article I want to show you how you can restrict the hierarchies for profit center if user presses F4 help on a hierarchy variable in the Bex variable screen with the enhancement framework. I created a few very simple hierarchies on profit center. I created a very simple query on a cube where I had Profitcenter, Fiscal period and Amount. It simply shows the periodic costs for the profit centers. The corresponding variable for hierarchy nodes: The query itself looks simple: If you execute the query, the variable screen appears and asks you to choose a hierarchy. This is not very comfortable, especially if the user is restricted to one or two hierarchies only. The first thing you have to do is create an enhancement. Network Blog: How to Debug InfoPackages in BW.

Debugging DTPs is very easy. There is an option in the Execute tab of the DTP that allows you to execute it in a Dialog Process and debug it (screen-shot). You may even set breakpoints. How about InfoPackages? If you have a routine in the InfoPackage, you may insert the "manual" BREAK_POINT in the code. For more general situations, you may proceed with the following steps. 1- Set the InfoPackage scheduler to "Start Later in Background". 2- The Scheduling Options window will open. 3- Now, you need to set a debugging user for the system, so that the process will stay "on hold", until you start debugging it in SM50. ... or go directly to the table RSADMINA in SE16 and change it.

Caution: When you set this user, all processes running by it will stop and wait for manual intervention on sm50. 4- Start the InfoPackage. 5- Quickly go to SM50 find the work process and debug it. A Tour of the DBA Cockpit: Overview. Currently Being Moderated The DBA Cockpit is an SAP tool for managing the underlying databases. Initially developed jointly by IBM and SAP for managing DB2 databases, it has been expanded to encompass all SAP-supported database platforms, and is designed to be a central point for database administration tasks and monitoring. It is available in both the sapgui interface (traditional ABAP screens), as well as a web browser interface (via ABAP WebDynpro). I know what you're thinking. "Surely he can't think that SAP could put all of that together and make it available in both sapgui and a web browser!

" A Treatise on Logistics Data Extraction for Business Information. BI Master Data Partitioning. Size of master data can be as large as 40 millions or even more. One of our Utility Client has ~ 6 millions of master data records. This master data has 90 attributes making the table size really huge. We faced issue while activating the master data. The load was failing every time while activation. We tried activating the data for master data in background as well however no use.

Then partitioning of master data table (P Table in our case) helped and we could activate the data. The partitioning was done using DB6CONV Program by Basis.DB6CONV is not a ABAP Program and it needs to be done outside the application server For more details in DB6CONV program, please refer SAP Note 362325 - DB6: Table conversion using DB6CONV After few days we received request for adding 3 more attributes. Since we had partition the P table for ZTDB infoobject the new attributes were trying to find a place holder in partition table for themselves however failed as it was not automatic.

Approach: DataStore Objects for Direct Update. DataStore Object - DataStore Object is used to store consolidated and cleansed transaction data in transparent, flat database tables. DataStore Object Types:1) Standard DataStore Object2) Write-Optimized DataStore Object3) DataStore Object for Direct Update For detailed information on DataStore Object for Direct Update, please look into the following link.... DataStore Object for Direct UpdateThe following APIs exist for writing and deleting data in DataStore Object for Direct Update:1) RSDRI_ODSO_INSERT2) RSDRI_ODSO_INSERT_RFC3) RSDRI_ODSO_MODIFY4) RSDRI_ODSO_MODIFY_RFC5) RSDRI_ODSO_UPDATE6) RSDRI_ODSO_UPDATE_RFC7) RSDRI_ODSO_DELETE_RFC How to insert data into DataStore Object for Direct Update using RSDRI_ODSO_INSERT API The below sample code shows how to transfer a small set of data in one RSDRI_ODSO_INSERT API call.

REPORT z_dso_insert_data. DATA:i_count TYPE i. IF sy-subrc = 0. ELSE. ELSE. ELSE.