OM16 to get the Internal Ids of your orders (ou via RRP3 + GT_IO)
Témoin de suppression dans MAT1
LOC3 témoin de suppression
You can mass delete transportation lanes using the following reports:
1. /SAPAPO/TR_TRPROD_MASSMAINT –> specific for products
2. /SAPAPO/TR_TRM_MASSMAINT –> vehicles on transportation lanes
3. /SAPAPO/TR_TRMCARR_MASSMAINT –> TSPs for means of transport
But before you can use these reports, you first need to go to transaction WUF and choose object type LOCATION. Choose location as an attribute, then execute. In the Transportation Lane dependencies, you need to copy and paste all those in the reports above. Choose delete in the report then execute. After the reports confirm Transportation Lanes deleted, go to WUF. Transportation Lane button dependency should not be there.
Can be used to delete obsolete pegging areas from APO. For example, pegging areas without orders. It can help to improve system performance. It is recommended that this report be executed periodically.
Can be used to delete receipts in PP/DS. It is not recommended for usage in productive environments, it is more of a tool that can be used in order to clean up the system in the implementation phase. In general, for order deletion, it is recommended to use an interactive transaction like the Receipts View or to delete the orders with a BAPI.
Can be used to delete the User Settings variant of the Product Planning Table which is currently assigned to a User
Using this program you can copy user-specific settings for the Product Planning Table, Resource Planning Table, Detailed Scheduling Planning Board, Product View, so that they are available for other users.
This report should be executed periodically in order to ensure that the time streams of the resources are up to date in the liveCache. When the time streams are outdated in the liveCache, this can cause issues during order scheduling (for example, an order is scheduled during a non-working time of the Resource)
In a CTP scenario, if you terminate the Sales Order maintenance (VA01 / VA02) in ECC with /n, this may cause database inconsistencies in the table /SAPAPO/OPR. Report /SAPAPO/DMOPR_REORG_CTP reorganizes the operations table so that such inconsistencies are removed from the system.
This report can be used to search for orders which do not contain any operations in APO. It was provided in note 918648. Note 1426836 enhances the report so it works both for orders that exist only in APO and also for orders that exist in both systems.
Creates pegging areas for all Location-Products in a given Planning Version which do not have a pegging area yet. If a pegging area is missing for a particular Location-Product in APO, it is preferable to execute transaction /SAPAPO/RRP_NETCH with the option to create pegging area instead of executing this report.
Used to regenerate the Setup Matrix in the liveCache. Can solve some setup matrix inconsistencies.
This report was provided in note 1633057. It can be used to view orders graphically. The report is already contained in the newest releases and support packages.
This report is a last resort for deletion of inconsistent order data from the liveCache.It is recommended not to run it without contacting SAP Support firstThe user has to know previously the order which he wants to delete – It is necessary to inform the OrderID of the order to be deleted.It should not be used often or as a planning tool – if liveCache details for a given order are deleted with this report and there are still some entries in non-liveCache APO tables (like /SAPAPO/OPR or /SAPAPO/ORDFLDS), these will not be deleted, thus generating inconsistencies in the system! It is not supported by SAP – for further details, check KBA 1867390
Check Consistency of Ordkey, Ordmap, and Order Objects. The report should only be executed at times when there are no CIF activities occuring, because during order transfer there may be temporary inconsistencies in the system which will be solved as soon as the queue gets processed, thus making the result of the report unreliable otherwise.
PP/DS Related Reports – Automotive
PPC: Analysis report for the PRC future change records
Report which checks for component with invalid references to parent materials (phantom assemblies). The inconsistent orders found may be transferred from ECC to APO with report RCIFORDT to fix the issue
A planning area as we know is one of the most important areas in both Demand (DP) and Supply Network Planning (SNP). It basically defines the area where most of the planning activities take place grouping together both characteristics and key figures under a single domain.
In order to maintain the data consistency and to avoid loss of data, it is generally a good practice to take backup of the data in planning area to Info cube.
In this document, we will see how data can be backed up from Planning Area in step wise manner. For illustration purpose, have taken SNP Planning Area however process remains same for both DP and SNP – only major difference being in SNP, export DataSources can be generated for the SNP aggregates and not for MPOS like in Demand Planning. We would see the same further ahead in the document.
1. Create Planning Area
Planning area is created – TESTSNP. Here we have copied the standard planning area 9ASNP02 and created our own custom planning area. Configuration Steps for SNP Planning Area is discussed in detail in the following document –
Navigate to the Created Planning Area in the above step and from menu Select Extras – Data Extraction Tools as shown below –
Same can be achieved by the below transaction also –
Transaction – /SAPAPO/SDP_EXTR
Here, we need to select Data backup and then Generate Infocube from Planning Area in next screen. After that we need to provide the name of Infocube and Infoarea as shown below –
Above step will create Infocube with name SNPBACKUP. After creation of backup Infocube, we can see the same in transaction RSA1 – here we can edit/create new dimensions (basically to partition the data) and remove unwanted characteristics/key-figures which we do not want to save in cube.
3. Create Data Source
In order to create the Data Source, we need to again the same path as described in step 2.
Transaction – /SAPAPO/SDP_EXTR
Here, we need to select Generate Data Source and then provide the name of the data Source. 9A is prefixed to the name given by us for Data Source by the system automatically.
In the next screen, we need to select the Aggregate on which we want to base the Data Source. In case of DP Planning Area, we can base the Data Source on the MPOS while this is not the case with SNP. This also means that only the characteristics or characteristics combination contained in these aggregates can be used in the InfoSource, transfer structure and transfer rules. Also For an SNP aggregate, only the key figures assigned to it can be extracted. So, it is an important decision to select the Aggregate on which one wants to base the Data Source as per business requirement. Here, we have selected the aggregate 9AMALA (9AMATNR, 9ATRNAME).
Also while creation of Data Source, there is this indicator as highlighted above which gives one the flexibility to avoid the system from transferring data records cube if all key figures in the data record have the value zero. This can drastically reduce the volume of information that the system transfers, and as a result improve performance.
Once the Data Source is generated, we can replicate the Data Source and then it will appear in transaction RSA1. Then we need to activate the Data Source as shown below
4. Create Transformation
In the above steps, we have created the backup cube for planning area to store the data and also Data Source for Planning Area. Now, we need to link the both and this is achieved by Transformation. It connects the objects in Source to corresponding objects in target.
Transaction – RSA1
Select the Infocube – Right click create transformation. In next screen, we need to provide source and target for this transformation to link. After validation of all linkages, we then activate the transformation as shown below –
In Transformation one also has option to define various Formulas, Rules etc. basically to define how the data needs to be mapped to the target field.
5. Create Data Transfer Package(DTP)
Transaction – RSA1
In Transformation, we defined the mapping between source and target objects; however extraction and update are defined using DTP.
Within DTP, there is option of defining the type of Extraction – Full/Delta, package size, filter criteria, Update Mode, Processing Mode and Error Handling Mechanism. All these options can be defined as per business requirement.
6. Create Infopackage
Transaction – RSA1
Infopackage would help in extracting the data from source and loads it to PSA. Data from PSA to the target is taken by DTP as described above.
Once Infopackage is created, we have defined all the objects which would be needed to extract the data from SNP Planning Area and store it into Backup cube. Final structure in RSA1 would look like below –
In this document, we saw how data from planning area can be stored in Infocube via data Source, Transformation, DTP and Infopackage. Also, one can automate these steps via process chain and have it scheduled on Daily/Weekly basis as per requirement.