Archiving API Gateway Log/Tables
This tool allows you to maintain and archive API gateway log tables that may grow in size over time.
In particular the below tables contain the bulk of the gateway data:
- /DFLOW/ZAPIL01 - CMS API Log - Header
- /DFLOW/ZAPIL06 - CMS API Log - MetaData
- /DFLOW/ZAPIL10 - CMS API Log - Return
It is highly recommended that you determine how long you would like to keep a line item level log vs header detail log and then create/schedule a background job to delete anything greater (either nightly or weekly).
You have the choice to delete line item data (the highest volume by a hundred fold) and header data (less volume) as well as saving the entries to an export file as part of the deletion process.
Line item detail is the data contained behind any drill down activity, eg: viewing metadata, viewing the full return table, etc.
We recommend 1 year for line item detail and 5 years for header detail.
Please observe foreground vs. background file locations. (SAP app/DB server vs. user presentation server).
To archive Ggateway, first create a logical filename as shown below and then set up a background job with a variant via the following steps.
a. Create the logical filename "ZGLINK_API_GATEWAY_SAVED_LOG_FILES" via transaction FILE. Assign to a logical path of the same name and configure that logical path to the directory to save the log files. The physical path must have the filename placeholder as <FILENAME>.
b. Create a variant on the "Archive Gateway Data" selection screen to determine what to archive from the Gateway logs
a. Go to the Gimmal Link Enterprise System Dashboard by utilizing transaction /DFLOW/SYS.
b. Select the “Archive Gateway Data” menu option under the “Execute” menu.
c. Enter in the criteria to determine what data to archive. We suggest leaving the last year’s worth of detail information and leaving all header information intact. Initially only the date and max header rows filters will need to have information entered. Note that in order to do advanced date calculations you will need to adjust on the “Save as Variant…” screen.
Setting | Description | Sample Value(s) |
---|---|---|
CMS Profile | The content management system profile to limit archiving on. | |
CMS Use Tag | The use tag to limit archiving on. | |
CMS Repository Type | The content management system repository type to limit archiving on. | |
CMS RFC Destination | The remote function call (RFC) destination to limit archiving on. | |
API Source | The source of the api calls to limit archiving on. | |
API Type | The particular api call type to limit archiving on. | |
SAP object ID | The particular SAP ArchiveLink object keys to limit archiving on. | |
Document Type | The SAP ArchiveLink document types to limit archiving on. | |
Unique Entry Key | The unique entry in the gateway log to limit archiving on. | |
Date | The date range to limit archiving to. It is recommended to configure this at a minimum to limit how much data is archived. | |
Time | The time range to limit archiving on. | |
User Name | The SAP user name to limit archiving on. | |
CMS Message Type | The particular message types from the logs to limit archiving on. | |
CMS SAP User | The SAP user name to limit archiving on. | |
CMS Rep./Store | The content management system repository or object store to limit archiving on. | |
CMS ID | The content management system id(s) to limit archiving on. | |
CMS Filename | The content management system filename(s) to limit archiving on. | |
CMS Type | The content management system document class, object type or metadata template to limit archiving on. | |
CMS Property | The content management system property that a choice list is defined to be configured against to limit archiving on. | |
CMS Choice List Prefix | The content management system choice list prefix to limit archiving on. | |
Test Only | If selected the archiving isn’t actually performed and the process is just simulated. | |
Delete Headers – not recommended | If selected the header information from the gateway log will be deleted as well. Typically only the detail information needs to be deleted for space saving reasons so try to leave the headers when possible. | |
Max Header Rows | This determines how many at one given time are checked to see if their data should be archived. This helps for performance reasons to help limit the amount of processing. We suggest trying a value representative of the amount of processing that goes on in the given system. | 500 |
Save selected entries | Optionally if selected the archived entries can be stored in an output file for later analysis. | |
PC file path (online execution only) | When this program is executed manually from the menu path and not via a scheduled job this determines the output file location for the archived data. | |
Dataset file path | When this program is executed in the background (potentially by the scheduled job) a dataset file is used for storing the archived data. | |
d. Save the criteria as a variant that can be used by the scheduled job.
a. Select "Save as Variant..." from the "Goto" menu under the "Variants".
b. Adjust the date range by selecting "X:Dynamic Date Calculation (System DA\ate)" for the selection variable of the "Date" Field name.
c. Configure the "Name of Variable (Input Only Using F4)" to select the particular date range to archive.
a. We suggest selecting I for include (the I in the I/E column) with the LT (for less than) option the "Current date +/- ??? days" selection from the choices. b. Enter 365 - for the ??? value.
c. This archives all non-header data older than a year. d. Make sure to enter a unique variant name that will be needed when scheduling the job. Ex: ARCHIVEGATEWAY
e. Also enter a description for the information entered. Ex: Archive the Gateway Logs
c. Configure a scheduled job to archive the gateway data periodically.
- Go to transaction SM36 to configure a new background job to archive the data.
- Enter a unique name for the job that reflects the archiving of Gimmal Link Enterprise data. Ex: zglink_archive_gateway.
- Define a step to call the particular archive program with the variant defined previously.
- Define an ABAP program by pressing the “ABAP program” button.
- Enter “/DFLOW/ZCMSAPI_GTW_ARCHIVE” for the “Name”.
- Select the variant name previously defined in the “Variant” field.
- Then finally save the step information by clicking on the floppy disk at the bottom.
- Green arrow back to view the background job and you should see that a job step was successfully defined.
- Define the start condition by clicking on the “Start condition” button along the top.
- Press the “Date/Time” button along the top and define a start date and time for the archiving background job. We suggest starting the job off hours at night or on the weekend. The job shouldn’t affect any current processing so it isn’t required to run off hours but just suggested.
- Press the “Period value” at the bottom to make the job run periodically to archive the gateway logs. We suggest running the archiving program weekly so that the job performs quickly and logs are maintained fairly regularly.
- Finally save the start condition by clicking on the floppy disk at the bottom.
- Confirm that the background job now mentions the planned start time along with how frequently it will be run.
- Save the background job so that the archiving of gateway logs can occur.