EPM Automate Tips and Tricks to Optimize Your Oracle Cloud Automation Processes

Interested in reducing the time spent in your data loading and export processes? The EPM Automate utility can help you do just that and much more within the context of Oracle Cloud! Learn how this tool works and can be used for automating processes to save time in day-to-day business activities. These processes include data and metadata import and export, business rule launches, data movement across cubes, application backup and restore, and much more. Also, learn how EPM Automate can be integrated with scripts to perform a number of different tasks outside of Planning.

Product Overview: EPM Automate

The EPM Automate Utility enables Service Administrators to remotely perform tasks within Oracle Enterprise Performance Management Cloud instances. It is a command line utility implemented on top of the Oracle Cloud REST (Representational State Transfer) APIs. EPM Automate leverages scripts and Windows Task Scheduler to perform and automate common business tasks, including importing and exporting metadata, importing and exporting Planning application data, application refresh, running business rules, copying data across databases (i.e., BSO to ASO), uploading and downloading files to and from PBCS repository, as well as listing and deleting the files, and exporting and importing snapshots.

EPM Automate allows individuals to create scripts and automate execution via Task Scheduler for Windows (e.g., develop scripts to download and maintain rolling 7-day application backups). Additionally, the utility returns specific error codes to indicate status of operation/task and can run on a local machine or server environment. EPM Automate is also available for Linux and it can be downloaded and installed from the Oracle Planning and Budgeting (PBCS) web interface within a matter of minutes.

EPM Automate Commands

There are multiple sets of commands available as dependent upon the type of Oracle Cloud Service, these include PBCS, Oracle Enterprise Planning and Budgeting Cloud Commands, Oracle Financial Consolidation and Close Cloud Commands, Oracle Account Reconciliation Cloud Commands, and Oracle Profitability and Cost Management Cloud Commands. Commands require “Service Administrator” security role within Oracle Cloud for particular instance in order to run. The table below provides a list of EPM Automate typical commands:

Those specific to Oracle PBCS include:

Scripting and Syntax

After you download and install the EPM Automate utility, you can integrate a variety of scripts and launch commands through the utility. Here are some examples:

  • Leverage Windows batch scripting + Task Scheduler to process a trigger file and kick off an automated routine (say, for importing your daily actuals data)
  • Utilize VBScripting to unzip files that have been exported via EPM Automate
  • Use Powershell scripts to delete old files that were created “x” days prior
  • Use VBScripting to send an email containing the batch process log to the FP&A admin after the automated routine is completed

 A Walk-through of Automating Meta Data Loads

In my opinion, Windows Batch Scripting is the easiest in this context to learn, apply, and get your organization up and running with automation scripts. This paper will walkthrough how to utilize this kind of scripting. The first walkthrough involves an automated metadata load. You might see this kind of scripting in operations for a business that has frequently changing Account hierarchy structures and needs to make sure changes in their source system are reflected in PBCS.

Let’s assume a business case where we would want to import our Account hierarchy on a daily basis into our PBCS application “PLNtstc1”. In this particular example, we’ve added the highlighted accounts to our source:

We need to begin by setting up a job in PBCS that can be launched via automation and initiate the metadata import:

Note that in step 2 above – the filename is static (i.e., no date stamp). The filename delineated here must also be called out in the import scripting to import the appropriate file. Also, please note that in the past few screenshots, we have setup a job in PBCS (“Import_Account_Hierarchy_Job”) which we can launch manually within Job Console or automatically via EPM Automate.

With the job task created, we can bring the latest hierarchy into our cube with automation scripting and EPM Automate. Now, you can use sub-folders on your local machine or server environment to compartmentalize scripts based on function:

Then, use sub-folders on your local machine or server environment to store import files as needed:

Utilize a system variables script to reduce maintenance time and recycle file paths, credentials, and more throughout your routine:

Utilize a “wrapper” script as a primary script that governs the majority of automated tasks and references other scripts (such as the variables script) to drive the routine:

Step 1: Call the routine variables file that has been created. Subsequent steps will utilize those variables!

Step 2: Login to PBCS via EPM Automate.

Note: Wherever we are using “call epmautomate [command]…” we are calling the EPM Automate utility and running a command against PBCS.

Step 3: Delete yesterday’s Account hierarchy file that was uploaded to PBCS.

Note that we are using a static filename in our job setup for this import (“Account_Hierarchy_File.csv”)

Because of that, we must delete the file that was uploaded yesterday (if any) before we can import today’s latest metadata file. You will receive an error if you try to upload today’s file and a file with the same filename as today’s file already exists in the PBCS inbox!

Step 4: Upload today’s file.

Step 5: Launch the import metadata job created in PBCS.

Note that the job that is called out in the scripting is the same job that was created earlier within the PBCS Job Console.

Step 6: Logout of PBCS to complete the routine.

Now it’s time to launch Processing Routine Outcome(s). There are multiple methods to visualize results! One method is by checking in PBCS for the expected outcome. Perhaps you automated a data load – check to see if data is there via Smart View. In our case, we brought in new Account members – and here is the expected outcome:

Leveraging Email Notifications

You can integrate email notification functionality into your automated routines to help you monitor processes and troubleshoot, if necessary. There are several methods to do so. Method 1 including leveraging custom scripting (for example, VBScripting) and an email server to send success/failure notifications following completion of a routine. By attaching a batch process log (text file) to the recipient you can test Success/failure and it is dictated by EPM Automate exit codes. The advantages to Method 1 is that you can receive a lot of detail through the batch process log and it gives you ability to customize (for example, what you want the email to say). Disadvantages include that it may need a significant amount of programming knowledge to setup and maintenance could be challenging as a result.

Method 2, alternatively, leverages the PBCS system email notification capabilities in your routine. This allows you to receive an email following each Job Console step (for example, receive an email notification after “importmetadata” job runs, after “refreshcube” job runs”, etc.). You may also want to set up a dummy Business Rule via Calculation Manager to be launched at the end of your routine. Therefore, no news is bad news. If you don’t receive this email notification, you would have to go in and investigate what went wrong along the automated path. By utilizing client email forwarding to send specific notifications to specific recipients (i.e., if the aggregation is the last step in a routine), you could just have the email notification pertaining to the aggregation be forwarded to multiple recipients, instead of all recipients receiving all job task email notifications. Some advantages for Method 2 is that it is very easy to setup and manage and the system emails provide you with details around job success/failure in PBCS. However, the disadvantage is that you have limited ability to customize. Let’s walk-through Method 2 and extend the example from Walk-through 1.

Assume you are a system administrator responsible for maintaining the daily automation routine that imports the latest Account hierarchy into the PBCS app, PLNtstc1. You’d like to be notified via email notification as far as the status of the jobs involved in the automation routine. To setup the email notification functionality we:

  1. Begin by logging in to the PBCS web interface
  2. Go to “Application” -> “Settings”
  3. Go to “Notifications” -> check “Job Console” -> input Email Address for recipient -> click “Save”

That’s it! At this point you should be all set to receive email notifications. At this point, whenever you launch a job through Job Console, you will receive an email notification pertaining to the job process outcome. The same goes for when you launch jobs through the automation! Let’s execute the same automation script again by double-clicking (Method 1 as discussed earlier):

As discussed previously, there is only one part of our script associated with the actual “importmetadata” Job Console job:

This job contains two steps: the import step and the step to refresh the database. For the import step, I receive this email notification after completion:

And right away, I know the process was successful! Note that my username is the same as my email address, and I am using my username to login via EPM Automate. Therefore, the “From” and “To” recipients are the same. Shortly after that, the second step (the refresh step) takes place and after which we receive this email notification:

Again, pretty useful information that allows us to save time! We can consider this Method 2 for processing routine outcomes (as opposed to our “one-stop” check (Method 1).

Let’s say there was an error in my import file that would throw an error upon processing in PBCS:

We would then receive an email notification indicating the process was unsuccessful. Here is an example of a failure email notification for your reference:

System Logs and Error Handling – Logs

EPM Automate produces multiple exit codes9 (e.g., “1”, “6”, “9”, etc.) depending on job outcome, for example:

The exit codes reveal insight into the status of a particular Job Console job and/or EPM Automate-related operation. EPM Automate automatically “creates a log file… for each failed execution”:

Alternatively, we can write these error codes to a log file and archive the log file with each routine run. We can also write additional information to the log file using batch scripting – to do so is surprisingly simple and effective!

Walkthrough of Integrating Batch Log

We can add code to our existing scripts to create a log file that details the steps taken throughout the automate routine. Let’s return to our previous walk-throughs concerning the daily import of the Account hierarchy. We’ll need a new folder to store our log files:

We should also leverage a variable for the log folder path so that we don’t have to write the path out with each step in the routine. Let’s return to our variables file and add two variables:

  • %AutomationLogPath% -- defines the folder path for folder we just made
  • %AutomationLogFile% -- defines our log filename

Now that we have these variables setup pertaining to our log files, let’s return to our wrapper script. We need to tell the routine to write details to the log:

Let’s take a look at the new syntax and logic we have integrated into our routine: (1) “echo…. >>…”. “echo” is a Windows Batch Scripting command.

“@echo on” allows you to see each line of script code via command prompt (can also be turned “off”) – useful for troubleshooting purposes. “echo… >>” writes the particular line to a file. In the above example we are writing the statement “Logging into environment” into the log file. “2 > &1” facilitates the next line which is “IF %ERRORLEVEL% NEQ 0 goto :FAILUREPOINT”.

“ERRORLEVEL” implies the EPM Automate exit code!! So the logic here would be “If the exit code is not equal to “0” (“0” implying “no error”), go to “FAILUREPOINT” and take next steps.”) In our example, “FAILUREPOINT” is at the end of the script:

So if there was an exit code that occurred during a previous step, the processing would go to “FAILUREPOINT” – at which point the line after “echo” above would be printed to the log and the admin would be notified within the log to investigate the error.

Let’s assume we made the same mistake in our import file as during our earlier walk through example:

But let’s also assume we did not know about this error and we went ahead and ran our automated routine by double-clicking:

At the end of the routine, we could then go in to our logs folder and visualize the log:

You can see that lines we coded to be written to the log with “echo” were in fact written (e.g., “Logging into environment”). EPM Automate steps were written to the log as we coded as well (e.g., “delete file completed successfully”). At the end of the log I am told to investigate exit code “EPMAT-1” but I also have the following detail from the EPM Automate utility which is useful for troubleshooting purposes:

Again, pretty powerful information that allows us to save time! We can consider this Method 3 for processing routine outcomes. And this method can be supplemented with our email notifications!

 System Logs and Error Handling – Automatic Retry

We can also take steps to minimize errors due to human intervention in the system that is taking place while the automation is running. For example, let’s assume we had an automated routine that was running multiple times during the business day. Say the routine was built to import a dimension hierarchy 4 times/day. But users are using the system during the day as well. If they were to be submitting data via an ad-hoc grid or form or something like that at the same time the automation step to refresh the cube occurs, an error would be thrown (conveying there are other users in the system…). We can integrate logic into our scripting to have the refresh step kick off again, multiple times (say, 20 times) if it fails (the reason for the failure likely being human users)To do so, you can use loop logic or “IF… GOTO” logic as we discussed earlier!

Another step to minimize errors due to human intervention in the system that takes place while the automation is running is to kick human users out of the system prior to routine kickoff! A quick way to do this – customize a refresh job in PBCS:

And then simply have that refresh job run as the first step in the automation routine prior to PBCS environment login and this will kick users out of the system prior to subsequent automation steps:

Implications for Data Streams

From our previous walk-throughs, you can tell we developed multiple mechanisms for managing a routine, including logs + email notifications, script comments, error handling and reduction processes. But we had a very small routine! Essentially we uploaded a metadata file and imported our latest hierarchy into our cube. What are the implications of continuing the scripting and our management controls?

  • We can do more automation jobs:Import metadata, run a Business Rule, import data… export data, backup your app and save app snapshots on a periodic basis…
  • We can do more on the routine management side:Archive logs, manipulate date stamps to pass along variable values, send files to various locations on a server to be processed by other systems, leverage other types of scripting…
  • Many, many more possibilities! Our primary constraints are time allowance, technical knowledge, and the commands available to us

Let’s explore this notion further by going into detail on a sub-topic: Utilizing timeframe variable values to keep our routine more dynamic. We utilize timeframe variables and values all the time in Planning (Data Management, Substitution Variables, Runtime Prompt variables, etc.). Let’s assume we have an automated daily routine where we not only need to load metadata but we also need to clear yesterday’s MTD actuals data, load today’s latest data, and aggregate this latest data. We can process files that delineate the timeframe to which we are loading the data within the filename. For example, if we used a variable for the data filename:

We could process the daily file in a dynamic fashion each day. “%DATE:~4,2%%DATE:~7,2%%DATE:~-2%“ represents components of the data function discussed earlier. For example, for today’s date, this segment of the data filename would be: “040617”. The asterisk (“*”) represents a wildcard, We can use this to represent the month and year to which we are loading/running Business Rules! For example, say we are processing data for April 2017 – the “*” could be replaced with numbers representing that timeframe (say, “0417” = April ‘17)

Considering the above, we can process a new file each day utilizing a different filename!

  • Actuals_0417_040617.txt – scripting would recognize the file with today’s datestamp as defined by the variable and process it for loading into the cube for April-17
  • Actuals_0417_040717.txt – would take similar action tomorrow!
  • Actuals_0517_050117.txt – would take similar action on May 1st!

Considering these filenames, we build a script to extract components of the filename and modify those component values as needed for PBCS. For example, a piece of that script could look like:

And here, say we are processing today’s file: Actuals_0417_040617.txt. The second highlighted line in the screenshot indicates we should capture the variable value that is 8 characters in from the left (“~8”) and covers 2 characters over (“2”) – this value is “04”. On the last line of the screenshot, we write “**** Data file load month is: 04” to log. We can go on to manipulate “04” as needed! We can convert it into “Apr”, and pass it through a RTP variable, we can concatenate with “17” for Data Management, etc.

Export Data Out of PBCS

Let’s go through an exercise where we have to export data out of PBCS for usage in a larger data stream. This data has to be sent to a different source system for business purposes. We can do this using multiple means (Business Rule, ExportData job, Data Management). I am a fan of the Data Management method – “Export Data to Flat File” because you can manipulate file output to some extent without the need for a different tool! That said, there are some complexities involved as far as automating this particular process end-to-end in a dynamic fashion. So let’s take a look at how we would do this via a PBCS Business Rule (“ExportDataRule”):

Now that we have our Business Rule for exporting the data, we can set ourselves up to automate the download and processing of that data. Let’s make folders to facilitate this:

Note that we can save time by simply adding variables to our existing variables file to accommodate this particular routine!


And our “wrapper” script for this process would look like:

Here, we are running two main EPM Automate tasks as shown in the highlighted segments:

  • “runbusinessrule” – to launch our export rule
  • “downloadfile” – to download our file to our local machine or server environment

Note that we can add batch scripting commands to this script to rename our output file, add a datestamp to it, move it, and more!

There are multiple methods to launching the routine.

Method 1: Simply double-click the script that initiates the routine

Method 2: Run from Command Prompt

Let’s try Method 2 to launch our data export routine.

Use “cd” to change directory to the script path as necessary:

Then, simply press “Enter” to run. If “echo” is on you should see script detail in the command prompt such as:

The advantages to Method 2 are that it allows you to see what is going on and it’s much easier to troubleshoot. However, it does take a bit longer to set up than Method 1 and it might be a bit overwhelming compared to Method 1.

We can then go in and review the export file:

We can review logs and email notifications associated with the process. We can also go in and look at the result through the PBCS Job Console. We can consider this Method 4 for visualizing the routine results: “Application” button ->“Jobs”

The advantages of this method is that it allows you to see details on the particular PBCS jobs and you do not have to be in server environment to access (beneficial if you’re storing batch process logs in server environment). Additionally, multiple people can access it. However, it does take a while to access (have to login, kind of manual, and goes against notion of automation).

Minimize Maintenance and Optimize Processes!

Combine the previously mentioned techniques to optimize your automated routines. Multiple means to run automation. Multiple means to review the results. Controls to minimize errors. Combine all of the above to do many tasks through multiple scripts:

  • Login
  • Upload hierarchy + import metadata
  • Clear old MTD numerical data via Data Management
  • Load today’s data
  • Aggregate
  • Receive email notifications along the way and forward to business users

Automate Multiple Jobs + Handle Errors

Manual Steps vs Automation

The amount of time it took to take manual steps for walk through 5 and resolve errors is likely a double-digit number of minutes. And the amount of time it took to take automated steps for walk through 5 and resolve errors is far less than that! Additionally, you can do something else with your time after kicking off the routine. You can really see the power of EPM Automate when you consider automating multiple jobs in one routine on a daily basis. In multiple business cases, you can see how useful these scripting processes can be. For example, imagine setting up the automation to load multiple load files instead of going through all the clicks in Data Management for each load file.

Conclusions and Path Forward

EPM Automate improves efficiency! It makes running PBCS job tasks easier. Of course, effective scripting and process organization is key and will help you save time in troubleshooting. There are a great many benefits of taking measures to optimize your automation processes. Plus, what updates are in our future? Perhaps a streamlined path for downloading the Data Management Process Details log via EPM Automate? Word of advice… Be sure to review the latest updates from Oracle Cloud when they are announced.

Blog post by Noah Neal of Key Performance Ideas.




 

RECENTLY
FROM THE BLOG

 

6/28

OAC Webinar