Note: This blog post explores batches in Data Management (DM) – a relatively new tool used for data integration in the Oracle EPM Cloud suite of products. Specifically, we will look at this in the context of Oracle Enterprise Planning and Budgeting as a Cloud Service (EPBCS). Batches in DM allow for greater efficiency when the user is looking to run multiple Data Load Rules automatically.


Have you ever been in an Oracle EPM Cloud app data situation that required you to use the same data file, but multiple Data Load Rules? Or how about the case where you needed to load two separate files, back-to-back, using different Data Load Rules? Administrators who have been working with the technology for a long time can likely recall numerous instances where they have needed to run multiple Data Load Rules in serial – one after another – for the purpose of importing different data slices in a given time window. But not all users may be aware of the option that is batch processing in Data Management!

If you are looking to save time in your Data Management integration load processes, then this is the blog post for you!

Background: Data Management Data Load Rules

Data Management is a critical component of the Oracle EPM Cloud (here referred to as “Cloud”) software suite, and it is used for data integrations, data import and data export, and data copy. The version in Cloud is derived from the original Oracle on-premises version, which exists as FDM (Financial Data Quality Management) and FDMEE (Financial Data Quality Management, Enterprise Edition).

This tool is particularly helpful for organizations that need to import data that requires mappings or has some difference(s) in its source data files (relative to the target Cloud application). DM utilizes “Data Load Rules” to load the data into the application from an inbound data file(s), by leveraging integration parameters such as file field specifications, filename, and target Period, Year, and Scenario.

Data Load Rules therefore can be summarized as the (oftentimes) recurring aspect within a DM integration – the administrator will likely re-visit often the Data Load Rule screen to run the organization’s rule(s) each time data needs to be imported into the system.

But what if the administrator needs to run multiple Data Load Rules in a given timeframe? There are multiple reasons why s/he might need to do this. We will look at an example context for when this type of case might exist, as well as how batch processing of these Rules can support this kind of data load maintenance. 

Example Case: Loading to Two Destinations Within a Dimension

Consider an example case where the organization needs to load the data to two separate sets of members in the Account dimension of an EPBCS application, for reporting purposes:

In this conceptual example, we have two separate rollups with separate accounts – essentially these are two different means for viewing the Income Statement. In real life – perhaps the example application is one which has both a natural account rollup, as well as an industry-specific account rollup. As you can see per the screenshot – the two rollups convey the Income Statement but are composed of unique sets of accounts.

As such, two separate sets of data are needed to accommodate these views:

In the example data file above, you can see the two independent sets of accounts (1) and (2) are captured per each data row.

And this logic translates to a need for two separate data integrations (because we then need two separate Import Formats, possibly Mappings, etc.). Ultimately, we end up with two different Data Load Rules to accommodate both sets:

At this point, both rules can be run individually, one after another. Alternatively, however, this is a good opportunity to take advantage of the Data Management batch feature!

Setting Up Batches in DM

To setup a batch in DM, begin by going into DM à “Setup” à “Batch Definition”

Click the green “+ Add” button to add the new batch:

Add detail as needed:

To summarize the above, we have batch “Demo_DM_Batch” that we are executing for Type “Data” (note that it is possible to have a batch of batches).

For “Execution Mode” – we are having the processes run in “Serial” – one after the other but alternatively we could also have them run in “Parallel” – at the same time. Note that for “Parallel” Execution Mode, you have the option to “Wait” for completion which gives back control only once the batch has finished processing, as opposed to “No Wait” which runs the batch in the system background and allows you to do other tasks concurrently.

In the next tab – “Parameters” – you can define the parameters associated with the batch. You will see these options are similar to the “Data Load Rule” options –including items such as “Import From Source”, “Export to Target”, “Import Mode” and “Export Mode”:

The option “Extract Exchange Rates” allows you to bring in the exchange rate from the source ERP system.

 Next, under “Batch Jobs”, select the Data Load Rules of interest:

Note that “Job Sequence” helps govern the ordering of the component rules being run – in this case we are running “DM_Batch_Demo_DLRule1” first, and then “DM_Batch_Demo_DLRule2”.

Once done, be sure to click “Save” in the upper-right hand corner.

Executing Batches in DM

Once the batch is defined, go to “Workflow” à “Batch Execution” to run the batch:

Click “Execute” to run the batch on-demand:

And in “Process Details” under “Workflow”, I can see that my batch ran both component rules and troubleshoot errors if needed.

Please note you can also click “Check Status” in the “Batch Execution” screen to see the status detail as well:

Lastly, please note that you could “Schedule” your batch to run at a certain cadence (or cancel that schedule if needed):


Oftentimes when building integrations in Oracle EPM Cloud applications, there is a need to run multiple Data Load Rules for processing different data sets, quickly and efficiently. Batches in Data Management can help. For example, instead of running a dozen Data Load Rules manually, the admin can setup and schedule a batch to perform the execution of those Data Load Rules, automatically. This functionality is useful in saving time in the long run!

For more information, please visit the Oracle EPM Cloud website (and related documentation) at:

Recent Posts

Successful Reconciliation Compliance with Oracle Cloud

Would you like to learn how Key Performance Ideas can help you utilize Oracle’s Account Reconciliation Compliance Module that saves time, rework, and improves accuracy?  How about seeing the same time savings and accuracy every month?  Would you like to have real-time...

read more
Accelalpha + Key Performance Ideas

Key Performance Ideas (KPI) is pleased to announce that Accelalpha has acquired us.

Accelalpha + Key Performance Ideas

Key Performance Ideas (KPI) is pleased to announce that Accelalpha has acquired us.

Schedule Time With Us