Bug Report: Push Data failed. Error: Exported data size violates permissible amount: 100 MB

, , , ,

Introduction

Data Map Error:

Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

If you are confused, join the club.  The results are inconsistent as some data pushes are successful that are over the 100MB limit.  So, why the following error?

Exporting data…
Exported data file(s) size is: 207.1 MB.
Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

Clarification

A point of clarification for those of you who are new to data maps and smart pushes. If you think they are the same thing, here is the clarification from Oracle, in my words.

  • A Data Map is any data map executed from the Data Map area, whether it is through the UI, EPM Automate, or the REST API.
  • A Smart Push is essentially any Data Map executed from a Data Form.

Although they seem like the same function, they have different logical areas in execution.  My understanding is that a Data Map should never hit a cap on memory.  A Smart Push does have a cap.  Not only that, the way it was explained to me is that there is a hard cap on how much memory Smart Pushes can consume, and this is a global limit, not a limit per Smart Push.  So, the reason you are experiencing inconsistent results with Smart Pushes is quite simple.  The more Smart Pushes that are executed in a time window, the more memory is used.  So, you may never have a problem in a Test, or at night, but during UAT or in Prod, successful execution may be intermittent.  The reason is when these are run periodically, that limit may never be reached.  Run multiple times by multiple people in short durations will cause the limit to be consumed.

This bug only applies to Data Maps.

The Problem

The same Data Map executed results in two different outcomes.

Failure

Exporting data…
Exported data file(s) size is: 207.1 MB.
Push Data failed. Error: Exported data size of data map that is being executed from groovy is more than permissible amount: 100 MB.

Success

Exported data file(s) size is: 464.7 MB.
EXPORT elapsed time: 39584
IMPORTING – AppName: AreakFin
TRANSFORM elapsed time: 63634
IMPORTING elapsed time: 21166
TOTAL elapsed time: 124553

So, if there is a cap at 100MB, what gives?  If you have seen the following error, and wondered why the same Data Map sometimes runs and sometimes fails, it is related to Bug 27161430.

The Fix

Although support was difficult to navigate, I was lucky enough to be at an Oracle session in Virginia and talked to a developer.  He immediately requested the ticket number and said flat out, this is a problem.  I don’t want to name names, so a huge thank you to an unidentified developer at Oracle for giving me a few minutes and helping, because I don’t believe it would have been escalated to the development team otherwise.

The ticket was updated yesterday, and the fix is slated to be released in February. Although this is an internal bug, here are the details.

Bug 27161430 – PBCS: EXPORTED DATA SIZE OF DATA MAP THAT IS BEING EXECUTED FROM GROOVY IS MORE

 
2 replies
  1. Nandan says:

    Hi Kyle, As always your inputs/guidance helps EPM community. We encountered the same issue with data maps (exceeding 100mb limit), was looking for more details and came across one of the oracle knowledge article (Doc ID 2509693.1) published on 09 OCT 2019 that Oracle DEV team don’t have plans to enhance this. Can you pls let us know what’s workaround/solution other than reducing the volume of data (<=100 MB) by splitting the single large data map into few or using RTP to limit data ?

     
    Reply
    • Kyle Goodfriend says:

      I know of two options. The first is to iterate through chunks of data. For example, run the map and run one for each product, or groups of products, to stay under the limit. The second option is to run the data map as an admin. This can be done if you use Groovy to run the data map through REST api with the connection created with an admin ID. A user can run the rule, which runs the data map not as a smart list, using the admin ID.

       
      Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.