With FDM/ERPi of Oracle Data Source incremental data loads

Hello

I use ERPi 11.1.2.1. In the workspace, it is possible to set the option of the rule of the data load to be instant, or incremental. However, I use FDM/ERPi set to load data from Oracle GL in Essbase. Is it possible for me to put in place the FDM for the data to load incremental data rules charges? Could be a parameter in the source ERPi adapter?

Thanks for any information you could provide.

Yes, the source ERPi adapter there is an option for "Method of Load Data" that will allow you to define how the rule of the DL is run. By default, it is "FULL REFRESH" but can be changed.

(A) connecting to the application via the workbench and the source system adapters
(B) make a right click on the Source ERPI adapter and choose "options".

You will see an option to load method and it will be the full refresh value, choose the value of the option you want in the menu drop-down and save.

Tags: Business Intelligence

Similar Questions

  • Create a table with all kinds of oracle data types?

    Hello
    who can give me a small example to create a table with all kinds of oracle data types? and the example to insert it?

    Thank you
    Roy

    Hello

    Read the fine manual. It contains examples at the end of the chapter.

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14200/statements_7002.htm

    I don't know if you know that you can also create your own data types using 'create a type '. So look for examples that are of your interest and not for all types of data.

    Concerning

    Published by: skvaish1 on February 16, 2010 15:33

  • Load the account with FDM ERPI

    Hello

    Is it possible to use the reconciliation account in the Source of FDM ERPI?

    I don't think but want to be sure.

    I use Oracle eBS R12, FDM ERPI 11.1.2 and Hyperion Planning 11.1.2.

    I have a hierarchy of GL account:
    Level 1: account
    Level 2: back
    for example:
    Income1
    Account A
    Account B
    Years2
    Account C

    I would like to have a map like:

    Source: Income 1; Target: 1 member
    Source: Income 2; Target: Member 2

    Thank you in advance for your help!

    Fanny

    Fanny,

    You cannot configure ERPI to provide parents of the segment values. Mappings for each account will be the only way to do it. Wild cards can make this easier, but it will still be a fair amount of work.

    Kind regards
    Matt

  • Charge several months with FDM ERPI

    Hello

    I use FDM ERPI to load the data from the eBS planning (11.1.2.1).

    Probably a very basic question, but I can see in the script of 'LOAD' in the Essbase adapter it is possible to load several months at a time.

    But how could I do?

    Thanks a lot for your help

    Fanny

    I don't think that the ERPI-C adapter can process several points simultaneously. Instead, you can create a simple order process which is the range of periods necessary.

  • FDM/ERPi 11.1.2.2 - able to load EBS translated currency?

    We are currently running Hyperion 11.1.2.1 in our Production environment. We have Hyperion loaded 11.1.2.2 in our DEV environment. When we look at data load ERPi 11.1.2.2 rules, it seems that we could learn the currency translated from Oracle EBS and not only the functional currency. This would be useful for our environment. I looked through the 11.1.2.2 online documentation and it says that ERPi can only shoot the functional currency. Is this correct? If so, what is the purpose is the currency Type field in the source data ERPi load rule filter menu?

    Thank you.

    Terri T.

    Functional currency only

  • Event scripts FDM shot twice during data loads

    Here's an interesting question. I added the following three scripts to different event (one at a time, ensuring that one of them is both), clear data before loading to Essbase:


    Script event content:
    ' Declare local variables
    Dim ObjShell
    Dim strCMD
    «Call MaxL script to perform data clear the calculation.»
    Set objShell = CreateObject ("WScript.Shell")
    strCMD = "D:\Oracle\Middleware\EPMSystem11R1\products\Essbase\EssbaseClient\bin\startMAXL.cmd D:\Test.mxl"
    API. DataWindow.Utilities.mShellAndWait strCMD, 0


    MaxL Script:
    Login * identified by * on *;
    run the calculation ' FIX("Member1","Member2") CLEARDATA "Member3"; ENDFIX' on *. *** ;
    "exit";




    However, it seems that clear is performed twice, both before and after the data has been loaded to Essbase. This has been verified at every step, checking the newspaper of Essbase applications:

    No script event:
    -No Essbase data don't clear in the application log

    Above to add the script to the event "BefExportToDat":
    -The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
    -Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

    Above to add the script to the event "AftExportToDat":
    -The script is executed once when you click Export in the customer Web FDM (before the "target load" modal popup is displayed). Entries are visible in the log of Essbase applications.
    -Script is then run a second time when you click the OK button in the modal pop-up "target Load System". Entries are visible in the log of Essbase applications.

    Above to add the script to the event "BefLoad":
    -Script only runs that after you click Export in the FDM Web Client (before 'target system load' modal popup is displayed).
    -Script is run AFTER loading to Essbase data when the OK button is clicked in the modal popup "load the target system". Entries are visible in the log of Essbase applications.

    Some notes on the above:
    1. "BefExportToDat" and "AftExportToDat" are both performed twice, before and after the modal popup "target Load System". :-(
    2. "befLoad" is executed WHEN the data is loaded to Essbase. :-( :-(

    Someone please any idea how we could run a clear Essbase database before the data is loaded, and not after we have charged for up-to-date data? And maybe about why event scripts above seem to be fired twice? It doesn't seem to be any logic to this!


    BefExportToDat - entered in the journal Application Essbase:
    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:19:51 2012]Local/Monthly/Monthly/admin@Native Directory/140095859451648/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:20:12 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


    AftExportToDat - entered in the journal Application Essbase:
    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:32 2012]Local/Monthly/Monthly/admin@Native Directory/140095933069056/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095930963712/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:21:47 2012]Local/Monthly/Monthly/admin@Native Directory/140095928858368/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.


    BefLoad - entered in the journal Application Essbase:
    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:43 2012]Local/Monthly/Monthly/admin@Native Directory/140095932016384/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.
    +...+

    + [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003037) +]
    Updated load cells [98] data

    + [Sea 16 May 16:23:44 2012]Local/Monthly/Monthly/admin@Native Directory/140095929911040/Info (1003024) +]
    Data load time: seconds [0.52]
    +...+

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013091) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1013162) +]
    + Received order [calculate] user [directory admin@Native] +.

    + [Sea 16 May 16:23:45 2012]Local/Monthly/Monthly/admin@Native 140095860504320/Directory/Info (1012555) +]
    + Erasure of the data in the partition [Member3] with fixed member [Period (Member1); Scenario (Member2)] +.

    James, the scripts export and the Load event will fire four times, once for each type of file: the. DAT file (main TB file),-A.DAT (log file),-B.DAT and - c.DAT.

    To work around this problem, then only run during the loading of the main TB file, add the following or something similar at the beginning of your event scripts. This assumes that strFile is in the list of parameters to the subroutine:

    Select Case LCase(Right(strFile,6))
         Case "-a.dat", "-b.dat", "-c.dat" Exit Sub
    End Select
    
  • Oracle On Demand on EHA Pod data loader

    Oracle data loader does not work correctly.
    I downloaded from Staging (EHA Pod).
    And I did the following work.

    1. go to the "config" folder and update 'OracleDataLoaderOnDemand.config '.
    hosturl = https://secure-ausomxeha.crmondemand.com
    2. go to the "sample" folder and change the Owner_Full_Name to the 'account - insert.csv '.

    And at the command prompt, run the batch file.
    It runs successfully, but the records are not inserted on EHA Pod.Records exists on EGA Pod.
    It is the newspaper.
    Data loader is only EGA Pod? Could you please give me some advice?


    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: start of execution.
    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: load from the list of configurations: {sessionkeepchkinterval = 300, maxthreadfailure = 1, testmode is production, logintimeoutms = 180000, csvblocksize = 1000, maxsoapsize is 10240, impstatchkinterval = 30, numofthreads = 1, https://secure-ausomxeha.crmondemand.com = hosturl maxloginattempts = 1, routingurl = https://sso.crmondemand.com, manifestfiledir=.\Manifest\}
    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: list of all options of loaded: {datafilepath = sample/account - insert.csv, waitforcompletion = False, clientlogfiledir = datetimeformat = usa, operation = insertion, username = XXXX/XXXX, help = False, disableimportaudit = False, clientloglevel = detailed, mapfilepath = sample/account.map, duplicatecheckoption = externalid, csvdelimiter is, importloglevel = errors, recordtype = account}
    [2012-09-19 14:49:55, 296] DEBUG - BulkOpsClientUtil.getPassword () [hand]: entering.
    [2012-09-19 14:49:59, 828] DEBUG - BulkOpsClientUtil.getPassword () [hand]: get out.
    [2012-09-19 14:49:59, 828] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: entering.
    [2012-09-19 14:49:59, 937] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: request for host to send to search: https://sso.crmondemand.com/router/GetTarget
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: search for returned: < host? XML version = "1.0" encoding = "UTF-8"? >
    < HostUrl > https://secure-ausomxega.crmondemand.com < /HostUrl >
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: extract successfully the host URL: https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: get out.
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: entering.
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host of the routing application = https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host of the config = https://secure-ausomxeha.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: updated the config file:.\config\OracleDataLoaderOnDemand.config
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host set to https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: get out.
    [2012-09-19 14:50:03, 953] INFO - [main] trying to connect...
    [2012-09-19 14:50:10, 171] INFO - [main] successfully connected as: XXXX/XXXX
    [2012-09-19 14:50:10, 171] DEBUG - BulkOpsClient.doImport () [hand]: start of execution.
    [2012-09-19 14:50:10, 171] INFO - request Oracle Loader validation on demand data import [main]...
    [2012-09-19 14:50:10, 171] DEBUG - FieldMappingManager.parseMappings () [hand]: start of execution.
    [2012-09-19 14:50:10, 171] DEBUG - FieldMappingManager.parseMappings () [hand]: complete execution.
    [2012-09-19 14:50:11, 328] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: call for submission BulkOpImportGetRequestDetail WS
    [2012-09-19 14:50:11, 328] INFO - SOAP [main], A request was sent to the server to create import demand.
    [2012-09-19 14:50:13, 640] DEBUG - SOAPImpRequestManager.sendImportGetRequestDetail () [Thread-3]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:13, 640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run (): BulkOpImportGetRequestDetail WS call ends
    [2012-09-19 14:50:13, 640] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: Code of State response SOAP = OK
    [2012-09-19 14:50:13, 640] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: go to sleep for 300 seconds.
    [2012-09-19 14:50:20, 328] INFO - [main] a response to the SOAP request to create the import on the server request has been received.
    [2012-09-19 14:50:20, 328] DEBUG - SOAPImpRequestManager.sendImportCreateRequest () [hand]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:20, 328] INFO - [main] validation of Oracle Data Loader application Import PASSED.
    [2012-09-19 14:50:20, 328] DEBUG - BulkOpsClient.sendValidationRequest () [hand]: complete execution.
    [2012-09-19 14:50:20, 343] DEBUG - ManifestManager.initManifest () [hand]: create manifest directory: .\\Manifest\\
    [2012-09-19 14:50:20, 343] DEBUG - BulkOpsClient.submitImportRequest () [hand]: start of execution.
    [2012-09-19 14:50:20, 390] DEBUG - BulkOpsClient.submitImportRequest () [hand]: sending CSV data Segments.
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.CSVDataSender () [hand]: CSVDataSender will use 1-wire.
    [2012-09-19 14:50:20, 390] INFO - [main] application to Oracle Loader on demand data import with the following request Id: AEGA-FX28VK...
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: creation of thread 0
    [2012-09-19 14:50:20, 390] INFO - [main] import Request Submission Status: started
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: from wire 0
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: there are pending requests. Go to sleep.
    [2012-09-19 14:50:20, 406] DEBUG - CSVDataSenderThread.run () [Thread-5]: Thread 0 Presentation of CSV data Segment: 1 of 1
    [2012-09-19 14:50:24, 328] INFO - [Thread-5] has received a response to the data import SOAP request sent to the server.
    [2012-09-19 14:50:24, 328] DEBUG - SOAPImpRequestManager.sendImportDataRequest () [Thread-5]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:24, 328] INFO - [Thread-5] A SOAP request that contains the import data was sent to the server: 1 of 1
    [2012-09-19 14:50:24, 328] DEBUG - CSVDataSenderThread.run () [Thread-5]: there is no more waiting for the request to be picked up by Thread 0.
    [2012-09-19 14:50:24, 328] DEBUG - CSVDataSenderThread.run () [Thread-5]: Thread 0 finished now.
    [2012-09-19 14:50:25, 546] INFO - [main] import Request Submission Status: 100.00%
    [2012-09-19 14:50:26, 546] INFO - [main] Presentation of Oracle Data Loader application Import completed successfully.
    [2012-09-19 14:50:26, 546] DEBUG - BulkOpsClient.submitImportRequest () [hand]: complete execution.
    [2012-09-19 14:50:26, 546] DEBUG - BulkOpsClient.doImport () [hand]: complete execution.
    [2012-09-19 14:50:26, 546] INFO - [main] trying to connect...
    [2012-09-19 14:50:31, 390] INFO - XXXX/XXXX [hand] is now disconnected.
    [2012-09-19 14:50:31, 390] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: interrupted.
    [2012-09-19 14:50:31, 390] DEBUG - BulkOpsClient.main () [hand]: complete execution.

    Hello

    the points of data loader by default for the production environment without worrying if download you it from intermediary or production.
    To change the pod edit the configuration file and capture the content below:

    hosturl = https://secure-ausomxeha.crmondemand.com
    routingurl = https://secure-ausomxeha.crmondemand.com
    testmode = debug

  • Do we need to re - create data load rules if we move from EBS 11 to 12?

    If so, please explain why. Thank you.

    If you switch from EBS 11-12 of the EBS, you need to create a new entry in the source system in ERPi.

    Once the new source system is created, you then need to initialize the source system in ERPi.

    From there, you need to associate it with a format of import and localities in ERPI and your data load rules are then based on location.

  • SSRS for lack of outer join with the Oracle data source

    It seems to be a problem with the Oracle driver used in the Reporting SERVICES query designer.

    When you use an Oracle data source, if I create an outer join in the graphic designer, it automatically inserts '{OJ' before the join and '} ' after her.  This is an incorrect syntax for Oracle and refuses to start.  The curly braces and the JO editable in designer text, but if I go back to the graphic designer and immediately to reintegrate them.

    Only, this has started to happen a year or two ago - before that it worked, but with the old (+) syntax.

    Can it not be healed?  It makes things very difficult.

    -Geoff

    Hi Geoff,

    Thanks for posting in the Microsoft Community.

    However, the question you posted would be better suited in the Forums of the Oracle Support; We recommend that you post your query in Oracle Support Forums to get help:

    https://forums.Oracle.com/forums/main.jspa;JSESSIONID=8d92100c30d8fb401bcbd10b46c38c9ddf1a3242549a.e34SbxmSbNyKai0Lc3mPbhmSc3aNe0? CategoryID = 84

    If you have any other questions or you need Windows guru, do not hesitate to post your questions and we will be happy to help you.

  • Find the Oracle data source...

    I inherited a ColdFusion 10 app with a backend Oracle 11g; Windows Server.    I am especially a DBA and not an expert in ColdFusion.  In the application code, they have the hardcoded data source.  In other words, when I move to Test I have to change the source data, load the modules and test.   After a successful test, I need to change the data source, once again, to download and the promoted to production.   I seem to remember in the old application, I used to support, the code was generic and the data source depends on what server you were lit; If the development, testing or production, you don't have to worry.  I don't remember how it was done.   What would be the best way to eliminate this hard coding and make it more automated?   I know I'm probably missing something, but you were all very helpful to me in the last few weeks, so I hope that he is not a stupid question.   Thank you.

    I do not have Admin CF

    You can specify that?  Do you mean that you do not have access to the CF Admin?

    Not having access to the CF Admin, I'm not sure you can do something differently as you are now., unless you're writing a logic to examine the server host name or the domain name of the site and set the source of data accordingly.

    If you can access the CF Admin, create two data sources: one for production and one for testing.  Then in application.cfc, you can write logic to examine the server host name or the domain name and indicate 'this.datasource' to whatever datasource is appropriate.

    -Carl V.

  • Using Oracle Partition Exchange with Oracle Data Integrator (ODI) 11g

    Hello

    I'm trying to follow http://www.ateam-oracle.com/configuring-oracle-data-integrator-odi-with-oracle-partition-exchange/

    But impossible to get the same options in 11g.

    Can I use the Exchange Partition Oracle 11g ODI.

    Thank you.

    Yes, exchange of partition is certainly possible in ODI 11 g

  • Can anyone tell me about Oracle Data Governance Manager? How to do with it?

    Hai all,

    We plan to implement Oracle Master Data Management.Please Oracle customer hub should be read the following link and tell me how to work on Oracle Data Governance Manager.

    Where can I find it?

    What are the steps in the implementation?

    http://www.Oracle.com/us/products/applications/master-data-management/Oracle-customer-hub-439838.PDF

    Hello Nandini,

    DGM software build can be downloaded from My Oracle Support via the "Patches and updates" tab with the name/number of patch 9329831 (currently password protected; please connect to a service request and ask for a password).

    Installation of the DGM: for more information on how to install the CMD are documented in siebel Guide Version to maintain 8.1.1.x, O Rev., Section: management and governance of data installation (DGM).

    Steps to customize the application of governance (DGM) (Doc ID 1323952.1) data management
    Document provides steps to customize the application of the DGM, makes it easier to change the skin via CSS stylesheet changes in the application of OOTB DGM.

    Reference: Oracle Master Data Management - Information Center Customer Hub (Siebel UCM) - installation of the life cycle (Doc ID 1081980.1).

    Thank you

    Shilpi

  • Export schema through Oracle data pump with question database Vault enabled

    Hello

    I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
    http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
    to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.

    I.e. I gave sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');

    execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');

    I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.

    However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081

    The export is completed, but this error.

    Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.

    Thank you

    I moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support

    HTH
    Srini

  • Oracle Data Masking Pack with Oracle Standard Edition

    Hello
    I found on the Oracle data masking Pack in the oracle web site:

    https://shop.Oracle.com/pls/ostore/f?p=dstore:product:3439101311204935:no:RP, 6:P6_LPI, P6_PROD_HIER_ID:4509221213031805719914, 114180807059101824910944 & tz = - 5:00

    I would like to know if the data masking Pack Oracle can work with Oracle Standard Edition, which is the license we have in my office. We don't plan to upgrade to Oracle Enterprise Edition.

    I know that Oracle Enterprise Edition with the functionality of datamasking. Standard Edition does not.

    I asked this question here because I chat with an agent of sale to Oracle's web site, but has not answered my question.

    Published by: user521219 on November 17, 2011 11:20

    Yes, it's an add-on but only in Enterprise Edition.

    Read this book to see what functions are standard and optional:

    http://www.Oracle.com/us/products/database/039449.PDF

  • Fill in the data in an Excel template (with pre-defined formulas) Oracle

    Hello

    Anyone know how I can fill (with PL/SQL) Oracle data in an Excel template (with pre-defined formulas)? I mean that the user provides the Excel model so I just transfer the data from Oracle in particular cells in the worksheet. The model can be large and can become 50 pages long.

    Thank you.
    Andy

    So, you want an Oracle process or a process of APEX to take an existing Excel file and simply plug data into it? What I would say is instead, you export your data out and into your writing Excel VBA code file to take you input file and treat it accordingly.

    Thank you

    Tony Miller
    Webster, TX

    You can get more with a kind word and a legacy, you can with just a kind word

Maybe you are looking for

  • What happened to the notes in mail el capitan?

    I just (finally) upgraded a MacBook from 2006 to an iMac and have slowly learned my way with el capitan (which of course my old computer could not operate.  I forgot that I was on what operating system.  I used to store important information in the n

  • I have 272 active system fonts. Is this necessary?

    I use Extensis Suitcase. I have 272 system fonts activated with a lock icon indicating that they need to be activated. Many of them are languages other than English. Do really need to be active?

  • Hexadecimal ASCII

    Hello I try to send a hexadecimal string to my Profibus card. The string represents a serial number I read from a serial.txt file. The a 9-digit number which I converted to a hexadecimal string. To be able to write to the Profibus device I need to se

  • computer a824n of office doesn't start with an external hard drive 2T connected.

    My HP Pavilion a824n desktop computer won't start with my new external hard drive 2T connected via a USB port.  The computer stops on the first screen, which I think is the Intel screen, with options for press ESC, F1, F10.  But these keys do nothing

  • Windows 7 Setup will not recognize SSD

    I just built a new computer and I want to install 64-bit Windows 7 Professional on an Intel X - 25 M SSD 80 GB.  I do a fresh install on a new drive.  When I arrive at point in the installation of Windows 7 that asks you which drive I want to install