DBMS_DATAPUMP. METADATA_REMAP

What I'm missing here? I have reference Mr. [Christopher Poole | http://www.chrispoole.co.uk/tips/dbatip4.htm] while trying to use the function METADATA_REMAP of DATAPUMP... but I can't make it work. Here is my code... my proc succeeds however it attempts to import the schema of origin instead of remapping it. So, I get a ton of errors saying "this schema already exists...". ».

What Miss me?




DECLARE
/ * IMPORT/EXPORT VARIABLES * /.
v_dp_job_handle NUMBER;          -The handful of job data pump
v_count NUMBER;          -Index of the loop
v_percent_done NUMBER;          -Percentage of job complete
v_job_state VARCHAR2 (30);     -To keep track of job status
v_message KU$ _LOGENTRY;     -For error messages and work in PROGRESS
v_job_status KU$ _JOBSTATUS;     -The State of the work of get_status
v_status KU$ _STATUS;     -The status returned by get_status object
T_DATE VARCHAR2 (13).
v_schema VARCHAR2 (25);
v_new_schema VARCHAR2 (25);
v_source_database_name VARCHAR2 (50);

BEGIN
v_schema: = 'TEST ';
T_DATE: = TO_CHAR (SYSDATE, 'MMDDYYYY_HHMI');
v_source_database_name: = 'TEST_DB ';
v_new_schema: = 'TEST_NEW ';

/ * OPEN THE DATA PUMP PROCEDURE * /.
v_dp_job_handle: = DBMS_DATAPUMP. OPEN)
OPERATION = > "IMPORT."
JOB_MODE = > "SCHEMA."
REMOTE_LINK = > v_source_database_name,
JOB_NAME = > v_schema | ' _REMAP_' | T_DATE,
VERSION = > 'LAST');

/ * ADD THE NAME OF THE EXPORT LOG FILE TO THE DATA PUMP PROCEDURE * /.
DBMS_DATAPUMP. ADD_FILE)
MANAGE = > v_dp_job_handle,
FILENAME = > v_schema | ' _REMAP_' | T_DATE |'. JOURNAL '.
DIRECTORY = > 'DATAPUMP. "
FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

/ * ADD THE NAME OF THE SCHEMA TO THE PROCEDURE OF DATA PUMP EXPORT * /.
DBMS_DATAPUMP. () METADATA_FILTER
MANAGE = > v_dp_job_handle,
NAME = > 'SCHEMA_EXPR ',.
VALUE = > ' = "' | v_schema | " ' ') ;

/ * REMAP THE ORIGINAL THE NEW SCHEMA SCHEMA * /.
DBMS_DATAPUMP. () METADATA_REMAP
MANAGE = > v_dp_job_handle,
NAME = > 'REMAP_SCHEMA ',.
OLD_VALUE = > ' = "' | v_schema | " ' ',
VALUE = > ' = "' | v_new_schema | " ' ') ;

/ * START THE EXPORT * /.
DBMS_DATAPUMP. START_JOB (v_dp_job_handle);



/ * EXPORT ERROR HANDLING * /.
v_percent_done: = 0;
v_job_state: = "UNDEFINED";

WHILE (v_job_state! = "COMPLETED") AND (v_job_state! = "STOPPED")
LOOP
DBMS_DATAPUMP. GET_STATUS)
v_dp_job_handle,
DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR + DBMS_DATAPUMP. KU$ _STATUS_JOB_STATUS + DBMS_DATAPUMP. KU$ _STATUS_WIP.
-1,
v_job_state,
v_status);

v_job_status: = v_status. JOB_STATUS;

/ * IF THE PERCENTAGE CHANGED, DISPLAYS THE NEW VALUE * /.
IF v_job_status. PERCENT_DONE! = v_percent_done THEN
DBMS_OUTPUT. Put_line ('* percent of the job done = ' |) To_char (v_job_status. PERCENT_DONE));
v_percent_done: = v_job_status. PERCENT_DONE;
END IF;

/ * IF THE WORK IN PROGRESS (WIP) OR ERROR MESSAGES HAVE BEEN RECEIVED FOR THE POST, POST THEM * /.
IF BITAND (v_status. MASK, DBMS_DATAPUMP. KU$ _STATUS_WIP)! = 0 THEN
v_message: = v_status. WORK IN PROGRESS;
ELSIF BITAND (v_status.mask, DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR)! = 0 THEN
v_message: = v_status. ERROR;
ON THE OTHER
v_message: = NULL;
END IF;

IF v_message IS NOT NULL THEN
v_count: = v_message. FIRST;
While v_count IS NOT NULL
LOOP
DBMS_OUTPUT. Put_line (v_message (v_count). LOGTEXT);
v_count: = v_message. Next (v_count);
END LOOP;
END IF;
END LOOP;

-Indicate that the finished work and detach.
DBMS_OUTPUT. Put_line ("' job has completed");
DBMS_OUTPUT. Put_line (' State of the Final work = ' | v_job_state);

/ * END OF THE DATA PUMP PROCEDURE * /.
DBMS_DATAPUMP. Detach (v_dp_job_handle);
END;

A simple display... not tested... Why not change it...

DBMS_DATAPUMP.METADATA_REMAP (
HANDLE => v_dp_job_handle,
NAME => 'REMAP_SCHEMA',
OLD_VALUE => v_schema,
VALUE => v_new_schema) ;

Tags: Database

Similar Questions

  • DBMS_DATAPUMP.metadata_remap - a matter of REMAP_TABLE.

    I want to export SCOTT. DEPT table and imports than HR. DEPT_HR with constraint. I'm using DBMS_METADATA. METADATA_REMAP with REMAP_SCHEMA (to change the schema anme) and REMAP_TABLE (to change the name of the table). I don't know where I am getting the error. It seems that REMAP_SCHEMA is changing all the names of schema successfully, but REMAP_TABLE does not change the name of the table on the constraint. So constraint is not be created. Is there another workaround solution? Here is a small proof of concept:

    SQL> --My database version.
    SQL> ----------------------
    SQL> SELECT * FROM v$version;
    
    BANNER
    --------------------------------------------------------------------------------
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    CORE    11.2.0.1.0      Production
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production
    
    SQL> SET SERVEROUT ON
    SQL> ed
    Wrote file afiedt.buf
    
      1  DECLARE
      2  l_data_pump_handle    NUMBER;
      3  l_logfilename    VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.log';
      4  l_expfilename    VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.dmp';
      5  BEGIN
      6  l_data_pump_handle:= DBMS_DATAPUMP.OPEN(operation   => 'EXPORT',
      7                        job_mode    => 'TABLE',
      8                        remote_link => NULL,
      9                        job_name    => 'TEST_REMAP_DP',
    10                        version     => 'LATEST');
    11   DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_handle,
    12                     filename    => l_expfilename,
    13                     directory => 'SAUBHIK',
    14                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
    15  DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_handle,
    16                     filename    => l_logfilename,
    17                     directory => 'SAUBHIK',
    18                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    19   DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    20                        name   => 'SCHEMA_EXPR',
    21                        value  =>'= '||''''||'SCOTT'||'''');
    22   DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    23                        name   => 'NAME_EXPR',
    24                        value  =>'= '||''''||'DEPT'||'''');
    25  --We don't need index
    26    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    27                        name   => 'EXCLUDE_PATH_EXPR',
    28                        value  =>'=''INDEX''');
    29  -- We don't copy table statistics!!
    30    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    31                        name   => 'EXCLUDE_PATH_EXPR',
    32                        value  =>'=''STATISTICS''');
    33   -- We don't copy index statistics either!!
    34    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    35                        name   => 'EXCLUDE_PATH_EXPR',
    36                        value  =>'=''INDEX_STATISTICS''');
    37    -- We do not need the data!!
    38    DBMS_DATAPUMP.DATA_FILTER(
    39     handle => l_data_pump_handle,
    40     name => 'INCLUDE_ROWS',
    41     value =>0
    42     );
    43  -- Start the export now.
    44       DBMS_DATAPUMP.start_job(l_data_pump_handle);
    45       dbms_output.put_line('Export started....');
    46   -- Detach, it's finish!
    47      DBMS_DATAPUMP.detach(l_data_pump_handle);
    48      dbms_output.put_line('Export ended....');
    49  EXCEPTION
    50       WHEN OTHERS THEN
    51        dbms_datapump.stop_job(l_data_pump_handle);
    52        RAISE;
    53*  END;
    54  /
    Export started....
    Export ended....
    
    PL/SQL procedure successfully completed.
    
    SQL> SELECT * FROM user_datapump_jobs;
    
    no rows selected
    
    
    
    

    Now, I'm importing that:

    SQL> ed
    Wrote file afiedt.buf
    
      1  --DWABI_28052015143133.dmp
      2  DECLARE
      3  l_data_pump_imp_handle NUMBER;
      4  l_logfilename  VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.log';
      5  ind       NUMBER;        -- loop index
      6   pct_done  NUMBER;        -- percentage complete
      7   job_state VARCHAR2(30);  -- track job state
      8   le        ku$_LogEntry;  -- WIP and error messages
      9   js        ku$_JobStatus; -- job status from get_status
    10   jd        ku$_JobDesc;   -- job description from get_status
    11   sts       ku$_Status;    -- status object returned by get_status
    12  BEGIN
    13  l_data_pump_imp_handle:= DBMS_DATAPUMP.OPEN(operation   => 'IMPORT',
    14                        job_mode    => 'FULL',
    15                        remote_link => NULL,
    16                        job_name    => 'TEST',
    17                        version     => 'LATEST');
    18   DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_imp_handle,
    19                     filename    => 'DWABI_28052015143133.dmp',
    20                     directory => 'SAUBHIK',
    21                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
    22  DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_imp_handle,
    23                     filename    => l_logfilename,
    24                     directory => 'SAUBHIK',
    25                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    26   --If table is already there then do not import.
    27   dbms_datapump.set_parameter(handle => l_data_pump_imp_handle,
    28                              name => 'TABLE_EXISTS_ACTION',
    29                              value =>'SKIP');
    30    -- We need to remap the schema!!.
    31    dbms_output.put_line('Changing Schema...');
    32    DBMS_DATAPUMP.METADATA_REMAP (
    33     handle => l_data_pump_imp_handle,
    34     name => 'REMAP_SCHEMA',
    35     old_value => 'SCOTT',
    36     value=>'HR'
    37     );
    38    -- We need to remap the table!!. This is not working properly.
    39    dbms_output.put_line('Changing Table...');
    40    DBMS_DATAPUMP.METADATA_REMAP (
    41     handle => l_data_pump_imp_handle,
    42     name => 'REMAP_TABLE',
    43     old_value => 'DEPT',
    44     value=>'DEPT_HR',
    45     object_type => NULL
    46     );
    47   -- Start the import now.
    48       DBMS_DATAPUMP.start_job(l_data_pump_imp_handle);
    49    -- monitor job
    50    pct_done := 0;
    51    job_state := 'UNDEFINED';
    52    WHILE (job_state != 'COMPLETED') AND (job_state != 'STOPPED') LOOP
    53      dbms_datapump.get_status(l_data_pump_imp_handle, dbms_datapump.ku$_status_job_error +
    54      dbms_datapump.ku$_status_job_status +
    55      dbms_datapump.ku$_status_wip, -1, job_state, sts);
    56      js := sts.job_status;
    57      -- If the percentage done changed, display the new value
    58      IF js.percent_done != pct_done THEN
    59        dbms_output.put_line('*** Job percent done = ' ||
    60        to_char(js.percent_done));
    61        pct_done := js.percent_done;
    62      END IF;
    63      -- If any work-in-progress (WIP) or error messages
    64      -- were received for the job, display them.
    65      IF (BITAND(sts.mask,dbms_datapump.ku$_status_wip) != 0) THEN
    66        le := sts.wip;
    67      ELSE
    68        IF (BITAND(sts.mask,dbms_datapump.ku$_status_job_error) != 0) THEN
    69          le := sts.error;
    70        ELSE
    71          le := NULL;
    72        END IF;
    73      END IF;
    74      IF le IS NOT NULL THEN
    75        ind := le.FIRST;
    76        WHILE ind IS NOT NULL LOOP
    77          dbms_output.put_line(le(ind).LogText);
    78          ind := le.NEXT(ind);
    79        END LOOP;
    80      END IF;
    81      --DBMS_LOCK.sleep (10);
    82    END LOOP;
    83    -- Indicate that the job finished and detach from it.
    84    dbms_output.put_line('Job has completed');
    85   -- Detach, it's finish!
    86       DBMS_DATAPUMP.detach(l_data_pump_imp_handle);
    87  EXCEPTION
    88       WHEN OTHERS THEN
    89        dbms_datapump.stop_job(l_data_pump_imp_handle);
    90        RAISE;
    91* END;
    SQL> /
    Changing Schema...
    Changing Table...
    Master table "SYS"."TEST" successfully loaded/unloaded
    Starting "SYS"."TEST":
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39083: Object type CONSTRAINT failed to create with error:
    ORA-00942: table
    or view does not exist
    Failing sql is:
    ALTER TABLE "HR"."DEPT" ADD CONSTRAINT
    "PK_DEPT" PRIMARY KEY ("DEPTNO") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE
    DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS"  ENABLE
    
    Job "SYS"."TEST" completed with 1 error(s) at 15:04:02
    Job has completed
    
    PL/SQL procedure successfully completed.
    
    SQL>
    
    
    
    

    If you look at the failing sql code, it is clear that the name of the table in the constraint definition has not changed, but the DEPT_HR table is created in the HR schema without constraint. What's not here?

    Post edited by: Massimiliano edited the subject line, because he said as DBMS_METADATA. Changed to DBMS_DATAPUMP.

    Hello

    This is a bug in 11.2.0.1 I think - please this Ref

    Oracle Support Document 1609238.1 (REMAP_TABLE on IMPDP FAILS WITH ORA-942) can be found at: https://support.oracle.com/epmos/faces/DocumentDisplay?id=1609238.1

    See you soon,.

    Rich

  • dbms_datapump import dblink ORA-39165

    Hi all

    I try to import a table on a from dblink to a diagram a diagram b but I get ORA-39165 scheme ('A') has not been found. and ORA-39166 object ('ACCOUNT') has not been found. When you try to import b to b it works but oddly always saves an ORA-39166. User B has imp_full_database and permission on two databases:

    declare
    Number of JobHandle;
    JS varchar2 (9); -COMPLETED or STOPPED
    q varchar2 (1): = chr (39);


    BEGIN / * open a new level of schema import job using a link to default DB * /.
    JobHandle: = dbms_datapump.open (operation = > 'IMPORT',)
    job_mode = > 'TABLE',
    remote_link = > "MRT");

    dbms_datapump.add_file (JobHandle,
    filename = > "mylog";
    Directory = > 'DATA_PUMP_DIR. "
    filetype = > dbms_datapump.ku$ _file_type_log_file);

    dbms_datapump.metadata_remap (manage = > JobHandle,)
    name = > 'REMAP_SCHEMA ',.
    old_value = > "A"
    value = > 'B');

    dbms_datapump.metadata_filter (manage = > JobHandle,)
    name = > 'SCHEMA_EXPR ',.
    value = > "IN"A","
    object_type = > 'TABLE');

    dbms_datapump.metadata_filter (manage = > JobHandle,)
    name = > 'NAME_LIST;
    ' value = > '('' COMPTE ''), '
    object_type = > 'TABLE');

    dbms_datapump.set_parameter (JobHandle,
    "TABLE_EXISTS_ACTION."
    'REPLACE');

    dbms_datapump. START_JOB (JobHandle);

    dbms_datapump.wait_for_job (JobHandle, js);

    end;

    For the life of me I don't understand how can I tell him that the source table is in A diagram. Any help would be greatly appreciated.

    Thank you very much

    Ok. Then you export in job_mode SCHEME and patterns of filter and tables, using metadata filter with SCHEMA_EXPR and INCLUDE_PATH_EXPR.

    Using the code you provided, here it is changed:

    declare
      JobHandle number;
      js varchar2(9); -- COMPLETED or STOPPED
      q varchar2(1) := chr(39); 
    
    BEGIN /* open a new schema level import job using a default DB link */
      JobHandle := dbms_datapump.open (operation=>'IMPORT', job_mode=>'SCHEMA', remote_link=>'RMT'); 
    
      dbms_datapump.add_file (JobHandle, filename => 'mylog', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
      ---
      DBMS_DATAPUMP.metadata_filter (handle=> JobHandle, name=> 'SCHEMA_EXPR',VALUE=> 'IN(''A'')');
      dbms_datapump.metadata_filter (handle => JobHandle,name => 'INCLUDE_PATH_EXPR',value => 'IN (''TABLE'')');
      --
      dbms_datapump.metadata_remap ( handle=>JobHandle,name=> 'REMAP_SCHEMA',old_value=> 'A',value=> 'B');
      dbms_datapump.metadata_filter (handle =>JobHandle, name =>'NAME_LIST', value =>'(''ACCOUNT'')',object_type => 'TABLE');
      dbms_datapump.set_parameter ( JobHandle,'TABLE_EXISTS_ACTION','REPLACE' );
    
      dbms_datapump.start_job( JobHandle);
      dbms_datapump.wait_for_job( JobHandle, js);
    end;
    /
    

    I tested it with 10.2.0.3 on both sites, and it worked:

    -- mylog.log content:
    
    Starting "B"."SYS_IMPORT_SCHEMA_03":
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "B"."ACCOUNT"                                    1 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01749: you may not GRANT/REVOKE privileges to/from yourself
    Failing sql is:
    GRANT SELECT ON "B"."ACCOUNT" TO "B"
    ORA-39166: Object ('ACCOUNT') was not found.
    Job "B"."SYS_IMPORT_SCHEMA_03" completed with 2 error(s) at 17:35:40
    

    And the table was created to the destination site:

    B@local> select * from account;
             N
    ----------
             1
    

    If you do not have a direct subsidy that B A.ACCOUNT on the remote site, then you will see this output, but the table is imported anyway:

    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "B"."ACCOUNT"                                    1 rows
    ORA-39166: Object ('ACCOUNT') was not found.
    

    Kind regards.
    Nelson

  • Works Datapump inside the SQL worksheet...

    Hi all

    "running the following code inside the spreadsheet and he just spits ' anonymous block completed.

    *****************************

    Set scan off

    Set serveroutput on

    escape game off

    whenever sqlerror exit

    DECLARE

    number of H1;

    errorvarchar varchar2 (100): = "ERROR";

    tryGetStatus number: = 0;

    Start

    H1: = dbms_datapump.open (operation = > 'IMPORT', job_mode = > job_name 'FULL' = > 'IMPORT_JOB_SQLDEV_73', version = > 'COMPATIBLE');

    tryGetStatus: = 1;

    dbms_datapump.set_parallel (handle = > h1, degree = > 1);

    dbms_datapump.add_file (handle = > h1, filename = > 'IMPORTtest2.LOG', directory = > 'DATA_PUMP_DIR', filetype = > 3);

    dbms_datapump.set_parameter (handle = > h1, name = > 'KEEP_MASTER', value = > 0);

    dbms_datapump.add_file (handle = > h1, filename = > ' ORAEU_COPIED.) (DMP", directory = >"DATA_PUMP_DIR", filetype = > 1);

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\APT_SYS_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/APT_SYS_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\APT_SYS_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/APT_SYS_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATA_MART_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_MART_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATA_MART_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_MART_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > UPPER ('C:\ORACLE\ORADATA\ORCL/DATA_STORE_DATA01.) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_STORE_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATA_STORE_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_STORE_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\RNET2_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/RNET2_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\RNET2_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/RNET2_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATA_AMA_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_AMA_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATA_AMA_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATA_AMA_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATASERVICES_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATASERVICES_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\DATASERVICES_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/DATASERVICES_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\ARC_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/ARC_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\MCSS_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/MCSS_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\MCSS_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/MCSS_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\IPAS_DATA01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/IPAS_DATA01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\IPAS_IDX01.)) DBF'), value = > SUPERIOR ('/ rdsdbdata/db/ORAEU_A/file/IPAS_IDX01.) DBF'));

    dbms_datapump.metadata_remap (manage = > h1, name = > 'REMAP_DATAFILE', old_value = > SUPERIOR ('C:\ORACLE\ORADATA\ORCL\PFIZERCMS_DATA01.)) DBF "), value = >"

    dbms_datapump.set_parameter (handle = > h1, name = > 'INCLUDE_METADATA', value = > 1);

    dbms_datapump.set_parameter (handle = > h1, name = > 'DATA_ACCESS_METHOD', value = > "AUTOMATIC").

    dbms_datapump.set_parameter (handle = > h1, name = > 'REUSE_DATAFILES', value = > 0);

    dbms_datapump.set_parameter (handle = > h1, name = > 'TABLE_EXISTS_ACTION', value = > 'REPLACE');

    dbms_datapump.set_parameter (handle = > h1, name = > 'SKIP_UNUSABLE_INDEXES', value = > 0);

    dbms_datapump. START_JOB (handle = > h1, skip_current = > 0, abort_step = > 0);

    dbms_datapump. Detach (handle = > h1);

    errorvarchar: = "NO_ERROR"

    EXCEPTION

    WHILE OTHERS THEN

    BEGIN

    IF ((errorvarchar = ' ERROR') AND (tryGetStatus = 1)) THEN

    DBMS_DATAPUMP. Detach (H1);

    END IF;

    EXCEPTION

    WHILE OTHERS THEN

    NULL;

    END;

    LIFT;

    END;

    /

    *****************

    all entries are appreciated

    Thank you

    Please let me know if you see a code problem

    What is your question has to do with Sql Developer?

    Unless you can connect you must mark ANSWERED thread and repost it in Sql and Pl/Sql forum

    the Japan Government says:

    Hi Jeff

    I expect that it runs the import at least.

    Get this error now on line 66:

    EXCEPTION

    WHILE OTHERS THEN

    BEGIN

    IF ((errorvarchar = ' ERROR') AND (tryGetStatus = 1)) THEN

    DBMS_DATAPUMP. Detach (H1);

    END IF;

    EXCEPTION---> line 66

    WHILE OTHERS THEN

    NULL;

    END;

    When you repost you should post the EXACT code you use. The code you posted is NOT valid: you must also REMOVE the WHEN of OTHER or you can expect everyone to book you to INTENTIONALLY HIDE any exceptions that occur. Why do you NOT want to know if Oracle finds problems in your code? That makes NO sense at all.

    IF ((errorvarchar = ' ERROR') AND (tryGetStatus = 1)) THEN

    There is NO SPACE after the "and" what is garbage just as far as Oracle are concerned.

    Either you have posted an incorrect code or your code has syntax errors and will not compile or run anyway.

    I suggest that correct you your syntax errors before reposting in the Sql forum. If you have pain to find their delete ALL unnecessary code to reduce the problem to the example as SIMPLE as possible, until you find the code causing errors.

  • Specified Import Tables in conjunction with schema remap

    I have a problem that I had to face for a while now.  I export some tables using DBMS_DATAPUMP of a schema (say, a test environment) and I would like to just a SINGLE import of this dump file table in another schema (say, a dev environment).  At the same time, I'm remapping the source table to a temporary table with the same structure.

    Let me start by saying, I used this script to run the export and import in the SAME pattern and it worked fine.  This problem only when I went to import the data into another schema, using METADATA_REMAP.  Here's the import code.

    BEGIN
          SELECT TO_CHAR (SYSDATE, 'YYYYMMDDHH24MISS') INTO L_JOB_NUM FROM DUAL;
          SELECT TO_CHAR (SYSDATE, 'YYYYMMDD') INTO L_SHORT_DT FROM DUAL;
          V_JOB_NUM :=
             DBMS_DATAPUMP.OPEN (OPERATION   => 'IMPORT',
                                 JOB_MODE    => 'TABLE',
                                 JOB_NAME    => 'BMF_CASE_IMP_' || L_JOB_NUM,
                                 VERSION     => 'COMPATIBLE');
                                 
          DBMS_DATAPUMP.SET_PARALLEL (HANDLE => V_JOB_NUM, DEGREE => 1);
          DBMS_DATAPUMP.ADD_FILE (
             HANDLE      => V_JOB_NUM,
             FILENAME    => 'BMF_CASE_IMP_BATCH_' || L_SHORT_DT || '.LOG',
             DIRECTORY   => G_DUMP_DIRECTORY,
             FILETYPE    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); 
          
                                         
          DBMS_DATAPUMP.METADATA_FILTER (HANDLE   => V_JOB_NUM,
                                         NAME     => 'NAME_EXPR',
                                         VALUE    => q'|in ('BATCH')|',
                                         OBJECT_PATH => 'TABLE');
                                         
          DBMS_DATAPUMP.METADATA_REMAP (HANDLE      => V_JOB_NUM,
                                        NAME        => 'REMAP_TABLE',
                                        OLD_VALUE   => 'BATCH',
                                        VALUE       => 'BATCH_TMP');
                                        
                                         
          d('Remapping from schema '|| G_FROM_SCHEMA || ' to ' || G_TO_SCHEMA );
          DBMS_DATAPUMP.METADATA_REMAP (HANDLE      => V_JOB_NUM,
                                        NAME        => 'REMAP_SCHEMA',
                                        OLD_VALUE   => G_FROM_SCHEMA,
                                        VALUE       => G_TO_SCHEMA);
          DBMS_DATAPUMP.ADD_FILE (
             HANDLE      => V_JOB_NUM,
             FILENAME    => 'BMF_CASE_EXP_' || i_case_control_id || '.DMP',
             DIRECTORY   => G_DUMP_DIRECTORY,
             FILETYPE    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);          
          DBMS_DATAPUMP.SET_PARAMETER (HANDLE   => V_JOB_NUM,
                                       NAME     => 'INCLUDE_METADATA',
                                       VALUE    => 0);
          DBMS_DATAPUMP.START_JOB (HANDLE         => V_JOB_NUM,
                                   SKIP_CURRENT   => 0,
                                   ABORT_STEP     => 0);
    

    If I remove the filter from the BATCH table metadata and run this, it ends and I get the following output in the LOG file:

    ...

    . . imported "CMR2_DEV." "' NTC_ACTION ': 'SYS_P1932' 13.84 KB 0 rows

    . . imported "CMR2_DEV." "' BATCH_TMP ': 'SYS_P343' 16.70 KB 1 lines

    (.. .and documents for all tables in the dump file)

    However, as soon as I activate the filter NAME_EXPR or NAME_LIST, I get nothing imported.  Just the following errors:

    -ORA-31627: API call succeeded, but more information

    -ORA-31655: no data or metadata of objects selected for employment

    It worked when I was not moving between the schemas so is there another way, I need to write my table filter expression, which will identify the BATCH table when a remapping of schema is used?

    Thanks in advance.

    Adam

    I think that my advice was not correct. The name_list filter only takes a table name.  If you do not have a list of schema filter, then the owner of the default table for the schema to run the task.  I think you need to add a filter of schema specifying the table owner.

    If you can't understand it, I can try to see if I can find the right calls, but it may take me a while.

    Dean

  • How to copy all the tables, triggers, etc, from a schema from one user to another

    Hello everyone!

    I'm looking for a QUERY or a stored procedure to copy the tables of a schema of the user to a different schema.

    Should resemble the kind of: copy (select * from object where owner = 'UserIwantToCopyFrom') user = "UserIwantToCopyTO".

    I'm sure that my example is rubbish, but I tried to explain what I want to do.

    Then there is a chance to do in sql code? I have to build a model of a schema of the user with hundreds of tables, triggers, etc. and copy it into several other user patterns.

    Thanks for your advice!

    Jan

    There are many examples available.
    What you generally want to do is:

    For the export, use the job_mode-online option "SCHEMA".
    Example of export

    http://www.oracle-base.com/articles/10g/OracleDataPump10g.php
    
    DECLARE
      l_dp_handle       NUMBER;
      l_last_job_state  VARCHAR2(30) := 'UNDEFINED';
      l_job_state       VARCHAR2(30) := 'UNDEFINED';
      l_sts             KU$_STATUS;
    BEGIN
      l_dp_handle := DBMS_DATAPUMP.open(
        operation   => 'EXPORT',
        job_mode    => 'SCHEMA',
        remote_link => NULL,
        job_name    => 'EMP_EXPORT',
        version     => 'LATEST');
    
      DBMS_DATAPUMP.add_file(
        handle    => l_dp_handle,
        filename  => 'SCOTT.dmp',
        directory => 'TEST_DIR');
    
      DBMS_DATAPUMP.add_file(
        handle    => l_dp_handle,
        filename  => 'SCOTT.log',
        directory => 'TEST_DIR',
        filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    
      DBMS_DATAPUMP.metadata_filter(
        handle => l_dp_handle,
        name   => 'SCHEMA_EXPR',
        value  => '= ''SCOTT''');
    
      DBMS_DATAPUMP.start_job(l_dp_handle);
    
      DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    /
    

    for import, you can use the remap_schema option with:

    DBMS_DATAPUMP.METADATA_REMAP (
       handle      IN NUMBER,
       name        IN VARCHAR2,
       old_value   IN VARCHAR2,
       value       IN VARCHAR2,
       object_type IN VARCHAR2 DEFAULT NULL);
    

    There are many more details in the document as provided Thierry.

  • How can I specify a different schema when importing using DBMS_DATAPUMP?

    I use the following procedure to import a schema. Is it possible that I can specify the import to import into another schema?

    Basically, what I want to do is take a pattern and copy it into another schema

    DECLARE
    v_dp_job_handle NUMBER;     -The handful of job data pump
    v_count NUMBER;     -Index of the loop
    v_percent_done NUMBER;     -Percentage of job complete
    v_job_state VARCHAR2 (30);     -To keep track of job status
    v_message KU$ _LOGENTRY;     -For error messages and work in PROGRESS
    v_job_status KU$ _JOBSTATUS;     -The State of the work of get_status
    v_status KU$ _STATUS;     -The status returned by get_status object
    v_logfile NUMBER;
    T_DATE VARCHAR2 (13).
    BEGIN
    v_project: = 'TEST ';
    T_DATE: = '03272009_1048 ';

    / * IMPORT * /.
    / * OPEN THE DATAPUMP PROCEDURE * /.
    v_dp_job_handle: = DBMS_DATAPUMP. OPEN)
    OPERATION = > "IMPORT."
    JOB_MODE = > "SCHEMA."
    -REMOTE_LINK = > v_desitination_server_name,
    JOB_NAME = > v_project | ' _IMP_' | T_DATE,
    VERSION = > 'LAST');

    / * ADD THE NAME OF THE DUMP FILE TO THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_project | ' _EXP_' | T_DATE |'. DMP',.
    DIRECTORY = > "DATAPUMP");

    / * ADD THE NAME OF THE LOG IMPORT FILE TO THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_project | ' _IMP_' | T_DATE |'. JOURNAL '.
    DIRECTORY = > 'DATAPUMP. "
    FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    / * START THE IMPORT * /.
    DBMS_DATAPUMP. START_JOB (v_dp_job_handle);

    / * END OF THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. Detach (v_dp_job_handle);
    END;

    Use metadata_remap with the REMAP_SCHEMA option proc:

    DBMS_DATAPUMP. METADATA_RAMAP (id, 'REMAP_SCHEMA', 'SOURCE_SCHEMA', 'DESTINATION_SCHEMA');

  • dbms_datapump.data_remap is to have question in the edition of the Oracle SE

    Hello

    Is there no difference b/w the function dbms_datapump.data_remap and EE edition of oracle.

    I have my code in the environment have complied and Edition oracle EE works data_remap fine function remapping table all I need in my database.
    but in the edition SE it gives Oracle error

    ORA-31623: a job is not attached to this session by the specified handle.

    I use the database to Oracle 11 g Release 11.2.0.3.0 - 64 bit Production database SE

    and Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production of EE database.

    I just want to clarify this dbms_datapump have any question in SE or not (IE it's functionality is available in SE)?

    Thank you.

    Hello
    I don't think that DATA_REMAP it is different between the two versions behave. Are you sure that the user is configured correctly - the most common cause of the error you are getting is that the user who runs the datapump doesn't have explicit "create table" privilege.

    Try running:

    grant create table to xx;

    where xx is the user, and then try again.

    If this help not try to add an exception block similar to the one I posted here: http://dbaharrison.blogspot.de/2013/05/dbmsdatapump-finding-out-what-actual.html

    See you soon,.
    Harry

  • Exclusion DBMS_DATAPUMP

    Hello

    How to exclude the index, synomys, grants, statistics while making import using DBMS_DATAPUMP.

    Could someone pls give me the syntax to exculde these things.

    Thank you
    JP

    Hi JP,
    The problem of single quotes - try this example (just update your schema name as suitable etc.):

    DECLARE
      l_dp_handle      NUMBER;
      l_last_job_state VARCHAR2(30) := 'UNDEFINED';
      l_job_state      VARCHAR2(30) := 'UNDEFINED';
      l_sts            KU$_STATUS;
      v_job_state      varchar2(4000);
    BEGIN
      l_dp_handle := DBMS_DATAPUMP.open(operation   => 'EXPORT',
                                        job_mode    => 'SCHEMA',
                                        remote_link => NULL,
                                        version     => 'LATEST');
      DBMS_DATAPUMP.add_file(handle    => l_dp_handle,
                             filename  => 'test.dmp',
                             directory => 'DATA_PUMP_DIR',
                             reusefile => 1);
      DBMS_DATAPUMP.add_file(handle    => l_dp_handle,
                             filename  => 'test.log',
                             directory => 'DATA_PUMP_DIR',
                             filetype  => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE,
                             reusefile => 1);
      DBMS_DATAPUMP.METADATA_FILTER(l_dp_handle, 'SCHEMA_LIST', '''ALIGNE''');
      DBMS_DATAPUMP.METADATA_FILTER(l_dp_handle,
                                    'EXCLUDE_PATH_EXPR',
                                    'IN (''INDEX'', ''SYNONYMS'',''GRANTS'',''STATISTICS'')');
      DBMS_DATAPUMP.start_job(l_dp_handle);
      DBMS_DATAPUMP.WAIT_FOR_JOB(l_dp_handle, v_job_state);
      DBMS_OUTPUT.PUT_LINE(v_job_state);
    END;
    

    See you soon,.
    Harry

  • dbms_datapump - how to ennusre pl/sql only ends when working.

    Hello

    Call dbms_datapump via pl/sql - look at outpurt directory where the log files and export are created it seems to take a certain time but pl/sql comes in saying all the much earlier.


    Uisng 11.2.0.3

    Want to pl/sql to display complete only when it is truly comprehensive.

    Seems tobe running in background.
    declare 
      -- Local variables here
      i integer;
    
      
    h1 number; -- Datapump handle 
      dir_name varchar2(30); -- Directory Name 
    
    v_file_name varchar2(100);
      v_log_name  varchar2(100);  
      
    v_job_status ku$_Status;          -- The status object returned by get_status
        v_job_state VARCHAR2(4000);
        v_status ku$_Status1010;
        v_logs ku$_LogEntry1010;
        v_row PLS_INTEGER;
        v_current_sequence_number archive_audit.aa_etl_run_num_seq%type;
       v_jobState                user_datapump_jobs.state%TYPE; 
    
    begin
    
    
    --execute immediate ('alter tablespace ARCHIVED_PARTITIONS read only');
    
    -- Get last etl_run_num_seq by querying public synonym ARCHIVE_ETL_RUN_NUM_SEQ
    -- Need check no caching on etl_run_num_seq
    
    select last_number - 1
    into v_current_sequence_number
    from ALL_SEQUENCES A
    WHERE A.SEQUENCE_NAME = 'ETL_RUN_NUM_SEQ';
    
    v_file_name := 'archiveexppre.'||v_current_sequence_number;
    v_log_name  := 'archiveexpprelog.'||v_current_sequence_number;
    
    dbms_output.put_line(v_file_name);
    dbms_output.put_line(v_log_name);
    
    -- Create a (user-named) Data Pump job to do a schema export.
    
      dir_name := 'DATA_EXPORTS_DIR'; 
      h1 := dbms_datapump.open(operation =>'EXPORT', 
      job_mode =>'TRANSPORTABLE', 
      remote_link => NULL, 
      job_name    => 'ARCHIVEEXP10');--||v_current_sequence_number); 
    
      dbms_datapump.add_file(handle =>h1, 
                             filename => v_file_name, 
                             directory => dir_name, 
                             filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE, 
                             reusefile => 1); -- value of 1 instructs to overwrite existing file 
    
      dbms_datapump.add_file(handle =>h1, 
                             filename => v_log_name, 
                             directory => dir_name, 
                             filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE, 
                             reusefile => 1); -- value of 1 instructs to overwrite existing file 
    
      dbms_datapump.metadata_filter(      
          handle => h1,
          name   => 'TABLESPACE_EXPR',
         VALUE    => 'IN(''ARCHIVED_PARTITIONS'')'
          );
    -- 
      
     --dbms_datapump.metadata_filter(handle =>h1, 
      --                       name => 'TABLE_FILTER', 
      --                       value => 'BATCH_AUDIT'); 
    
        
        
     
    -- Start the datapump_job
    
    -- 
     
        
    -- dbms_datapump.set_parameter(h1, 'TRANSPORTABLE', 'ALWAYS'); 
      dbms_datapump.start_job(h1); 
      begin 
        null;
        -- dbms_datapump.detach(handle => h1); 
      end;
    
      
    dbms_datapump.wait_for_job(h1,v_jobState);
    
    dbms_output.put_line('Job has completed');
     
     exception
        when others then
          dbms_datapump.get_status(handle => h1, 
                                 mask => dbms_datapump.KU$_STATUS_WIP, 
                                 timeout=> 0, 
                                job_state => v_job_state, 
                                status => v_job_status); 
        
                   dbms_output.put_line(v_job_state);
      
         RAISE_APPLICATION_ERROR(-20010,DBMS_UTILITY.format_error_backtrace);
     
     
     
    
     
     
      
    end;
    This causes.

    How can I make sure that the work of pl/sql ends only when the work of dbms_daatpump i.e. runs first plan.


    Tried to add dbms_datapump.wait_for_job (h1, v_jobState);

    but get message job is not attached to this session when adding this.

    Deleted Dbms_datapump.detach and now wokrs OK - seems that dbms_datapump.detach + job queue is mutually exclusive?

    Thank you

    Published by: user5716448 on 28-Sep-2012 06:37

    Published by: user5716448 on 28-Sep-2012 06:37

    Published by: user5716448 on 28-Sep-2012 06:38

    Published by: user5716448 on 28-Sep-2012 06:47

    Published by: user5716448 on 28-Sep-2012 06:50

    user5716448 wrote:

    Deleted Dbms_datapump.detach and now wokrs OK - seems that dbms_datapump.detach + job queue is mutually exclusive?

    If you want your block to WAIT until the datapump finishes, why loose you? Detach means you are more interested in once work has begun. Remove detach it and keep the wait_for_job you've discovered.

  • DBMS_DATAPUMP - import a file of DB Link dump

    I have two Unix machine, each containing two database.

    Below show how its organized.
    Server1
    ----------
    
    1. DB1 - User1 - DbLink1 -> This points to DB2
    2. DB2 - User1
    
    DB Version 10g.
    
    Server2
    ----------
    
    1. DB1 - User1 - DBlink1 -> This points to DB2
    2. DB2 - User1
    
    DB Version 10g.
    Now the question is.

    I have a program that uses DBMS_DATAPUMP in Server1.DB1.User1 that will export 10 tables of DB1. User1
    and 2 tables in DB2. User1 (via DbLink1).

    Now I've moved these files from Server1 to Server2 dump.

    I am writing a program using DBMS_DATAPUMP which will import the dump files in Server2.DB1.User1
    and Server2.DB2.User1. I have no problem with the importation of tables, in Server2.DB1.User1, but how do I
    import, in the Server2.DB2.User1 of Server2.DB1.User1 using the DbLink1? Any idea?

    Dean Gagne says:
    You cannot import a dumpfile over a network link.

    I think that, knani wantes just to launch the import through the db-link process.

  • Looking for help with dbms_datapump.data_remap

    I left this post another forum. I think it might be a better fit.

    Hi I'm moving this line but get ORA-39001: invalid argument value:
    dbms_datapump.data_remap (dph, 'EMP_NAME', 'EMP', 'ENAME', 'DATA_MASKER_UTIL.mask_varchar2', NULL);

    I'm on Oracle 11.1.0.6.0 and the use of the SCOTT schema.

    Thanks for any help.

    Chris wrote:
    dbms_datapump.data_remap (dph, 'EMP_NAME', 'EMP', 'ENAME', 'DATA_MASKER_UTIL.mask_varchar2', NULL);

    Are you sure of values passed are correct? Because, as indicated in the documentation [url http://download.oracle.com/docs/cd/B28359_01/appdev.111/b28419/d_datpmp.htm#CHDHJFCB], the name parameter value must be 'COULMN_NAME '.

    Also have a [url http://forums.oracle.com/forums/thread.jspa?messageID=9372117] read this thread and the specification of the SYS. Package DBMS_DATAPUMP.

  • DBMS_DATAPUMP

    dbms_datapump.add_file (handle = > l_dp_handle,)
    filename = > 'CCPDEV ' | TO_CHAR (SYSDATE, 'YYYYMMDD HH24MISS') |'. DMP',.
    Directory = > 'INSA');

    using this procedure, I can create the dump file, here, I want to also log file.
    How can I do this?

    Hello

    It's all in the documentation:
    http://download.Oracle.com/docs/CD/E11882_01/AppDev.112/e16760/d_datpmp.htm#i997139

    You must add a file to your business and clarify that it is the log with the FILETYPE parameter file.

    dbms_datapump.add_file(handle => l_dp_handle,
    filename =>'CCPDEV'||TO_CHAR(SYSDATE,'YYYYMMDD HH24MISS')||'log',
    directory => 'INSA',
    filetype=>DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    

    Specifying the file type is not mandatory for the dump file, because it is the default value.

    Hope this will help.

    Kind regards
    Sylvie

  • expdp include with dbms_datapump API

    Hello

    I'm trying to translate the following for use with the dbms_datapump API but I m, difficulty to write the correct syntax to INCLUDE = SCHEMA:------"=-"myschema\"------" section.

    expdp dumpfile = mydirectoryp:mydumpfile.dp = SCHEMA INCLUDE:------logfile = mydirectoryg:mylogfile.log ' =------------'myschema\' "full = y

    Could someone help me with this?

    Thank you

    JP

    Sorry - I forgot the other filter that you need:

    dbms_datapump. () METADATA_FILTER
    manage = 16,
    name = "INCLUDE_PATH_LIST"
    value = "SCHEMA."
    object_path = null,
    object_type = null);

    Dean

  • DBMS_DATAPUMP. OPEN

    Hello

    I am a newbie in PL/SQL & DATAPUMP!

    I tried to generate (OPEN) IMPORT-WORK with:
    CREATE OR REPLACE PROCEDURE XX IS
    BEGIN
      BEGIN
        DECLARE
          HANDLE1 NUMBER;
        BEGIN
          HANDLE1 := DBMS_DATAPUMP.OPEN(OPERATION   => 'IMPORT',
                                        JOB_MODE    => 'SCHEMA',
                                        REMOTE_LINK => 'STRM');
          DBMS_OUTPUT.PUT_LINE('HANDLE1 :' || HANDLE1);
        EXCEPTION
          WHEN OTHERS THEN
            DBMS_OUTPUT.PUT_LINE('EX_HANDLE1 :' || HANDLE1);
            DBMS_OUTPUT.PUT_LINE('Import Error 1 :' || SQLERRM(SQLCODE));
        END;
      
      END;
    END XX;
    and always have the following error after running:
    SQL> execute xx;
     
    EX_HANDLE1 :
    Import Error 1 :ORA-31626: Job ist nicht vorhanden
     
    PL/SQL procedure successfully completed
     
    SQL> 
    And the handle Variable is empty !

    What I did wrong?

    Help, please!

    hqt200475

    Published by: hqt200475 on March 29, 2011 03:16

    Try with:

     l_dp_handle := DBMS_DATAPUMP.OPEN(operation   => 'IMPORT',
                                        job_mode    => 'SCHEMA',
                                        remote_link => NULL,
                                        job_name    => 'IMPORT_JOB',
                                        version     => 'COMPATIBLE');
    

    Are you sure remote connection? See the documentation for this parameter and use it.

Maybe you are looking for