Completed_rows work in DBMS_DATAPUMP?

Hello:

I am currently implementing a PL/SQL script to load a dump (expdp) using DBMS_DATA_PUMP file and while it loads the file, I can't get the number of rows populated using the element completed_rows of the state variable.  I noticed that the completed_rows of the following code sections are commented.  Is it because this feature does not work?  If not, how can I get the number of filled lines?

I'm doing 10 gr 2 but she also does not work in 12 c for what it is worth.

Thank you.

Types of worker status provides information about the lines:

http://docs.Oracle.com/CD/B28359_01/AppDev.111/b28419/d_datpmp.htm#BABJJAFC

Tags: Database

Similar Questions

  • Using DBMS_DATAPUMP with the LONG data type

    I have a procedure below that calls the DBMS_DATAPUMP procedure using a REMOTE_LINK move a schema of one database to another. However, some tables in this schema have columns with the data type of LONG. And when I run it I get an error saying that you cannot move the data with the LONG data type using a REMOTE CONNECTION. So no data in these specific tables gets flying over.

    Does anyone else have this problem? If so, do you have a work around? I tried to add a CLOB column in my table and affecting the new CLOB equal THROUGHOUT, but I couldn't get that to not work either... even when I tried to use a TO_LOB. If I could get that pass, then I could just slide ALONG, the schema, and then re-create the LONG column on the opposite side.

    Here is my procedure...



    DECLARE
    / * IMPORT/EXPORT VARIABLES * /.
    v_dp_job_handle NUMBER;          -The handful of job data pump
    v_count NUMBER;          -Index of the loop
    v_percent_done NUMBER;          -Percentage of job complete
    v_job_state VARCHAR2 (30);     -To keep track of job status
    v_message KU$ _LOGENTRY;     -For error messages and work in PROGRESS
    v_job_status KU$ _JOBSTATUS;     -The State of the work of get_status
    v_status KU$ _STATUS;     -The status returned by get_status object
    v_logfile NUMBER;
    T_DATE VARCHAR2 (13).
    v_source_server_name VARCHAR2 (50);
    v_destination_server_name VARCHAR2 (50);

    BEGIN
    v_project: = 'TEST ';
    T_DATE: = TO_CHAR (SYSDATE, 'MMDDYYYY_HHMI');
    v_source_server_name: = 'TEST_DB ';

    v_dp_job_handle: = DBMS_DATAPUMP. OPEN)
    OPERATION = > "IMPORT."
    JOB_MODE = > "SCHEMA."
    REMOTE_LINK = > v_source_server_name,
    JOB_NAME = > v_project | ' _EXP_' | T_DATE,
    VERSION = > 'LAST');

    v_logfile: = DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE;

    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_project | ' _EXP_' | T_DATE |'. JOURNAL '.
    DIRECTORY = > 'DATAPUMP. "
    FILETYPE = > v_logfile);

    DBMS_DATAPUMP. () METADATA_FILTER
    MANAGE = > v_dp_job_handle,
    NAME = > 'SCHEMA_EXPR ',.
    VALUE = > ' = "' | v_project | " ' ') ;

    DBMS_DATAPUMP. START_JOB (v_dp_job_handle);

    v_percent_done: = 0;
    v_job_state: = "UNDEFINED";

    WHILE (v_job_state! = "COMPLETED") AND (v_job_state! = "STOPPED")
    LOOP
    DBMS_DATAPUMP. GET_STATUS)
    v_dp_job_handle,
    DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR + DBMS_DATAPUMP. KU$ _STATUS_JOB_STATUS + DBMS_DATAPUMP. KU$ _STATUS_WIP.
    -1,
    v_job_state,
    v_status);

    v_job_status: = v_status. JOB_STATUS;

    IF v_job_status. PERCENT_DONE! = v_percent_done THEN
    DBMS_OUTPUT. Put_line ('* percent of the job done = ' |) To_char (v_job_status. PERCENT_DONE));
    v_percent_done: = v_job_status. PERCENT_DONE;
    END IF;

    IF BITAND (v_status. MASK, DBMS_DATAPUMP. KU$ _STATUS_WIP)! = 0 THEN
    v_message: = v_status. WORK IN PROGRESS;
    ELSIF BITAND (v_status.mask, DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR)! = 0 THEN
    v_message: = v_status. ERROR;
    ON THE OTHER
    v_message: = NULL;
    END IF;

    IF v_message IS NOT NULL THEN
    v_count: = v_message. FIRST;
    While v_count IS NOT NULL
    LOOP
    DBMS_OUTPUT. Put_line (v_message (v_count). LOGTEXT);
    v_count: = v_message. Next (v_count);
    END LOOP;
    END IF;
    END LOOP;

    DBMS_OUTPUT. Put_line ("' job has completed");
    DBMS_OUTPUT. Put_line (' State of the Final work = ' | v_job_state);

    DBMS_DATAPUMP. Detach (v_dp_job_handle);
    END;

    TO_LOB can be used to insert, create table in select and update the instructions to convert

    So: You simply cannot use it in SELECT..., you can use CREATE TABLE BLAH AS SELECT TO_LOB (LONG_COLUMN) OF DREADED_TABLE_WITH_LONG_COL;

  • dbms_datapump - how to ennusre pl/sql only ends when working.

    Hello

    Call dbms_datapump via pl/sql - look at outpurt directory where the log files and export are created it seems to take a certain time but pl/sql comes in saying all the much earlier.


    Uisng 11.2.0.3

    Want to pl/sql to display complete only when it is truly comprehensive.

    Seems tobe running in background.
    declare 
      -- Local variables here
      i integer;
    
      
    h1 number; -- Datapump handle 
      dir_name varchar2(30); -- Directory Name 
    
    v_file_name varchar2(100);
      v_log_name  varchar2(100);  
      
    v_job_status ku$_Status;          -- The status object returned by get_status
        v_job_state VARCHAR2(4000);
        v_status ku$_Status1010;
        v_logs ku$_LogEntry1010;
        v_row PLS_INTEGER;
        v_current_sequence_number archive_audit.aa_etl_run_num_seq%type;
       v_jobState                user_datapump_jobs.state%TYPE; 
    
    begin
    
    
    --execute immediate ('alter tablespace ARCHIVED_PARTITIONS read only');
    
    -- Get last etl_run_num_seq by querying public synonym ARCHIVE_ETL_RUN_NUM_SEQ
    -- Need check no caching on etl_run_num_seq
    
    select last_number - 1
    into v_current_sequence_number
    from ALL_SEQUENCES A
    WHERE A.SEQUENCE_NAME = 'ETL_RUN_NUM_SEQ';
    
    v_file_name := 'archiveexppre.'||v_current_sequence_number;
    v_log_name  := 'archiveexpprelog.'||v_current_sequence_number;
    
    dbms_output.put_line(v_file_name);
    dbms_output.put_line(v_log_name);
    
    -- Create a (user-named) Data Pump job to do a schema export.
    
      dir_name := 'DATA_EXPORTS_DIR'; 
      h1 := dbms_datapump.open(operation =>'EXPORT', 
      job_mode =>'TRANSPORTABLE', 
      remote_link => NULL, 
      job_name    => 'ARCHIVEEXP10');--||v_current_sequence_number); 
    
      dbms_datapump.add_file(handle =>h1, 
                             filename => v_file_name, 
                             directory => dir_name, 
                             filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE, 
                             reusefile => 1); -- value of 1 instructs to overwrite existing file 
    
      dbms_datapump.add_file(handle =>h1, 
                             filename => v_log_name, 
                             directory => dir_name, 
                             filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE, 
                             reusefile => 1); -- value of 1 instructs to overwrite existing file 
    
      dbms_datapump.metadata_filter(      
          handle => h1,
          name   => 'TABLESPACE_EXPR',
         VALUE    => 'IN(''ARCHIVED_PARTITIONS'')'
          );
    -- 
      
     --dbms_datapump.metadata_filter(handle =>h1, 
      --                       name => 'TABLE_FILTER', 
      --                       value => 'BATCH_AUDIT'); 
    
        
        
     
    -- Start the datapump_job
    
    -- 
     
        
    -- dbms_datapump.set_parameter(h1, 'TRANSPORTABLE', 'ALWAYS'); 
      dbms_datapump.start_job(h1); 
      begin 
        null;
        -- dbms_datapump.detach(handle => h1); 
      end;
    
      
    dbms_datapump.wait_for_job(h1,v_jobState);
    
    dbms_output.put_line('Job has completed');
     
     exception
        when others then
          dbms_datapump.get_status(handle => h1, 
                                 mask => dbms_datapump.KU$_STATUS_WIP, 
                                 timeout=> 0, 
                                job_state => v_job_state, 
                                status => v_job_status); 
        
                   dbms_output.put_line(v_job_state);
      
         RAISE_APPLICATION_ERROR(-20010,DBMS_UTILITY.format_error_backtrace);
     
     
     
    
     
     
      
    end;
    This causes.

    How can I make sure that the work of pl/sql ends only when the work of dbms_daatpump i.e. runs first plan.


    Tried to add dbms_datapump.wait_for_job (h1, v_jobState);

    but get message job is not attached to this session when adding this.

    Deleted Dbms_datapump.detach and now wokrs OK - seems that dbms_datapump.detach + job queue is mutually exclusive?

    Thank you

    Published by: user5716448 on 28-Sep-2012 06:37

    Published by: user5716448 on 28-Sep-2012 06:37

    Published by: user5716448 on 28-Sep-2012 06:38

    Published by: user5716448 on 28-Sep-2012 06:47

    Published by: user5716448 on 28-Sep-2012 06:50

    user5716448 wrote:

    Deleted Dbms_datapump.detach and now wokrs OK - seems that dbms_datapump.detach + job queue is mutually exclusive?

    If you want your block to WAIT until the datapump finishes, why loose you? Detach means you are more interested in once work has begun. Remove detach it and keep the wait_for_job you've discovered.

  • DBMS_DATAPUMP.metadata_remap - a matter of REMAP_TABLE.

    I want to export SCOTT. DEPT table and imports than HR. DEPT_HR with constraint. I'm using DBMS_METADATA. METADATA_REMAP with REMAP_SCHEMA (to change the schema anme) and REMAP_TABLE (to change the name of the table). I don't know where I am getting the error. It seems that REMAP_SCHEMA is changing all the names of schema successfully, but REMAP_TABLE does not change the name of the table on the constraint. So constraint is not be created. Is there another workaround solution? Here is a small proof of concept:

    SQL> --My database version.
    SQL> ----------------------
    SQL> SELECT * FROM v$version;
    
    BANNER
    --------------------------------------------------------------------------------
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    CORE    11.2.0.1.0      Production
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production
    
    SQL> SET SERVEROUT ON
    SQL> ed
    Wrote file afiedt.buf
    
      1  DECLARE
      2  l_data_pump_handle    NUMBER;
      3  l_logfilename    VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.log';
      4  l_expfilename    VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.dmp';
      5  BEGIN
      6  l_data_pump_handle:= DBMS_DATAPUMP.OPEN(operation   => 'EXPORT',
      7                        job_mode    => 'TABLE',
      8                        remote_link => NULL,
      9                        job_name    => 'TEST_REMAP_DP',
    10                        version     => 'LATEST');
    11   DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_handle,
    12                     filename    => l_expfilename,
    13                     directory => 'SAUBHIK',
    14                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
    15  DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_handle,
    16                     filename    => l_logfilename,
    17                     directory => 'SAUBHIK',
    18                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    19   DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    20                        name   => 'SCHEMA_EXPR',
    21                        value  =>'= '||''''||'SCOTT'||'''');
    22   DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    23                        name   => 'NAME_EXPR',
    24                        value  =>'= '||''''||'DEPT'||'''');
    25  --We don't need index
    26    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    27                        name   => 'EXCLUDE_PATH_EXPR',
    28                        value  =>'=''INDEX''');
    29  -- We don't copy table statistics!!
    30    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    31                        name   => 'EXCLUDE_PATH_EXPR',
    32                        value  =>'=''STATISTICS''');
    33   -- We don't copy index statistics either!!
    34    DBMS_DATAPUMP.metadata_filter(handle => l_data_pump_handle,
    35                        name   => 'EXCLUDE_PATH_EXPR',
    36                        value  =>'=''INDEX_STATISTICS''');
    37    -- We do not need the data!!
    38    DBMS_DATAPUMP.DATA_FILTER(
    39     handle => l_data_pump_handle,
    40     name => 'INCLUDE_ROWS',
    41     value =>0
    42     );
    43  -- Start the export now.
    44       DBMS_DATAPUMP.start_job(l_data_pump_handle);
    45       dbms_output.put_line('Export started....');
    46   -- Detach, it's finish!
    47      DBMS_DATAPUMP.detach(l_data_pump_handle);
    48      dbms_output.put_line('Export ended....');
    49  EXCEPTION
    50       WHEN OTHERS THEN
    51        dbms_datapump.stop_job(l_data_pump_handle);
    52        RAISE;
    53*  END;
    54  /
    Export started....
    Export ended....
    
    PL/SQL procedure successfully completed.
    
    SQL> SELECT * FROM user_datapump_jobs;
    
    no rows selected
    
    
    
    

    Now, I'm importing that:

    SQL> ed
    Wrote file afiedt.buf
    
      1  --DWABI_28052015143133.dmp
      2  DECLARE
      3  l_data_pump_imp_handle NUMBER;
      4  l_logfilename  VARCHAR2(30) := 'DWABI_'||to_char(sysdate, 'DDMMRRRRhh24miss') || '.log';
      5  ind       NUMBER;        -- loop index
      6   pct_done  NUMBER;        -- percentage complete
      7   job_state VARCHAR2(30);  -- track job state
      8   le        ku$_LogEntry;  -- WIP and error messages
      9   js        ku$_JobStatus; -- job status from get_status
    10   jd        ku$_JobDesc;   -- job description from get_status
    11   sts       ku$_Status;    -- status object returned by get_status
    12  BEGIN
    13  l_data_pump_imp_handle:= DBMS_DATAPUMP.OPEN(operation   => 'IMPORT',
    14                        job_mode    => 'FULL',
    15                        remote_link => NULL,
    16                        job_name    => 'TEST',
    17                        version     => 'LATEST');
    18   DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_imp_handle,
    19                     filename    => 'DWABI_28052015143133.dmp',
    20                     directory => 'SAUBHIK',
    21                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
    22  DBMS_DATAPUMP.ADD_FILE(handle    => l_data_pump_imp_handle,
    23                     filename    => l_logfilename,
    24                     directory => 'SAUBHIK',
    25                     filetype    => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    26   --If table is already there then do not import.
    27   dbms_datapump.set_parameter(handle => l_data_pump_imp_handle,
    28                              name => 'TABLE_EXISTS_ACTION',
    29                              value =>'SKIP');
    30    -- We need to remap the schema!!.
    31    dbms_output.put_line('Changing Schema...');
    32    DBMS_DATAPUMP.METADATA_REMAP (
    33     handle => l_data_pump_imp_handle,
    34     name => 'REMAP_SCHEMA',
    35     old_value => 'SCOTT',
    36     value=>'HR'
    37     );
    38    -- We need to remap the table!!. This is not working properly.
    39    dbms_output.put_line('Changing Table...');
    40    DBMS_DATAPUMP.METADATA_REMAP (
    41     handle => l_data_pump_imp_handle,
    42     name => 'REMAP_TABLE',
    43     old_value => 'DEPT',
    44     value=>'DEPT_HR',
    45     object_type => NULL
    46     );
    47   -- Start the import now.
    48       DBMS_DATAPUMP.start_job(l_data_pump_imp_handle);
    49    -- monitor job
    50    pct_done := 0;
    51    job_state := 'UNDEFINED';
    52    WHILE (job_state != 'COMPLETED') AND (job_state != 'STOPPED') LOOP
    53      dbms_datapump.get_status(l_data_pump_imp_handle, dbms_datapump.ku$_status_job_error +
    54      dbms_datapump.ku$_status_job_status +
    55      dbms_datapump.ku$_status_wip, -1, job_state, sts);
    56      js := sts.job_status;
    57      -- If the percentage done changed, display the new value
    58      IF js.percent_done != pct_done THEN
    59        dbms_output.put_line('*** Job percent done = ' ||
    60        to_char(js.percent_done));
    61        pct_done := js.percent_done;
    62      END IF;
    63      -- If any work-in-progress (WIP) or error messages
    64      -- were received for the job, display them.
    65      IF (BITAND(sts.mask,dbms_datapump.ku$_status_wip) != 0) THEN
    66        le := sts.wip;
    67      ELSE
    68        IF (BITAND(sts.mask,dbms_datapump.ku$_status_job_error) != 0) THEN
    69          le := sts.error;
    70        ELSE
    71          le := NULL;
    72        END IF;
    73      END IF;
    74      IF le IS NOT NULL THEN
    75        ind := le.FIRST;
    76        WHILE ind IS NOT NULL LOOP
    77          dbms_output.put_line(le(ind).LogText);
    78          ind := le.NEXT(ind);
    79        END LOOP;
    80      END IF;
    81      --DBMS_LOCK.sleep (10);
    82    END LOOP;
    83    -- Indicate that the job finished and detach from it.
    84    dbms_output.put_line('Job has completed');
    85   -- Detach, it's finish!
    86       DBMS_DATAPUMP.detach(l_data_pump_imp_handle);
    87  EXCEPTION
    88       WHEN OTHERS THEN
    89        dbms_datapump.stop_job(l_data_pump_imp_handle);
    90        RAISE;
    91* END;
    SQL> /
    Changing Schema...
    Changing Table...
    Master table "SYS"."TEST" successfully loaded/unloaded
    Starting "SYS"."TEST":
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    ORA-39083: Object type CONSTRAINT failed to create with error:
    ORA-00942: table
    or view does not exist
    Failing sql is:
    ALTER TABLE "HR"."DEPT" ADD CONSTRAINT
    "PK_DEPT" PRIMARY KEY ("DEPTNO") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255
    STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
    PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE
    DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS"  ENABLE
    
    Job "SYS"."TEST" completed with 1 error(s) at 15:04:02
    Job has completed
    
    PL/SQL procedure successfully completed.
    
    SQL>
    
    
    
    

    If you look at the failing sql code, it is clear that the name of the table in the constraint definition has not changed, but the DEPT_HR table is created in the HR schema without constraint. What's not here?

    Post edited by: Massimiliano edited the subject line, because he said as DBMS_METADATA. Changed to DBMS_DATAPUMP.

    Hello

    This is a bug in 11.2.0.1 I think - please this Ref

    Oracle Support Document 1609238.1 (REMAP_TABLE on IMPDP FAILS WITH ORA-942) can be found at: https://support.oracle.com/epmos/faces/DocumentDisplay?id=1609238.1

    See you soon,.

    Rich

  • dbms_datapump.data_remap is to have question in the edition of the Oracle SE

    Hello

    Is there no difference b/w the function dbms_datapump.data_remap and EE edition of oracle.

    I have my code in the environment have complied and Edition oracle EE works data_remap fine function remapping table all I need in my database.
    but in the edition SE it gives Oracle error

    ORA-31623: a job is not attached to this session by the specified handle.

    I use the database to Oracle 11 g Release 11.2.0.3.0 - 64 bit Production database SE

    and Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64 bit Production of EE database.

    I just want to clarify this dbms_datapump have any question in SE or not (IE it's functionality is available in SE)?

    Thank you.

    Hello
    I don't think that DATA_REMAP it is different between the two versions behave. Are you sure that the user is configured correctly - the most common cause of the error you are getting is that the user who runs the datapump doesn't have explicit "create table" privilege.

    Try running:

    grant create table to xx;

    where xx is the user, and then try again.

    If this help not try to add an exception block similar to the one I posted here: http://dbaharrison.blogspot.de/2013/05/dbmsdatapump-finding-out-what-actual.html

    See you soon,.
    Harry

  • dbms_datapump import dblink ORA-39165

    Hi all

    I try to import a table on a from dblink to a diagram a diagram b but I get ORA-39165 scheme ('A') has not been found. and ORA-39166 object ('ACCOUNT') has not been found. When you try to import b to b it works but oddly always saves an ORA-39166. User B has imp_full_database and permission on two databases:

    declare
    Number of JobHandle;
    JS varchar2 (9); -COMPLETED or STOPPED
    q varchar2 (1): = chr (39);


    BEGIN / * open a new level of schema import job using a link to default DB * /.
    JobHandle: = dbms_datapump.open (operation = > 'IMPORT',)
    job_mode = > 'TABLE',
    remote_link = > "MRT");

    dbms_datapump.add_file (JobHandle,
    filename = > "mylog";
    Directory = > 'DATA_PUMP_DIR. "
    filetype = > dbms_datapump.ku$ _file_type_log_file);

    dbms_datapump.metadata_remap (manage = > JobHandle,)
    name = > 'REMAP_SCHEMA ',.
    old_value = > "A"
    value = > 'B');

    dbms_datapump.metadata_filter (manage = > JobHandle,)
    name = > 'SCHEMA_EXPR ',.
    value = > "IN"A","
    object_type = > 'TABLE');

    dbms_datapump.metadata_filter (manage = > JobHandle,)
    name = > 'NAME_LIST;
    ' value = > '('' COMPTE ''), '
    object_type = > 'TABLE');

    dbms_datapump.set_parameter (JobHandle,
    "TABLE_EXISTS_ACTION."
    'REPLACE');

    dbms_datapump. START_JOB (JobHandle);

    dbms_datapump.wait_for_job (JobHandle, js);

    end;

    For the life of me I don't understand how can I tell him that the source table is in A diagram. Any help would be greatly appreciated.

    Thank you very much

    Ok. Then you export in job_mode SCHEME and patterns of filter and tables, using metadata filter with SCHEMA_EXPR and INCLUDE_PATH_EXPR.

    Using the code you provided, here it is changed:

    declare
      JobHandle number;
      js varchar2(9); -- COMPLETED or STOPPED
      q varchar2(1) := chr(39); 
    
    BEGIN /* open a new schema level import job using a default DB link */
      JobHandle := dbms_datapump.open (operation=>'IMPORT', job_mode=>'SCHEMA', remote_link=>'RMT'); 
    
      dbms_datapump.add_file (JobHandle, filename => 'mylog', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
      ---
      DBMS_DATAPUMP.metadata_filter (handle=> JobHandle, name=> 'SCHEMA_EXPR',VALUE=> 'IN(''A'')');
      dbms_datapump.metadata_filter (handle => JobHandle,name => 'INCLUDE_PATH_EXPR',value => 'IN (''TABLE'')');
      --
      dbms_datapump.metadata_remap ( handle=>JobHandle,name=> 'REMAP_SCHEMA',old_value=> 'A',value=> 'B');
      dbms_datapump.metadata_filter (handle =>JobHandle, name =>'NAME_LIST', value =>'(''ACCOUNT'')',object_type => 'TABLE');
      dbms_datapump.set_parameter ( JobHandle,'TABLE_EXISTS_ACTION','REPLACE' );
    
      dbms_datapump.start_job( JobHandle);
      dbms_datapump.wait_for_job( JobHandle, js);
    end;
    /
    

    I tested it with 10.2.0.3 on both sites, and it worked:

    -- mylog.log content:
    
    Starting "B"."SYS_IMPORT_SCHEMA_03":
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "B"."ACCOUNT"                                    1 rows
    Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
    ORA-39083: Object type OBJECT_GRANT failed to create with error:
    ORA-01749: you may not GRANT/REVOKE privileges to/from yourself
    Failing sql is:
    GRANT SELECT ON "B"."ACCOUNT" TO "B"
    ORA-39166: Object ('ACCOUNT') was not found.
    Job "B"."SYS_IMPORT_SCHEMA_03" completed with 2 error(s) at 17:35:40
    

    And the table was created to the destination site:

    B@local> select * from account;
             N
    ----------
             1
    

    If you do not have a direct subsidy that B A.ACCOUNT on the remote site, then you will see this output, but the table is imported anyway:

    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    . . imported "B"."ACCOUNT"                                    1 rows
    ORA-39166: Object ('ACCOUNT') was not found.
    

    Kind regards.
    Nelson

  • DBMS_DATAPUMP. OPEN

    Hello

    I am a newbie in PL/SQL & DATAPUMP!

    I tried to generate (OPEN) IMPORT-WORK with:
    CREATE OR REPLACE PROCEDURE XX IS
    BEGIN
      BEGIN
        DECLARE
          HANDLE1 NUMBER;
        BEGIN
          HANDLE1 := DBMS_DATAPUMP.OPEN(OPERATION   => 'IMPORT',
                                        JOB_MODE    => 'SCHEMA',
                                        REMOTE_LINK => 'STRM');
          DBMS_OUTPUT.PUT_LINE('HANDLE1 :' || HANDLE1);
        EXCEPTION
          WHEN OTHERS THEN
            DBMS_OUTPUT.PUT_LINE('EX_HANDLE1 :' || HANDLE1);
            DBMS_OUTPUT.PUT_LINE('Import Error 1 :' || SQLERRM(SQLCODE));
        END;
      
      END;
    END XX;
    and always have the following error after running:
    SQL> execute xx;
     
    EX_HANDLE1 :
    Import Error 1 :ORA-31626: Job ist nicht vorhanden
     
    PL/SQL procedure successfully completed
     
    SQL> 
    And the handle Variable is empty !

    What I did wrong?

    Help, please!

    hqt200475

    Published by: hqt200475 on March 29, 2011 03:16

    Try with:

     l_dp_handle := DBMS_DATAPUMP.OPEN(operation   => 'IMPORT',
                                        job_mode    => 'SCHEMA',
                                        remote_link => NULL,
                                        job_name    => 'IMPORT_JOB',
                                        version     => 'COMPATIBLE');
    

    Are you sure remote connection? See the documentation for this parameter and use it.

  • DBMS_DATAPUMP parameter problem

    Hello

    Try to use the aPL/SQL 11g R1 procedure which did a quick export fo a few tables using dbms_datapump which replaces the same output file whenever it is running, the script below. Problem is that it will not overwrite the existing file - I found that there is a parameter REUSE_DUMPFILES, who works with the command of the expdp line but I am unable to be recognized in my script!

    I tried the next line using high - and low of breaks 'Y', but it fails on compiling. What I am doing wrong?

    dbms_datapump.set_parameter (handle = > h1, name = > 'REUSE_DUMPFILES', value = > 'y');


    Main script

    CREATE or REPLACE procedure DCH_ERPM.instant_dump
    as
    NUMBER of H1;

    Start

    H1: = dbms_datapump.open (operation = > 'EXPORT', job_mode = > 'TABLE', job_name = > ", version = > 'COMPATIBLE');
    dbms_datapump.set_parallel (handle = > h1, degree = > 1);
    dbms_datapump.set_parameter (handle = > h1, name = > 'KEEP_MASTER', value = > 0);
    dbms_datapump.metadata_filter (manage = > h1, name = > 'SCHEMA_EXPR', value = > 'IN ("DCH_ERPM")');
    dbms_datapump.metadata_filter (manage = > h1, name = > 'NAME_EXPR', value = > 'IN ("PF00_INDICATORS", "PF01_ANNUALCRIT", "PF02_VALUES")');
    dbms_datapump.add_file (handle = > h1, filename = > 'ERPM_quick' | to_char (sysdate, 'YYMMDDHH24MISS') |'.) (DMP", directory = >"DATA_PUMP_DIR", filetype = > 1);
    dbms_datapump.set_parameter (handle = > h1, name = > 'INCLUDE_METADATA', value = > 1);
    dbms_datapump.set_parameter (handle = > h1, name = > 'DATA_ACCESS_METHOD', value = > "AUTOMATIC").
    dbms_datapump.set_parameter (handle = > h1, name = > "ESTIMATION", value = > 'BLOCKS');
    dbms_datapump. START_JOB (handle = > h1, skip_current = > 0, abort_step = > 0);
    dbms_datapump. Detach (handle = > h1);

    end;

    Hi Steve,.

    You may need to call external shell program to remove the existing files of the procedure. Maybe following help.

    http://asktom.Oracle.com/pls/asktom/f?p=100:11:0:P11_QUESTION_ID:16212348050

    Concerning

  • Procedure DBMS_DATAPUMP to function

    Hello

    I created a procedure that works quite well as follows:
    CREATE OR REPLACE PROCEDURE proc_expdp(p_schema_name VARCHAR2, p_table_name VARCHAR2)
    IS
        v_handle NUMBER;
        v_jobname VARCHAR2 (100);
        v_dirname VARCHAR2 (100);
        v_filename VARCHAR2 (100);
    BEGIN
        v_filename := to_char(sysdate,'YYYYMMDDHH24MISS')||'.DMP';
        v_jobname := v_filename||'_EXPDP';
        v_dirname := 'DMPDIR';
        v_handle := dbms_datapump.open(operation => 'EXPORT', job_mode => 'TABLE', job_name => v_jobname);
        dbms_datapump.add_file(handle => v_handle, filename => v_filename, directory => v_dirname, filetype => 1);
        dbms_datapump.add_file(handle => v_handle, filename => v_filename||'_EXPDP'||'.LOG', directory => v_dirname, filetype => 3);
        dbms_datapump.metadata_filter (handle => v_handle, name => 'SCHEMA_EXPR', value => 'IN ('||p_schema_name||')');
        dbms_datapump.metadata_filter (handle => v_handle, name => 'NAME_EXPR', value => 'IN ('||p_table_name||')');
    --    dbms_datapump.data_filter (handle => v_handle, name => 'PARTITION_LIST', value => '''ODS_SLS_ITEM_DETAIL_20090101'', ''ODS_SLS_ITEM_DETAIL_20090102'', more here ''ODS_SLS_ITEM_DETAIL_20090227'', ''ODS_SLS_ITEM_DETAIL_20090228''', table_name => 'ODS_SLS_ITEM_DETAIL', schema_name => 'ODSPROD');
        dbms_datapump.start_job(v_handle);
        dbms_datapump.detach(v_handle);
    END;
    ... and I use the following code from PL/SQL to execute, which works very well also.
    BEGIN
      proc_expdp('''HR''','''DEPARTMENTS'',''EMPLOYEES''');
    END;
    I needed this as a function, so I corrected the following text:
    CREATE OR REPLACE FUNCTION system.func_expdp(p_schema_name VARCHAR2, p_table_name VARCHAR2) RETURN VARCHAR2
    IS
        v_handle NUMBER;
        v_jobname VARCHAR2 (100);
        v_dirname VARCHAR2 (100);
        v_filename VARCHAR2 (100);
    BEGIN
        v_filename := to_char(sysdate,'YYYYMMDDHH24MISS')||'.DMP';
        v_jobname := v_filename||'_EXPDP';
        v_dirname := 'DMPDIR';
        v_handle := dbms_datapump.open(operation => 'EXPORT', job_mode => 'TABLE', job_name => v_jobname);
        dbms_datapump.add_file(handle => v_handle, filename => v_filename, directory => v_dirname, filetype => 1);
        dbms_datapump.add_file(handle => v_handle, filename => v_filename||'_EXPDP'||'.LOG', directory => v_dirname, filetype => 3);
        dbms_datapump.metadata_filter (handle => v_handle, name => 'SCHEMA_EXPR', value => 'IN ('||p_schema_name||')');
        dbms_datapump.metadata_filter (handle => v_handle, name => 'NAME_EXPR', value => 'IN ('||p_table_name||')');
    --    dbms_datapump.data_filter (handle => v_handle, name => 'PARTITION_LIST', value => '''ODS_SLS_ITEM_DETAIL_20090101'', ''ODS_SLS_ITEM_DETAIL_20090102'', more here ''ODS_SLS_ITEM_DETAIL_20090227'', ''ODS_SLS_ITEM_DETAIL_20090228''', table_name => 'ODS_SLS_ITEM_DETAIL', schema_name => 'ODSPROD');
        dbms_datapump.start_job(v_handle);
        dbms_datapump.detach(v_handle);
        return v_filename;
    END func_expdp;
    ... and I wouldn't be able to run it with a select statement. Well, this part is the problem. When I try to run as a select statement
    select system.func_expdp('''HR''','''DEPARTMENTS'',''EMPLOYEES''') from dual;
    I get the following error:
    An error was encountered performing the requested operation:
    ORA-31626:job does not exist
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 911
    ORA-06512: at "SYSTEM.FUNC_EXPDP", line 4356
    ORA-06512: at "SYSTEM.FUNC_EXPDP", line 11
    31626. 00000 - "job does not exist"
    *Cause: An invalid reference to a job which is no longer executing, is no executing on the instance where the operation was attempted, or that does not have a valid Master Table. Refer to any following error messages for clarification.
    *Action: Start a new job, or attach to an existing job that has a valid Master Table.
    What I would do... ?

    Thanks for your time

    PS: I use 10g Express Edition. Lately I have created and deleted several datapump jobs, but I think it's irrelevant since my procedure works very well.

    You cannot use the function inside a SELECT statement because dbms_datapump.open does an implicit validation. However, you can use the function within a pl/sql as block:

    declare
       l_filename varchar2(100);
    begin
       l_filename := system.func_expdp('''HR''','''DEPARTMENTS'',''EMPLOYEES''')
    end;
    /
    

    Also, please avoid creating all objects in the SYS and SYSTEM schemas.

  • ORA-31626 dbms_datapump running

    I wrote some PL/SQL to do nightly exports of my database, but I have been unable to operate. I'm getting an ORA-31626. A search of Metalink has not yield well except a few possible references to the aq_tm_processes and streams_pool_size.

    Has anyone else seen elsewhere? My streams_pool_size is 0, but I use the automatic SGA sizing so what I understand pool of watercourses should not be defined more precisely and Oracle will allocate space to the pool shared according to the needs. Code pasted below:

    Thank you

    < pre >
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2 (30): = "UNDEFINED";
    l_job_state VARCHAR2 (30): = "UNDEFINED";
    l_sts KU$ _STATUS;
    v_file_handle UTL_FILE. TYPE_DE_FICHIER;
    v_exists BOOLEAN;
    v_dbname VARCHAR2 (12);
    v_expfile VARCHAR2 (32);
    v_explog VARCHAR2 (32);
    v_expfile1 VARCHAR2 (32);
    v_explog1 VARCHAR2 (32);
    v_expfile2 VARCHAR2 (32);
    v_explog2 VARCHAR2 (32);
    V_LENGTH NUMBER;
    v_blocksize NUMBER;
    v_jobstate VARCHAR2 (30);
    v_jobstatus KU$ _JOBSTATUS;
    v_status KU$ _STATUS;
    v_scn NUMBER;
    BEGIN
    SELECT name
    IN v_dbname
    DATABASE of v$.

    v_expfile: = v_dbname | ' _exp.dmp';
    v_explog: = v_dbname | ' _exp.log';
    v_expfile1: = v_dbname | ' 1_exp.dmp';
    v_explog1: = v_dbname | ' 1_exp.log';
    v_expfile2: = v_dbname | ' 2_exp.dmp';
    v_explog2: = v_dbname | ' 2_exp.log';

    DBMS_OUTPUT. ENABLE (1000000);
    -Checks if a file exists and if so rename
    UTL_FILE. FGETATTR ('DUMP_FILES', v_expfile1, v_exists, v_length, v_blocksize);
    IF v_exists
    THEN
    dbms_output.put_line (' Rename ' | v_expfile1 |' to ' | v_expfile2);
    UTL_FILE. FRENAME (src_location = > 'DUMP_FILES', nomfichier_src = > v_expfile1 dest_location = > 'DUMP_FILES', dest_filename = > v_expfile2, overwrite = > TRUE);
    dbms_output.put_line (' Rename ' | v_explog1 |' to ' | v_explog2);
    UTL_FILE. FRENAME (src_location = > 'DUMP_FILES', nomfichier_src = > v_explog1 dest_location = > 'DUMP_FILES', dest_filename = > v_explog2, overwrite = > TRUE);
    dbms_output.put_line ("' renowned files.");
    END IF;
    -Rename the current file
    UTL_FILE. FGETATTR ('DUMP_FILES', v_expfile, v_exists, v_length, v_blocksize);
    IF v_exists
    THEN
    dbms_output.put_line (' Rename ' | v_expfile |' to ' | v_expfile1);
    UTL_FILE. FRENAME (src_location = > 'DUMP_FILES', nomfichier_src = > v_expfile dest_location = > 'DUMP_FILES', dest_filename = > v_expfile1, overwrite = > TRUE);
    dbms_output.put_line (' Rename ' | v_explog |' to ' | v_explog1);
    UTL_FILE. FRENAME (src_location = > 'DUMP_FILES', nomfichier_src = > v_explog dest_location = > 'DUMP_FILES', dest_filename = > v_explog1, overwrite = > TRUE);
    dbms_output.put_line ("' renowned files.");
    END IF;

    -Export full Begin
    dbms_output.put_line (' ' start export ');
    -Get the current database YVERT

    SELECT current_scn
    IN v_scn
    DATABASE of v$.

    l_dp_handle: = dbms_datapump.open (operation = > 'EXPORT', job_mode = > 'FULL', remote_link = > NULL, job_name = > "NIGHTLY_EXPORT", version = > 'LATER');
    dbms_datapump.add_file (handle = > l_dp_handle, filename = > v_expfile, directory = > 'DUMP_FILES', filetype = > DBMS_DATAPUMP.) KU$ _FILE_TYPE_DUMP_FILE);
    dbms_datapump.add_file (handle = > l_dp_handle, filename = > v_explog, directory = > 'DUMP_FILES', filetype = > DBMS_DATAPUMP.) KU$ _FILE_TYPE_LOG_FILE);
    dbms_datapump.set_parameter (handle = > l_dp_handle, name = > 'FLASHBACK_SCN', value = > v_scn); -Even running consistent traditional export mode
    dbms_datapump. START_JOB (l_dp_handle);
    -Start loop to determine whether the task is complete
    v_jobstate: = "UNDEFINED";
    WHILE (v_jobstate! = 'COMPLETE') and (v_jobstate! = "STOPPED")
    LOOP
    DBMS_DATAPUMP. GET_STATUS)
    manage = > l_dp_handle,
    mask = > 15,-DBMS_DATAPUMP.ku$ _status_job_error + DBMS_DATAPUMP.ku$ _status_job_stats + DBMS_DATAPUMP.ku$ _status_wip
    Timeout = > NULL,
    job_state = > v_jobstate,
    State = > v_status
    );
    v_jobstatus: = v_status.job_status;
    END LOOP;

    dbms_output.put_line (' export completed with status: ' | v_jobstate);
    dbms_datapump. Detach (l_dp_handle);
    END;
    < / pre >

    I don't say that the size of my pool is just, but it's what I have:

    shared_pool_size = 400 m
    STREAMS_POOL_SIZE = 13 m

    Since this failure on the command open, I wonder if you already have a job that is opened with the same name. While I was trying to get your script to work, I removed the job_name = "NIGHTLY_EXPORT" and assign

    job_name-online null,

    This creates just a job name that is not already in use.

    Dean

  • Performance of the Datapump API work on a specific instance of the RAC

    I'm under that DataPump matter using the DBMS_DATAPUMP API in a CCR environment. DBMS_DATAPUMP. START_JOB (handle) occasionally fails with ERROR ORA-31626: there is no job

    Note metalink Doc ID: 758228.1
    says "It is not possible to start or restart Data Pump jobs on a single instance of a RAC, if there is use of data pump running on other instances of FCAC."
    It recommends that these jobs running on a regular instance using the DBMS_JOB. Procedure of the INSTANCE.

    I am unable to do this. I tried to call DBMS_JOB. INSTANCE after I made a DBMS_DATAPUMP. OPEN to get a handle to the job, and before I called DBMS_DATAPUMP. START_JOB. The DBMS_JOB. Call the INSTANCE returns an ORA-23421: job XXX number is not a job in the job queue.

    I wonder if the DATAPUMP jobs enter a different working mechanism and therefore is not accessible with the DBMS_JOB package. Would be recognizing all of the ideas on the issue.

    Thank you
    EM

    Create a DATAPUMP service that points to one and only one of your nodes.

    Then, connect to the service not to the database, instance or sid.

  • DBMS_DATAPUMP. METADATA_REMAP

    What I'm missing here? I have reference Mr. [Christopher Poole | http://www.chrispoole.co.uk/tips/dbatip4.htm] while trying to use the function METADATA_REMAP of DATAPUMP... but I can't make it work. Here is my code... my proc succeeds however it attempts to import the schema of origin instead of remapping it. So, I get a ton of errors saying "this schema already exists...". ».

    What Miss me?




    DECLARE
    / * IMPORT/EXPORT VARIABLES * /.
    v_dp_job_handle NUMBER;          -The handful of job data pump
    v_count NUMBER;          -Index of the loop
    v_percent_done NUMBER;          -Percentage of job complete
    v_job_state VARCHAR2 (30);     -To keep track of job status
    v_message KU$ _LOGENTRY;     -For error messages and work in PROGRESS
    v_job_status KU$ _JOBSTATUS;     -The State of the work of get_status
    v_status KU$ _STATUS;     -The status returned by get_status object
    T_DATE VARCHAR2 (13).
    v_schema VARCHAR2 (25);
    v_new_schema VARCHAR2 (25);
    v_source_database_name VARCHAR2 (50);

    BEGIN
    v_schema: = 'TEST ';
    T_DATE: = TO_CHAR (SYSDATE, 'MMDDYYYY_HHMI');
    v_source_database_name: = 'TEST_DB ';
    v_new_schema: = 'TEST_NEW ';

    / * OPEN THE DATA PUMP PROCEDURE * /.
    v_dp_job_handle: = DBMS_DATAPUMP. OPEN)
    OPERATION = > "IMPORT."
    JOB_MODE = > "SCHEMA."
    REMOTE_LINK = > v_source_database_name,
    JOB_NAME = > v_schema | ' _REMAP_' | T_DATE,
    VERSION = > 'LAST');

    / * ADD THE NAME OF THE EXPORT LOG FILE TO THE DATA PUMP PROCEDURE * /.
    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_schema | ' _REMAP_' | T_DATE |'. JOURNAL '.
    DIRECTORY = > 'DATAPUMP. "
    FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    / * ADD THE NAME OF THE SCHEMA TO THE PROCEDURE OF DATA PUMP EXPORT * /.
    DBMS_DATAPUMP. () METADATA_FILTER
    MANAGE = > v_dp_job_handle,
    NAME = > 'SCHEMA_EXPR ',.
    VALUE = > ' = "' | v_schema | " ' ') ;

    / * REMAP THE ORIGINAL THE NEW SCHEMA SCHEMA * /.
    DBMS_DATAPUMP. () METADATA_REMAP
    MANAGE = > v_dp_job_handle,
    NAME = > 'REMAP_SCHEMA ',.
    OLD_VALUE = > ' = "' | v_schema | " ' ',
    VALUE = > ' = "' | v_new_schema | " ' ') ;

    / * START THE EXPORT * /.
    DBMS_DATAPUMP. START_JOB (v_dp_job_handle);



    / * EXPORT ERROR HANDLING * /.
    v_percent_done: = 0;
    v_job_state: = "UNDEFINED";

    WHILE (v_job_state! = "COMPLETED") AND (v_job_state! = "STOPPED")
    LOOP
    DBMS_DATAPUMP. GET_STATUS)
    v_dp_job_handle,
    DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR + DBMS_DATAPUMP. KU$ _STATUS_JOB_STATUS + DBMS_DATAPUMP. KU$ _STATUS_WIP.
    -1,
    v_job_state,
    v_status);

    v_job_status: = v_status. JOB_STATUS;

    / * IF THE PERCENTAGE CHANGED, DISPLAYS THE NEW VALUE * /.
    IF v_job_status. PERCENT_DONE! = v_percent_done THEN
    DBMS_OUTPUT. Put_line ('* percent of the job done = ' |) To_char (v_job_status. PERCENT_DONE));
    v_percent_done: = v_job_status. PERCENT_DONE;
    END IF;

    / * IF THE WORK IN PROGRESS (WIP) OR ERROR MESSAGES HAVE BEEN RECEIVED FOR THE POST, POST THEM * /.
    IF BITAND (v_status. MASK, DBMS_DATAPUMP. KU$ _STATUS_WIP)! = 0 THEN
    v_message: = v_status. WORK IN PROGRESS;
    ELSIF BITAND (v_status.mask, DBMS_DATAPUMP. KU$ _STATUS_JOB_ERROR)! = 0 THEN
    v_message: = v_status. ERROR;
    ON THE OTHER
    v_message: = NULL;
    END IF;

    IF v_message IS NOT NULL THEN
    v_count: = v_message. FIRST;
    While v_count IS NOT NULL
    LOOP
    DBMS_OUTPUT. Put_line (v_message (v_count). LOGTEXT);
    v_count: = v_message. Next (v_count);
    END LOOP;
    END IF;
    END LOOP;

    -Indicate that the finished work and detach.
    DBMS_OUTPUT. Put_line ("' job has completed");
    DBMS_OUTPUT. Put_line (' State of the Final work = ' | v_job_state);

    / * END OF THE DATA PUMP PROCEDURE * /.
    DBMS_DATAPUMP. Detach (v_dp_job_handle);
    END;

    A simple display... not tested... Why not change it...

    DBMS_DATAPUMP.METADATA_REMAP (
    HANDLE => v_dp_job_handle,
    NAME => 'REMAP_SCHEMA',
    OLD_VALUE => v_schema,
    VALUE => v_new_schema) ;
    
  • Question-&gt; DBMS_DATAPUMP - EXPORT, TRANSPORTABLE

    Hi all

    OS: BSG 64
    Database: 10.2.0.4

    For two days, I fight with my problem.
    The problem is that I tried to export trasportable using DBMS_DATAPUMP.

    When I use DBMS_DATAPUMP in my journal, I see
    #################################################
    Start "DBA_GRP". "" EXP_JOB11 ":
    Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK
    Table main "DBA_GRP." "' EXP_JOB11 ' properly load/unloaded
    #############################################

    When I use
    expdp DIRECTORY of DUMPFILE = tablespacename200812_alex.dmp dba_grp/pasw_grp@TST = trans_dir TRANSPORT_TABLESPACES = tablespacename200812
    I see in the log file
    #########################################################
    Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
    Object type TRANSPORTABLE_EXPORT/treatment TABLE
    Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK
    ######################################################

    Because lack of DBMS_DATAPUMP
    Object type TRANSPORTABLE_EXPORT/PLUGTS_BLK of treatment and
    Object type TRANSPORTABLE_EXPORT/treatment TABLE
    I'm not able to import the tablespace transportabe back to the database.

    I googled to ask tom, oracle and many other web sites, forum, but could not solve my problem.
    Could you please have a look at my code?
    I think the question is in steps: STEP 4 or STEP 4-1, but I'm not sure. Can I missing another setting or more.

    Thank you very much for your help in advance!


    BEGIN
    v_dp_num: = DBMS_DATAPUMP. OPEN ("EXPORT", "PORTABLE", NULL, 'EXP_JOB11');
    Dbms_output.put_line (' STEP 1: handful of employment is ' | v_dp_num);
    DBMS_DATAPUMP.add_file (manage = > v_dp_num,)
    filename = > rec.tablespace_name | ". metadata.log",
    DIRECTORY = > "TAB_PART_ARCH"
    filetype = > DBMS_DATAPUMP.ku$ _file_type_log_file
    );
    Dbms_output.put_line ('done STEP 2');
    DBMS_DATAPUMP.add_file (manage = > v_dp_num,)
    filename = > rec.tablespace_name | ". metadata.dmp",
    DIRECTORY = > "TAB_PART_ARCH"
    filetype = > DBMS_DATAPUMP.ku$ _file_type_dump_file
    );
    Dbms_output.put_line ('done STEP 3');
    DBMS_DATAPUMP.set_parameter (manage = > v_dp_num,)
    name = > 'TTS_FULL_CHECK ',.
    value = > 1
    );
    Dbms_output.put_line ('done STEP 4');
    DBMS_DATAPUMP.metadata_filter (manage = > v_dp_num,)
    Name = > 'TABLESPACE_EXPR ',.
    value = > ' IN ("% tablespacename200812 %'')).
    );
    Dbms_output.put_line ("' STEP 4-1 is completed");
    DBMS_DATAPUMP.metadata_filter (manage = > v_dp_num,)
    Name = > 'TABLESPACE_LIST ',.
    value = > "'tablespacename200812"'
    );
    Dbms_output.put_line ('STEP 5 does');
    DBMS_DATAPUMP. START_JOB (v_dp_num);
    EXCEPTION
    WHILE OTHERS
    THEN
    v_sqlerrm: = SQLERRM;

    sp_log ("METADATA FILE CREATION FAILURE for ' | ') Rec.tablespace_name | "Reason: ' | '. v_sqlerrm);

    DBMS_DATAPUMP.stop_job (v_dp_num);
    END;

    Hello

    This line

    Object type TRANSPORTABLE_EXPORT/treatment TABLE

    is absent when the datapump is not all tables in the specified tablespace. You can check it using this query:

    Select the table table_name from dba_tables where nom_tablespace in "YOUR_TABLESPACE_NAME";

    I'll guess that you get no back line.

    Here is a update .sql script that works very well. At the beginning, I let fall and then create a tablespace and table. If you have a table called this or a tablespace and you run this script, it will drop these 2 out of objects.

    -test_api.sql

    set echo on

    Drop tablespace including contents test;
    create tablespace datafile 'test.f' size 10 m test re-use;
    create table scott.test_table (a number) tablespace test;
    insert into scott.test_table values (1);
    commit;
    alter tablespace read-only test;

    declare

    number of v_dp_num;
    v_sqlerrm VARCHAR2 (4000);

    BEGIN
    v_dp_num: = DBMS_DATAPUMP. OPEN ("EXPORT", "PORTABLE", NULL, 'EXP_JOB25');
    DBMS_DATAPUMP.add_file (handle-online v_dp_num,
    name of file-online "metadata.log."
    DIRECTORY-ONLINE "DPUMP_DIR."
    type of file-online DBMS_DATAPUMP.ku$ _file_type_log_file
    );
    DBMS_DATAPUMP.add_file (handle-online v_dp_num,
    name of file-online "metadata.dmp."
    DIRECTORY-ONLINE "DPUMP_DIR."
    type of file-online DBMS_DATAPUMP.ku$ _file_type_dump_file
    );
    DBMS_DATAPUMP.set_parameter (handle-online v_dp_num,
    name-online "TTS_FULL_CHECK."
    value-online 1
    );
    DBMS_DATAPUMP.metadata_filter (handle-online v_dp_num,
    Name-online "TABLESPACE_EXPR."
    value-online 'IN ('TEST') '.
    );

    DBMS_DATAPUMP. START_JOB (v_dp_num);
    EXCEPTION
    WHILE OTHERS
    THEN
    v_sqlerrm: = SQLERRM;

    DBMS_DATAPUMP.stop_job (v_dp_num);
    END;
    /

    -output metadata.log

    Departure 'SYS '. "" EXP_JOB25 ":
    Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
    Object type TRANSPORTABLE_EXPORT/treatment TABLE
    Processing object type TRANSPORTABLE_EXPORT/POST_INSTANCE/PLUGTS_BLK
    Table main 'SYS '. "' EXP_JOB25 ' properly load/unloaded
    ******************************************************************************
    Empty the files together for SYS. EXP_JOB25 is:
    /ADE/dgagne_l6/Oracle/work/metadata.dmp
    ******************************************************************************
    Required for the TEST transportable tablespace data files:
    /ADE/dgagne_l6/Oracle/DBs/test.f
    Job 'SYS '. "" EXP_JOB25 "carried out at 16:35:10

    Please let me know if you managed to get your tables in the dumpfile.

    Object type TRANSPORTABLE_EXPORT/treatment TABLE

    Thank you

    Dean

  • How can I specify a different schema when importing using DBMS_DATAPUMP?

    I use the following procedure to import a schema. Is it possible that I can specify the import to import into another schema?

    Basically, what I want to do is take a pattern and copy it into another schema

    DECLARE
    v_dp_job_handle NUMBER;     -The handful of job data pump
    v_count NUMBER;     -Index of the loop
    v_percent_done NUMBER;     -Percentage of job complete
    v_job_state VARCHAR2 (30);     -To keep track of job status
    v_message KU$ _LOGENTRY;     -For error messages and work in PROGRESS
    v_job_status KU$ _JOBSTATUS;     -The State of the work of get_status
    v_status KU$ _STATUS;     -The status returned by get_status object
    v_logfile NUMBER;
    T_DATE VARCHAR2 (13).
    BEGIN
    v_project: = 'TEST ';
    T_DATE: = '03272009_1048 ';

    / * IMPORT * /.
    / * OPEN THE DATAPUMP PROCEDURE * /.
    v_dp_job_handle: = DBMS_DATAPUMP. OPEN)
    OPERATION = > "IMPORT."
    JOB_MODE = > "SCHEMA."
    -REMOTE_LINK = > v_desitination_server_name,
    JOB_NAME = > v_project | ' _IMP_' | T_DATE,
    VERSION = > 'LAST');

    / * ADD THE NAME OF THE DUMP FILE TO THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_project | ' _EXP_' | T_DATE |'. DMP',.
    DIRECTORY = > "DATAPUMP");

    / * ADD THE NAME OF THE LOG IMPORT FILE TO THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. ADD_FILE)
    MANAGE = > v_dp_job_handle,
    FILENAME = > v_project | ' _IMP_' | T_DATE |'. JOURNAL '.
    DIRECTORY = > 'DATAPUMP. "
    FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);

    / * START THE IMPORT * /.
    DBMS_DATAPUMP. START_JOB (v_dp_job_handle);

    / * END OF THE DATAPUMP PROCEDURE * /.
    DBMS_DATAPUMP. Detach (v_dp_job_handle);
    END;

    Use metadata_remap with the REMAP_SCHEMA option proc:

    DBMS_DATAPUMP. METADATA_RAMAP (id, 'REMAP_SCHEMA', 'SOURCE_SCHEMA', 'DESTINATION_SCHEMA');

  • Im getting an iPhone SE of the United Kingdom, I would like to know if it works in Albania

    I want to know if this works with the carrier that I choose

    It depends, if your carrier allows you to call or text outside of your region. Your iPhone will work in any country

Maybe you are looking for

  • HP Support Assistant required Action

    I have the HP Touchsmart all-in-one with Windows 7.  The HP Support Assistant in my taskbar is rechaussaient a white exclamation mark in a red circle that says need of Action Important.  When I click on the companion of HP and he opens it shows every

  • Tecra 9000 - WLan and new hardware issue

    I have a 2nd hand Tecra 9000 that works all fine except the WlanAlso the new hardware wizard (perhaps related) keeps poping up.1. WLan Bluetooth.the computer labels proclaim inside the WiFi or Bluetooth.* There is a Wlan switch that illuminates an in

  • Pavilion p6000 series p6633w: video card

    I know that my system has a pci 2.0 x 16 slot, is the 3.0 pci card work?  Sapphire Radeon R7 240 2 GB DDR3 HDMI/DVI-D/VGA with Boost PCI-Express-11216-00 - 20G graphics card

  • No sound after installing geforce 210 in hp pavilion a1430n video card

    I saw the notice on the change of device settings audio bios auto activated, but there is no audio device options in the bios of this model. Control Panel > sound indicates 'no audio devices installed. I installed this new geforce 210 video card (in

  • HP 15-r012tx USB, network and Bluetooth controller driver

    Dear Sir Details of the laptop: HP 15-r012tx OS: Window7 IM can not find driver for this material 3, could u plese help me by providing the link. Also my wireless receiver did not work at all. concerning