import schema data using oracle data pump

Hi all

I want to import a schema on a single server using IMPDP. But on this server, the user does not exists.is it is possible to import all the objects that break including the creation of the schema. In exp/imp, it may be possible. But in data pump I can't do the same. Its giving error that the user does not exist. For your reference, I'll give you the code below the import.

Impdp system directory = DIR_DUMP logfile = expdp_tsadmin_tcgadmin_11182013_2_2.log

dumpfile = expdp_tsadmin_tcgadmin_11182013_2.dmp PATTERNS = TS_ADMIN version = 10.1.0.2.0

Pls help and thanks in advance.

Thank you

Piku

Hello

All errors are due to the fact that the tablespace that ts_data does not exist, everything else is hit because of this.

create tablespace ts_data datafile 'xxxxxx' size xM;

then try again

See you soon,.

Harry

Tags: Database

Similar Questions

  • How to change the password of a schema by using Oracle SQL Developer

    Hi need to change the password of a schema by using Oracle SQL Developer, how can I do?

    or maybe http://www.thatjeffsmith.com/archive/2012/11/resetting-your-oracle-user-password-with-sql-developer/

  • Data Pump 11g Network Import

    I have a need to perform a network import a 10.2.0.4 DataPump mode database on my old server (HP - UX 11.11) to my new 11.2.0.3 database on a new server (HP - UX 11.31). What I would REALLY like to do is to import directly from my database physical standby (running in mode READ_ONLY while I do importation) rather than having calm my database of production for a couple of hours while I do the import from there.

    What I want to know is if the importation of network mode Data Pump running on 11.2.0.3 the new server creates a task Data Pump to extract in the old database as part of this importation of direct network link. If so, I won't be able to use the physical Standby as the source of my import because Data Pump will not be able to create the main table in the old database. I can't find anything in any Oracle documentation on the use of a physical Standby as a source. I know that I can't do a Data Pump export regularly this database, but I would really like to know if anyone has experience doing this.

    Any comments would be greatly appreciated.

    Bad news, Harry - it worked for me on a standard database open in READ ONLY mode. Not sure what is different between your environment and mine, but there must be something... Read only database is a 10.2.0.4 database running on a box of HP PA-RISC under HP - UX 11.11. The target database runs under 11.2.0.3, on a box of HP Itanium under HP - UX 11.31. The user, I am connected to the database target IMP_FULL_DATABASE privs, and this is the user id used for the DB_LINK and also the same user id on the source database (which, of course, follows!). This user also has the privs permission. My file by looks like this:

    TABLES = AC_LIAB_ %
    NETWORK_LINK = ARCH_LINK
    DIRECTORY = DATA_PUMP_DIR
    JOB_NAME = AMIBASE_IMPDP_ARCHDB
    LOGFILE = DATA_PUMP_DIR:base_impdp_archdb.log
    REMAP_TABLESPACE = ARCHIVE_BEFORE2003:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2003:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2004:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2005:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2006:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2007:H_DATA
    REMAP_TABLESPACE = ARCHIVE_2008:H_DATA
    REMAP_TABLESPACE = ARCHIVE_INDEXES:H_INDEXES
    REUSE_DATAFILES = NO
    SKIP_UNUSABLE_INDEXES = Y
    TABLE_EXISTS_ACTION = REPLACE

  • Data Pump: export/import tables in different schemas

    Hi all

    I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:


    Script working export:

    expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log

    Script to import all the desks:

    Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate

    Script error for some tables to import:

    Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only

    Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.

    It is possble to import some tables but not all tables in the export file?

    Thanks for the help!

    942572 wrote:

    It's good to import all tables exported through Scott, who do NOT have the clause 'include '.

    I have the error only when I try to import some tables with clause "include".

    Can I import only some table since the export dump file? Thank you!

    you use INCLUDE incorrectly!

    do below yourself

    Impdp help = yes

    INCLUDE

    Include the specific object types.

    For example, INCLUDE TABLE_DATA =.

  • 10g to 11 GR 2 upgrade using Data Pump Import

    Hello

    I intend to move a database 10g a windows server to another. However, there is also a requirement to upgrade this database to GR 11, 2. That's why I was going to combine the 2 in one movement.

    1. take an export of complete data to source 10 g data pump
    2. create a new database empty 11 g on the target environment
    3 import the dump file into the target database

    However, I have a couple of queries running in my mind about this approach-

    Q1. What happens with the SYSTEM and SYS SYSTEM and SYSAUX objects when importing? Given that I have in fact already created a new dictionary on the empty target database - import SYS or SYSTEM will simply produce errror messages that should be ignored?

    Q2. should I use EXCLUDE on SYS and SYSTEM (is it EXCLUDE better export or import side)?

    Q3. What happens if there are things like scheduled tasks, etc. on the source system - since these would be stored in the property SYSTEM tables, how I would bring them across to the target 11 g database?

    Thank you
    Jim

    This approach is included in the Upgrade Guide 11 GR 2 - http://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm

    PL, ensure that you use SYSDBA privileges to run commands expdp and impdp. See here the first sections of "Note".

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref57
    http://docs.Oracle.com/CD/E11882_01/server.112/e22490/dp_import.htm#i1012504

    As mentioned, sown schemas (for example SYS etc.) are not exported. See http://docs.oracle.com/cd/B19306_01/server.102/b14215/dp_export.htm#i1006790

    HTH
    Srini

  • Export schema through Oracle data pump with question database Vault enabled

    Hello

    I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
    http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
    to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.

    I.e. I gave sys and system the following:
    execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');

    execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
    execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');

    I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.

    However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081
    ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
    ORA-01031: insufficient privileges
    ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
    ORA-06512: at line 1
    ORA-06512: at "SYS." Dbms_metadata", line 9081

    The export is completed, but this error.

    Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.

    Thank you

    I moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support

    HTH
    Srini

  • Select table when import using Data Pump API

    Hello
    Sorry for the trivial question, I export the data using Data Pump API, with the mode "TABLE".
    If all the tables will be exported in a .dmp file.

    So, my question is how to import a few tables using Data Pump API?, how to set the "TABLES" property as a command line interface?
    can I use procedures DATA_FILTER?, if so how?

    Really thanks in advance

    Kind regards

    Kahlil

    Hello

    You should use the procedure of metadata_filter for the same thing.
    for example:

    dbms_datapump.metadata_filter
                (handle1
                 ,'NAME_EXPR'
                 ,'IN (''TABLE1'', '"TABLE2'')'
                );
    {code}
    
    Regards
    Anurag                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
    
  • Migration using data pump for Oracle 10 g-> Oracle 11 g

    Hi all

    1)
    At the moment I am using Oracle 11g. I have a plan to import data from Oracle 10 g. I would like to know if its possible to import data that has been exported by datapump on Oracle 10 g?

    Can I convert somehow expdp out of Oracle 10 g Oracle 11 g format?





    2)
    The next question is. If I use expdp to create the dump of the database complete. Can I use *.dmp for import selected users? Or only the complete database can be restored?

    Yes, you can import dump 10g in an 11g database.

    Maybe you should take the time and read the section on datapump in the Oracle thin® [Database Utilities | http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#i1007324] Manual.
    : p

  • Differences between Data Pump and always by inheritance, import and export

    Hi all

    I work as a junior in my organization dba and I saw that my organization still uses legacy import and export utility instead of Oracle Data Pump.

    I want to convence my manager to change the existing deal with Oracle Data Pump, I have a meeting with them to keep my points and convence them for Oracle utility pump data.

    I have a week very convencing power but I won't miss to discover myself, I can't work myself and really a need with differences of strength against import and export, it would be really appreciated if someone can put strong points against Oracle Data pump on legacy import and export.

    Thank you

    Cabbage

    Hello

    a other people have already said the main advantage of datapump is performance - it is not just a little more fast exp/imp it is massively faster (especially when combined with parallel).

    It is also more flexible (once much more) - it will even create users with exports level schema which imp can't do for you (and it is very annoying that he could never).

    It is reusable

    It has an api plsql

    It supports all types of objects and new features (exp is not - and that alone is probably reason to spend)

    There even a 'legacy' at 11.2 mode where most of your old exp parameter file will still work with him - just change exp expdp and impdp print.

    The main obstacle to the transition to datapump seems to be all "what do you mean I have to create a directory so that it works", well who and where is my dumpfile why can't it be on my local machine. These are minor things to go well beyond.

    I suggest do you some sort of demo with real data of one of your large databases - do a full exp and a full expdp with parallel and show them the runtimes for them to compare...

    See you soon,.

    Rich

  • Data pump import

    I cannot using the data pump import tool. I work with Oracle 10.2.4.0 on Windows Server 2008. I am trying to import a large amount of data (several patterns, thousands of paintings, hundreds of GB of data) of the same version of Oracle on Linux. Since there is so much data and will be if long to import, I try to make sure that I can make it work for a table or a diagram before I try. So I will try to import the TEST_USER. Table 1 table using the following command:

    Dumpfile = logfile = big_file_import.log big_file.dmp Impdp.exe tables = test_user.table1 remap_datafile=\'/oradata1/data1.dbf\':\'D:\oracle\product\10.2.0\oradata\orcl\data1.dbf\'

    I provide sys as sysdba to connect when he asks (not sure how you get this info in the original order). However, I get the following error:

    the "TEST_USER" user does not exist

    My understanding is that the data pump utility would create all the necessary patterns for me. Is it not the case? The target database is a new facility for the source schemas do not exist here.

    Even if I create the schema of test_user by hand, then I get an error message indicating that the tablespace does not exist:

    ORA-00959: tablespace "TS_1" does not exist

    Then even to do it, I don't want to have to do first, does not work. Then he gives me something about how the user has no privileges on tablespace.

    The data pump utility is not supposed to do that sort of thing automatic? What I really need to create all schemas and storage by hand? That will take a long time. I'm doing something wrong here?

    Thank you
    Dave

    tables = test_user.table1

    The "TABLES" mode does NOT create database accounts.

    The FULL mode creates space storage and database accounts before importing the data.

    PATTERNS mode creates the database accounts before importing data - but expect Tablespaces to exist before so that tables and indexes can be created in the correct storage spaces.

    Hemant K Collette
    http://hemantoracledba.blogspot.com

  • Follow the progress of the import of the data pump network?

    I'm on Oracle 10.2.0.4 (SunOS) and execution of an import of network data pump from a list of tables in a schema.

    I see that the following views are available to track the data pump tasks:

    DBA_DATAPUMP_JOBS - a list and the status of the data pump jobs running
    DBA_DATAPUMP_SESSIONS - list of user sessions attached to each data pump task (which may be attached to the session of v$)
    DBA_RESUMABLE - See labor being imported and its status
    V$ SESSION_LONGOPS - can see the total size of imports and the elapsed time for work of pump data if they run long enough

    What other options are available for the increase in imports of network monitoring?
    Also, is it possible to see at which table is being processed for a network of several tables import?

    That would have helped. :^)

    When you run a job, if you do not specify a task name, then it will generate one for you. In your case, I don't see a name of the specified job, it seems that one would be generated. The generated name looks like:

    SYS_IMPORT_FULL_01 if a plenty to do
    SYS_IMPORT_SCHEMA_01 if make a diagram
    SYS_IMPORT_TABLE_01 if a table
    SYS_IMPROT_TRANSPORTABLE_01 if you are using the transportable tablespace.

    01 is the first array to create. If there is a second job while this job is currently running, it will be 02. The number can also be moved if the job fails, and is not cleaned. In your case, you make a table, so the default task name would be something like:

    SYS_IMPORT_TABLE_01 and lets say you ran this with the diagram of the SYSTEM.

    In this case, you can run this command:

    Impdp system/password attach = system.sys_import_table_01

    This will bring you to the datapump prompt, where you can type in the location or status 10, etc..

    Dean

  • To update the schema in the data pump

    Hello

    Database version: 11.2.0.1.0

    Is that we can update the schema in the database without creating the similar to the network mode dump file

    for example

    I have a DB 1 database to say, it contains two schema Test and production, we can update the test with production data without creating the dump file in the database server.

    the whole idea is to perform the export and import data in a single step here to reduce the time and human intervention.

    Currently, I've followed the steps below:

    (1) export of production data

    (2) file the test schema

    (3) create a diagram of the test

    (4) import production data in the test schema

    Thank you

    Sery

    Hello

    It is possible.,.

    SQL > create public database link impdpt connect to the SYSTEM identified by abc123 using 'DG1 ';

    Database link created.

    SQL >

    -----

    Directory of system/abc123 impdp [oracle@prima admin] $ = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Import: Release 11.2.0.4.0 - Production on Tue Dec 29 02:53:24 2015

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    Connected to: Oracle Database 11 g Enterprise Edition Release 11.2.0.4.0 - 64 bit Production

    With partitioning, OLAP, Data Mining and Real Application Testing options

    Start "SYSTEM". "" SYS_IMPORT_SCHEMA_01 ": System / * Directory = network_link = impdpt schemas = hr remap_schema hr:hrtn logfile = test_same.log = DATA_PUMP_DIR

    Current estimation using BLOCKS method...

    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA

    Total estimation using BLOCKS method: 448 KB

    Processing object type SCHEMA_EXPORT/USER

    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT

    Processing object type SCHEMA_EXPORT/ROLE_GRANT

    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE

    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA

    Object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE of treatment

    Object type SCHEMA_EXPORT/TABLE/TABLE processing

    . . imported "HRTN. "" COUNTRY "25 lines

    . . imported "HRTN. "' DEPARTMENTS ' 27 lines

    . . imported "HRTN. "' EMPLOYEES ' 107 lines

    . . imported "HRTN. "JOBS"19 ranks. "

    . . imported "HRTN. "" JOB_HISTORY "10 lines

    . . imported "HRTN. "" LOCATIONS "23 lines

    . . imported "HRTN. "The REGIONS"4 lines.

    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT

    Object type SCHEMA_EXPORT/TABLE/COMMENT of treatment

    Object type SCHEMA_EXPORT/PROCEDURE/treatment PROCEDURE

    Object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE processing

    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/treatment

    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment

    Object type SCHEMA_EXPORT/VIEW/VIEW processing

    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment

    Object type SCHEMA_EXPORT/TABLE/TRIGGER processing

    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment

    Work 'SYSTEM '. "' SYS_IMPORT_SCHEMA_01 ' completed Fri Dec 29 02:54:09 2015 elapsed 0 00:00:44

    [oracle@prima admin] $

  • migration from 10g to 12 c using the data pump in

    Hi, while I used the data pump at the level of the schema before, I'm relatively new to the full database import.

    We are trying a full database migration to 10.2.0.4 to 12 c using the complete method of database data pump over db link.

    the DBA has indicated to avoid move SYSAUX and SYSTEM objects. but initially during the documentation review, it appeared that these objects are not exported since the TRANSPORTABLE given target = system NEVER. If anyone can confirm this? done import and export log refers to the objects I thought would not:

    ...

    19:41:11.684 23 FEBRUARY 15:Estimated TABLE_DATA 3718 objects in 77 seconds

    19:41:12.450 23 February 15: total estimation using BLOCKS method: 52,93 GB

    19:41:14.058 23 February 15: object DATABASE_EXPORT/TABLESPACE of treatment type

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'UNDOTBS1' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'SYSAUX' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'TEMP' already exists

    20:10:33.185 23 February 15: ORA-31684: TABLESPACE object type: 'USERS' existing

    20:10:33.200 23 FEBRUARY 15:96 objects TABLESPACE finished in 1759 seconds

    20:10:33.208 23 February 15: treatment of type of object DATABASE_EXPORT/PROFILE

    20:10:33.445 23 FEBRUARY 15:7 PROFILE items finished in 1 seconds

    20:10:33.453 23 February 15: treatment of DATABASE_EXPORT/SYS_USER/USER object type

    20:10:33.842 23 FEBRUARY 15:1 USER objects ended in 0 seconds

    20:10:33.852 23 February 15: treatment of DATABASE_EXPORT/SCHEMA/USER object type

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OUTLN' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'ANONYMOUS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'OLAPSYS' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'MDDATA' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'SCOTT' already exists

    20:10:52.368 23 February 15: ORA-31684: USER object type: 'LLTEST' already exists

    20:10:52.372 23 FEBRUARY 15:Finished objects USER 1140 in 19 seconds

    20:10:52.375 23 February 15: object DATABASE_EXPORT/ROLE of treatment type

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'SELECT_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'EXECUTE_CATALOG_ROLE' already exists

    20:10:55.255 23 February 15: ORA-31684: object ROLE type: 'DELETE_CATALOG_ROLE' already exists

    20:10:55.256 23 February 15: ORA-31684: object ROLE type: 'RECOVERY_CATALOG_OWNER' already exists

    ...

    the most insight.

    The schema SYS, CTXSYS and MDSYS ORDSYS are not exported using exp/expdp

    DOC - ID: Note: 228482.1

    I guess that he has already installed a 12 c software and created an itseems database - so when you have imported you have this "already exists."

    Every time the database is created and the software installed by default system, sys, sysaux will be created.

  • Data Pump Export/Import

    Hello Forum,

    I have a question regarding imports and exports of data pump, perhaps what I should already know.

    I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.

    My intention is to use the data pump to export the table and and indexes and constraints etc..

    The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.

    My plan is

    1 truncate table

    2. disable or remove index

    3 leave the constraints in place?

    4. use data pump to import a lines to keep.

    My question

    will be my clues and imported too much constraints I want to import only a subset of my exported table?

    or

    If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?

    My table using the query sub in data pump functionality must exist in the database before doing the import

    or handful of data pump import as usual IE will create table indices grants and statistics etc.?

    Thank you for your comments.

    Concerning

    Your approach is ineffective.

    What you need to do is to

    create the table in select foo * bar where the

    bar of truncate table;

    Insert / * + APPEND * / into select bar * foo.

    Rebuild the indexes on the table.

    Fact.

    This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.

    ----------

    Sybrand Bakker

    Senior Oracle DBA

  • Error data pump import

    Hi all

    I get errors below when you try to use the data pump import a table into the dump of the file (of a pre-production database) in a different database (prod_db) even version. I used the expdp to generate the dump file.

    Gettings errors-
    ORA-39083
    ORA-00959
    ORA-39112

    Any suggestions or advice would be appreciated.

    Thank you


    Import: Release 10.2.0.1.0 - 64 bit Production

    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
    With partitioning, OLAP and Data Mining options
    Main table 'SYSTEM '. "' SYS_IMPORT_TABLE_01 ' properly load/unloaded
    Start "SYSTEM". "" SYS_IMPORT_TABLE_01 ": System / * DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE = Mxxxx_CLAIMS. LOG TABLES = CLAIMS
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    ORA-39083: object of type THAT TABLE could not create with the error:
    ORA-00959: tablespace "OXFORD_DATA_01" does not exist
    Because sql is:
    CREATE TABLE 'xxx '. "" CLAIMS "("CLAIM_NUM"VARCHAR2 (25) NOT NULL ACTIVATE, ACTIVATE THE"LINE_NUM"NUMBER (3.0) NOT NULL, 'PAID_FLAG' CHAR (1), CHAR (1) 'ADJ_FLAG', 'CLAIM_TYPE' VARCHAR2 (20), 'MEM_ID' VARCHAR2 (20), VARCHAR2 (20) 'MEM_SUF', DATE 'MEM_BEG_DATE', 'MEM_REL' CHAR (2), 'MEM_NAME' VARCHAR2 (40), DATE 'MEM_DOB', 'MEM_SEX' CHAR (1),"REC_DATE"DATE, DATE OF THE"PAID_DATE") 'FROM_DATE' DATE "
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_IDX1 ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    ORA-39112: type of dependent object INDEX: 'xxx '. «CLAIMS_PK ' ignored, base object type TABLE: 'Xxx'.» "" Creation of CLAIMS ' failed
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "' CLAIMS_IDX1 ' failed to create
    ORA-39112: dependent object type skipped INDEX_STATISTICS, object-based INDEX type: 'xxx '. "" Creation of CLAIMS_PK ' failed
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    ORA-39112: dependent object type skipped TABLE_STATISTICS, object-based ARRAY type: "xxx". "" Creation of CLAIMS ' failed
    Work 'SYSTEM '. "" SYS_IMPORT_TABLE_01 "completed with error (s) from 6 to 13:49:33


    Impdp name username/password DIRECTORY = DATA_PUMP_DIR DUMPFILE = EXP_Mxxxx_1203.DMP LOGFILE is Mxxxx_CLAIMS. LOG TABLES = CLAIMS

    Before you create the tablespace or use clause REMAP_TABLESPACE for the import.

Maybe you are looking for