[4.0 AI2] Job export DataPump: "SNA must be all positive.

I would like to recommend Developer SQL as user interface for the Data Pump work (since we do not have OEM).

On export work interface, choose "view consistent export data read" fails if the RCS is > = 2147483648. I get the error "SNA must be all positive.

While I'm here, the option 'Deadline' fails too, since the displayed date uses the local settings and the PL/SQL code generated uses the database settings!

This is a show-stopper for recommend the tool, because I don't want exports to read compatible...

Hey I looked at the code, and in fact, we use an int that I will have this changed. Can you clarify what your light with the date I don't see your problem...

Thank you

Tags: Database

Similar Questions

  • By default, export DataPump is incompatible...

    Oh my...

    It seemed to me have come to realize that the default Oracle for the DataPump is an inconsistent export.

    Recently, I tried refresh our production test environment and get:

    ORA-39083: Type as ref_constraint cannot be created with the object error:
    ORA-02298: cannot validate (VS2AU. FK_LEDGER_ROW_CALL) - parent key not found

    Reference we use Oracle 11.1.0.7 on AIX 6.1

    Oracle reference Note: 462645.1 says that we must use FLASHBACK_TIME or FLASHBACK_SCN with the NETWORK_LINK parameter.

    Questions I have are:

    Does this mean that we MUST use FLASHBACK_RETENTION as having a FLASHBACK_QUERY_AREA put in place?
    I hope not, because that has been problematic for the archivelogs and RMAN (for us).

    Then, how can I specify in an automated cron script to get the latest SNA or present to pass as a parameter?

    Thanks in advance.

    close

    First - make sure you have all of the point set off, set linesize, echo set, etc. When you run this script file, you will get a file named scn.par that contains something like:

    flashback_scn = 527968

    It's everything you want in this file. You don't want the title, connected lines, etc. The topic set linesize, echo, etc. which are defined in the script I gave you will create her parfile, as you want.

    Then run your job to datapump as:

    expdp system / $SYSTÈME.
    FULL = Y.
    COMPRESSION = ALL.
    parfile = SCN.par.
    Directory = .
    dumpfile = .dmp.
    logfile =.log

    DataPump and exp/imp classic allow you to set some parameters in a file that is called a parfile. You specify as

    parfile = my_par_file.par

    In your case, him parfile is called scn.par and it contains a single line, as described above.

    Dean

    Published by: Dean WINS 14 August 2009 13:42

  • error when exporting datapump

    Hi gurus,

    Oracle Version: 10.2.0.3
    Operating system: Linux

    Today that we had a problem when exporting datapump runs in the production server, the error is
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 9.126 GB
    Processing object type SCHEMA_EXPORT/USER
    Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
    Processing object type SCHEMA_EXPORT/ROLE_GRANT
    Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
    Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
    Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4869,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4868,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4867,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    Processing object type SCHEMA_EXPORT/SEQUENCE/SEQUENCE
    Processing object type SCHEMA_EXPORT/TABLE/TABLE
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/COMMENT
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/FUNCTION/FUNCTION
    Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
    Processing object type SCHEMA_EXPORT/PACKAGE/COMPILE_PACKAGE/PACKAGE_SPEC/ALTER_PACKAGE_SPEC
    Processing object type SCHEMA_EXPORT/FUNCTION/ALTER_FUNCTION
    Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
    Processing object type SCHEMA_EXPORT/VIEW/VIEW
    Processing object type SCHEMA_EXPORT/PACKAGE/PACKAGE_BODY
    Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4869,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4868,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    ORA-39127: unexpected error from call to export_string := SYS.DBMS_RMGR_GROUP_EXPORT.grant_exp(4867,1,...) 
    ORA-06502: PL/SQL: numeric or value error: NULL index table key value
    ORA-06512: at "SYS.DBMS_RMGR_GROUP_EXPORT", line 130
    ORA-06512: at line 1
    ORA-06512: at "SYS.DBMS_METADATA", line 5118
    . . exported "QFUNDAPAYPROD_JUN09"."ST_LO_TRANS"         213.3 MB 2017789 rows
    . . exported "QFUNDAPAYPROD_JUN09"."ST_EXTERNAL_LOG"     712.6 MB 14029747 rows
    . . exported "QFUNDAPAYPROD_JUN09"."ST_LO_MASTER"        198.1 MB  915035 rows
    . . exported "QFUNDAPAYPROD_JUN09"."ST_LO_APPORTIONS"    110.7 MB 3951926 rows
    Can someone help me please it is a serious problem and how to solve it.

    Thank you and best regards,
    Kahina Prasad.S

    SIDDABATHUNI wrote:
    Hi orawiss,

    Here is my order export

    expdp username/password directory=dbdump dumpfile=D_JUN09_$DT.dmp exclude=statistics,grants,PROCOBJ logfile=D_JUN09_$DT.log job_name=QFU
    

    But today morning alone i have added PROCOBJ prameter to exclude it.

    Thank you best regards &,.
    Kahina Prasad.S

    Bug 4358907, as the MOS ID: ORA-39127 use datapump exp [451987.1 ID]

    Try not to use the exclude option

  • Can't export (datapump) a database of Oracle 10 g 2 (10.2.0.3) of 10.2.0.4

    Hello

    Here's our environment (OS, RDBMS) for all our machines to Oracle:

    -Windows Server 2003 SP2 and Windows x 64 enterprise servers
    -Oracle 10g R2, 10.2.0.3 and 10.2.0.4

    I'm trying simply to export a schema (from a Server Windows 64 bits, Oracle 10.2.0.4) that was on a database, Oracle 10.2.0.3 (using Datapump).

    I have no error messages, but nothing happens. Here is the log of the execution of the 'export Datapump ":
    +;;; +

    Export: Release 10.2.0.4.0 - 64 bit Production on Wednesday, September 16, 2009 10:09:54

    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    +;;; +

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production
    With the options of Real Application Clusters, partitioning, OLAP and Data Mining

    But, I am able to import (always using Datapump) a schema which stood on an Oracle 10.2.0.3 database in the same machine Oracle (64-bit), Oracle 10.2.0.4.

    I was sure that I was able to export (datapump) a database with a lower version than the version of the software that ran from datapump export.

    Any idea?

    Thank you to

    Because the dump file will be server-side in any way why you want to export using 10.2.0.4 remotely on 10.2.0.3.
    Better to use the same version of the utility as a server.

    Moving data between different database Versions
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_overview.htm#CEGFCFFI

  • Software Adobe reader appeared on my office twice now as a must with all its subfolders etc.

    Software Adobe reader appeared on my office twice now as a must with all its subfolders etc.  How can I get it my office?

    Original title: Desktop problems

    Hello

    Thanks for posting in the Microsoft community!

    (1) did you make any changes before the show?

    (2) have you changed the directory on the desktop during the installation of Adobe reader?

    (3) have you tried to delete the folder?

    You can try to delete the files and re-install the application. Make sure that the installation directory is c:/program files.

    Make slate: how to remove the unwanted files and programs

    http://www.Microsoft.com/athome/Setup/CleanSweep.aspx#fBid=SgPtt5ts3EM

  • Export DataPump with the query option

    Hi all

    My environment is IBM AIX, Oracle 10.2.0.4.0 database.

    I need a few sets of records using a query in export production. Request is attached to several tables. Since we have the BLOB data type, we export using datapump.

    We have weaker environments, but have not the same set of data and tables, and therefore not able to simulate the same query in lower environment. But created a small table and faked the query.

    My order is

    expdp system / < pwd > @orcl tables = dump.dump1 query = dump.dump1:' ' where num < 3 ' ' directory = DATA_PUMP_DIR dumpfile = exp_dp.dmp logfile = exp_dp.log

    Query in the command pulls two records directly. By running the command above, I see the size 80KO dump file,
    In the export log file.

    I see Total estimation using BLOCKS method: 64 KB.
    export Dump.Dump1 = 4,921 KB 2 rows.

    My doubts are,
    (1) is the correct command that I am running.
    (2) estimate said 64 k, considering that it says also exported 4,921 KB. But the dump file created is 80KO. It is exported correctly?
    (3) given that I run with the use of the system, apart from 2 rows, export all data. We must send the dump file to the other Department. We should not export all of the data other than the query output.
    (4) in the order if I am not using "tables = dump.dump1), the export file big mess." Don't know which is the right.

    Your answers will be more useful.

    The short answer is 'YES', he did the right thing.

    The long answer is:

    Query in the command pulls two records directly. By running the command above, I see the size 80KO dump file,
    In the export log file.

    I see Total estimation using BLOCKS method: 64 KB.
    export Dump.Dump1 = 4,921 KB 2 rows.

    My doubts are,
    (1) is the correct command that I am running.

    Yes. As long as you query is correct. DataPump will export on the lines that match this query.

    (2) estimate said 64 k, considering that it says also exported 4,921 KB. But the dump file created is 80KO. It is exported correctly?

    Estimate is made using the full picture. Since you specify, he used the method of estimation of block. Basically, how many blocks have been attributed to this table. In your case, I guess it was 80KB.

    (3) given that I run with the use of the system, apart from 2 rows, export all data. We need to send the dump file to other > Department. We should not export all of the data other than the query output.

    I will export all the data, but going to export metadata. It exports the table definition, all indexes on it, all the statistics on tables or indexes, etc. This is why the dump file could be bigger. There is also a 'main' table that describes the export job who gets exproted. This is used by export and import to find what is in the dumpfile, and where in the dumpfile these things are. It is not user data. This table needs to be exported and will take place in the dumpfile.

    (4) in the order if I am not using "tables = dump.dump1), the export file big mess." Don't know which is the right.

    If you only want this table, then you order export is right. If you want to export more, then you need to change your export command. From what you say, it seems that you order is correct.

    If you do not want any expoirted metadata, you can add:

    content = data_only

    at the command line. This will only export the data and when the dumpfile is imported, it must have the table already created.

    Dean

  • Import with datapump when exporting datapump was executed as user SYS

    Hi all

    all I have is a dumpfile and an export a datapump log file. Export has been executed as user sys:

    Export: Release 11.2.0.1.0 - Production on mid Dec 3 12:02:22 2014

    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    ;;;
    Join EIB: Oracle Database 11 g Release 11.2.0.1.0 - 64 bit Production
    "SYS". "' SYS_EXPORT_FULL_01 ':"sys/***@database AS SYSDBA"directory = data_pump_dir dumpfile = db_full.dmp db_full.log = complete logfile = gestartet wird y
    Mobiltelefondienste participations mit method BLOCKS...
    Objekttyp DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA wird thanks
    Approximately Schatzung mit BLOCKS method: 52.84 GB

    Now I have to import (with datapump) user USER01 and export USER02. But I don't know the name of the source database tablespaces.

    I want to keep the name of the user (USER01/USER02). That's why I created these users in the target database.

    The questions I have are:

    -should I start the import by datapump as user SYS?

    -Should what settings I use to import users USER01, USER02 and data in the target database? Since I do not know the names of tablespaces

    in the source database parameter REMAP_TABLESPACE will not help

    Any help will be appreciated

    J.

    Hi J.

    The questions I have are:

    -should I start the import by datapump as user SYS?

    No, you need to import with a user in the role of "imp_full_database".

    -Should what settings I use to import users USER01, USER02 and data in the target database? Since I don't know the names of the storage spaces in the source database parameter REMAP_TABLESPACE will not help

    Well, an idea is to generate you a schema import sqlfile and see in the ddl in what tablespace, it will try to create objects.

    Impdp------' / as sysdba------' directory = dumpfile = = = USER01, USER02 patterns sqlfile

    For more information, take a look at the import documentation

    http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_import.htm

    Hope this helps,

    Kind regards

    Anatoli has.

  • Parallel sessions on Export Datapump (10.2.0.4)

    Hello

    We use Oracle 10.2.0.4 on Solaris and I export a table using Datapump export.

    The export includes a query that selects from three tables based on relevant conditions. Specifies him parfile "parallel = 4' and the dumpfile setting uses % you so that it creates an appropriate number of files."

    When I run the export using my own account (DBA) (i.e. expdp mr_dba parfile = exp_xyz.par) export complete in 15 minutes and creates four dumpfiles. When I run the export as the owner of the schema using the parfile her exact same (i.e. expdp schema_own parfile = exp_xyz.par) export takes more than two hours and does that with two dumpfiles.

    Could someone suggest the things I could look at to find out why there is such a difference in elapsed time? Exports have been executed repeatedly as long as both users with the box with similar loads and the results are pretty consistent that is 15 mins for my user and two hours for the owner of the schema.

    The owner of the schema has a different profile and another group of consumers of resources but also of my profile and the profile of schema owners have 'sessions_per_user' defined on Unlimited. In the resource manager in the Parallel_Degree_Limit_P1 value is set to 16 for my group of consumer and is not at all defined for the consumer of schema owners group.

    I did notice that when exporting under the schema owner of the DBA_DATAPUMP_SESSIONS showed a DBMS_DATAPUMP session, a MASTER session and two WORKING sessions. When I run it under my user name, it shows these four sessions but shows also three sessions TABLE OUTSIDE. This suggests that it is using a different approach, but I don't know that this would cause.

    Any advice would be very welcome. I have not posted specific information on the settings file or the tables that I don't know what information people may require - so if you need the details of anything please let me know.

    Thank you very much.

    Is the scheme which takes more privileged time or not? If she didn't have permission, can you give it and let me know if that makes a difference. I seem to remember hearing something about parallel and not private accounts, but can't seem to find anything about it.

    Thank you

    Dean

  • EXPORT DATAPUMP with query on a table

    Hello

    I would like to ask how to include a query in one parfile using EXPDP.

    My statement is:

    Select * from apt.sales where TO_CHAR (or_date, 'YYYY') = '2011';

    My goal is to export only the GOLD (receipts) generated in 2011 in the table APT. SALES

    My parfile is correct:

    schemas = APT
    Query = APT. SALES:'"where TO_CHAR(or_date,'YYYY') = '2011'" ' "
    Directory = DATA_PUMP_DIR
    dumpfile = ORin2011.dmp
    logfile - ORin2011.log

    I'm working on an Oracle 10.2.0.1 database in Windows Server 2003.

    Help, please.

    Try to:
    1 use the keyword TABLES otherwise all data schema is exported
    2. use quoting easier for the QUERY parameter
    3. use '=' for the LOGFILE parameter

    Example with the table HR. T on Windows:

    tables=HR.T
    query=HR.T:"where TO_CHAR(x,'YYYY') = '2011'"
    directory=DATA_PUMP_DIR
    dumpfile=ORin2011.dmp
    logfile=ORin2011.log
    
    C:\>expdp system/xxx parfile=et.par
    
    Export: Release 11.2.0.2.0 - Production on Mar. Oct. 18 08:17:36 2011
    
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    
    Connected to: Oracle Database 11g Express Edition Release 11.2.0.2.0 - Production
    Starting "SYSTEM"."SYS_EXPORT_TABLE_01":  system/******** parfile=et.par
    Estimate in progress using BLOCKS method...
    Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 64 KB
    Processing object type TABLE_EXPORT/TABLE/TABLE
    Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
    . . exported "HR"."T"                                    5.007 KB       1 rows
    Master table "SYSTEM"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
    ******************************************************************************
    Dump file set for SYSTEM.SYS_EXPORT_TABLE_01 is:
      C:\ORACLEXE\APP\ORACLE\ADMIN\XE\DPDUMP\ORIN2011.DMP
    Job "SYSTEM"."SYS_EXPORT_TABLE_01" successfully completed at 08:17:40
    
  • Import and export datapump

    Hello

    I'm trying to export some tables of a hr schema and import it into the diagram orcl but after he successfully imported, I tried to view the tables in the schema orcl but it shows me the error as * ' ORA-00942: table or view does not exist "*.my whole scripts can be found below.


    CREATE USER ORCL IDENTIFIED PER HOUR;

    GRANT CREATE SESSION, IMP_FULL_DATABASE TO ORCL.

    ALTER USER ORCL QUOTA UNLIMITED ON USERS;

    GRANT UNLIMITED TABLESPACE TO ORCL.


    EXPORT:

    D:\ > expdp HR/hr schemas = hr include = TABLE:------"(\'JOB_HISTORY\',\'REGIONS\',\'COU IN
    (NTRIES\', \'LOCATIONS\', \'DEPARTMENTS\', \'JOBS\', \'EMPLOYEES\')-"directory = ORC_D
    IR dumpfile = hrschema.dmp logfile = hrschema.log

    / * Export: Release 10.2.0.1.0 - Production on Thursday, may 12, 2011 14:31:06

    Copyright (c) 2003, 2005, Oracle. All rights reserved.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - production
    tion
    With partitioning, OLAP and Data Mining options
    Departure 'HR '. "" SYS_EXPORT_SCHEMA_01 ': hr / * schemas = hr include = TABLE: 'IN '.
    (\'JOB_HISTORY\',\'REGIONS\',\'COUNTRIES\',\'LOCATIONS\',\'DEPARTMENTS\',\'JOBS\
    (\'EMPLOYEES\') ' directory = ORC_DIR dumpfile = hrschema.dmp logfile = hrschema.log
    Current estimation using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    Total estimation using BLOCKS method: 448 KB
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/treatment
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    Object type SCHEMA_EXPORT/TABLE/COMMENT of treatment
    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment
    Object type SCHEMA_EXPORT/TABLE/TRIGGER processing
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    . . exported 'HR '. "" COUNTRIES "25 lines 6.093 KB
    . . exported 'HR '. "' DEPARTMENTS ' 27 lines 6.640 KB
    . . exported 'HR '. "' EMPLOYEES ' 107 lines 15,80 KB
    . . exported 'HR '. «JOBS «19 ranks 6,609 KB»
    . . exported 'HR '. "' JOB_HISTORY ' 6,585 KB 10 rows
    . . exported 'HR '. "" LOCATIONS "23 rows 7,710 KB
    . . exported 'HR '. "' REGIONS ' 5,296 KB 4 rows
    Table main 'HR '. "" SYS_EXPORT_SCHEMA_01 "properly load/unloaded
    ******************************************************************************
    Empty set of files for HR. SYS_EXPORT_SCHEMA_01 is:
    D:\ORACLE DIRECTORY\HRSCHEMA. DMP
    Job 'HR '. "" SYS_EXPORT_SCHEMA_01 "conducted at 14:31:21 * /.


    IMPORT:


    D:\ > impdp orcl/hr schemas = hr include = TABLE:------"(\'JOB_HISTORY\',\'REGIONS\',\'C IN
    (OUNTRIES\', \'LOCATIONS\', \'DEPARTMENTS\', \'JOBS\', \'EMPLOYEES\')-' TABLE_EXISTS_
    ACTION = CHANGE directory = ORC_DIR dumpfile = hrschema.dmp logfile = hrschema.log

    / * Import: Release 10.2.0.1.0 - Production on Thursday, may 12, 2011 14:41:26

    Copyright (c) 2003, 2005, Oracle. All rights reserved.

    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - production
    tion
    With partitioning, OLAP and Data Mining options
    Table main "ORCL". "' SYS_IMPORT_SCHEMA_01 ' properly load/unloaded
    Departure "ORCL". "' SYS_IMPORT_SCHEMA_01 ': orcl / * schemas = hr include = TABLE:
    "IN (\'JOB_HISTORY\',\'REGIONS\',\'COUNTRIES\',\'LOCATIONS\',\'DEPARTMENTS\',\'J.
    (OBS\', \'EMPLOYEES\') ' TABLE_EXISTS_ACTION = REPLACE directory = dumpfile ORC_DIR = h
    Chema.dmp logfile = hrschema.log
    Object type SCHEMA_EXPORT/TABLE/TABLE processing
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    . . imported 'HR '. "" COUNTRIES "25 lines 6.093 KB
    . . imported 'HR '. "' DEPARTMENTS ' 27 lines 6.640 KB
    . . imported 'HR '. "' EMPLOYEES ' 107 lines 15,80 KB
    . . imported 'HR '. «JOBS «19 ranks 6,609 KB»
    . . imported 'HR '. "' JOB_HISTORY ' 6,585 KB 10 rows
    . . imported 'HR '. "" LOCATIONS "23 rows 7,710 KB
    . . imported 'HR '. "' REGIONS ' 5,296 KB 4 rows
    Processing object type SCHEMA_EXPORT/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/treatment
    Object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    Object type SCHEMA_EXPORT/TABLE/COMMENT of treatment
    Object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment
    Object type SCHEMA_EXPORT/TABLE/TRIGGER processing
    Object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS treatment
    Work "ORCL". "" SYS_IMPORT_SCHEMA_01 "carried out at 14:41:31 * /.

    Use it at time of importation:
    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref340

  • Export DataPump - how to import user specified?

    Hi gurus,

    I have a big problem. I exported all Oracle instance users using datapump command and right now I have a problem with the import of this new instance.

    Is it possible to import the datapump dump file specified users (not full instance)?

    You can use the option of patterns & remap_schemas

    If you want to import with the same name:

    Impdp directory = directory_name dumpfile = dumpfile_name.dmp logfile = logfile_name.log patterns = schema_name

    If you want to import with a different name:

    Impdp directory = directory_name dumpfile = dumpfile_name.dmp = logfile_name.log patterns = schema_name remap_schema logfile = export_schema:import_schema

    Use may try to import in the different tablespace using remap_tablespace also

    Published by: Riou on July 9, 2010 01:12

  • Import/Export DataPump for refreshing production ERP test

    Hi all

    We have 11i Oracle applications running on AIX 5.3ML8 production and test. The production is autonomous with the basic 10.2.0.4 and test is RAC with database version 10.2.0.4.

    Now the question is that we intend to update the Test with production every night. We plan to use Datapump Import/Export for that. I just wanted to know from you guys, if any body has a bad experience using this utility with ERP.

    Thank you and best regards,
    Vijay

    Vijay,

    I am sensitive on the use of export/import to me because someone told me that with ERP type database we should not use Export/Import, although he was not able to specify the exact reason for this. Y at - it no problem using Import/Export the ERP database refresh?

    Using import/export is supported with Oracle Apps database (for the database full exp/imp and certain patterns such as custom). The Apps schema, I think that it is not supported due to the dependencies of the object and the integrity constraints. However, you can open an SR to confirm this with the support of the Oracle.

    Kind regards
    Hussein

  • Compression and Export DataPump

    It is possible to compress a Data Export pump?

    Which Version of DB?

    In ORACLE 12 c, it will be possible, but you need ORACLE DB Enterprise Edition with Advanced Compression Option:

    http://docs.Oracle.com/database/121/Sutil/dp_export.htm#SUTIL4051

  • export DataPump

    Hi all experts.

    We are in windows server 2008R2 and our database is 11.2.0.3 into two RAC nodes. I want to update a schema of staging by the same pattern of production. I know I have to export the EXPDP control scheme, and then import it to the staging. Are there other measures that I have to perform... So my measurements are below: Please let me know if I need to save information in putting on stage or doing the steps in the staging prior to importation

    (1) backup staging schema by expdp

    (2) schema export production to the staging X

    (3) schema user X drop of staging

    (4) place the tables, constraints, and the objects of the schema X in staging

    (5) to import a production scheme X to the staging.

    (6) run invalid objects to the schema X in staging

    I have to save all the previlages and grants the X pattern in staging before importing? Please give me your comments. Thank you.

    Hello

    About 3 and 4, if your intermediate username contains objects, you can use the cascade option do to fall, this will also remove his object.

    If you need keep the subsidies and privileges, you can also deposit all intermediary user object and keep the user himself.

    Kind regards

    --

    Bertrand

  • Import/export DataPump object grants given by another user

    Hello

    I searched through the forum and the documentation but couldn't find an answer. I have the following case:

    (1) I have a USER1 user who did not receive subsidies on the tables belonging to the user SCOTT.
    (2) user SCOTT gives the right to SELECT for his table EMP user1:
    Grant select on emp for User1;

    (3) I do exporting of the User1 schema (schema = User1). But in addition I also perform a full database export (full = y)
    (4) I have let down the user User1 and then import back

    After importation, the USER1 user has the right to SELECT on SCOTT. EMP more (no matter if I imported the schema of the schema or full dump of export file mode). Is it possible to import the user so that it has the same exact privileges that he had for the export (also the ones he was given to her by other users)? I know that the privileges granted to objects owned by SYS are not exported, but what about other non-system users?

    Thanks in advance for any answers.

    Kind regards
    Swear

    Swear,

    Subsidies are impoted when the objects to which they belong are imported, not when the schema that the grants were awarded to the is imported. So, given that scott made the concession on scott.emp to user1, this grant be imported when scott.emp is imported. They get also only exported when the object is exported. Because scott.emp was not exported, this grant will not be exported when only User1 schema is exported.

    I don't know an import of the single step that you will get what you want. If it's just the grants that you are looking for, you could make, it is 2 steps and should be an export complete dumpfile. Export could be reduced down a bit so it would be a condensed dumpfile. Here's what you do:

    1. from the source database:
    a. do this if you want a digest dumpfile
    Directory System/Manager expdp dumpfile = full_grant.dmp dpump_dir = include = grant
    b. do that if you want a complete dumpfile
    expdp System/Manager directory = dpump_dir dumpfile = full.dmp

    2 from the source or the target database:
    a. If you have just the source grants follow these steps:
    Directory System/Manager Impdp = dpump_dir dumpfile = full_grant.dmp sqlfile = grant_only.sql
    b. If you don't have a full dump and only want to see grants, so that
    Directory System/Manager Impdp dumpfile = full.dmp dpump_dir = include = grant sqlfile = grant_only.sql
    c. If you hav only a complete dumpfile and you want to see everything then
    Directory System/Manager Impdp = dpump_dir dumpfile = full.dmp sqlfile = full.sql

    Now, you can edit one of these files, .sql to find the commands you want. Your best bet would be to look at grant_only.sql and search for "to user1. This will give you all the subsidies granted to User1. These could be run from sqlplus.

    A shortcut if you have configured database links would be something like:
    Directory System/Manager Impdp = network_link dpump_dir = link_from_target_to_source include grant sqlfile = grant_only.sql =
    Then, edit the file as shown above.

    I hope this helps.

    Dean

Maybe you are looking for