Data pump export
Hello
I use
patterns of expdp dumpfile = 31082015.dmp system/ar8mswin1256@nt11g = dbo_mobile_webresults_test
and I face this error:
DEU-00018: customer data pump is incompatible with the version of database 11.01.00.07.00
I think it's a problem of version.
I found that the database on the server that I am connected 11.2.0.1.0 - 64 bit
and my client is 11.1.0.7.0
I tried it on another pc and it worked.
Thank you very much
Tags: Database
Similar Questions
-
Excluding views materialized to a data pump export
Hello
I use the Oracle11g 11.1 RAC on Linux version
I am trying to remove a schema of data pump export materialized views. I used EXCLUDE = MATERIALIZED_VIEW in the
settings file. The materialized view is not exported, but the table associated with the view is always exported. Does anyone know how to remove those?
Thank you
RichardYou will need to specify the tables in the exclude parameter.
EXCLUDE the = MATERIALIZED_VIEW would not exclude the paintings of masters associates.
In a single parameter EXCLUDE specify:
EXCLUDE THE = MATERIALIZED_VIEW, TABLE: IN "('EMP', 'DÉPARTEMENT')" " -
Data Pump Export Wizard in TOAD
Hello
I am new to the interface of TOAD.
I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.
I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.
I know that the command line data pump process IE $expdp and $impdp.
But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.
I would like to know what is the process for this.
How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?
I can do it on the local computer?
Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.
Thanks in advance.user13517642 wrote:
Hello
I am new to the interface of TOAD.I would like to export tables from a database and import it into another. I intend to use Data Pump Export / Import Wizard in TOAD.
I installed 9.1 Toad on WINDOWS XP and I connect to the BOX (DB server) using the customer Oracle UNIX.
I know that the command line data pump process IE $expdp and $impdp.
But I don't have a direct access to the UNIX BOX, IE the HOST so I would use TOAD to do the job.
I would like to know what is the process for this.
How different it is to use the command line data pump... With the help of TOAD where we create DIRECTORY DATAPUMP?
I can do it on the local computer?
Basically, I would like to know the process of import/export using TOAD with no direct access to the UNIX MACHINE.
Thanks in advance.
I don't think that you can do with TOAD withouth physically copy of the file on the remote host. However, you have another option. "You can use the link to database to load data * without copying it to the remote host" using NETWORK_LINK parameter, as described below:
For export:
http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref144
The NETWORK_LINK parameter initiates an export using a database link. This means that the system whereby the customer expdp is connected contact data source referenced by the source_database_link, he extracted data and writes the data to a dump on the connected system file.To imprt:
http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_import.htm#sthref320
The parameter NETWORK_LINK initiates a network import. This means the impdp customer initiates import demand, usually in the local database. This server comes in contact with the remote source database referenced by source_database_link and retrieves the data directly, it rewrites in the target database. There is no involved dump file.Kamran Agayev a.
Oracle ACE
- - - - - - - - - - - - - - - - - - - - -
My video tutorials of Oracle - http://kamranagayev.wordpress.com/oracle-video-tutorials/ -
Data pump - export without data
To export the database without data in old tool exp was the parameter ROWS defined as N. How to import the schema of database without data using data pump technology?You can see by checking using dump export on your command line like this
C:\Documents and Settings\nupneja>expdp -help Export: Release 10.2.0.1.0 - Production on Friday, 09 April, 2010 18:06:09 Copyright (c) 2003, 2005, Oracle. All rights reserved. The Data Pump export utility provides a mechanism for transferring data objects between Oracle databases. The utility is invoked with the following command: Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp You can control how Export runs by entering the 'expdp' command followed by various parameters. To specify parameters, you use keywords: Format: expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN) Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott or TABLES=(T1:P1,T1:P2), if T1 is partitioned table USERID must be the first parameter on the command line. Keyword Description (Default) ------------------------------------------------------------------------------ ATTACH Attach to existing job, e.g. ATTACH [=job name]. COMPRESSION Reduce size of dumpfile contents where valid keyword values are: (METADATA_ONLY) and NONE. *CONTENT* Specifies data to unload where the valid keywords are: (ALL), DATA_ONLY, and METADATA_ONLY. DIRECTORY Directory object to be used for dumpfiles and logfiles. DUMPFILE List of destination dump files (expdat.dmp), e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp. ENCRYPTION_PASSWORD Password key for creating encrypted column data. ESTIMATE Calculate job estimates where the valid keywords are: (BLOCKS) and STATISTICS. ESTIMATE_ONLY Calculate job estimates without performing the export. EXCLUDE Exclude specific object types, e.g. EXCLUDE=TABLE:EMP. FILESIZE Specify the size of each dumpfile in units of bytes. FLASHBACK_SCN SCN used to set session snapshot back to. FLASHBACK_TIME Time used to get the SCN closest to the specified time. FULL Export entire database (N). HELP Display Help messages (N). INCLUDE Include specific object types, e.g. INCLUDE=TABLE_DATA. JOB_NAME Name of export job to create. LOGFILE Log file name (export.log). NETWORK_LINK Name of remote database link to the source system. NOLOGFILE Do not write logfile (N). PARALLEL Change the number of active workers for current job. PARFILE Specify parameter file. QUERY Predicate clause used to export a subset of a table. SAMPLE Percentage of data to be exported; SCHEMAS List of schemas to export (login schema). STATUS Frequency (secs) job status is to be monitored where the default (0) will show new status when available. TABLES Identifies a list of tables to export - one schema only. TABLESPACES Identifies a list of tablespaces to export. TRANSPORT_FULL_CHECK Verify storage segments of all tables (N). TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded. VERSION Version of objects to export where valid keywords are: (COMPATIBLE), LATEST, or any valid database version. The following commands are valid while in interactive mode. Note: abbreviations are allowed Command Description ------------------------------------------------------------------------------ ADD_FILE Add dumpfile to dumpfile set. CONTINUE_CLIENT Return to logging mode. Job will be re-started if idle. EXIT_CLIENT Quit client session and leave job running. FILESIZE Default filesize (bytes) for subsequent ADD_FILE commands. HELP Summarize interactive commands. KILL_JOB Detach and delete job. PARALLEL Change the number of active workers for current job. PARALLEL=
. START_JOB Start/resume current job. STATUS Frequency (secs) job status is to be monitored where the default (0) will show new status when available. STATUS[=interval] STOP_JOB Orderly shutdown of job execution and exits the client. STOP_JOB=IMMEDIATE performs an immediate shutdown of the Data Pump job. C:\Documents and Settings\nupneja> Content to the "metadata_only" parameter will export only the structure of the schema to skip the lines.
-
Hello Forum,
I have a question regarding imports and exports of data pump, perhaps what I should already know.
I need to empty a table that has about 200 million lines, that I need to get rid of about three quarters of data.
My intention is to use the data pump to export the table and and indexes and constraints etc..
The table has no relationship to any other table, it is composed of approximately 8 columns with constraints not null.
My plan is
1 truncate table
2. disable or remove index
3 leave the constraints in place?
4. use data pump to import a lines to keep.
My question
will be my clues and imported too much constraints I want to import only a subset of my exported table?
or
If I dropped the table after truncation, I'll be able to import my table and indexes, even if I use the sub as part of my statement to import query functionality?
My table using the query sub in data pump functionality must exist in the database before doing the import
or handful of data pump import as usual IE will create table indices grants and statistics etc.?
Thank you for your comments.
Concerning
Your approach is ineffective.
What you need to do is to
create the table in select foo * bar where the
bar of truncate table;
Insert / * + APPEND * / into select bar * foo.
Rebuild the indexes on the table.
Fact.
This whole thing with expdp and impdp is only a waste of resources. My approach generates redo a minimum.
----------
Sybrand Bakker
Senior Oracle DBA
-
Data Pump: export/import tables in different schemas
Hi all
I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:
Script working export:expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log
Script to import all the desks:
Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate
Script error for some tables to import:
Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only
Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.
It is possble to import some tables but not all tables in the export file?
Thanks for the help!
942572 wrote:
It's good to import all tables exported through Scott, who do NOT have the clause 'include '.
I have the error only when I try to import some tables with clause "include".
Can I import only some table since the export dump file? Thank you!
you use INCLUDE incorrectly!
do below yourself
Impdp help = yes
INCLUDE
Include the specific object types.
For example, INCLUDE TABLE_DATA =.
-
selective column of data pump export
Dear Experts,
I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.
Thanks in advance.spur230 wrote:
Dear Experts,I'm using Oracle 11 g. With the help of data pump is it possible to choose only some columns in a table to export.
Thanks in advance.
It is not possible in data pum export only selective table columns. However you can try next.
(1) create table export_selective as select c1, c2 from the source table; (you can create it in the source db)
(2) expdp export_selective table
(3) export_selective impdp in the target table
Reverse:
Dblink allows to obtain this table created with selective columns using DEC -
I have this slow data export pump, and I have a few suggestions for settings that might improve the speed. But I can't seem to pass them through the DBMS_DATAPUMP package. Is this possible?
REPORT THE NUMBER OF PUMP_HANDLE: = DBMS_DATAPUMP. OPEN (OPERATION = > 'EXPORT', JOB_MODE = > 'TABLE', JOB_NAME = > 'EXP_DATABASE_370');
BEGIN
DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A1.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);
DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A2.) DMP', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_DUMP_FILE);
DBMS_DATAPUMP. ADD_FILE (PUMP_HANDLE, DIRECTORY = > 'EXP_DATABASE_DIR', FILENAME = > ' MY_DATA_A5.) TXT', FILETYPE = > DBMS_DATAPUMP. KU$ _FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP. METADATA_FILTER (PUMP_HANDLE, NAME = > 'NAME_EXPR', VALUE = > 'IN ("MY_DATABASE_370")');
DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'INCLUDE_METADATA', VALUE = > 1);
DBMS_DATAPUMP. SET_PARALLEL (PUMP_HANDLE, LEVEL = > 4);
< < THIS_LINE_FAILS > > DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "DIRECT_PATH");
DBMS_DATAPUMP. START_JOB (PUMP_HANDLE);
DBMS_DATAPUMP. DETACH (PUMP_HANDLE);
END;
< < THIS_LINE_FAILS > > line throws an exception:
ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39049: parameter not valid name ACCESS_METHOD;
ORA-06512: at line 10
Replace < < THIS_LINE_FAILS > > this call fails with the same message
DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > "EXTERNAL_TABLES");
Replace < < THIS_LINE_FAILS > > this call fails with the same message
DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'ACCESS_METHOD', VALUE = > 1); / * INTEGER does not seem to work either * /.
Replace < < THIS_LINE_FAILS > > this call also fails with a message similar
DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'PARALLEL_FORCE_LOCAL', VALUE = > 1);
Replacement of < < THIS_LINE_FAILS > > with this call fails also, with a quite different message
DBMS_DATAPUMP. SET_PARAMETER (PUMP_HANDLE, NAME = > 'Settings', VALUE = > "DISABLE_APPEND_HINT");
ORA-20020: error: ORA-39001: value of the invalid argument. ORA-39207: NULL value is not valid for the parameter settings. ;
Hello
you have ACCESS_METHOD we DATA_ACCESS_METHOD. Just give a try.
see you soon,
rich
-
The journal of data pump export error
In datapump export log, I could see the error below:
=================
. . exported "SYSMAN." "" MGMT_METRIC_DEPENDENCY_DEF "13 7KO lines
. . exported "SYSMAN." "' MGMT_CREDENTIAL_TYPES ' 6,796 KB 4 rows
. . exported "SYSMAN." "' MGMT_JOB_TYPE_DISPLAY_INFO ' 31 lines 7,085 KB
{color: #0000ff} ORA-31693: table data object 'OE '. "' WAREHOUSES ' could not load/unload and being ignored because of the error:
ORA-06564: TEST_DIR object does not exist
{color}. . exported 'SH '. "" CHANNELS "6,695 KB 5 rows
. . exported "SYSMAN." "' MGMT_METADATA_SETS ' 6,757 KB 18 rows
. . exported "SYSMAN." "' MGMT_HA_INFO_ECM ' 6,523 KB 1 lines
=================
Remaining exports seems to be quite right, I do not know why this error stuck to this object.
Does someone has an idea about it, looked for metalink, but did not find a lot of relevant info.
Database version: 10.2.0.1Abu,
Is - this possiible to be able to run the same command line session expdp and see if it fails for the same reason?
Concerning
-
I want to store the files of Data Pump export on a remote machine
Hello
I have two servers to database running, A and B. I want to run a cron job on A who takes a dump and stores it in a directory on B every night.
Is this possible?
Thank you.Another option would be to use the NETWORK_LINK parameter. Let's say you want the export of A files to be created on the server where the B. create a link from B to A database, and then run expdp B using the link pointing to A network
http://docs.Oracle.com/CD/B19306_01/server.102/b14215/dp_export.htm#sthref142
HTH
Srini -
Hi all
Export using default coherent datapump? If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Not just need?
Please let me know.Hello
[If I do an export on the database of production for updating the production dev, do I need to start limiting production database? Do not right?]
years is: no need to start the production database in restricted
-
Export data pump job scheduler does not load
I have a pump a 10.2.0.2.0 data dump file database, and then import on 11.2.0.3.0 fails with this error:
ORA-39083: Object type PROCOBJ failed to create with error: ORA-06502: PL/SQL: numeric or value error: character to number conversion error Failing sql is: BEGIN dbms_scheduler.create_job('"MY_JOB_NAME"', job_type=>'STORED_PROCEDURE', job_action=> 'MY_SCHEMA.MY_PROCEDURE' , number_of_arguments=>0, start_date=>'31-JUL-12 02.05.13.782000 PM EUROPE/BERLIN', repeat_interval=> 'FREQ=WEEKLY;BYDAY=FRI;BYHOUR=7;BYMINUTE=0;BYSECOND=0' , end_date=>NULL, job_class=>'"DEFAULT_JOB_CL {code} I extracted the SQL Code from the dump file and it looks like this: {code} BEGIN dbms_scheduler.create_job('"MY_JOB"', job_type=>'STORED_PROCEDURE', job_action=> 'MY_SCHEMA.MY_PROCEDURE' , number_of_arguments=>0, start_date=>'31-JUL-12 02.05.13.782000 PM EUROPE/BERLIN', repeat_interval=> 'FREQ=WEEKLY;BYDAY=FRI;BYHOUR=7;BYMINUTE=0;BYSECOND=0' , end_date=>NULL, job_class=>'"DEFAULT_JOB_CLASS"', enabled=>FALSE, auto_drop=>FALSE,comments=> 'bla bla comment' ); dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH'); dbms_scheduler.enable('"MY_JOB"'); COMMIT; END; / {code} After the job is defined the second statement fails: {code} SQL> exec dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH'); BEGIN dbms_scheduler.set_attribute('"MY_JOB"','logging_level','HIGH'); END; * ERROR at line 1: ORA-06502: PL/SQL: numeric or value error: character to number conversion error ORA-06512: at "SYS.DBMS_SCHEDULER", line 2851 ORA-06512: at line 1 {code} From the source I see: {code} SQL> select logging_level from dba_scheduler_jobs where job_name = 'MY_JOB'; LOGG ---- FULL {code} In the docs I see these valid LOGGING_LEVELs: http://docs.oracle.com/cd/E14072_01/server.112/e10595/scheduse008.htm#CHDFDFAB DBMS_SCHEDULER.LOGGING_OFF DBMS_SCHEDULER.LOGGING_FAILED_RUNS DBMS_SCHEDULER.LOGGING_RUNS DBMS_SCHEDULER.LOGGING_FULL So please help me, I can't find something useful on MOS, what is Data Pump exporting there, and can not import it again Maybe I have overseen a known bug?
Finally I found the bug myself:
MOS: Impdp of objects of treatment fails with ORA-39083 and ORA-06502 [ID 780174.1]
https://support.Oracle.com/epmos/faces/DocContentDisplay?ID=780174.1
-
Export schema through Oracle data pump with question database Vault enabled
Hello
I installed and configured Vault database on an Oracle 11 g - r2 - 11.2.0.3 to protect a specific schema (SCHEMA_NAME) via a Kingdom. I followed the following doc:
http://www.Oracle.com/technetwork/database/security/TWP-databasevault-DBA-BestPractices-199882.PDF
to ensure that the system and the network user has sufficient rights to complete a schedule pump data Oracle export operation.
I.e. I gave sys and system the following:
execute dvsys.dbms_macadm.authorize_scheduler_user ('sys', 'SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_scheduler_user ('system', 'SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user ('sys', 'SCHEMA_NAME');
execute dvsys.dbms_macadm.authorize_datapump_user ('system', 'SCHEMA_NAME');
I also create a second domain on the same schema (SCHEMA_NAME) to allow sys and system to manage indexes for the actual tables protected, in order to allow a system to manage indexes for the tables of protected area and sys. This separate realm was created for all types of indexes: Index, Partition of Index and Indextype, sys and system were allowed as the OWNER of this Kingdom.
However, when I try and complete an operation of Oracle Data Pump export on the schematic, I get two errors directly after the following line appears in the export log:
Processing object type SCHEMA_EXPORT/TABLE/INDEX/DOMAIN_INDEX/INDEX:
ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_NOTIFY_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
ORA-06512: at line 1
ORA-06512: at "SYS." Dbms_metadata", line 9081
ORA-39127: unexpected error of the call to export_string: = SYS. DBMS_TRANSFORM_EXIMP. INSTANCE_INFO_EXP('AQ$_MGMT_LOADER_QTABLE_S','SYSMAN',1,1,'11.02.00.00.00',newBlock)
ORA-01031: insufficient privileges
ORA-06512: at "SYS." DBMS_TRANSFORM_EXIMP', line 197
ORA-06512: at line 1
ORA-06512: at "SYS." Dbms_metadata", line 9081
The export is completed, but this error.
Any help, pointers, suggestions, etc. in fact no matter what will be very welcome at this stage.
Thank youI moved the forum thread on the "database - security. If the document does not help, pl open an SR with Support
HTH
Srini -
Export the whole (10 GB) using Data Pump utility export base
Hello
I have a requirement that we have to export all of the base (10 GB) using Data Pump export utility because it is not possible to send the discharge of 10 GB in a CD/DVD for the seller of our application system (to analyze a few problems that we have).
Now when I checked online full export is available but not able to understand how it works, as we never used this data pump utility, we use the method of normal export. In addition, the data pump will reduce the size of the dump file so it can fit on a DVD or can we use utility export parallel DB full to split the files and include them in a DVD, it is possible.
Please correct me if I am wrong and kindly help.
Thank you for your help in advance.Pravin,
The server saves files in the directory object that you specify on the command line. So what you want to do is:
1. from your operating system, find an existing directory or create a new directory. In case you, C:/Dump is as good a place as any.
2. connect to sqlplus and create the directory object. Just use the path. I use linux, so my directory looks like/scratch/xxx/yyy
If you use Windows, the path to your directory would look like C:/Dump you should not attack.3. don't forget to grant access to this directory. You can grant access to a single user, group of users, or the public. Just like
any other object.If it helps, or if she has answered your question, please mark messages with the appropriate tag.
Thank you
Dean
-
I have a need to perform a network import a 10.2.0.4 DataPump mode database on my old server (HP - UX 11.11) to my new 11.2.0.3 database on a new server (HP - UX 11.31). What I would REALLY like to do is to import directly from my database physical standby (running in mode READ_ONLY while I do importation) rather than having calm my database of production for a couple of hours while I do the import from there.
What I want to know is if the importation of network mode Data Pump running on 11.2.0.3 the new server creates a task Data Pump to extract in the old database as part of this importation of direct network link. If so, I won't be able to use the physical Standby as the source of my import because Data Pump will not be able to create the main table in the old database. I can't find anything in any Oracle documentation on the use of a physical Standby as a source. I know that I can't do a Data Pump export regularly this database, but I would really like to know if anyone has experience doing this.
Any comments would be greatly appreciated.Bad news, Harry - it worked for me on a standard database open in READ ONLY mode. Not sure what is different between your environment and mine, but there must be something... Read only database is a 10.2.0.4 database running on a box of HP PA-RISC under HP - UX 11.11. The target database runs under 11.2.0.3, on a box of HP Itanium under HP - UX 11.31. The user, I am connected to the database target IMP_FULL_DATABASE privs, and this is the user id used for the DB_LINK and also the same user id on the source database (which, of course, follows!). This user also has the privs permission. My file by looks like this:
TABLES = AC_LIAB_ %
NETWORK_LINK = ARCH_LINK
DIRECTORY = DATA_PUMP_DIR
JOB_NAME = AMIBASE_IMPDP_ARCHDB
LOGFILE = DATA_PUMP_DIR:base_impdp_archdb.log
REMAP_TABLESPACE = ARCHIVE_BEFORE2003:H_DATA
REMAP_TABLESPACE = ARCHIVE_2003:H_DATA
REMAP_TABLESPACE = ARCHIVE_2004:H_DATA
REMAP_TABLESPACE = ARCHIVE_2005:H_DATA
REMAP_TABLESPACE = ARCHIVE_2006:H_DATA
REMAP_TABLESPACE = ARCHIVE_2007:H_DATA
REMAP_TABLESPACE = ARCHIVE_2008:H_DATA
REMAP_TABLESPACE = ARCHIVE_INDEXES:H_INDEXES
REUSE_DATAFILES = NO
SKIP_UNUSABLE_INDEXES = Y
TABLE_EXISTS_ACTION = REPLACE
Maybe you are looking for
-
Visa series executable does not
Hi LV I LV 2010. I created a VI to control the DC motor via the serial port. The VI works fine. But if I create an executable and run it on the same PC communication does not work. Please someone advise me. Ananth
-
I tried to update my windows vista family premium, which worked so well in what is the update of the issues. but the last time, it displays an error with the code 646 and 643 respectively. Is can someone tell me what this error means, as well as meas
-
Please send to my e-mail address to register. Not sent.I want to try MS Project (trial) and send me the cost, please.I would like you to give me the best price, as you can, as I wasunemployed for 3 years (architect) and I need this program toget a jo
-
internal/external speakers, all the controls reveal all workers
internal, external, mike, headphones do not work. After all recommended controls, unistall causing w/restart install: same problem. also used fix - it several times: same problem. Help! Joe
-
Can someone help me find a driver for sound card for HP G61 408 CA (windows XP)? Please help me.