Tablespace level export, import of schema level - is it possible?
Hello
If I have an export level DataPump tablespace (played to = < the tablespaces list > STORAGE space), it is possible to import only tables and dependent objects to a specific schema residing in exported tablespaces? DB version is 11.2.0.3.0. According to the documentation, it should be possible: http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm#i1011943 : a schema import is specified using the SCHEMAS. Source can be a full, table, tablespace, or schema mode export dump file set or another database.
Perform a quick test seems, however, that it is not so:
(1) source DB - I have two tablespaces (TS1, TS2) and two schemas (USER1, USER2):
SQL > select nom_segment, nom_tablespace, segment_type, owner
from dba_segments
where owner in ("USER1", "User2");
2 3
OWNER NOM_SEGMENT SEGMENT_TYPE TABLESPACE_NAME
------ --------------- ------------------ ----------------
USER1 UQ_1 INDEX TS1
USER1 T2 TABLE TS1
USER1 T1 TABLE TS1
USER2 T4 TABLE TS2
USER2 T3 TABLE TS2
(2) I am not a tablespace level to export:
Expdp system directory $ = dp_dir = ts1 tablespaces, ts2 dumpfile = test.dmp
Export: Release 11.2.0.3.0 - Production on Fri Jul 11 14:02:54 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
With partitioning, OLAP, Data Mining and Real Application Testing options
Start "SYSTEM". "" SYS_EXPORT_TABLESPACE_01 ": System / * Directory = dp_dir tablespaces = ts1, ts2 dumpfile = test.dmp
Current estimation using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Object type TABLE_EXPORT/TABLE/TABLE processing
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Object type TABLE_EXPORT/TABLE/CONSTRAINT/treatment
Object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
. . exported "USER1". "" T1 "5,007 KB 1 lines
. . exported "USER1". "" T2 "5,007 KB 1 lines
. . exported "user2". "" T3 "5,007 KB 1 lines
. . exported "user2". "" T4 "5,007 KB 1 lines
Main table 'SYSTEM '. "' SYS_EXPORT_TABLESPACE_01 ' properly load/unloaded
******************************************************************************
"(3) I'm trying to import only the objects belonging to User1 and I get the 'ORA-39039: schema '(' USER1')' expression contains no valid schema" error: "
Impdp system directory $ = dp_dir patterns = USER1 dumpfile = test.dmp
Import: Release 11.2.0.3.0 - Production on Fri Jul 11 14:05:15 2014
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - Production
With partitioning, OLAP, Data Mining and Real Application Testing options
ORA-31655: no data or metadata of objects selected for employment
ORA-39039: pattern Expression "(' USER1')" contains no valid schema.
(4) However, the dump file contains clearly the owner of the tables:
Impdp system directory = dp_dir dumpfile = test.dmp sqlfile = imp_dump.txt
excerpt from imp_dump.txt:
-path of the new object type: TABLE_EXPORT/TABLE/TABLE
CREATE TABLE "USER1". "" T1 ".
("DUMMY" VARCHAR2 (1 BYTE)
)
So is it possible to somehow filter the objects belonging to a certain pattern?
Thanks in advance for any suggestions.
Swear
Hi swear,.
This led a small survey of me I thought I was worthy of a blog in the end...
Oracle DBA Blog 2.0: A datapump bug or a feature and an obscure workaround
Initially, I thought that you made a mistake, but that doesn't seem to be the way it behaves. I've included a few possible solutions - see if one of them responds to what you want to do...
See you soon,.
Rich
Tags: Database
Similar Questions
-
Hello.
I want to export/import two schemas of my former base 9i to a new 11 GR 2.
I've done several times, but this time I am faced with an error that drives me crazy. I hope someone can tell me what I'm missing
The scenario.
Old machine: RedHat 8 with DB Oracle 9.2.0.1.New machine: 5.7 RHEL 64 bit with DB Oracle 11.2.0.2 64-bit.
Yes, in the old machine, I export the users with a file parameter as follows:
/Home/Oracle/exptmp/exp_parameters.par:
file=/home/oracle/exptmp/NESE_20130829.dmp owner=nese compress=n grants=y indexes=y log=exp_NESE_20130829.log rows=y statistics=none
And then.
$ cd /home/oracle/exptmp/ $ exp system/*******@resesdb parfile=exp_parameters.par Export: Release 9.2.0.1.0 - Production on Vie Ago 30 12:07:07 2013 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved. Conectado a: Oracle9i Enterprise Edition Release 9.2.0.1.0 - Production With the Partitioning and Oracle Data Mining options JServer Release 9.2.0.1.0 - Production Exportación realizada en el juego de caracteres WE8ISO8859P1 y el juego de caracteres NCHAR AL16UTF16 Exportando los usuarios especificados ... [...] La exportación ha terminado correctamente y sin advertencias. Export: Release 9.2.0.1.0 - Production on Vie Ago 30 13:08:18 2013 Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved. $
I copy the dump file with SCP (scp, since it transfer files in binary mode only.) I also tried with SFTP but got the same result)
$ scp NESE_20130829.dmp oracle@***.***.***.78:/home/oracle/imptmp/ oracle@***.***.***.78's password: NESE_20130829.dmp 100% |*******************************************************************************************************|
In the new machine:
User "NESE" and tablespaces/data files needed have already been created.
$ ls -la /home/oracle/imptmp/NESE_20130829.dmp -rwxr-xr-x 1 oracle oinstall 6196510720 ago 30 14:07 /home/oracle/imptmp/NESE_20130829.dmp
I use a parameter for the import file.
/Home/Oracle/imptmp/imp_parameters.par
file=/home/oracle/imptmp/NESE_20130829.dmp fromuser=nese touser=nese commit=y grants=y ignore=y indexes=y log=import_nese_20130829.log rows=y
But when I do the import, then:
$ cd /home/oracle/imptmp/ $ imp system/*******@seresdb file=imp_parameters.par Import: Release 11.2.0.2.0 - Production on Vie Ago 30 14:13:45 2013 Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved. Conectado a: Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production With the Partitioning, OLAP, Data Mining and Real Application Testing options IMP-00009: abnormal end of export file IMP-00000: Import terminated unsuccessfully $
And that's all. I don't know what else to do.
MOS, there is hardly this error information, except if you transfer the file by FTP in ascii mode.
There is something that I am missing?
Any idea?
Concerning
OK, I found the problem. (less)
I think this problem is the file system of the old machine.
The old machine's file system is reiserfs. I have not found any note on the problems with reiserfs, but this is my conclusion, given that all the tests I did.
This old machine had an NFS volume on a backup server, so I changed to this volume export dump file. Then I transferred the file via scp on the new computer, as I did before.
Now import worked well. !
I repeated the steps by changing the backup file to export to back reiserfs volume, only to ensure that I did not make other changes, scp the new machine, and failed to import. So, no doubt. It seems that reiserfs causes a kind of corruption of dump files when they are copied to another file system (but it's weird, because I run md5sum on the two file systems and the result is the same)
I would like to do another test. So I'll write here the results to confirm and help other people with the same problem.
Thank you all.
-
How to import the schema to a designated tablespace file?
Hi all:
I had created a tablespace and a temporary tablespace of the files for the shema again, but I don't know if I import the schema for the new tablespace or the default file. How can I check?
What is the command to modify the tablespace and when I need to change it?
Thank you.
SQL > create tablespace temp_demov temporary tempfile ' / u02/BIPLATFORM_DB/temp01.dbf' size 5 m autoextend on;
create tablespace temporary temp_demov templfile ' / u02/BIPLATFORM_DB/temp01.dbf' size 5 m autoextend on
SQL > create tablespace TBS1 datafile ' / u02/BIPLATFORM_DB/tbs01.dbf' size 5 m autoextend on
2;
Created tablespace.
[obiee@elysium-d ~] $ sqlplus / as sysdba
SQL * more: Production of liberation 11.2.0.2.0 kills him Aug 12 14:24 2014
Copyright (c) 1982, 2010, Oracle. All rights reserved.
Connected to:
Oracle Database 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit Production
With partitioning, Real Application Clusters, Automatic Storage Management, OLAP,.
Options of Data Mining and Real Application Testing
SQL > create or replace directory datapumpdir as ' / tmp';
Created directory.
SQL > impdp system/arg0s@rhea directory = datapumpdir = BISAMPLE logifle = BISAMPLE .log .dmp dumpfile
SP2-0734: unknown command begins «impdp system...» "- rest of line is ignored.
SQL > impdp system/arg0s@rhea directory = datapumpdir = BISAMPLE logifle = BISAMPLE .log .dmp dumpfile
SP2-0734: unknown command begins «impdp system...» "- rest of line is ignored.
SQL >
[obiee@elysium-d ~] $ impdp system/arg0s@rhea directory = datapumpdir BISAMPLE = dumpfile. Logfile = BISAMPLE .log DMP
Import: Release 11.2.0.2.0 - Production on Tue 12 August 14:49:38 2014
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
....
You can remap tablespace user in impdp command
-
Data Pump: export/import tables in different schemas
Hi all
I use Oracle 11.2 and I would like to use the data pump to export / import data into the tables between the different schemas. The table already exists in the source and target schemas. Here are the commands that I use:
Script working export:expdp scott/tiger@db12 schema = source include = TABLE:------"IN (\'TB_TEST1 ', \'TB_ABC')\ 'directory = datapump_dir dumpfile = test.dump logfile = test_exp.log
Script to import all the desks:
Impdp scott/tiger@db12 remap_schemas = rΘpertoire source: target = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only table_exists_action = truncate
Script error for some tables to import:
Impdp scott/tiger@db12 remap_schemas = source: target include = TABLE:------"IN (\'TB_TEST1')\' directory = datapump_dir dumpfile = test.dump logfile = test_imp.log content = data_only
Export is good, I got the folling error when I try to import the table TB_TEST1 only: "ORA-31655: no metadata data_or objectsz selected for employment. The user scott is a DBA role, and it works fine when I try to import all tables being export without the include clause.
It is possble to import some tables but not all tables in the export file?
Thanks for the help!
942572 wrote:
It's good to import all tables exported through Scott, who do NOT have the clause 'include '.
I have the error only when I try to import some tables with clause "include".
Can I import only some table since the export dump file? Thank you!
you use INCLUDE incorrectly!
do below yourself
Impdp help = yes
INCLUDE
Include the specific object types.
For example, INCLUDE TABLE_DATA =.
-
I have Windows XP and IE8. I checked help for export/import of Favorites and know it:
- Click the Favorites button, click the arrow next to the Add to Favorites button, and then click import and export.
However there is no arrow next to the add to Favorites button. I remember in IE7 that's exectly so.
http://support.Microsoft.com/default.aspx/KB/211089/en-us
Internet Explorer 8 to export the Favorites folder, follow these steps:
- In Internet Explorer, click Favorites, click the arrow next to Add to your Favoritesand then click import and export.
- Click export to a file, and then click Next.
- Click to select the Favorites check box, and then click Next.
- Select the Favorites folder that you want to export. If you want to export all Favorites, select the top level Favorites folder. Otherwise, select the individual folder that you want to export.
- Click Next.
Note By default, Internet Explorer creates a Bookmark.htm file in your Documents folder. If you want to use a different name than the file Bookmark.htm, or if you want to store the Favorites exported in a different folder in the Documents folder, specify the new file and the name of the folder.
- Click Next.
Note If you already have a file with the same name, Internet Explorer asks you to replace. Click Yes to replace the file. Click No to provide a new file name.
- Click export.
- Click Finish.
Discussions in microsoft.public.internetexplorer.general
XP discussion groups:
Link above is to the XP newsgroups.
There is a list of groups of discussion XP to the bottom of the left column.
You get the help you need there.
Here is the Vista Forums.
See you soon
Mick Murphy - Microsoft partner
-
9i export/import - multiple exports running
Hi all
I have a Database 9i running on an AIX machine. I'm trying to export a few patterns.
accidentally I ran after 2 minutes that I wasn't able to find the process with expdp (instead of exp).
to my surprise, it doesn't show any error or any matter that the same dump file and logfile are reused
normally that I know in datapump I can kill by going to prompt export orders, but don't know how I'd do the same thing with the traditional export/import.
(1) if I kill it of operating system level. He will have to dump of origin file problem as I'm not sure of what process use the dump file to write data
(2) why it did not any error when running it a second time as it normally gives an error in expdp to logfile if his re-employment.
(3) is there a link where I can find a good post, containing information or export import tips
Database: 9.2.0.6
Machine: AIX
Thank you
H,
Thus, if the main table would exist, then he doesn't average I export dump would be consistent? Even I don't see any errors recorded in the logfile anywhere.
is there a manual way to perform a check consistency of my export dump file.
1. I was little confused with the word and the context / term "Coherent"? Can you specify what is consistent - from your view, so that I can try myself to provide information from my side.
To be honest, if the export went very well, you would see then all problems.
Export master table helping - complete/progress of the task (to track the progress of it). Is to contribute to the collection (* as scale all the objects he needs to go through the base on your export type)
Since then, if dynamic if you want to join some tasks to the existing in progress/task, can he help you.
-Pavan Kumar N
-
Hi all
I tried to manage a report to other FMS of export/import.
I found this good documentation:
export:
C:\Quest_Software\foglight5\bin > fglcmd.bat - srv 10.4.118.110 - port 8080 - usr foguser - pwd fogpwd123 cmd - m system util:uiexport: sa_salogsummary f C:\Temp\LogFilterSummaryDashboard.zip
Import:
$FOGLIGHT_HOME/bin/fglcmd.sh - 10.4.118.110 - port 8080 srv - usr - pwd fogpwd123 cmd - foguser util:uiimport f LogFilterSummaryDashboard.zip
but my problem is, in this example, the module he wants to export has a good reputation: "system: sa_salogsummary.
But if I look in my report, it bears the name: "user: saladisc.3" so I cut the '. '. 3 "and does, but as expected it does my all-inclusive usersettings of photos and so on.
And if I extracted the zip file that was does and only change the wcf.xml of does, deleted all the other reports/dashboards that I don't want and didn't import xml without all the other stuff he does not appear :)
any ideas how I could get this special report in an another fms? where is my failure?
Thanks in advance
The blog article that you referenced, build a custom dashboard to summarize LogFilter alarms describes the difference between data-driven drag and drop dashboards and reports and dashboards WCF based queries and reports.
The 'special report' you are trying to copy to an another FMS appears to be a drag and drop the report. This report has a number of data views with direct references to the data in your original FMS. None of these specific data elements will exist in your FMS target, so it is unlikely that your migrated report will have any content once you've imported it there.
The blog post also describes the technique of export and importation of a module of dashboard from a FMS. The system works at the level of a module, so all the views, queries, converters, etc in the module will be included in the export. If you want to export a single view/report and not others in the same module, you need to create a new module, and then (deep) copy the target view/report of the original module in the new module before exporting the new module. The technique of export/import works with both 'system' and 'user' modules, but queries modules WCF-based 'system' are more suited to sharing across different FMS that modules the user with drag-and-drop views/reports with direct references.
Kind regards
Brian Wheeldon
-
Import the schema from one instance to another
Hello
I'm going through a pattern of a (production) Oracle instance to another (development) that is located on a different server. I modified this two years ago, and the data are now obsolete. We do export every night of all schemas and I only want to import the schema of "Bob" in dev. I copied the export file to the dev server and used the following command:
Sys IMP / < password > file = server.dmp fromuser = bob ignore = y
The import worked well for a bit, then I started getting errors cannot create scopes on the tablespace and possibly the imp stopped. I check, and of course, like many primary tablespace which is strange since the two prod and the dev team have the same size. In any case, I went in and has doubled the size of the index and the data of the tablespaces and tried to import again.
This time, I got the following:
IMP-00019: line rejected due to the ORACLE 1 error
IMP-00003: ORACLE error 1
ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
Column 1 3956
2 1074785 column
Column 3 20 - APR - 2001:08:57:44
Column 4 483
Column 5 52905
Column 6 CR
Column 7-. 72
Column 8
Column 9 47650
Column 10
Column 11
Column 12
IMP-00019: line rejected due to the ORACLE 1 error
IMP-00003: ORACLE error 1
ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
Column 1 3957
2 1076007 column
Column 3 20 - APR - 2001:13:38:04
Column 4 483
Column 5 24290
Column 6 CR
Column 7-. 26
Column 8
Column 9 47839
Column 10
Column 11
Column 12
IMP-00019: line rejected due to the ORACLE 1 error
IMP-00003: ORACLE error 1
ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
3958 1 column
2 1076015 column
Column 3 20 - APR - 2001:13:38:54
Column 4 483
Column 5 24290
Column 6 CR
Column 7-. 09
Column 8
Column 9 47842
Column 10
Column 11
Column 12
I use the "ignore = y ' option it will ignore a problem if the object already exists, but it adds the data to import or clear and then add it? I'm guessing that he tries to add data, which are the cause of the errors.
I remember in the past, the only way I got this job was to open a session as long as sys, delete user Bob w/waterfall, and then re-create the user before importing. This seems to be a royal pain in the lacrosse.
Did I miss something on how to properly import the schema?
Thank youThis works. If you have changes of geometry in the table, it's probably best to drop the scheme or at least the table. What if all you want to do is update the data and nothing else has changed, then you can simply truncate and then import the data.
Dean
-
Problem all import a schema in Oracle 10 g.
The schema import (with any import data structures only) 7 to 10g Oracle. While importing the schema, it shows "low disk space". but I have 55 GB of available disk space.
Help me if one know about it.I had droped all objects and import again performed and generated the create.sql script. I tried to remove the init/next extensions from the table, however I have > found there are 183 tables.
If you are to 9i or more and if you are using LMT with Autoallocate or uniform size and if you want to export and import these tables into another database, use this:
Prepare the list of 183 tables.
set long 9999999 exec dbms_metadata.set_transform_param(dbms_metadata.session_transform, 'STORAGE', FALSE); -- this will remove the Storage clause for the objects. EXEC DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM, 'TABLESPACE', FALSE); - this will Remove the tablespace clause for the objects. select dbms_metadata.get_ddl('TABLE','TABLE_NAME','SCHEMA_OWNER') from dual; --generate for 183 tables
Now you have the script and run them in the target database.
HTH
AnanthaIt cannot be used for oracle 7. Please ignore.
-
Hello
I am getting below error while taking expdp backup table of BLOB.
ORA-31693: Data Table object 'HCLM_ADMIN '. "' SCAN_UPLOADEDFILES_TEMP ' failed to load/unload and being ignored because of the error:
ORA-02354: Error exporting/importing data
ORA-01555: snapshot too old: rollback segment number with the name "" too small
ORA-22924: snapshot too old
ORA-31693: Data Table object 'HCLM_ADMIN '. "' TPA_FAXWATCHER ' failed to load/unload and being ignored because of the error:
ORA-02354: Error exporting/importing data
ORA-01555: snapshot too old: rollback segment number with the name "" too small
ORA-22924: snapshot too old
We have already set aside retention to 50000 .table structure are:
SQL > show Cancel parameter
VALUE OF TYPE NAME
------------------------------------ ----------- ------------------------------
UNDO_MANAGEMENT string AUTO
UNDO_RETENTION integer 50000
undo_tablespace string UNDOTBS1
SQL > alter table hclm_admin. SCAN_UPLOADEDFILES_TEMP modify lob (FILE_BLOB) (RETENTION);
Modified table.
SQL > select nom_de_colonne, pctversion and retention
from dba_lobs where owner = 'HCLM_ADMIN' and table_name = "SCAN_UPLOADEDFILES_TEMP";
COLUMN_NAME
--------------------------------------------------------------------------------
PCTVERSION RETENTION
---------- ----------
FILE_BLOB
50000
SQL > alter table hclm_admin. TPA_FAXWATCHER modify lob (FILEDATA_BLOB) (RETENTION);
Modified table.
SQL > select column_name, pctversion and retention of dba_lobs where owner = 'HCLM_ADMIN' and table_name = "SCAN_UPLOADEDFILES_TEMP";
Column_name PCTVERSION RETENTION
---------- ----------
FILE_BLOB
50000
CREATE TABLE HCLM_ADMIN. TPA_FAXWATCHER
(
FILENAME_VAR VARCHAR2 (50 BYTE),
DATE OF CREATED_DATE_DTE,
FILEPATH_VAR VARCHAR2 (100 BYTE),
DATE OF TIMESTAMP_DTE,
FAXNO_VAR VARCHAR2 (15 BYTE),
DEPARTMENT_VAR VARCHAR2 (50 BYTE),
REQUESTTYPE_VAR VARCHAR2 (50 BYTE),
TAGTO_VAR VARCHAR2 (50 BYTE),
REMARK_VAR VARCHAR2 (1000 BYTE),
DOCTYPE_VAR VARCHAR2 (50 BYTE),
TAGTOVALUE_VAR VARCHAR2 (50 BYTE),
DOCTYPE_OTHER_VAR VARCHAR2 (50 BYTE),
HEGIC_NO_VAR VARCHAR2 (50 BYTE),
RECORDNO_NUM NUMBER OF NON-NULL,
FILEDATA_BLOB BLOB,
DATE OF FAXLOCKDATE_DTE,
NUMBER OF FAXLOCKSTATUS_VAR
FAXLOCKBYUSER_VAR VARCHAR2 (50 BYTE)
)
(STORE AS) LOB (FILEDATA_BLOB)
TABLESPACE HCLM_ALERTSVC
ALLOW ONLINE STORAGE
8192 CHUNK
RETENTION
NOCACHE
LOGGING
INDEX)
TABLESPACE HCLM_ALERTSVC
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
))
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
))
TABLESPACE HCLM_ALERTSVC
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
ALTER TABLE HCLM_ADMIN. (ADD) TPA_FAXWATCHER
KEY ELEMENTARY SCHOOL
(RECORDNO_NUM)
USING INDEX
TABLESPACE HCLM_ALERTSVC
PCTFREE 10
INITRANS 2
MAXTRANS 255
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
));
CREATE TABLE HCLM_ADMIN. SCAN_UPLOADEDFILES_TEMP
(
NUMBER OF TEMPID_NUM
SESSION_ID VARCHAR2 (200 BYTE),
NUMBER OF UPLOADFILE_NUM
NUMBER OF DOCNO_NUM
NUMBER OF SCANJOB_NUM
FILENAME_VAR VARCHAR2 (200 BYTE),
FILETYPE_VAR VARCHAR2 (200 BYTE),
FILE_BLOB BLOB,
VARCHAR2 (200 BYTE) FLAG,
NUMBER OF USERID_NUM
CREATED_DATE DATE
)
(STORE AS) LOB (FILE_BLOB)
TABLESPACE PHCLMDBTBS
ALLOW ONLINE STORAGE
8192 CHUNK
RETENTION
NOCACHE
LOGGING
INDEX)
TABLESPACE PHCLMDBTBS
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
))
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
))
TABLESPACE PHCLMDBTBS
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE)
64K INITIALS
ACCORDING TO 1 M
MINEXTENTS 1
MAXEXTENTS UNLIMITED
PCTINCREASE 0
DEFAULT USER_TABLES
)
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING;
Kind regards
Hello
First check the lob corrupt as:
SQL > create table corrupted_lob_data (corrupted_rowid rowid);
Table created.
SQL > set off concat
SQL >
declare
SQL > error_1555 exception;
pragma exception_init (error_1555,-1555);
number num.
Start
for cursor_lob in (select rowid r, & lob_column of table_owner. & table_with_lob) loop
Start
NUM: = dbms_lob.instr (. cursor_lob & lob_column, hextoraw ('889911'));
exception
When error_1555 then
insert into corrupted_lob_data values (cursor_lob.r);
commit;
end;
end loop;
end;
2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
17.
Enter the value of lob_column: FILE_BLOB
Enter the table_owner value: hclm_admin
Enter the value of table_with_lob: SCAN_UPLOADEDFILES_TEMP
former 6: for cursor_lob in (select rowid r, & lob_column of table_owner. & table_with_lob) loop
6 news: for cursor_lob (select rowid r, FILE_BLOB of hclm_admin loop SCAN_UPLOADEDFILES_TEMP.)
old 8: num: = dbms_lob.instr (. cursor_lob & lob_column, hextoraw ('889911'));
8 new: num: = dbms_lob.instr (cursor_lob. FILE_BLOB, hextoraw ('889911'));
PL/SQL procedure successfully completed.
SQL > select * from corrupted_lob_data;
CORRUPTED_ROWID
------------------
AAASF/AAKAABacyAAA
To resolve this problem, we have 3 options
(1) to restore and recover the LOB segment using physical backup.
(2) empty LOBs affected using the UPDATE, as mentioned in the Note 787004.1statement:
SQL > update SCAN_UPLOADEDFILES_TEMP
Set FILE_BLOB = empty_blob()
where rowid in (select corrupted_rowid
of corrupted_lob_data);
SQL > commit;
(3) perform the export, excluding the corrupt ROWID. by adding the following to export command parameter
Query =------"where rowid not in (\'AAASF/AAKAABacyAAA\'\) \ \" "
Kind regards
Alok Dwivedi
-
How export/import the database from the Tools menu in sql developer.
Hello
I need to export the database to a server and import it on another server db using the export function of database under the Tools menu in SQL developer. How to import it? This is for Oracle database 10g.
Thank youHello
You can do it, but I do not recommend because sqldeveloper could eat all the memory of the system to analyze the script. Now the question is do you have also opted for storage option when you export, if yes make sure that similar tablespace exists or available in env b. Also if you want to use the generated sqldeveloper file you can run this file from sqlplus session
sqlplus newuser/password
sqlplus > @myexport.sqlBut my advice or suggestion or recommendation to be, you should consider using export/import or datapump but the decision is yours. Hope this helps
Concerning
-
Migration 9i to 10g (ASM) using export/import
I am trying to migrate 9i to 10 g to a new server, don't know if it is possible to import into the database managed ASM directly or not. Here are the steps.
1 run the full export on 9i
2. install 10gr 2 on the new server, application of patches to latest
3. install the DSO 10 gr 2 on the new server instance
4. create a new db via DBCA, using ASM instance
5 create spaces for storage on the new database (managed by ASM) <-concern: since the new DB using ASM, when create tablespace, using different datafile name old db, as if it will cause import problem? Does full import requires same structures?
Thank you.
JoeHello
When create tablespace, he will use different datafile name old db, as if it will cause import problem
The name of the tablespace should be the same. The name or the location of the data file is not serious. It's just this tablespace name is important. Create storage spaces, create users, and then import them using the FROMUSER TOUSER parameter.
Anand
-
Export/import of the data extraction
Hello
Is it possible to export/import a data mining (created via the server front end ordminer) the ACTIVITY of a schema/database to an another schema/database?
The generation of code pl/sql method and the dbms_data_mining.export_model/import_model no current models.
I need to transfer a subset of "mining" in one database to another. Using a full schema export/import is not possible because I want to just move some activities.
Help?
Concerning
KSHello
Sorry, but you cannot export mining activities.
Activity code generation is a deployment option (to advance the process in some form of request).
Recreate the mining activities in the new DB would be my recommendation.
Thank you, Mark -
How export/import prefs.js? Copy of the file does not work
I want to export/import on: configuration changes. The path is "Firefox/Profiles/i63866ho.default-1384517947097/prefs-js". I also copied prefs.js, all the contents of the file "i63866ho.default - 1384517947097" and the folder "Profiles". Nothing has worked. When I copied prefs.js he did nothing and returns to the old/first prefs.js after restarting firefox. When I copy the file it says that it cannot do.
Can someone help me?
Thank you very much
I'm not very familiar with profiles.ini. There are probably other volunteers who can tell you the best way to add the new (old) profile name to the file.
The prefs.js file is small, but you mentioned the copy of your extensions as well, so I thought it would be more convenient copy the entire folder. I think that if you can afford the disk space, it is easier to grasp this. That said, some of the information is redundant with, for example, the bookmarkbackups folder contains many safeguards dated (mine are over 1 MB). If you use download sites that use IndexedDB (such as the mega), can have a very large folder indexedDB that you really don't need to carry with you.
-
Files on the FF help bookmarks refers to an element of Menu called Export/Import... Unfortunately, this menu choice is not in menu 13 worm m FF. So, how to export m bookmarks to an HTML file?
See in HISTORY > and view HISTORY to open the library menu > import and backup > export HTML...
Thank you
Maybe you are looking for
-
Satellite 2450-201: installation new multi drive DVD - RW
INSTALLED NEC ND - 6750A DVD PLAYER MULTI ON MY LAPTOP OF 2450-201.DEVICE IS NOT RECOGNIZED - THE NOTEBOOK STARTS UP OK OTHERWISE. HOW TO SOLVE THIS PROBLEM.HELP & GREATLY APPRECIATED TIPS.
-
M6-w105dx envy: envy x 360 m6 convertible w105dx screen after waking from sleep
Hello Whenever I put my laptop to sleep (I almost cover) and I try to wake up (I open the lid) the screen is full of small squares (they resemble pixels). I don't see a little square that moves when I move the mouse. If I turn it off by pressing the
-
Satellite Pro 4300 - unrecognized PCMCIA wifi card
Hello!!Can someone help me please? I have problems with my Wireless PCMCIA card. The computer does not seem to find it at all. I am running Windows XP. I tried with other maps too, but nothing happens. Are there drivers, I need to solve the problem o
-
Change the left kick in fact email application.
Hello. I know that slide left full email deletes (or met) this email. How can I disable this option / change what is the complete left JAB? Thank you.
-
Connect to the router but no internet.
I use windows xp and netgear wireless router.till yesterday I browse the internet.from today morning, I'm not in a position to communicate the internet.but it will connect to the linksys router not internet.i can ping default gateway.but, that I can'