Dropped the curiously named table in the schema table sheets

I use Oracle XE and I'm not logged as an administrator. Just as a user called 'hr '.

I dropped an empty table called "Date_tine_demo" (sic) to help

DROP TABLE date_tine_demo;

But when I checked the tables in my diagram to help

SELECT * FROM tab;

the 'date_tine_demo' table was missing, but there was a table with a really weird name

Before:

TNAME                          TABTYPE  CLUSTERID                               
------------------------------ ------- ----------                                                                        
COUNTRIES                      TABLE 
DATE_TINE_DEMO           TABLE                                          
DEPARTMENTS                TABLE                                            
EMPLOYEES                    TABLE                                            
EMP_DETAILS_VIEW       VIEW                                             
JOBS                                 TABLE                                            
JOB_HISTORY                  TABLE                                            
LOCATIONS                      TABLE                                            
PROMOTIONS                  TABLE                                            
REGIONS                         TABLE  

After:

TNAME                          TABTYPE  CLUSTERID                               
------------------------------ ------- ----------                               
BIN$zP+eIjJ7ShCOnQb4EGVHig==$0 TABLE                                            
COUNTRIES                                             TABLE                                            
DEPARTMENTS                                       TABLE                                            
EMPLOYEES                                             TABLE                                            
EMP_DETAILS_VIEW                                VIEW                                             
JOBS                                                            TABLE                                            
JOB_HISTORY                                          TABLE                                            
LOCATIONS                                             TABLE                                            
PROMOTIONS                                         TABLE                                            
REGIONS                                                  TABLE                              

If anyone can offer advice on what this table is and how to get rid of, I would be grateful

What is in the garbage?

And if you don't want table to go to the basket when you drop, either disable the trash with ALTER SESSION/SYSTEM SET RECYCLEBIN = OFF or use DROP TABLE table_name PURGE. And to get rid of what is currently used to recycle bin PURGE RECYCLEBIN.

SY.

Tags: Database

Similar Questions

  • Impossible to drop the schema

    Hi all

    I'm trying to drop the schema but below error is coming

    (1) bdr_local > drop user cascade HILLS_ISSUE_TRACKER;
    drop the waterfall HILLS_ISSUE_TRACKER user
    *
    ERROR on line 1:
    ORA-00604: an error has occurred at the SQL level 1 recursive
    ORA-00069: could not acquire lock - tables turned off for B_ISSUE_COMMENTS locks

    bdr_local 2) > alter table HILLS_ISSUE_TRACKER. B_ISSUE_COMMENTS activate the table lock;
    ALTER table HILLS_ISSUE_TRACKER. B_ISSUE_COMMENTS activate the table lock
    *
    ERROR on line 1:
    ORA-00054: resource busy and acquire with NOWAIT specified

    bdr_local 3) > SELECT * FROM v lock $
    WHERE id1 = (SELECT FROM object owner object_id WHERE = 'HILLS_ISSUE_TRACKER' AND object_name = "B_ISSUE_COMMENTS"); 2

    no selected line

    I took the rebound of db to release the lock, if any. Please help if some body the same question.

    Thank you
    Jamsher.

    Something new to learn, thanks for sharing this.

  • How findout date, hour and who dropped the scheme at level database?

    Hi all

    In my production database, some abandoned one of the main schema, is it possible to know who let him down? & what time? How to find out who are all having permission to abandon the other schema?

    Version: 11 GR 2

    Thank you

    903787 wrote:
    Hello

    I'm sorry db is in noarchive, according to the requirement of the company.

    given that the company doesn't care enough data to have DB mode archive,
    they are reaping what they have sown.

    Please let me know is there any other options?

    only if AUDITING has been enabled previously.
    I know ask not that you another option now.
    Please don't waste your time doing so.

  • Import the schema from one instance to another

    Hello

    I'm going through a pattern of a (production) Oracle instance to another (development) that is located on a different server. I modified this two years ago, and the data are now obsolete. We do export every night of all schemas and I only want to import the schema of "Bob" in dev. I copied the export file to the dev server and used the following command:

    Sys IMP / < password > file = server.dmp fromuser = bob ignore = y

    The import worked well for a bit, then I started getting errors cannot create scopes on the tablespace and possibly the imp stopped. I check, and of course, like many primary tablespace which is strange since the two prod and the dev team have the same size. In any case, I went in and has doubled the size of the index and the data of the tablespaces and tried to import again.

    This time, I got the following:

    IMP-00019: line rejected due to the ORACLE 1 error
    IMP-00003: ORACLE error 1
    ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
    Column 1 3956
    2 1074785 column
    Column 3 20 - APR - 2001:08:57:44
    Column 4 483
    Column 5 52905
    Column 6 CR
    Column 7-. 72
    Column 8
    Column 9 47650
    Column 10
    Column 11
    Column 12
    IMP-00019: line rejected due to the ORACLE 1 error
    IMP-00003: ORACLE error 1
    ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
    Column 1 3957
    2 1076007 column
    Column 3 20 - APR - 2001:13:38:04
    Column 4 483
    Column 5 24290
    Column 6 CR
    Column 7-. 26
    Column 8
    Column 9 47839
    Column 10
    Column 11
    Column 12
    IMP-00019: line rejected due to the ORACLE 1 error
    IMP-00003: ORACLE error 1
    ORA-00001: unique constraint (SYS. XPKACCOUNT_TRANSACTION) violated
    3958 1 column
    2 1076015 column
    Column 3 20 - APR - 2001:13:38:54
    Column 4 483
    Column 5 24290
    Column 6 CR
    Column 7-. 09
    Column 8
    Column 9 47842
    Column 10
    Column 11
    Column 12

    I use the "ignore = y ' option it will ignore a problem if the object already exists, but it adds the data to import or clear and then add it? I'm guessing that he tries to add data, which are the cause of the errors.

    I remember in the past, the only way I got this job was to open a session as long as sys, delete user Bob w/waterfall, and then re-create the user before importing. This seems to be a royal pain in the lacrosse.

    Did I miss something on how to properly import the schema?

    Thank you

    This works. If you have changes of geometry in the table, it's probably best to drop the scheme or at least the table. What if all you want to do is update the data and nothing else has changed, then you can simply truncate and then import the data.

    Dean

  • WARNING: The pool_config view or table is not found in the schema: APEX_LISTENER

    Hi, we get this error in an independent ADR on Grizzly implemented.

    August 14, 2014 09:59:05

    oracle.dbtools.common.config.db.DatabasePoolConfig readPoolConfig

    WARNING: The pool_config view or a table has not been found in the schema:

    APEX_LISTENER

    Oracle.dbtools.rt.web.WebErrorResponse 09:59:05 August 14, 2014

    internalError

    GRAVE: JDBCException [kind = NO_DATA]

    JDBCException [kind = NO_DATA]

    defaults.xml "XXXXX" designate removed for safety, front of post

    <? XML version = "1.0" encoding = "UTF-8" standalone = 'no '? >

    < ! DOCTYPE SYSTEM property "http://java.sun.com/dtd/properties.dtd" > ""

    Properties of <>

    < comment > saved on Mon Aug 04 16:25:09 EDT 2014 < / comment >

    < key "cache.caching entry" = > false < / entry >

    < key = "cache.directory entry" > / tmp/apex/cache < / entry >

    < key = "cache.duration entry" > days < / entry >

    < key = "enter cache.expiration" > 7 < / entry >

    < key = "enter cache.maxEntries" > 500 < / entry >

    < key = "enter cache.monitorInterval" 60 > < / entry >

    < key = "cache.procedureNameList" / entry >

    < key = "cache.type entry" > lru < / entry >

    < key = "db.hostname entry" > XXXXXXXXX < / entry >

    < key = "db.password entry" > XXXXXXXXXXX < / entry >

    < key = "db.port entry" > XXXX < / entry >

    < key "db.sid entry" = > XXXXXX < / entry >

    < key "debug.debugger entry" = > false < / entry >

    < key "debug.printDebugToScreen entry" = > false < / entry >

    < key "error.keepErrorMessages entry" = > true < / entry >

    < key = "enter error.maxEntries" > 50 < / entry >

    < enter key = "jdbc. DriverType"> thin < / entry >

    < enter key = "jdbc. InactivityTimeout"1800 > < / entry >

    < enter key = "jdbc. InitialLimit' > 3 < / entry >

    < enter key = "jdbc. MaxConnectionReuseCount"> 1000 < / entry >

    < enter key = "jdbc. MaxLimit' 10 > < / entry >

    < enter key = "jdbc. MaxStatementsLimit' 10 > < / entry >

    < enter key = "jdbc. MinLimit"> 1 < / entry >

    < key = "enter jdbc.statementTimeout" > 900 < / entry >

    < key "log.logging entry" = > false < / entry >

    < key = "enter log.maxEntries" > 50 < / entry >

    < key = "misc.compress" / entry >

    < key = "misc.defaultPage entry" > apex < / entry >

    < key "misc.enableOldFOP entry" = > true < / entry >

    < key "security.disableDefaultExclusionList entry" = > false < / entry >

    < key = "enter security.maxEntries" > 2000 < / entry >

    < / properties >

    Complete log

    August 14, 2014 11:28:13 oracle.dbtools.standalone.Standalone run

    INFO: NOTE:

    Stand-alone mode is designed for use in development and test environments. This not is not supported for use in production environments.

    August 14, 2014 11:28:13 oracle.dbtools.standalone.Standalone run

    INFO: Starting standalone Web container to: / u00/home/oracle/ords/ords

    August 14, 2014 11:28:14 oracle.dbtools.standalone.Deployer deploy

    INFO: Deploy the application = /u00/home/oracle/ords/ords/ords/WEB-INF/web.xml path

    August 14, 2014 11:28:17 oracle.dbtools.standalone.Deployer deploy

    INFO: Deploy the application path = /u00/home/oracle/ords/ords/ords/WEB-INF/web.xml

    August 14, 2014 11:28:18 oracle.dbtools.common.config.file.ConfigurationFolder logConfigFolder

    INFO: Using the configuration file: / u00/home/oracle/ords/ords

    Configuration properties for: apex

    cache. Caching = false

    cache. Directory = / tmp/apex/cache

    cache. Duration = Days

    cache.expiration = 7

    cache.maxEntries = 500

    cache.monitorInterval = 60

    cache.procedureNameList =

    cache.type = LRU

    DB. HostName = XXXXXXXXXXXXXXXXXXX

    DB. Password = *.

    DB.port = XXXXXXXXXXXXXXXXXXX

    DB.sid = XXXXXXXXXXXXXXXXXXX

    Debug.Debugger = false

    debug.printDebugToScreen = false

    error.keepErrorMessages = true

    error.maxEntries = 50

    JDBC. DriverType = thin

    JDBC. InactivityTimeout = 1800

    JDBC. InitialLimit = 3

    JDBC. MaxConnectionReuseCount = 1000

    JDBC. MaxLimit = 10

    JDBC. MaxStatementsLimit = 10

    JDBC. MinLimit = 1

    jdbc.statementTimeout = 900

    log. Logging = false

    log.maxEntries = 50

    Misc.Compress =

    misc.defaultPage = apex

    misc.enableOldFOP = true

    security.disableDefaultExclusionList = false

    security.maxEntries = 2000

    DB. UserName = APEX_PUBLIC_USER

    August 14, 2014 11:28:26 oracle.dbtools.common.config.db.ConfigurationValues intValue

    ATTENTION: * jdbc. MaxLimit in the apex of configuration uses a value of 10, this setting cannot be sized properly for a production environment *.

    August 14, 2014 11:28:26 oracle.dbtools.common.config.db.ConfigurationValues intValue

    ATTENTION: * jdbc. InitialLimit in the apex of configuration uses a value of 3, this setting cannot be sized properly for a production environment *.

    Using the JDBC driver: Oracle JDBC driver version: 11.2.0.3.0

    August 14, 2014 11:28:28 oracle.dbtools.rt.web.SCListener contextInitialized

    INFO: Oracle REMAINS initialized data Services

    Version of REST Data Services Oracle: 2.0.8.163.10.40

    The Oracle Data Services REST server info: Grizzly/1.9.49

    August 14, 2014 11:28:28 com.sun.grizzly.Controller logVersion

    INFO: GRIZZLY0001: from Grizzly Framework 1.9.49 - 14/08/14 11:28

    August 14, 2014 11:28:28 oracle.dbtools.standalone.Standalone run

    INFO: http://localhost: 8080/ADR / began to.

    Configuration properties for: apex_al

    cache. Caching = false

    cache. Directory = / tmp/apex/cache

    cache. Duration = Days

    cache.expiration = 7

    cache.maxEntries = 500

    cache.monitorInterval = 60

    cache.procedureNameList =

    cache.type = LRU

    DB. HostName = XXXXXXXXXXXXXXXXXXX

    DB. Password = *.

    DB.port = XXXXXXXXXXXXXXXXXXX

    DB.sid = BXXXXXXXXXXXXXXXXXXX

    Debug.Debugger = false

    debug.printDebugToScreen = false

    error.keepErrorMessages = true

    error.maxEntries = 50

    JDBC. DriverType = thin

    JDBC. InactivityTimeout = 1800

    JDBC. InitialLimit = 3

    JDBC. MaxConnectionReuseCount = 1000

    JDBC. MaxLimit = 10

    JDBC. MaxStatementsLimit = 10

    JDBC. MinLimit = 1

    jdbc.statementTimeout = 900

    log. Logging = false

    log.maxEntries = 50

    Misc.Compress =

    misc.defaultPage = apex

    misc.enableOldFOP = true

    security.disableDefaultExclusionList = false

    security.maxEntries = 2000

    DB. UserName = APEX_LISTENER

    August 14, 2014 11:32:01 oracle.dbtools.common.config.db.ConfigurationValues intValue

    ATTENTION: * jdbc. MaxLimit in apex_al configuration uses a value of 10, this setting cannot be sized properly for a production environment *.

    August 14, 2014 11:32:01 oracle.dbtools.common.config.db.ConfigurationValues intValue

    ATTENTION: * jdbc. InitialLimit in apex_al configuration uses a value of 3, this setting cannot be sized properly for a production environment *.

    August 14, 2014 11:32:03 oracle.dbtools.common.config.db.DatabasePoolConfig readPoolConfig

    WARNING: The pool_config view or table is not found in the schema: APEX_LISTENER

    August 14, 2014 11:40:49 oracle.dbtools.common.config.db.DatabasePoolConfig readPoolConfig

    WARNING: The pool_config view or table is not found in the schema: APEX_LISTENER

    August 14, 2014 11:40:50 oracle.dbtools.rt.web.WebErrorResponse internalError

    GRAVE: JDBCException [kind = NO_DATA]

    JDBCException [kind = NO_DATA]

    at oracle.dbtools.common.jdbc.JDBCException.wrap(JDBCException.java:88)

    at oracle.dbtools.common.jdbc.JDBCQueryProvider.query(JDBCQueryProvider.java:63)

    at oracle.dbtools.common.jdbc.JDBCQueryProvider.query(JDBCQueryProvider.java:38)

    at oracle.dbtools.rt.jdbc.entity.JDBCTenantDispatcher.tenant(JDBCTenantDispatcher.java:98)

    at oracle.dbtools.rt.jdbc.entity.TenantDispatcherBase.target(TenantDispatcherBase.java:71)

    at oracle.dbtools.rt.jdbc.entity.TenantDispatcherBase.target(TenantDispatcherBase.java:37)

    at oracle.dbtools.rt.web.ReTargetingDispatcher.canDispatch(ReTargetingDispatcher.java:45)

    at oracle.dbtools.rt.web.RequestDispatchers.choose(RequestDispatchers.java:160)

    at oracle.dbtools.rt.web.RequestDispatchers.dispatch(RequestDispatchers.java:75)

    at oracle.dbtools.rt.web.ETags.checkPrecondition(ETags.java:93)

    at oracle.dbtools.rt.web.HttpEndpointBase.restfulServices(HttpEndpointBase.java:426)

    at oracle.dbtools.rt.web.HttpEndpointBase.service(HttpEndpointBase.java:164)

    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)

    to com.sun.grizzly.http.servlet.ServletAdapter$ FilterChainImpl.doFilter (ServletAdapter.java:1059)

    to com.sun.grizzly.http.servlet.ServletAdapter$ FilterChainImpl.invokeFilterChain (ServletAdapter.java:999)

    at com.sun.grizzly.http.servlet.ServletAdapter.doService(ServletAdapter.java:434)

    at oracle.dbtools.standalone.SecureServletAdapter.doService(SecureServletAdapter.java:91)

    at com.sun.grizzly.http.servlet.ServletAdapter.service(ServletAdapter.java:379)

    at com.sun.grizzly.tcp.http11.GrizzlyAdapter.service(GrizzlyAdapter.java:179)

    at com.sun.grizzly.tcp.http11.GrizzlyAdapterChain.service(GrizzlyAdapterChain.java:196)

    at com.sun.grizzly.tcp.http11.GrizzlyAdapter.service(GrizzlyAdapter.java:179)

    at com.sun.grizzly.http.ProcessorTask.invokeAdapter(ProcessorTask.java:849)

    at com.sun.grizzly.http.ProcessorTask.doProcess(ProcessorTask.java:746)

    at com.sun.grizzly.http.ProcessorTask.process(ProcessorTask.java:1045)

    at com.sun.grizzly.http.DefaultProtocolFilter.execute(DefaultProtocolFilter.java:228)

    at com.sun.grizzly.DefaultProtocolChain.executeProtocolFilter(DefaultProtocolChain.java:137)

    at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:104)

    at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:90)

    at com.sun.grizzly.http.HttpProtocolChain.execute(HttpProtocolChain.java:79)

    at com.sun.grizzly.ProtocolChainContextTask.doCall(ProtocolChainContextTask.java:54)

    at com.sun.grizzly.SelectionKeyContextTask.call(SelectionKeyContextTask.java:59)

    at com.sun.grizzly.ContextTask.run(ContextTask.java:71)

    to com.sun.grizzly.util.AbstractThreadPool$ Worker.doWork (AbstractThreadPool.java:532)

    to com.sun.grizzly.util.AbstractThreadPool$ Worker.run (AbstractThreadPool.java:513)

    at java.lang.Thread.run(Thread.java:662)

    Caused by: java.sql.SQLSyntaxErrorException: ORA-00942: table or view does not exist

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:447)

    at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:396)

    at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:879)

    at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:505)

    at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:223)

    at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:531)

    at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:208)

    at oracle.jdbc.driver.T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:886)

    at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1175)

    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1288)

    at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3612)

    at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3656)

    at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1495)

    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)

    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)

    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)

    at java.lang.reflect.Method.invoke(Method.java:597)

    at oracle.ucp.jdbc.proxy.PreparedStatementProxyFactory.invoke(PreparedStatementProxyFactory.java:111)

    to com.sun.proxy. $Proxy44.executeQuery (unknown Source)

    at oracle.dbtools.common.jdbc.JDBCQueryImpl.resultSet(JDBCQueryImpl.java:92)

    to oracle.dbtools.common.jdbc.JDBCResultRowIterator. < init > (JDBCResultRowIterator.java:29)

    at oracle.dbtools.common.jdbc.JDBCQueryImpl.execute(JDBCQueryImpl.java:52)

    at oracle.dbtools.common.jdbc.JDBCQueryProvider.query(JDBCQueryProvider.java:60)

    ... more than 33

    DBA has forgotten to run apex_rest_config.sql Doh!

  • Transformation to define the schema and user on Tables in relational and physical

    I imported my Designer SDDM 4.0.1.836 deposit.  Now, I want to APP_SYS of subviews allows to define the schema on the tables in the relational model and the user on the physical model.  Our application in designer systems are 90% based scheme.  Here's what I have so far based on the example of OE and HR plans that I imported, built logical, then back to relational.  At this point, I have the tables by subview...  Line 24 is obviously incorrect.  Can someone give me a step by step approach to understand what I need to set the schema of the table and the user?  I looked through the read me file, xml files, the index and searched the known universe and am still confused.

    importPackage(javax.swing);
    // variable to keep a status message for later
    
    var message="";
    var schema="";
    
    // Get all subviews for Relational
        subviews = model.getListOfSubviews().toArray();
    for (var isub = 0; isub<subviews.length;isub++){
        subview = subviews[isub];
        message = message + '\n'+isub+" "+ subviews[isub];
        if(subviews[isub]!=null){
              if(subview == "HR_ERD"){
                 shema = "HR";
                }else if(subview == "OE_ERD"){
                 schema = "OE";
                }
             tables = model.getTableSet().toArray();
             for (var t = 0; t<tables.length;t++){
                 table = tables[t];
                 //check for presentation in subview
                 tv = table.getFirstViewForDPV(subviews[isub]);
                 if(tv!=null){
                     table.setSchemaObject(schema);  
                      message = message + '\n    '+ t +" " +shema+"."+ tables[t] ;
                }
            }
        }
    }  
    
    JOptionPane.showMessageDialog(null, message);
    //Packages.oracle.dbtools.crest.swingui.ApplicationView.log(message);
    

    Hi Marcus,

    First of all, you must make sure that the schema already exists in your relational model.

    You can use the method

    public void setSchema (String schemaName)

    to associate your Table with this scheme.

    (This method is defined on ObjetConteneur, so it can also be used to make and index).

    If you already have a user of your physical model that implements your schema, it is not normally necessary to create a user on the TableProxyOracle in the physical model.

    However if you do not want to set, then the method to use is

    public void setUser (user UserOracle)

    David

  • DataPump - expdp.open creates tables in the schema

    Hello

    I use datapump in Oracle 10 g to archive old main schema to another schema partitions.

    I noticed that when dbms_datapump.open is called and a new table is created by dbms_datadpump for internal purposes. This is verified in the oracle documentation

    http://docs.Oracle.com/CD/B12037_01/AppDev.101/b10802/d_datpmp.htm#997383

    Notes on use

    • When the task is created, a master table is created for work under the scheme of the appellant in the default tablespace of the appellant. A handful referring to employment is returned that attaches to the current session to use. Once attached, the handle remains valid until that either explicit or implicit detach occurs. The handle is only valid in the session of the appellant. Other handles can be attached to the same task from another session using the ATTACH procedure.

    Does anyone know if this table can be removed by a call to dbms_datapump Oracle 'clean', or if it has to be cleaned manually.

    can confirm that this is what you do

    v_job_handle: = DBMS_DATAPUMP. OPEN('EXPORT', 'TABLE', NULL, v_job_name);

    -The parallelism 1 value and add the file

    DBMS_DATAPUMP. SET_PARALLEL (v_job_handle, 1);

    DBMS_DATAPUMP. ADD_FILE(v_job_handle, v_job_name |) '_' || v_partition. PARTITION_NAME | ".dmp", "PARTITION_DUMPS");

    -Apply filters to process only a partition in the table

    DBMS_DATAPUMP. METADATA_FILTER(v_job_handle, 'SCHEMA_EXPR', 'IN ("SIS_MAIN")');

    DBMS_DATAPUMP. METADATA_FILTER(v_job_handle, 'NAME_EXPR', ' AS "' | t_archive_list (i) |) '''');

    DBMS_DATAPUMP. DATA_FILTER(v_job_handle, 'PARTITION_EXPR', ' IN ("' | v_partition.partition_name |)) ')', t_archive_list (i), "SIS_MAIN");

    -Use statistics (rather than blocks) to estimate the time.

    DBMS_DATAPUMP. SET_PARAMETER(v_job_handle, 'ESTIMATE', 'STATISTICS');

    -Start the work. An exception is returned if something is not installed correctly.

    DBMS_DATAPUMP. START_JOB(v_job_handle);

    -The export job should work now. We loop until its finished

    v_percent: = 0;

    v_job_state: = "UNDEFINED";

    WHILE (v_job_state! = 'DONE') and (v_job_state! = "STOPPED") LOOP

    DBMS_DATAPUMP.get_Status (v_job_handle,DBMS_DATAPUMP.ku$_status_job_error + DBMS_DATAPUMP.ku$_status_job_status + DBMS_DATAPUMP.ku$_status_wip,-1, v_job_state, m);

    js: = sts.job_status;

    -As percentage-complete in this loop changes, displays the new value.

    IF js.percent_done! = v_percent THEN

    v_percent: = js.percent_done;

    END IF;

    END LOOP;

    -Once the work is finished, view the status before you loose work.

    PRC_LOG (f1, t_archive_list (i) |) ': Export complete with status: '. v_job_state);

    -DBMS_DATAPUMP. Detach (v_job_handle);

    -Use STOP_JOB instead of DETACHING otherwise 'master' table that is created when the OPEN is called is not removed.

    DBMS_DATAPUMP. STOP_JOB(v_job_handle, 0, 0);

  • Store XML in the schema issue generated tables and XML Validation against the schema.

    Hello friends,

    I am facing a problem where to store xml in a saved schema generated from tables.

    It's my diagram

    --

    <? XML version = "1.0" encoding = "UTF-8"? >

    " < xs: schema xmlns:ds = ' http://www.w3.org/2000/09/xmldsig# "" xmlns: XS = " " http://www.w3.org/2001/XMLSchema "xmlns =" " http://www.ABC.inf.in/test "targetNamespace =" " http://www.abc.inf.in/test" elementFormDefault = "qualified" attributeFormDefault = "unqualified" > "

    < xs: include schemaLocation = "abc.xsd" / >

    < xs: element name = "project" type = 'student' >

    < xs: annotation >

    < Intention > it is a Documentation < / intention >

    < / xs: annotation >

    < / xs: element >

    < / xs: Schema >

    -This is my xml document

    "< project versão ="2.00"xmlns ="http://www.abc.inf.in/test">."

    "< test xmlns ="http://www.abc.inf.in/test">."

    " < intest version ="2.00"Id ="testabc"xmlns: xsi =" http://www.w3.org/2001/XMLSchema-instance "  >

    < ide >

    < cUF > 35 < / cUF >

    < cNF > 59386422 < / cNF >

    < natOp > This is post < / natOp >

    < indPag > 1 < / indPag >

    < mod > 55 < / mod >

    < set > 1 < / series >

    < / ide >...

    .....................

    Giving not complete because it is too long.

    1. I registered successfully of the schemas in the database

    2. then I generate the saved schema table

    2. in my java code I have validated the document XML schema compared and it's valid successfully.

    3 but when I recorded this XML file in the generated table there me gives error

    As:

    INSERT INTO XMLTABLE

    VALUES

    (XMLTYPE (bfilename('MYDIR','testabc.xml'), NLS_CHARSET_ID ('AL32UTF8')))

    Error report:

    SQL error: ORA-31061: error XDB: XML error event

    ORA-19202: an error has occurred in the processing of XML

    LSX-00333: literal '94032000' is not valid with regard to the model

    And I have to store this xml in tables of this so what I have to do?

    Thanks for your reply odie_63.

    I got this my solution for the error. My XML document is not well structured based on my registered XML schema.

    Average in My XML Document is an invalid value and which do not match my schema model so it gives this error

    SQL error: ORA-31061: error XDB: XML error event

    ORA-19202: an error has occurred in the processing of XML

    LSX-00333: literal '94032000' is not valid with regard to the model

    For the Solution, we have two ways

    1. I changed this literal value "94032000" in my xml file, then save it.

    2.

    -To delete this schema can

    -We need to change the schema for the particular element model

    like: -.

    -then to store xml in the database, it works...

    Thank you.

  • How can I export the schema with all objects and a few tables without data

    Hi all

    Verion 10g EA.
    I export the schema with all objects but I need ignored some of the data in the table.

    There is a table 4 people gave huge, we need not to export data from tables but structure must export.


    Thank you
    Nr

    You can do this with a single command.  Run your export as usual and add the query parameters for 4 tables you want all the lines:

    expdp... query = schema1.table1: "where rownum = 0" query = schema2.table2: 'where rownum = 0'...

    It is best to place the query parameters in a parameter file if you don't have to worry about escaping special characters of the OS.

    Dean

  • Creating table in the schema has with the user B and granting permissions to the user C

    Hello, I have a problem fun - we have a large table which requires a batch to make millions of updates, and he needs to finish more quickly. It takes several hours as an update, but a select create table did the same thing in about a minute. Yay! Then we just delete the old table and rename the new table to the former, rebuild all the index and grant select/insert/update/delete permissions to another user who needs access... and there is the problem, Oracle said insufficient privs.

    We have A figure, which is the owner of the table, userid B who done all the stuff of application batch and userid C which performs inserts for a process of middleware. None of them are actual users, of course. We want that user B to do the ETG, rename, rebuild and re-grant of the authorisation of the index. But user B apparently cannot grant permissions on objects in the schema A to user C, even if B has created the table and has the role DBA (not ideal, I know)!

    What's really crazy is that there is no sense unique user b can grant permissions to user C, which is to:

    grant select any table USER c;
    grant insert any table USER c;
    grant update any table USER c;
    Grant delete any table USER c;

    It seems really perverse can create tables in diagram A and even grant C privs "whole table" that user B, but may not grant privs specifically on this object in the schema A, even with DBA privs. I must be missing something... right?

    Yes. By default, the stored procedures are stored procedures from DEFINER rights. If A is the owner of a stored procedure, this stored procedure can do whatever A enjoys the privileges to directly. If A grants B privileges RUN this stored procedure, when B calls the stored procedure, the procedure runs with the privileges.

    I agree point stew, however, re-creating objects in Oracle is generally a bad idea and a TRUNCATE with one insert direct-path, possibly combined with the deactivation and the rebuilding of the index would be more appropriate than a DEC. If you're going to stick with this Assignment, however, that really do in a stored procedure owned by A while B has no need to CREATE ANY TABLE.

    Justin

  • Prevent specific users to drop the partition of table

    Hello

    I'm having the following problem: on the CUSTOMERS table, I implemented a policy that prevents specific users to delete records that have REPORT_DATE (it is a DATE column) over 2 months. Works very well.
    These users are actually "login" I identify using sys_context ('USERENV', 'OS_USER').

    However... I realized that these users can fall any partition of this table (the table partitioned BY RANGE) so the policy that I have implemented is unnecessary.

    My question: is it possible to prevent specific users to delete a partition?
    All partitions in the CUSTOMERS table are associated with a REPORT_DATE unique that's why I want to prevent the destruction of the wall that has a REPORT_DATE related more than two months...


    Thank you
    MR. R.

    Hello

    Revoke the privilege of the Oracle user who is connected. (privileges are granted to the and helped by users Oracle, no internal users that you identify well as SYS_CONTEXT.)

    If it is the owner of the schema, and then create a different pattern of all the newspapers-ins, which has only the privileges that these users must have. Modify the application so that it uses this log-in Oracle and not the owner of the table schema.

  • querying table inside the schema of the IOM

    Hello

    I created a table looks in the scheme of the IOM. I want to insert data into this table custom through my scheduled task. As I try to insert data into a table that is located in the schema of the IOM itself, I hope I don't have to make a JDBC connection. Could you get it someone please let me know how can I go about it?

    Thank you
    PETREA

    You can use the statement prepared for her and use tcDataProvider (IOM) to make connections.

  • avoid triggers on all tables in the schema

    I need to put a CREATION TIMESTAMP to the timestamp of the database server (not the timestamp of session) in all tables in a schema to create and update. Is to create a trigger on all the tables in the schema a less time consuming way to do?

    Similarly, I need to set up columns such as CREATE_USER, LAST_UPDATE_USER.

    Thank you in advance.

    You can easily generate the DDL for adding new columns.

    The extent of the filling of the columns, your choice is either to use insertion befire table and triggers to update to fill the columns or the application to provide the necessary information.

    Basic trigger logic would be pretty much the same for all tables, then write a little SQL or PL/SQL to generate the code of the trigger should be simple enough.

    Depending on your application, such as web based with only one Oracle user, you may need to get the real user through dbms_application_info by the logic of the server-based application.

    HTH - Mark D Powell.

    Edited by: Mark D Powell 5 May 2010 07:48

  • Version of the schema via SQLite PRAGMA user_version

    I read some post on peacekeeping the schema via the PRAGMA version... From what I read, this method seems to be the right approach to implement. But in the last few days I've been running a few problems with my implementation... So, I have 2 questions...

    1. When I update the user_version PRAGMA, the value is never stored after each new load application...

      1st round (Application does not close on the device):

      Initial version of DB 0
      Version of DB 0
      DB Version 1
      DB Version 2
      DB Version 3

      2nd round

      Initial version of DB 1
      Version of DB 0
      DB Version 1
      DB Version 2
      DB Version 3

      To my knowledge, on the 2nd run the PRAGMA user_value must contain the value of 3... But it always starts at 1.

      This is the sequential code used to get and update the PRAGMA...

        // Retreive the DB schema version #
        QSqlQuery sqlQuery(sqlda->connection());
        sqlQuery.setForwardOnly(true);
        sqlQuery.exec("PRAGMA user_version");
      
        if (!sqlQuery.isActive())
        {
          // error
          qDebug() << "Error fetching DB Version.";
        }
      
        if (sqlQuery.next())
        {
          version = sqlQuery.value(0).toInt();
          qDebug() << "Initial DB Version " << version;
        }
      
        QSqlQuery sqlUpdateQuery(sqlda->connection());
        sqlQuery.setForwardOnly(true);
        sqlUpdateQuery.exec("PRAGMA user_version=0");
      
        ...
      
        sqlUpdateQuery.exec("PRAGMA user_version=3");
      
    2. In my class, I've decoupled the functions of the two as separate C++ function below version...
      int ApplicationUI::getDatabaseVersion()
      {
        // DB Version initialization
        int version = 0;
      
        // Create SQL Data Access object binding to the DB file...
        SqlDataAccess *sqlda = new SqlDataAccess(DB_PATH);
      
        // Retreive the DB schema version #
        QSqlQuery sqlQuery(sqlda->connection());
        sqlQuery.setForwardOnly(true);
        sqlQuery.exec("PRAGMA user_version");
      
        if (!sqlQuery.isActive())
        {
          // error
          qDebug() << "Error fetching DB Version.";
        }
      
        if (sqlQuery.next())
        {
          version = sqlQuery.value(0).toInt();
        }
      
        return version;
      }
      
      void ApplicationUI::updateDatabaseSchemaVersion(int version)
      {
        // Create SQL Data Access object binding to the DB file...
        SqlDataAccess *sqlda = new SqlDataAccess(DB_PATH);
      
        // Prepare the update statement
        QString sqlPragma = "PRAGMA user_version=" + QString::number(DB_VERSION);
        QSqlQuery sqlUpdateQuery(sqlda->connection());
        sqlUpdateQuery.setForwardOnly(true);
        sqlUpdateQuery.exec(sqlPragma);
        sqlda->connection().commit();
        qDebug() << "Updated PRAGMA to Version " << version;
      }
      
      void ApplicationUI::updateDatabaseSchema()
      {
        // Create SQL Data Access object binding to the DB file...
        SqlDataAccess *sqlda = new SqlDataAccess(DB_PATH);
      
        int version = 0;
      
        version = getDatabaseVersion();
        qDebug() << "Initial DB Version " << version;
        updateDatabaseSchemaVersion(2);
        version = getDatabaseVersion();
        qDebug() << "DB Version " << version;
        updateDatabaseSchemaVersion(4);
        version = getDatabaseVersion();
        qDebug() << "DB Version " << version;
        updateDatabaseSchemaVersion(6);
        version = getDatabaseVersion();
        qDebug() << "DB Version " << version;
        updateDatabaseSchemaVersion(7);
        version = getDatabaseVersion();
        qDebug() << "DB Version " << version;
      }
      

      in my test case, I generate some sequential update but whenever the application is running, I get the following message from the console:

      QSqlDatabasePrivate::removeDatabase: connection '. / data / bookDatabase.db' is still in use, all queries will stop working.
      QSqlDatabasePrivate::addDatabase: duplicate connection name '. / data / bookDatabase.db' old deleted connection.

      I tried to add the following sqlda-> connection (m:System.NET.Sockets.Socket.close ()); After calling SQL but the same message appears.

      You forgot something in my code... in c#, I usually wrap my SQL operations with a try catch/finally block. In the finally block, I close and have waiting them for connections...

    Thanks... dropped the PRAGMA and routed to a table of metadata according to the guidelines mentioned by perter9477 in the following thread...

    http://supportforums.BlackBerry.com/T5/native-development/best-approach-for-SQL-schema-version-upgra...

  • Import export of the schema is unknown

    Hello

    I have a file dump exported molasses formed up again that some tables. The schema for the export of the discharge is unknown and we have only provided the dump file.

    I have a scheme named EXTRCT which has been granted the DBA role in my target database. I want to import the tables in the schema EXTRCT.

    The following command retrieves the tables in schema EXTRCT or it will create the schema in my target database that was used during the export. If it is the latter, how can I force the tables to be imported into the schema EXTRCT. I guess that I can't use REMAP_SCHEMA because I don't have the name of the schema where tables were exported?

    Impdp EXTRCT/extrct dumpfile = logfile directory = dp_dir imp.log = exp.dmp

    The databases are 11 GR 2.

    Thank you

    Mathieu

    run the import with

    sqlFile = my_test_file. SQL

    This will insert all ddl statements in the file instead of their execution.  Nothing will be created.  You can then change the my_test_file.sql to see what the patterns are.  You can also see if there are tablespaces that will need to be remapped as well.

    Dean

Maybe you are looking for

  • Address book exists, containing addresses, but does'nt show them

    The addresses in my address book do not appear in the address book window.The file exists and contains all my addresses and they have also highlighted when I start typing a name in the To: field. How can I make the vissible addresses in my address bo

  • Apps disappeared

    On my home screen, all my files are gone, as follows; https://imgur.com/a/t2IrU When I search the apps in the spotlight, they are still there, just the files are invisible. About 2-3 hours before this publication, is how long it has been like that. I

  • HP 350 1: update the BIOS

    My HP G1 was Windows 7.  I have upgraded to Windows 10. I received an email to download the new BIOS up to date. I checked on the HP site. This is for windows 7 not Windows 10. Support system HP does no update. Should I try and download this update?

  • Store art 500 GB not recognized on Windows 7

    Hello I have problem with my HARD drive (its does not work on my Windows 7 64 bit).->, it works on every xp, windows vista->, I'm sure that my cable USB etc is ok-> my drive HARD is ok (I have two pieces of this and the two works do not on Windows 7

  • T130 algorithm of fan speed

    I have a server T130 (1 week) and since the addition of a Broadcom 5722 (a single port NIC), the cooling fan has accelerated considerably (and very loudly). Running OpenSUSE 42.1 and functionally everything is perfect. I can't any that Dell to respon