Impdp fails via data link

Hello

I am trying to import a particular schema of another instance in my instance via a link of database using Data Pump.

Source instance: Solaris on Sparc 10.2.0.4.0 64-bit
Instance of the target: Linux on x 32, 10.2.0.4.0

I created a database on the target database link and checked the link to work, i.e. I could query the tables in the remote database by using the user who should receive the data.

I created a directory "exp" on the target database and have the appropriate permissions for 'mh03.

Then I tried importing and got an error:

-----

Oracle@padsw7ora01 imp$ impdp mh03 / * NETWORK_LINK = padsol25 = test_bas_bkrus_d123120 REMAP_SCHEMA = test_bas_bkrus_d123120:mh03 = exp = imp.log LOGFILE DIRECTORY SCHEMAS

Import: Release 10.2.0.4.0 - Production on Tuesday, January 20, 2009 14:47:30

Copyright (c) 2003, 2007, Oracle. All rights reserved.

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With partitioning, OLAP, Data Mining and Real Application Testing options
ORA-39006: internal error
ORA-39113: unable to determine the database version
ORA-02083: name of a illegal character data '-'

ORA-39097: Data Pump job encountered the error unexpected-2083

Oracle@padsw7ora01 imp$

-----

I tried searching metalink and web but not able to find any useful information. I tried using the VERSION = '10.2.0.4.0' parameter, but this did not help either (error message remains as what).

I found what seemed somewhat related:

ORA-39113: unable to determine the database version
Cause: The Data Pump has been impossible to determine the level of compatibility with the version of the current database using SYS. DBMS_UTILITY. DB_VERSION.
Action: Be sure you have access to the DBMS_UTILITY package for you. If it is a network work, make sure that package DBMS_UTILITY access is granted to you on the remote instance.

But even that "grant execute on SYS. DBMS_UTILITY to mh03' did not help at all.

I realize that the target database is compatible to 10.2.0.3.0:

-----

1 declare
2 v varchar2 (1000 char);
VARCHAR2 (1000 char) 3 c.
4 start
DBMS_UTILITY 5. DB_VERSION (v, c);
6 dbms_output.put_line (v);
7 dbms_output.put_line (c);
8 * end;
SQL > /.

PL/SQL procedure successfully completed.

SQL > set serverout on
SQL > /.
10.2.0.4.0
10.2.0.3.0

PL/SQL procedure successfully completed.

SQL >

-----

Any ideas, what went wrong and how can I remedy this? Thank you very much!

Kind regards

Robert

Published by: rklemme on January 20, 2009 06:16

rklemme

In my case it had nothing to do with the host name of the server, or name resolution. It was related to the global_name of the instance.
I could question using the link but not to perform a remote import.

Try to temporarily change your global_name by removing the hyphen in domain name and test your import:
SQL > alter database rename global_name to .pad. * mycompany * .net.

PY

Tags: Database

Similar Questions

  • Timestamp query via data link

    We have a problem with the synchronization of the time of one of our database servers. I thought it might be possible to check the time difference of two servers by querying the time from a server via a database link.
    But the script:
    connect myuser/xxxxxxxx@db1
    
    select 'DB1: ' server, to_char(systimestamp,'mm/dd/yyyy hh24:mi:ss,ff3') time from dual;
    
    connect myuser/xxxxxxxx@db2
    
    select 'DB2: ' server, to_char(systimestamp,'mm/dd/yyyy hh24:mi:ss,ff3') time from dual;
    
    select 'local (DB2):  ' server,
           to_char(systimestamp,'mm/dd/yyyy hh24:mi:ss,ff3') time
    from dual
    union 
    select 'remote (DB1): ',
           to_char(systimestamp,'mm/dd/yyyy hh24:mi:ss,ff3')
    from dual@db_link_to_db1;
    gives me this result:
    Connected.
    
    SERVE TIME
    ----- -----------------------------
    DB1:  03/08/2013 14:08:51,333
    
    Connected.
    
    SERVE TIME
    ----- -----------------------------
    DB2:  03/08/2013 14:08:51,208
    
    
    SERVER         TIME
    -------------- -----------------------------
    local (DB2):   03/08/2013 14:08:51,298
    remote (DB1):  03/08/2013 14:08:51,298
    You can see, that the server, when I connect later indicates a time that is earlier than the timestamp, that I had before, which would not know if the two servers are synchronized.
    But when I try to interrogate the timestamp of the servers in a single statement, I have time for the local server to the remote server. Is there a way to get the 'real time' of DB1 in a SQL query that is run on DB2?

    Published by: UW (Germany) on 08.03.2013 14:56

    Hey UW.

    MOS 165674.1 describes how.

    You need a remote control works (or discovered) return remote sysdate.

    Concerning
    Peter

  • How can I call a function table in pipeline via DB link?

    I am using a function table in pipeline defined in a remote DB (DB_A) of my DB in local (DB_B) via a link DB (DB_A_REMOTE).

    The function table in pipeline is defined in a package with all the specifications of type he needs and works very well when she is called locally but when called it remotely fails

    Here is an example configuration in DB_A:
    connect scott/tiger
    create or replace
    package pkg as
      type rec is record (
        dte date
      );
      type rec_set is table of rec;
      
      function dts(p_eff_date date) return rec_set pipelined;
      function dt(p_eff_date date) return date;
    end;
    /
    create or replace
    PACKAGE BODY pkg AS
    
      function dts(p_eff_date date) return rec_set pipelined AS
        r rec;
      BEGIN
        r.dte := p_eff_date;
        pipe row(r);
        r.dte := r.dte+1;
        pipe row(r);
        RETURN;
      END dts;
    
      function dt(p_eff_date date) return date as
      begin
        return p_eff_date;
      end;
    
    END pkg;
    /
    In DB_B, I have the following configuration:
    create database link DB_A_REMOTE connect to Scott identified by tiger using 'DB_A';
    create or replace synonym RPKG for PKG@DB_A_REMOTE;
    In DB_A, I can access the two PKG functions very well
    SQL> select pkg.dt(sysdate) from dual
    DJ.DT(SYSDATE)       
    ----------------------
    21-SEP-2012 11:26:31   
    
    SQL> select * from table(pkg.dts(sysdate))
    DTE                  
    ----------------------
    21-SEP-2012 11:26:31   
    22-SEP-2012 11:26:31   
    23-SEP-2012 11:26:31   
    24-SEP-2012 11:26:31   
    However, in DB_B the I get the following:
    SQL> select rpkg.dt(sysdate) from dual
    RPKG.DT(SYSDATE)     
    ----------------------
    21-SEP-2012 11:29:05   
    
    SQL> select * from table(rpkg.dts(sysdate))
    
    Error starting at line 2 in command:
    select * from table(rpkg.dts(sysdate))
    Error at Command Line:2 Column:20
    Error report:
    SQL Error: ORA-06553: PLS-752: Table function DTS is in an inconsistent state.
    06553. 00000 -  "PLS-%s: %s"
    *Cause:    
    *Action:
    selection rpkg.dt shows I can get to the remote package and run functions in it, but the second line is where my problem.

    Why the function table in an inconsistent state and how can I fix this problem so that it will work in all of the linlk database?

    Published by: Sentinel on September 21, 2012 11:35

    Go! You have posted more than 1,000 times and know that you must provide your Oracle version 4-digit.
    >
    Why the function table in an inconsistent state and how can I fix this problem so that it will work in all of the linlk database?
    >
    You can't - it is not supported.

    See the note under the PIPELINED clause in the declaration section of the definition of the doc of PL/SQL and function
    http://docs.Oracle.com/CD/E11882_01/AppDev.112/e25519/function.htm
    >
    Note:

    You cannot run a function table in pipeline over a database link. The reason is that the return type of a function table in pipeline is a SQL type defined by the user, which can be used in a single database (as explained in the Guide of the Oracle object-relational database developer). Although the return type of a function table in pipeline may appear as a PL/SQL type, the database actually converts this PL/SQL type to a type defined by the corresponding SQL user.
    >
    Your code using PL/SQL types for these types are implicitly converted to the SQL type needed to access the service using SQL. But the SQL types have an OID (object ID) which is not recognized on the other server so that the other server is unable to create the appropriate type.

    If you click on the link provided to the other doc in this note, you will see that even though you can create a type and specify an OID you still won't be able to use it as you wish.
    http://docs.Oracle.com/CD/E11882_01/AppDev.112/e11822/adobjbas.htm#ADOBJ7083
    >
    Restriction on the use of Types defined by the user with a remote database

    Objects or user-defined types (specifically, types declared with a SQL CREATE TYPE statement, as opposed to types declared in a PL/SQL package) are currently only useful in a single database. Oracle database limits the use of a link of database as follows:

    Unable to connect to a remote database for select, insert, or update a type defined by the user or a REF object on a remote table.

    You can use the CREATE TYPE statement with the Optional keyword OID to create an object identifier specified by the user (OID) that allows an object type for use in multiple databases. See the discussion on the attribution of an OID for a type of object in Oracle Database Data Cartridge Developer's Guide.

    You cannot use the links from the database of the PL/SQL code to declare a local variable of a type defined by the remote user.

    You cannot pass an argument value or return of type user defined in a PL/SQL remote procedure call.

  • Moving database from 1 server to another via Data Pump - some queries

    Hello

    I am planing to move my database from one windows server to another. Part of the obligation is also to update this database 10g to 11.2.0.3. So I'm combining 2 tasks using the export / import method (via Data Pump) upgrade.

    Regarding export / import (which will be a pump full data of the database export) my plan is.

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import of the source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export doesn't include these spaces of storage anyway?

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database enabled on the source (as opposed to other flashback features) database?

    My target is a virtual server with a single virtual processor
    Q3. Is there any point PARALLEL usinng in the import settings file (normally I would fix this number of processors - however in the case of this virtual server, it is actually onely a virtual processor)?

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? for example I just let go of data storage spaces and remake a complete import?

    Thank you
    Jim

    Jim,

    I'll take a pass on your questions:

    create an empty 11.2.0.3 target database on the server (same number of control files, and redo logs etc.) ready for an import > source database
    Q1. This will create tablespaces SYSTEM and UNDO - I presume the datapump export does not include these storage spaces > anyway?

    The system tablespace is created when you create a database, but the Data Pump will export and try to import. It will fail with tablespace exists. I am sure that the undo tablespace will be also exported and imported. If they are there, then just import will report that they already exist.

    For export, I intend to simulate CONSISTENT = Y using FLASHBACK_TIME = SYSTIMESTAMP
    Q2. What is the return of flame characteristics must be active on the source database to use this knowledge should I Flashback Database active > on the source (as opposed to other flashback features) database?

    I know not true about it. I thought that you need just enough cancel, but I hope that others will run in.

    My target is a virtual server with a single virtual processor
    Q3. Y at - it no PARALLEL point usinng in the import settings file (normally I put this on the number of processors - > however in the case of this virtual server, it is actually onely a virtual processor)?

    We recommend usually 2 times the number of processes, so 2 parallel should be ok.

    For the import, I intend to use the REMAP_DATAFILE option to change the location of the data files on the target server

    Q4. If the import fails before the end, what is the best course of action? that is, do I just give up storage of data and redo a > full import?

    It depends what is failure. Most of the failures will not stop work, but if this is the case, then most of these jobs may simply be restarted. To restart a job, you just need to know the name of the task, which is printed as you start to export/import, or you name the task in your order Data Pump. To restart, follow these steps:

    password/user Impdp attach = job_name

    If you do not name the work, the name of the job will be something like

    User.sys_import_full_01

    Hope that helps - and good luck with your migration.

    Dean

  • Export to Excel, via a link on the page of action

    I have two questions.

    (1) how can I export to excel, via a link.  For example I have a form, a user clicks submit, and I've got cfoutput data in a table on the same page (action page is identical to the entry page).  Now, once the output data, I'd like a link on this page so that the user can click and it will export the table to Excel.

    I tried:

        <cfheader name="Content-Disposition" value="attachment; filename=test.xls">
        <cfcontent type="application/msexcel">
        <cfoutput query="qTest">
        <table>
        <tr>
        <th>Account</th>
        <th>Amount</th>
        </tr>
        <tr>
        <td>#qTest.ACCOUNT#</td>
        <td>#NumberFormat('#qTest.TotalAmt#', "_(999,999,999.99)")#</td>
        </tr>
        </table>
        </cfoutput>

    But once the page is loaded, he tries to export automatically.  How can I get to export via a link?

    Also, someone suggested wrapping the cfoutput in a < cfsavecontent variable = "xyz" > tag, but once I have it, how do I insert the variable 'xyz' in the tag < cfcontent >?  Or where can I use it?

    Thank you guys

    The variable that contains the output table.  Just put another #session.export # in the appropriate place in your logic will be displayed on the screen using the scenario.

  • New Microsoft Data Link.UDL just appeared on my desk. What is it? Can I delete it?

    After having searched for information on the web, an icon appeared on my desktop namved new Microsoft Data Link.UDL.

    What is it?

    Is this a form of 'spy '?

    Can I delete it?

    There is a link used by an ActiveX data object that is used to access a database.  It was probably put there so that you can access a database on a web server.  Just delete it and make sure to run an in-depth analysis of the AV/anti-malware on your machine.

    John

  • How can you save session hearing on a different hard disk, sent via dynamic links to first?

    I am a Mac user. I train to a PC user to edit my audio.

    After the edict of sequences in CEP I send it via dynamic links to hearing make the audio changes then new return to CEP.

    How can I save the project/session of my Mac on a drive external hard shared for the PC user edit in Audition?

    I tried to send the CEP project via dynamic link to hearing, then opening audio files in Audition, then with the video showing in hearing - file > export > Session > checked "save copies of associated files" & chose the external hard drive that we share. No dice. External transfer to the computer and open the hearing session he has no related video and audio files that I had opened are not there. Essentially, this is an empty session.

    Help!

    Hi... basically dynamic link is not what you want if you want to send it to someone else for sound editing, as I can tell. I think the best option for you would be to select 'Export DV Preview Video' rather than 'Send by Dynamic Link' (under 'Edit in Audition'), which I think should export a video resolution low - ish, and the audio files and XML for hearing.

    Hope it is helpful

  • physical data link error

    Hello

    I upgraded OBIEE 11g and BI APPS FOR 7.9.6.3. Today, I merged OOTB DPR with an improved version of 7.9.6.2 RPD and ran to check overall consistency.

    Most of them are 24 errors

    "has no valid data type."

    "a comparisons are carriedout by the compatiable types.

    I checked LTS for these mistakes and LTS is not physically mapped to the physical source table. When I tried to map physical source it is to say '. " " .. .to no physical data link". I edited LTS and he has not joined to any table?

    Make sure that the data types used in your query are all the same types of data. You can't compare two different data types in a case statement.

  • First will not open my project after that I have record in SpeedGrade via Direct link

    First will not open my project after that I have record in SpeedGrade via Direct link

    Last night, I installed all the updates of CC, everything went well.

    All day I was doing the color correction in SpeedGrade to PremierePro project via direct link.

    After I finished, I wanted to go back to first to make an export (my manager was waiting for a sample) only, when the first attempted to connect to the project I worked on SpeedGrade, could not open!

    She the Saturn: "this project has been saved in a newer version of Adobe Premiere Pro and cannot be opened in this version.

    Before panic to my Director of waiting, I've saved every look I did in SpeedGrade as an individual LUT I could manually apply in an earlier version of the PremierePro project. But it took a long time and it's may not be a long-term option.

    What can be the problem between saving in SpeedGrade and then back in PremierePro?

    I PremierePro CC 2015 v. 9.02 (6) and CC SpeedGrade 2015 v. 2015.1

    I'm working on MacBook Pro Retina display 15'.

    Simple... you SpeedGrade 2015.1 (9.2) and PrPro 2015.0.1 (9.0.1). They don't 'work' together.

    You must get your PrPro up to 9.1, 2015.1 release. If your desktop application Adobe CC is not show your 'eligible' for the upgrade PrPro, sign out of it, then back in... that often useful. If this isn't the case, you may need to use the soft Cleaner CC Adobe to remove the application from Office CC & then reinstall it and reconnect.

    https://helpx.Adobe.com/Creative-Suite/KB/CS5-cleaner-tool-installation-problems.html

    Neil

  • Connection verification failed for data source - mysql.jdbc.Driver

    I get this message when you try to add a mySQL (don't ask) database in CF Admin checking the connection failed for data source: shownets java.sql.SQLException: no suitable driver available for shownets, please don't check the configuration in the file resources, pilot error: com.mysql.jdbc.Driver


    The root cause was that: java.sql.SQLException: no suitable driver available for shownets, please don't check the configuration in the file resources, pilot error: com.mysql.jdbc.Driver I tried all sorts of messages instructions and nothing seems to work.  It's on my 32-bit platform. Tryng to get out all the bugs before I installed the 64-bit version.

    TNX a

    It seems that you have recently installed Coldfusion. I say this because recent versions of Coldfusion no longer ships with JDBC to MySQL driver. You must therefore download and install it yourself.

    To do this, go to the download page of the MySQL JDBC driver. Select the platform independent as a platform connector. Download the ZIP file independent of the platform mysql-connector-java - x.x.xx.zip. (x.x.xx represents the version number)

    Extract the ZIP file. Open the folder that results and search for the driver. It is a JAR file called mysql-connector-java - x.x.xx - bin.jar (x.x.xx represents the version number). Copy the file in the directory lib to Coldfusion.

    Restart Coldfusion. Voila - you're ready to go.

  • try to download the software: via the link provided in the email, I get an error page «» Then I tried via my account by clicking on the link 'Download' screen 'my order. " Then, I get a message saying "you have no download. I'm almost ready to fi

    Hi, I just bought first elements. I paid online via Paypal. I got the Adobe confirmation email a few hours later. I then try to download the software: via the link provided in the email, I get an error page «» Then I tried via my account by clicking on the link 'Download' screen 'my order. " Then, I get a message saying "you have no download. I'm almost ready to file a complaint with Paypal for fraudulent online sales. What should I do?

    Hi H D,

    Tell us the name of the product you have purchased.

    Please visit: http://www.adobe.com/downloads/other-downloads.html

    Let me know if it helps.

    Concerning

    Megha Rawat

  • change timeline from first pro via dynamic link cs5 yet?

    I'm sure this has been covered before, but I can't find

    anything on the dynamic links in this forum.

    I created my video in still and export still sequence via dynamic links.

    Then, I created all of my menus etc in even, after which, I noticed a few glitches that I wanted to change, so I went back to the first and edited for changes.

    How to still bring back on these changes? It didn't happen automatically as I hoped, and I don't want to have to recreate my new menu structure in yet.

    So, basically, how can I make a change in the first and reflected it in my project again without creating a new project?

    Thank you

    Andy

    If you have converted the sequence Pr or generated the project already, then you need to file. Revert to the Original for en to update changes in sequences of Pr.

    -Jeff

  • Connection verification failed for data source

    I installed mysql-connector-java - 3.1.14 - bin and I'm trying to set up a mysql data source.


    Data and Services > Datasources > other

    Mysqldb CF Data Source name
    URL JDBC jdbc:mysql://localhost:3306:test
    Driver class com.mysql.jdbc.Driver
    Name of the driver
    Root username
    Password


    Connection verification failed for data source: mysqldb
    java.sql.SQLException: Malformed URL 'jdbc:mysql://localhost:3306:test '.
    The root cause was that: java.sql.SQLException: incorrect URL "jdbc:mysql://localhost:3306:test". 54


    Looks like I did it wrong he should be like this jdbc:mysql://localhost:3306 / test

  • Issue of data link

    Hi all

    I just wanted to know what is the purpose of a data link which does not bind the columns of 2 queries.

    It only groups associated with the place.

    Thank you.
    Allen

    Allen,

    Sometimes, a link can be defined for the column by column, or the condition is dynamic (so we don't know before starting the report). Then you can bind a group to and in addition had to code this condition of link in the detail query. So for the example simple emp-dept the main query is SELECT deptno as Department, dname, loc FROM dept and the detail request is SELECT * FROM emp where deptno =: Department. That link groups. In this example, which has the same effect that the second request without condition of writing and linking the columns... but with this method, you are able to build scenarios more complex.

    Concerning
    Rainer

  • Bird fails via enet/100 with ELCK on all other attempts

    I use my own code (C/C++) on linux which reads data from a RF Boonton Electronics 4300 running talkmode via a power meter box GPIB-ENET/100.  The first Bird command returns data properly but the second fails, and this trend continues thereafter.  Change in the amount of delay between the orders of Bird does not appear to have any effect.

    Ibsta shows ERR, TIMO, CMPL, and the error seems to be ELCK whenever he fails.  There is only one instrument connected to the box ENET/100 and that a copy of my program interact with it.  My program is multi-threaded, but I keep a mutex on the gpib lock of in the thread making the reading of the instrument.

    I connected one committed GPIB to ENET/100 box and I can only see good handshake and transfers of data from the instrument to the listener at the same frequency that I get good data in the application.

    Using a packet sniffer, I see what I guess it's the error response coming back from the box of ENET/100 on all other Bird attempts.

    Does anyone have an idea what could be the problem?  I would try to run the power meter in another mode where I could ibwrt and get a unique response from the Bird, but I don't think that this mode exists.  Thank you.

    Jim

    So I think I thought a solution to my problem... maybe someone could explain why this works?

    If I execute a command ibtrg before each Bird, I never receive any errors.  What confuses me, is that I thought that the talkmode supported by the instrument does not require an external trigger, but apparently he... half the time.

    BTW, I use older drivers or-488 to the situation above.  I also tried to use reminders ibnotify with 488.2 without success.

    Jim

Maybe you are looking for

  • Satellite Pro A30 replace hard drive

    I have an A30 Pro Satellite, which is out of warranty which has developed a hard disk failure. I looked below can all be deleted components i.e. PCMCIA card Bay, memory, Processor heatsink heat bay. but can't find how to access the hard drive. Please

  • Moving data in iCloud account

    Hello I have a Gmail email address defined as my iCloud account. I want to move to an e-mail domain iCloud for all my stuff from Apple. Question is, I'll be able to pass all my account data to the iCloud email? (I have also the new 32 GB, Apple TV, a

  • Connection of the GigE Vision camera

    Hello I develop a GigE Vision camera. I want to connect GigE Vision Camera to NIMAX. When you first connect GigE Vision camera, I can see the NOR-IMAQdx list devices. ("cam0:mycamera") When I click on "cam0:mycamera", a the IMAQ error has occurred. E

  • Windows Live Mail 11 Windows 7

    How can I stop [SPAM] in the subject line of the e-mails received? This most often occurs on the emails received from contacts using the same ISP as me. Platform: Windows 7 Live Mail 11

  • Localizer.dll

    Everybody doesn't know how to get rid of localizer.dll? No, I'd love to be able to.