Export/import of the data extraction

Hello

Is it possible to export/import a data mining (created via the server front end ordminer) the ACTIVITY of a schema/database to an another schema/database?

The generation of code pl/sql method and the dbms_data_mining.export_model/import_model no current models.

I need to transfer a subset of "mining" in one database to another. Using a full schema export/import is not possible because I want to just move some activities.

Help?

Concerning

KS

Hello
Sorry, but you cannot export mining activities.
Activity code generation is a deployment option (to advance the process in some form of request).
Recreate the mining activities in the new DB would be my recommendation.
Thank you, Mark

Tags: Business Intelligence

Similar Questions

  • all export and import and the data files location

    Hi guys,.

    can someone please confirm...

    If I export the entire base and my X tablespace uses the loc * 1 data file * / test1.dbf, in the database where I matters now in the data file is in loc * 2 * / test1.dbf, it does make a difference and the data gets imported correctly, Yes?

    If I export the entire base and my X tablespace uses the loc * 1 data file * / test1.dbf, in the database where I matters now in the data file is in loc * 2 * / test1.dbf, it does make a difference and the data gets imported correctly, Yes?

    No, it makes no difference and the import of the data correctly.

  • Import ONLY the DATA without the firering triggers

    Hi, I'm on 10.2.0.4 on windows 2008. I did a (EXPDP) Export with data a USER ly, I want to import (IMPDP) data to the user with the option TRUNCATE.

    Everything looks ok before seeing that the trigger of my paintings are triggered because import no INSERTS...

    There are my settings:


    DUMPFILE = "DESTRUCTION_DATA.dmp"
    LOGFILE = "imp_DESTRUCTION_DATA.log"
    DIRECTORY = DATA_PUMP_DIR
    CONTENT = DATA_ONLY
    TABLE_EXISTS_ACTION = TRUNCATE
    JOB_NAME = 'xxxxxx '.

    What is the best way to EXPORT and IMPORT only the data of a user without when everything is triggered.

    What I want to do is to update my database to test with productiomn data. I don't want to DROP the user and re-create all of its objects.

    Edited by: Jpmill 2010-11-09 12:01

    As the destination tables have already created triggers, you must disable it manually before the impdp and allow it after.

    To disable triggers, simply run the output of the following query connected as the user owner of the data:

    SELECT 'ALTER TRIGGER ' || trigger_name || ' DISABLE;'
    FROM user_triggers;
    

    Or do the same thing with pl/sql:

    BEGIN
      FOR i IN (SELECT trigger_name FROM user_triggers) LOOP
          EXECUTE IMMEDIATE 'ALTER TRIGGER ' || i.trigger_name || ' DISABLE';
      END LOOP;
    END;
    /
    

    To allow them to return is almost the same, just change DISABLE to ACTIVATE.

    The steps are:

    1. disable triggers
    2 - impdp
    3 activate the triggers

    Concerning

  • How can I import all the data of rockmelt Firefox which I like more, but I can't find a data option rockmelt 2 2 firefox as import switch and insert bookmarks in firefox?

    I want to import data like passwords and histroy and bookmarks and all of rockmelt Firefox I tried to go to import from other browsers, but I found no rockmelt, and also it will not contain password and this thing so please if you can tell me what is the best way to do this thing I need this quick please.

    To all of you who have this problem I solved it with myself.

    Just go to google chrome and download it and then go to C:\Users\username\AppData\Local\RockMelt and then copy all the data from the user file and then go to C:\Users\TOSHIBA\AppData\Local\Google\Chrome and delete the data of the user of the file and paste the one you copied everythign now in base is in chrome then forefox do synchronize on the other browser then choose google chrome!

    Note! If you can't find the AppData file it's hidden setting of files and folders and check the box says show all hidden files.

    Good luck!

  • IP address on the virtual machine windows endangered endangered after the importation of the data store.

    I'm gaving a problem with the IP on windows vm disappear after having mounted the snapshot to restore. Here's what I do.

    Mount making LUNS to the esx host area

    I have eyebrows outside the data store to find the vm I want to right click and click Import the vmx file.

    Imports of the virtual machine and when I start it up I wonder if I've moved or copied the VM. I have tried both moved and copy but when I connect to the virtual machine is missing all its IP info.

    Any way to avoid the losting vm sound IP info when I import to the host from a LUN outlet?

    Hello

    Installs on the forum of the Virtual Machine and the guest operating system.

    The MAC of a virtual computer is based on the UUID/name of the virtual machine so if you are not using a static MAC (one that recognizes VMware) and it will change the IP address might also change if you are using DHCP. So I suggest to use a static MAC in the VM configuration.

    Another way is to set up the guest operating system in the virtual machine with a static IP address instead of a DHCP function.

    But if you are unable to remove the virtual machine or the copy in the virtual machine, there may be something locking these files would be an another running virtual machine. If you need to find this VM and kill him. In this case, "vmware-cmd - L" and 'vmware-cmd - l' host in question can be useful to find where the 'hidden' VM is running.  Ultimately you can use ' vm-support - x "to find a list of the virtual machines running by how the hypervisor lists. Then use 'vm-support - Z' to suspend the offending virtual machine that should unlock everything.

    Yes, the use of these commands is command line.

    However, before using these commands determine if you can find the place where life VM incriminated.

    Best regards

    Edward L. Haletky

    Host communities, VMware vExpert,

    Author: VMware vSphere and Virtual Infrastructure Security,VMware ESX and ESXi in the 2nd business edition

    Podcast: the Podcast for security virtualization of resources: the virtual virtualization library

  • Import of the data dictionary trigger definitions

    Hi I just installed Oracle SQL datamodeler and try to import the schema definition. It import all the elements (tables, views, stored procedures, functions and so) except trigger
    Please notify

    Best regards
    Virgil

    Hi, Virgil,.

    Triggers associated with Tables or views must be imported. If you expand the node of the Table or view in the part of the physical model of the browser tree relevant, you should find the triggers there.

    System triggers are not currently imported.

    Note that there is a forum specifically for the SQL Developer Data Modeler: SQL Developer Data Modeler

    Kind regards
    David

  • Follow the progress of the import of the data pump network?

    I'm on Oracle 10.2.0.4 (SunOS) and execution of an import of network data pump from a list of tables in a schema.

    I see that the following views are available to track the data pump tasks:

    DBA_DATAPUMP_JOBS - a list and the status of the data pump jobs running
    DBA_DATAPUMP_SESSIONS - list of user sessions attached to each data pump task (which may be attached to the session of v$)
    DBA_RESUMABLE - See labor being imported and its status
    V$ SESSION_LONGOPS - can see the total size of imports and the elapsed time for work of pump data if they run long enough

    What other options are available for the increase in imports of network monitoring?
    Also, is it possible to see at which table is being processed for a network of several tables import?

    That would have helped. :^)

    When you run a job, if you do not specify a task name, then it will generate one for you. In your case, I don't see a name of the specified job, it seems that one would be generated. The generated name looks like:

    SYS_IMPORT_FULL_01 if a plenty to do
    SYS_IMPORT_SCHEMA_01 if make a diagram
    SYS_IMPORT_TABLE_01 if a table
    SYS_IMPROT_TRANSPORTABLE_01 if you are using the transportable tablespace.

    01 is the first array to create. If there is a second job while this job is currently running, it will be 02. The number can also be moved if the job fails, and is not cleaned. In your case, you make a table, so the default task name would be something like:

    SYS_IMPORT_TABLE_01 and lets say you ran this with the diagram of the SYSTEM.

    In this case, you can run this command:

    Impdp system/password attach = system.sys_import_table_01

    This will bring you to the datapump prompt, where you can type in the location or status 10, etc..

    Dean

  • can export us all the data in the Excel file format table in oracle?

    Hello

    Is it possible to export all the data in the table to the excel file format?

    Thank you

    Hi Mohammed_82!

    Oracle itself has no tools to export in csv or excel format. You have to search third-party tools such as Toad or the Torah.

    concerning

  • Datpump Export, Import in the upper and lower versions

    Versions involved : 10.2.0.4, 11.2.0.3
    Operating System  : AIX 6.1
    +++++++++++++++++++++++++++++++++
    Let's say I have export a schema in a database of 11.2 11.2 expdp.

    Can I import the dump above (11.2) to a base of 10.2 to 10.2 impdp?

    No, impdp version 10.2 will not recognize the file:

    [oracle@taurus ~]$ impdp system/******** dumpfile=teste.dmp
    
    Import: Release 10.2.0.5.0 - 64bit Production on Thursday, 06 June, 2013 13:55:06
    
    Copyright (c) 2003, 2007, Oracle.  All rights reserved.
    
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-39142: incompatible version number 3.1 in dump file "/opt/oracle/product/Ora10gR2/rdbms/log/teste.dmp"
    
  • Error export/import of the application to the Apex 4.1 on ORACLE 11 g 2 XE

    We have an ApEx application, what is migrated several ApEx 3 times through 3.2 to 4.1.
    There is no problem import/export demand, when we use ORACE 11 GR 2 (SE or EE).

    A few days go that put us the application on a database of the XE 11 GR 2. There was no significant or problems by running.

    Now, we have changed something and wanted to export the aplication of the database of XE and back to a different database.
    Importing (no questions about the origin of the XE) application or any other edition, we get an error:

    p_region_id = > 83466451 p_form_element_id = > null,

    *+
    ERROR on line 7:
    ORA-06550: line 7, column 26:
    PLS-00103: encountered the symbol "P_FORM_ELEMENT_ID" when awaits an of the
    Next:
    +), * & = - + <>/ at mod is rest not rem +.
    + < an exponent (*) > <>or! = or ~ = > = < = <>and or as like2 +.
    like4 likec between | submultiset of type multiset Member
    Symbol ',' was replaced by 'P_FORM_ELEMENT_ID' to.

    Looking at the source, I found:

    declare
    s varchar2 (32767): = null;
    Start
    s: = null;
    () wwv_flow_api.create_report_columns
    P_ID = > 83576346930344697 wwv_flow_api.g_id_offset, +.
    p_region_id = > 83466451 # p_form_element_id = > null,
    p_column_alias = > 'APP_USER2 ',.
    p_column_display_sequence = > 2,
    p_column_heading = > 'Benutzer ',.
    +...+

    It is absent from the code in the place where I put the #.

    Does anyone has an idea, what's not to export?

    We use a GR 11, 2 on OpenSUSE 12.1 Apex 4.1.0 ORACLE XE. The export file is about 2 MB.

    Thanks for your help
    Tilman

    If you are still able to access the original (where you got the export)

    try to export on the command-line as well... http://ruepprich.wordpress.com/2011/07/15/Exporting-an-apex-application-via-Command-Line/

    Kind regards
    Richard
    -----
    blog: http://blog.warp11.nl
    Twitter: @rhjmartens
    If you answer this question, please mark the thread as closed and give points where won...

  • Why pictures do not save when you import by the data reader?

    I just started using a data reader to import my photos on my mac, but realize that the pictures I imported have not saved on the hard drive. If I remove it and try to edit a photo in previous import, it says that it cannot find the file. What happens here?

    Cool. Thank you 1 million. One of those things that you would not be able to know without knowing precisely; )

    Richard

  • DataModeler: import of the data dictionary table comments

    Hi, I begin barely using SQL DataModeler and seems to have a problem with regard to the comments of reverse engineering table.

    When you use Import > data dictionary, I have no trouble importing column comments, but the comment table always seem to come up empty. I see no obvious option for y understood/off table/column comments so am puzzled as to why column comments import very well but the comments in the table are not (Yes, the tables, I'm importing have comments on the tables and columns).

    What Miss me? Any help is appreciated.

    -Andy.

    Hi Andy,.

    There is no option for this. They should be included, but unfortunately there is a bug and not included.

    Philippe

  • How can I export all the data collected on the page manage the of Echosign (list of all signed documents) to an excel file?

    Hello, rather than saving contracts individual pdf, I was wondering if there is a way I can export all of the data collected from the various contracts in a single great file?  Thank you

    Hello Olivia47511068,

    If you have used a library template to send the documents for signature, you can export the values of individual transactions by going into the tab manage and then scroll down to the section template Library, single click on it and on the right side, click on the history tab and select ' export the data. It will download the CSV file with data of form field of all transactions made using the model of the library.

    -Usman

  • Oracle 8i exporting because of the Chinese oracle 12 c

    Hi all

    I'll be very grateful if someone can give me some advice. I tested several days to discover the cause of the problem of displaying Chinese characters.


    Export version of the Oracle Server (Unix machine)

    Oracle8i Enterprise Edition Release 8.1.7.0.0

    Select DECODE ("parameter, 'NLS_CHARACTERSET', ' GAME of CHARACTERS",)
    "NLS_LANGUAGE', 'LANGUAGE',
    "NLS_TERRITORY', 'TERRITORY') name.
    value of v$ nls_parameters
    Setting WHERE IN ('NLS_CHARACTERSET", 'NLS_LANGUAGE',"NLS_TERRITORY")

    Value name
    American language
    American territory
    US7ASCII character

    the value of nls_lang = american_america.us7ascii in my machine

    I can correctly display traditional Chinese characters in sqlplus

    exp userAdm/pwd01@sampleDB FULL = Y LINE = sampleDB.dmp

    export success
    ---------------------------------------------

    Version of Oracle server Import
    64-bit Oracle Database release 12.1.0.1.0 12 c

    I create a new database on the import server (Windows 7)

    Value name
    American language
    American territory
    character ZNT16MSWIN95 <- cannot set us7ascii, because Oracle 12 c don't have that choose

    the value of nls_lang = american_america.us7ascii before import

    IMP userAdm/pwd01@sampleDB file = sampleDB.dmp log = sampleDB.log full = y

    import successfully

    However, when I set nls_lang = american_america.us7ascii in my machine

    I can't display traditional Chinese characters in sqlplus correctly

    Also tried after the language, always in vain

    Set nls_lang = CHINESE_TAIWAN TRADITIONAL. ZHT16MSWIN950
    Set nls_lang = AMERICAN_AMERICA. AL16UTF16
    Set nls_lang = american_america. ZHT16MSWIN950

    Can someone give me some advice?
    Thank you!

    V$ NLS_PARAMETERS also shows that the character of database defined then consulted NLS_DATABASE_PARAMETERS will not tell us more.

    What you have is a typical configuration of transmission. The binary codes of traditional Chinese characters are stored in the US7ASCII database, but are not recognized as such. These characters entered from Windows clients, which suggests that their real coding is ZHT16MSWIN950?

    You cannot directly export/import from the database of US7ASCII 8i to any other database that isn't US7ASCII as well without losing all character codes.

    Your choice:

    1. create the database of 12 c in the character set US7ASCII. You can do this by calling DBCA directly (not through Installer) and choose the Advanced installation mode. Then you can uncheck the box "Show only recommended character sets", and the US7ASCII will be available. You can also use the manual procedure that is heavy with the CREATE DATABASE statement. Once you have a database of the US7ASCII 12, you can export and import without data loss, condition NLS_LANG is set US7ASCII and continue running your application that you used to.

    This choice is conceptually the simplest, but you will continue running in an unsupported configuration, and you will certainly encounter problems sooner or later (usually sooner). I do not recommend this option.

    2. create the new database and export/import, as in option 1 above. However, before you run the application, move the character Unicode AL32UTF8 database defined using the Database for Unicode Migration Wizard, which comes with the database Oracle 12 c. Then, your database will be fit for deployment abroad. You may need to set up or change your application to work properly with the AL32UTF8 database. It is the most complex, but the most recommended option.

    3. create the new database and export/import, as in option 1 above. However, before running the application, migrate the database to the real character of your data set. This character set must be determined according to what applications/clients/platforms entered data in the 8i database. You make the migration using the Migration Wizard of database for Unicode and the csrepair script. Then you should be able to run your application without change, only after the NLS_LANG setting appropriate for your platform.

    Thank you

    Sergiusz

  • Export DataPump with the query option

    Hi all

    My environment is IBM AIX, Oracle 10.2.0.4.0 database.

    I need a few sets of records using a query in export production. Request is attached to several tables. Since we have the BLOB data type, we export using datapump.

    We have weaker environments, but have not the same set of data and tables, and therefore not able to simulate the same query in lower environment. But created a small table and faked the query.

    My order is

    expdp system / < pwd > @orcl tables = dump.dump1 query = dump.dump1:' ' where num < 3 ' ' directory = DATA_PUMP_DIR dumpfile = exp_dp.dmp logfile = exp_dp.log

    Query in the command pulls two records directly. By running the command above, I see the size 80KO dump file,
    In the export log file.

    I see Total estimation using BLOCKS method: 64 KB.
    export Dump.Dump1 = 4,921 KB 2 rows.

    My doubts are,
    (1) is the correct command that I am running.
    (2) estimate said 64 k, considering that it says also exported 4,921 KB. But the dump file created is 80KO. It is exported correctly?
    (3) given that I run with the use of the system, apart from 2 rows, export all data. We must send the dump file to the other Department. We should not export all of the data other than the query output.
    (4) in the order if I am not using "tables = dump.dump1), the export file big mess." Don't know which is the right.

    Your answers will be more useful.

    The short answer is 'YES', he did the right thing.

    The long answer is:

    Query in the command pulls two records directly. By running the command above, I see the size 80KO dump file,
    In the export log file.

    I see Total estimation using BLOCKS method: 64 KB.
    export Dump.Dump1 = 4,921 KB 2 rows.

    My doubts are,
    (1) is the correct command that I am running.

    Yes. As long as you query is correct. DataPump will export on the lines that match this query.

    (2) estimate said 64 k, considering that it says also exported 4,921 KB. But the dump file created is 80KO. It is exported correctly?

    Estimate is made using the full picture. Since you specify, he used the method of estimation of block. Basically, how many blocks have been attributed to this table. In your case, I guess it was 80KB.

    (3) given that I run with the use of the system, apart from 2 rows, export all data. We need to send the dump file to other > Department. We should not export all of the data other than the query output.

    I will export all the data, but going to export metadata. It exports the table definition, all indexes on it, all the statistics on tables or indexes, etc. This is why the dump file could be bigger. There is also a 'main' table that describes the export job who gets exproted. This is used by export and import to find what is in the dumpfile, and where in the dumpfile these things are. It is not user data. This table needs to be exported and will take place in the dumpfile.

    (4) in the order if I am not using "tables = dump.dump1), the export file big mess." Don't know which is the right.

    If you only want this table, then you order export is right. If you want to export more, then you need to change your export command. From what you say, it seems that you order is correct.

    If you do not want any expoirted metadata, you can add:

    content = data_only

    at the command line. This will only export the data and when the dumpfile is imported, it must have the table already created.

    Dean

Maybe you are looking for