exp/imp without data creates huge empty db!

Hello
I have a db of 8.1.7.4 on windows 2000 400GO
I wanted to exp / imp, but I don't want the data, just the structure and code

So I did

exp x/y@db LEADER = D:\exp_database.dmp LINES = COMPATIBLE N = Y FULL = Y CONSTRAINTS = Y GRANTS = INDEX Y = Y

IMP x/y@db2 FILE = D:\exp_database.dmp ROWS = N FULL = Y CONSTRAINTS = Y GRANTS = INDEX Y = Y Feedback = 1000


the 400GO exported to the file of 7 M, what's good
but when I import the 7 M, it creates a VACUUM 160 GB database!
I see that 'some' of the data files are full size, even it's empty!
no reason for this, is the structure take 40% of the db?




Thank you

The 'COMPRESS' export setting determines how the CREATE TABLE statement must be created.

By default, 'Y' 'COMPRESS' that is, a CREATE TABLE statement is generated with a LEADING period allocated size as large as the table (all extensions of the combined table) - without worrying about whether all lines exist in the table or all lines are exported or all lines are imported.

You export with 'COMPRESS = N. Which generates instructions CREATE TABLE with INITIAL as large as the INITIAL of the exported table report. However, if the source table has an INITIAL definition of, say, 100 M, again, regardless of the number of rows present/imported/exported, the import will be CREATE TABLE with INITIAL of 100M. In these cases, you need to generate the CREATE TABLE statement and create the tables manually.

Hemant K Collette
http://hemantoracledba.blogspot.com

Published by: Hemant K Collette on July 29, 2010 14:00

Tags: Database

Similar Questions

  • How do I only EXP/IMP data? 9i, only not using data bumper

    someone knows how to exp or imp only the data in the tables

    I have a dump of database user AA, it is rather small
    all my data are reset to default by running scripts, and I want to restore to the point of backup exp

    But if you imp all tables, views, triggers, they're all here

    I chose "" ignore the imp aa/aa leader = aa.dmp = y ', it doesn't import paritial of discharge "

    in any case, I can chose to imp data_only as impdp, or only data exp?


    I can't drop the user, if it is already connected many machine

    How am supposed to easily manage?

    Thank you, really


    BR/ricky

    With the traditional exp and services public imp there is no option to export only the data. The DDL for the exported objects are always exported. However, you can choose to export only the DDL without data using lines = n parameter.

    You say that your import is only a partial import. How do I? What was missing? What were your exact import settings because your message is missing either full = y or the fromuser = touser = parameters that I expect.

    If you truncate all existing objects, and then run an import job with ignore = y on a dmp file created as an export of the user, then the object creation fails for all pre-existing objects, but the data will be laid. If you are unable to truncate the tables target first then the import will be slow because he has a lot of unique to the import constraint violation error. Another approach would be to drop the user and all objects (or just all user objects) prior to importation and leave import restore user and user objects.

    If you use a dmp full export file then you should do a fromuser = touser = import.

    HTH - Mark D Powell.

  • Options of data compared to exp/imp pump

    I use data for the first time pump and I export 1 discount table in another case where it has been accidentally truncated. I don't see the options that I'm used to exp/imp data pump. For example, to ignore create errors if the structure of the table exist already (ingnore =) or to not load the index (index =). The press review of the Oracle I have has a reading from now is not even talking about these issue, or how Data Pump can handle them. Please let me know if you have experience with this and if it is a question.
    I ran datapump to export the table with data only and meta_data. I couldn't read the meta data. I expected little readable create instructions, but is binary and xml as statements. Not yet clear how I could use it. Now, I try to take my only export and loading data in the 2nd instance. The table already exists but is truncated, and I don't know how it will load the index. Please bring me up-to-date if you can.

    Thank you
    Tony

    Hello

    You should read the oracle documentation and he got very good examples

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_overview.htm#sthref13

    Parameter mapping pump export of data to the Original export utility

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14215/dp_export.htm#sthref181

    Concerning

  • External table is created without data

    Hey, guys:

    Please help me on this problem: I tried to load some data from a bunch of csv on linux server files external tables. However, the table is created without data. There is no warning message. but I check the CSV with cat, there are data. This is the query.
    create table worcs.ACBRD_0050_EXT(
    CODE VARCHAR2(4),
    POL_NBR VARCHAR2(8),
    CENT VARCHAR2(2),
    YR VARCHAR2(2),
    SEQ VARCHAR2(1),
    CLAIM_NBR VARCHAR2(4),
    SORT_INIT VARCHAR2(2),
    SORT_SEQ VARCHAR2(2),
    ENTER_CC_50 VARCHAR2(2),
    ENTER_YY_50 VARCHAR2(2),
    ENTER_MM_50 VARCHAR2(2),
    ENTER_DD_50 VARCHAR2(2),
    PREM_DUE_50 NUMBER(11,2),
    POL_STS_50 VARCHAR2(1),
    POL_AUDT_TYPE_50 VARCHAR2(1),
    CHANGE_50 VARCHAR2(1),
    REV_AUD_DED_50 VARCHAR2(1),
    AUDIT_ID_50 VARCHAR2(8),
    BILL_CC_50 VARCHAR2(2),
    BILL_YY_50 VARCHAR2(2),
    BILL_MM_50 VARCHAR2(2),
    BILL_DD_50 VARCHAR2(2)
    )
    organization external ( 
    default directory ksds
    access parameters
     ( records delimited by newline 
      badfile xtern_log_dir: 'xtern_acbrd_0050.bad'
     logfile xtern_log_dir:'xtern_acbrd_0050.log'
      discardfile xtern_log_dir:'xtern_acbrd_0050.dsc'
      ) location ('acbrd-0050.csv') ) REJECT LIMIT unlimited 
    ;
    And Linux, it says:
    [oracle@VM-OracleBI ksds]$ cat acbrd-0050.csv
    0050|00508081|1|11|1|    |  |  |1|11|10|31| 000001638.00|L|C|Y|A|CONF    | |  |  |  |
    0050|01803167|1|10|1|    |  |  |1|11|10|27| 000000896.00|L|C|Y|A|CONF    | |  |  |  |
    [oracle@VM-OracleBI ksds]$

    Transform your table create as

    POL_NBR VARCHAR2 (8).
    HUNDRED VARCHAR2 (8).
    YEAR VARCHAR2 (2),
    SEQ VARCHAR2 (2),
    CLAIM_NBR VARCHAR2 (4).
    SORT_INIT VARCHAR2 (2),
    SORT_SEQ VARCHAR2 (2),
    IND_0115 VARCHAR2 (2),
    CODE VARCHAR2 (4)

    then you will get it. If you looked in your LOG file as I mentioned earlier, you would have found a cargo of ORA-12899: value too large for column errors.

  • How to move a rule definition with the data store for esx alarm without re-creating the rule?

    Hi, I would spend my rule of definition of alarm of the data at the level of esx store, without re-creating the rule. Is this possible?

    In the Act, I don't want no definition of the alarm at the data store level. So, what is the best practice?



    Best regards

    It is currently not possible.  The rules are hierarchical, and there is no way to stop the inheritance, or move messages.  You will have to re-create them at the appropriate level.

    -KjB

    VMware vExpert

  • replaces data pump really exp/imp?

    Hi guys,.

    Ive read some people saying that we should use data instead of exp/IMP pump But as far as I can see, if I have a database behind a firewall somewhere else and can't connect to this database directly and need to get some acrposs data, then data pump is useless to me and I can't just exp and imp data.

    OracleGuy777 wrote:
    cannot be resolved my Data Pump. Yet people are saying exp and imp will become obsolete and no longer supported. If Oracle will do anything for people who have the same problem as me?


    The sky is not falling. He will not fall for a few years yet. Deprecated! = not supported. Deprecated = not recommended.

    The time will come when exp/imp will be not supported. But I don't think that to happen for a few versions of the database, particularly because it does not all areas of the problem.

  • Wise schema exp/imp or complete database exp/imp better in cross platform database 10g migration

    Hello

    When you perform a migration of database platform (big-endian to little endian) of 10 grams, which should be preferred to a database export import full or can do without schema using exp/imp?

    The data base is about 3 TB and a server Oracle ebs with many diagrams custom in it.

    Your suggestions are welcome.

    For EBS. export/import of individual schemas is not supported, because of the dependence between the different schemes of EBS - only complete export/import are supported.

    What are the exact versions of EBS and database?

  • Data pump - export without data

    To export the database without data in old tool exp was the parameter ROWS defined as N. How to import the schema of database without data using data pump technology?

    You can see by checking using dump export on your command line like this

    C:\Documents and Settings\nupneja>expdp -help
    
    Export: Release 10.2.0.1.0 - Production on Friday, 09 April, 2010 18:06:09
    
    Copyright (c) 2003, 2005, Oracle.  All rights reserved.
    
    The Data Pump export utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
    
       Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    You can control how Export runs by entering the 'expdp' command followed
    by various parameters. To specify parameters, you use keywords:
    
       Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
       Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
                   or TABLES=(T1:P1,T1:P2), if T1 is partitioned table
    
    USERID must be the first parameter on the command line.
    
    Keyword               Description (Default)
    ------------------------------------------------------------------------------
    ATTACH                Attach to existing job, e.g. ATTACH [=job name].
    COMPRESSION           Reduce size of dumpfile contents where valid
                          keyword values are: (METADATA_ONLY) and NONE.
    *CONTENT*               Specifies data to unload where the valid keywords are:
                          (ALL), DATA_ONLY, and METADATA_ONLY.
    DIRECTORY             Directory object to be used for dumpfiles and logfiles.
    DUMPFILE              List of destination dump files (expdat.dmp),
                          e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
    ESTIMATE              Calculate job estimates where the valid keywords are:
                          (BLOCKS) and STATISTICS.
    ESTIMATE_ONLY         Calculate job estimates without performing the export.
    EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
    FILESIZE              Specify the size of each dumpfile in units of bytes.
    FLASHBACK_SCN         SCN used to set session snapshot back to.
    FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
    FULL                  Export entire database (N).
    HELP                  Display Help messages (N).
    INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
    JOB_NAME              Name of export job to create.
    LOGFILE               Log file name (export.log).
    NETWORK_LINK          Name of remote database link to the source system.
    NOLOGFILE             Do not write logfile (N).
    PARALLEL              Change the number of active workers for current job.
    PARFILE               Specify parameter file.
    QUERY                 Predicate clause used to export a subset of a table.
    SAMPLE                Percentage of data to be exported;
    SCHEMAS               List of schemas to export (login schema).
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
    TABLES                Identifies a list of tables to export - one schema only.
    TABLESPACES           Identifies a list of tablespaces to export.
    TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
    TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
    VERSION               Version of objects to export where valid keywords are:
                          (COMPATIBLE), LATEST, or any valid database version.
    
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed
    
    Command               Description
    ------------------------------------------------------------------------------
    ADD_FILE              Add dumpfile to dumpfile set.
    CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
    EXIT_CLIENT           Quit client session and leave job running.
    FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.
    HELP                  Summarize interactive commands.
    KILL_JOB              Detach and delete job.
    PARALLEL              Change the number of active workers for current job.
                          PARALLEL=.
    START_JOB             Start/resume current job.
    STATUS                Frequency (secs) job status is to be monitored where
                          the default (0) will show new status when available.
                          STATUS[=interval]
    STOP_JOB              Orderly shutdown of job execution and exits the client.
                          STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                          Data Pump job.
    
    C:\Documents and Settings\nupneja>
    

    Content to the "metadata_only" parameter will export only the structure of the schema to skip the lines.

  • 10g Full exp/imp using the sys user

    Hello

    I plan to do an export of complete data of data from a database to a different database created on another server/platform.
    Oracle version/patch level is the same 10.1.0.5.

    I also plan to use exp/imp instead of datapump, as I'm not sure how stable Datapump is in this version.

    I want to export and import as "/ as sysdba" because it pays a sys privileges for users of database application.

    file = filename log full filename.log = exp = y
    Full IMP = log y = filename ignore = there give = filename.dmp

    ignore = y, because the tablespaces wil lbe created in advance due to the disposal of the other file system.

    The above method is ok or it will cause me problems

    Thank you

    Hello

    exp/imp does not affect accounts system/sys. they are not at all affected.

    Concerning

  • How to do an http Basic by wifi connection just without data plan?

    Ok. Here's the situation: I have developed an application that makes a connection http Basic (connector.open (url)). It woks fine on Simulator course since it uses the mds Simulator. I'm targeting os 5 and above. Now I started to test on the actual device (curve 9300). The real problem is how can I go out with just a default connection. I mean a generic that doesn't care what type of transport, the phone has and uses only the default. As you know has 9300 WiFi. So hopefully I should be able to connect using my wifi at home. On the phone, I can establish connection wifi fine. The phone's browser do not allow through well. He complains: (error! wifi detected...) We are not able to authenticate you as you are not currently a Rogers... Please turn off your wifi connection, and launch your browser to try again). Well, if it means that I'm not on Rogers, it is not correct, as my base phone package is with Rogers, the SIM set in the phone. I tried his suggestion and of course, it does not work!

    What encourages me that it should work in principle, but I have two phone applications (Opera mini and google map) who are able to go online and make the connection.

    I tried to use a combination of parameters attached to the url, as follows:

    1. just by default without any parameter

    2 - deviceside = true

    3 - deviceside = true; = wifi interface

    I also experimented with different options on options tcp apn:

    1 disabling APN setting

    2-activation parameter APN with internet.com, with comments from user and password

    setting of AFN 3 activation internet.com with wapuser1 and wap for the user name and password.

    To "manage connections", I have the following question similarly checked:

    -Network mobile Rogers...

    Dlink_RezaNetwork - Wi - Fi...

    -BlueTooth feature is disabled

    I wonder out exactly the Mobile network connection? I guess that's my basic phone plan puts online. I will be happy if that's the case for my app by there rather than the wifi connection if that is indeed what Opera mini and google map. But sorry to disappoint you, because even if I disable the first connection (i.e. Mobile network Rogers...) I am still able to use these two apps without problem! It makes me happy because it difinitely says I should be able to go online using only my wifi without data plan!

    On connection icons at the top of the phone to the right, I see the following

    Rogers - dlink.RezaNetwork 3 G (icon of the resistance)

    WiFi

    As you can see, my phone is connected to the wifi in my house.

    Anyway, after all this detail, anyone know if there is a solution to this problem? I mean google map and Opera Mini are able to go online without problem.

    In full respect, please don't refer me to read this or that doc I did those before you come and ask questions. But even if there is a doc that you think really go to address above concerns more precisely I'll gladly read

    I use eclipse for development and just well on Simulator.

    This is the result I get when I run the app through the eclipse on the device:

    (1) for the case when only a Wifi connection is enabled on the device connector.open (url) is called with no additional parameters, I get the error: net.rim.device.internal.io.criticalioexception: required radio isn't avtive.

    (2) for the case with wifi interface parameter defined as follows:

    String connStr = _theUrl + "; interface = wifi. "
    s = (connStr) Connector.open (StreamConnection);
    HttpConnection httpConn s = (HttpConnection);
    InputStream input = s.openInputStream ();

    There is no error reported, even though I capture all the exceptions of open(). But the stream returned is always empty!

    Thanks again.

    Just an update:

    The problem took over the redirect. It was a subject, I admit, I was quite familiar with. So, once read on responses from http status, the solution was easy. Also is it possible to simply use the wifi connection for http connections in the code. The build in the browser is another story and it depends on how the connection preferences are managed internally by the os and carrier I suppose, but lance in the code browser works very well, with just wifi traffic.

  • Move the schema objects to a different database without data

    Hi all

    I have to migrate schema objects in one database to another without data. Help, please.

    Kind regards

    Cherkaoui

    Hello

    Please read the doc, there is option

    CONTENT = {ALL |} DATA_ONLY | METADATA_ONLY}

    ALL loads all data and metadata contained in the source. It is the default value.

    DATA_ONLY only loads the data in row of table in the existing tables; No database object is created.

    METADATA_ONLY only loads the database object definitions; No data line of the table is loaded.

    so in your case, you can use CONTENT = METADATA_ONLY

    HTH

  • Tablespace vs transportable Datapump Exp/IMP

    Hi guys,.
    10.2.0.5

    Will need your counselor here before some tests on this subject.

    I have 2 databases.
    One of them is the production database (main site) and the other a mview site (reading only mviews).
    I need two of them migrate from HP - UX to Solaris (different endian).

    For the Production of database where all the mview connects and master tables to, we can use the transportable tablespace. I assumed transportable tablespace should be ablt to transport logs mview as well? May indicate what are the types of objects tts don't migrate?

    For site mview, seems that transportable tablespace migrate mviews on. Therefore, the only option is by datapump exp/IMP.

    All suugestion for this scenario?

    Thank you!

    TT, all objects in the repository of data are migrated.

    See, if it's useful...
    Transportable tablespace (TTS) Restrictions and Limitations: details, reference and Version where there is [ID 1454872.1] down

  • exp / imp question to support

    Hello @ all.
    It exist in some other forums tools exp and imp will be launched out of the product. Does anyone know if this is true and when it will arrive?
    Thank you very much.

    It has not been started, you can still use exp/imp in version 11 GR 2.

    In Oracle 10 g is a better tool called "Data Pump" for exports or imports

  • Premiere Pro CS6 any data 'Date Created'

    I studied this background in Adobe and public forums without seeing a similar discussion. I have a folder of .mts clips shot in AVCHD on a Sony HXR-MC2000. I can import or drag-and - drop clips and they come into my seq very well - obviously a recent update, I have installed has tackled the problem of no audio import with the clip, thank you, Adobe!  But for the life of me, I can't have the "Date Created" coming with the clip info. I have this box checked in the Panel PP6, but not these data imports.

    Any ideas?  I am running CS6 6.0.1 on an iMac running OS X 10.8.1 Mountain Lion

    Hi Miltko,

    Using a Sony SLT A37, can't do PrP to include creation or modification dates in the data of the project file. That is why the yet more general shortcoming. I would like to hear from users of Nikon and Canon.

    You can add these metadata in first Pro CC 2014.2 from now thanks to the display of the metadata > base. In the next release of Premiere Pro, metadata will be sortable in the project Panel (it's not currently). I tested it and it works fine.

    I also tested Premiere Pro CS6 and "Date Modified" and "création Date" metadata are sortable in this version with some standard .mov files I have in hand right now.

    If I understand correctly, all the metadata in the project Panel will be sorted in the next version. This is one of those unsung, updates that those of you on this post, will enjoy.

    Tip:

    To ensure that the metadata you need are included in Premiere Pro, please follow the standard advice: copy your SD card set to HD from your computer, and then import the files using the media browser. This way your audio, timecode, dates of creation and modification dates will be imported.

    If you have followed the standard procedures and you still do not see the metadata, file a report bug here: http://adobe.ly/ReportBug

    Premiere Pro CS6 allows users the Date of creation and update metadata for sorting by date, if your device is supported for use with CS6.

    Thank you

    Kevin

  • Oracle 11g R2 how to exp/imp

    Hello.. I use the oracle11.2 version. I'm trying to take a logical backup as "EXP" in the linux environment, but empty table failed...

    can u tell how to take this backup with empty tables...

    Thanks in advance

    Hello

    I'm trying to take a logical backup as "EXP" in the linux environment, but empty table failed

    This is due to a new feature of 11.2. By default, the DEFERRED_SEGMENT_CREATION parameter is set to TRUE, so that the Segments are only created when the first line is inserted:

    http://download.Oracle.com/docs/CD/E14072_01/server.112/e10820/initparams069.htm

    Please, use the Export/Import DATAPUMP instead of the Original Export/Import. The DATAPUMP handles better this new feature. You will have an example below:

    http://asktom.Oracle.com/pls/Apex/f?p=100:11:0:P11_QUESTION_ID:3014631100346711923

    You'll find the following tips to start with DATAPUMP link:

    http://www.Oracle-base.com/articles/10G/OracleDataPump10g.php

    Hope this helps.
    Best regards
    Jean Valentine

Maybe you are looking for