Export specific tables, with or without data

Hello
I am writing a script to export some data tables and other tables WITHOUT. I want a reproducible process (script, procedure, or function) if I don't want to use the Wizard SQL Developer. Can I just run a command EXP in SQL Developer? I'm running just a simple export with the following command but get an error:

Start
tables test/test@prod exp is TMP_TEST FILE=C:\Oracle\test.dmp LOG=C:\Oracle\test.log ROWS = N;.
end;
/

Any help is greatly appreciated.
Thank you
John

You can call from PL/SQL is the DBMS_DATAPUMP package. Here's the documentation: http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_datpmp.htm

Tags: Database

Similar Questions

  • External table is created without data

    Hey, guys:

    Please help me on this problem: I tried to load some data from a bunch of csv on linux server files external tables. However, the table is created without data. There is no warning message. but I check the CSV with cat, there are data. This is the query.
    create table worcs.ACBRD_0050_EXT(
    CODE VARCHAR2(4),
    POL_NBR VARCHAR2(8),
    CENT VARCHAR2(2),
    YR VARCHAR2(2),
    SEQ VARCHAR2(1),
    CLAIM_NBR VARCHAR2(4),
    SORT_INIT VARCHAR2(2),
    SORT_SEQ VARCHAR2(2),
    ENTER_CC_50 VARCHAR2(2),
    ENTER_YY_50 VARCHAR2(2),
    ENTER_MM_50 VARCHAR2(2),
    ENTER_DD_50 VARCHAR2(2),
    PREM_DUE_50 NUMBER(11,2),
    POL_STS_50 VARCHAR2(1),
    POL_AUDT_TYPE_50 VARCHAR2(1),
    CHANGE_50 VARCHAR2(1),
    REV_AUD_DED_50 VARCHAR2(1),
    AUDIT_ID_50 VARCHAR2(8),
    BILL_CC_50 VARCHAR2(2),
    BILL_YY_50 VARCHAR2(2),
    BILL_MM_50 VARCHAR2(2),
    BILL_DD_50 VARCHAR2(2)
    )
    organization external ( 
    default directory ksds
    access parameters
     ( records delimited by newline 
      badfile xtern_log_dir: 'xtern_acbrd_0050.bad'
     logfile xtern_log_dir:'xtern_acbrd_0050.log'
      discardfile xtern_log_dir:'xtern_acbrd_0050.dsc'
      ) location ('acbrd-0050.csv') ) REJECT LIMIT unlimited 
    ;
    And Linux, it says:
    [oracle@VM-OracleBI ksds]$ cat acbrd-0050.csv
    0050|00508081|1|11|1|    |  |  |1|11|10|31| 000001638.00|L|C|Y|A|CONF    | |  |  |  |
    0050|01803167|1|10|1|    |  |  |1|11|10|27| 000000896.00|L|C|Y|A|CONF    | |  |  |  |
    [oracle@VM-OracleBI ksds]$

    Transform your table create as

    POL_NBR VARCHAR2 (8).
    HUNDRED VARCHAR2 (8).
    YEAR VARCHAR2 (2),
    SEQ VARCHAR2 (2),
    CLAIM_NBR VARCHAR2 (4).
    SORT_INIT VARCHAR2 (2),
    SORT_SEQ VARCHAR2 (2),
    IND_0115 VARCHAR2 (2),
    CODE VARCHAR2 (4)

    then you will get it. If you looked in your LOG file as I mentioned earlier, you would have found a cargo of ORA-12899: value too large for column errors.

  • Unable to export specific tables in the schema.

    I created a schema duser, the data is imported and now I want to take some pictures from this scheme.
    the tables are (partyhdr, partyaddressdtl, partycontactdtl, partydsdtl, partytdsexcludedtl, partycurrencydtl)


    D:\ > EXP FILE DUSER/LOG@ORCL = 20121221PARTY0228PM. DMP TABLES = PARTYHDR, PARTYADDRESS
    DTL, PARTYCONTACTDTL, PARTYTDSDTL, PARTYTDSEXCLUDEDTL, PARTYCURRENCYDTL;

    Export: Release 10.1.0.2.0 - Production on Fri dec 21 14:30:21 2012
    Copyright (c) 1982, 2004, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Release 10.1.0.2.0 - Production
    Export performed WE8MSWIN1252 and AL16UTF16 NCHAR character set
    About to export specified tables by conventional means...
    . . export of table PARTYHDR 19387 rows exported
    . . export of table PARTYADDRESSDTL 20747 rows exported
    . . export the PARTYCONTACTDTL 226 exported table rows
    . . export the PARTYTDSDTL 53 exported table rows
    . . export the PARTYTDSEXCLUDEDTL 2 exported table rows
    EXP-00011: DUSER. PARTYCURRENCYDTL; There is no
    Export completed successfully with warnings.

    as I check table in schema partycurrencydtl is their AND MONTRANT RECORDS.

    SELECT OBJECT_TYPE OBJECT WHERE OWNER = 'DUSER' AND OBJECT_NAME = "PARTYCURRENCYDTL";

    OBJECT_TYPE
    -------------------
    TABLE

    I can understand why this particular table does not export.

    Kindly help.

    Please use the below command without semicolon

    EXP FILE = 20121221PARTY0228PM DUSER/LOG@ORCL. DMP TABLES = PARTYHDR, PARTYADDRESS
    DTL, PARTYCONTACTDTL, PARTYTDSDTL, PARTYTDSEXCLUDEDTL, PARTYCURRENCYDTL

  • Create table with overlapping parallel dates in individual columns

    I try to combine data from two different tables into a single table.

    The data in table 1 contains locations of patients in a hospital where each record represents a single location. Patients can be transferred several times between the different beds resulting from multiple records for a single visit.

    The data in table 2 contains the operative activity of the patient to the Hospital where each record represents either the GOLD of the suspension of the recovery room. A patient may have multiple operations in a single visit.

    I would like to join/merge/mashup data in a single table that contains the data parallel to each other. In other words, dates of the appliance on one side of the table and the activity of GOLD on the other. The difficulty is that the two sets of overlapping of dates of arrival and departure. I wish that the final table to divide the originals in new records when the overlaps do not coincide.

    Example:

    Original in both events (one per table)

    > Unit event has - from 14:00 to 18:00

    > OR event B - from 16:00 to 17:00

    Results in 3 documents (in the final)

    > Event 1 - unit from 14:00 to 16:00, null dates GOLD

    > Event 2 - unit from 16:00 to 17:00 OR 16:00 to 17:00

    > Event 3 - unit from 17:00 to 18:00, null dates GOLD

    Of course overlap can be more complex than the example above and adding code to indicate the 'ghosts' transfers to as well.

    In the code below, the first visit of the GOLD occurs during the first mention of the unit.

    Jason

    Oracle 10g

    [code]

    create the table delme_Unit_dates

    (id varchar2 (20))

    , unit_rcd_id varchar2 (20)

    , Unit_desc varchar2 (20)

    Unit_in_code char (1)

    Date of Unit_in_dttm

    Date of Unit_out_dttm

    Unit_out_code char(1));

    create the table delme_or_dates

    (id varchar2 (20))

    , OR_rcd_id varchar2 (20)

    , OR_desc varchar2 (20)

    OR_in_code char (1)

    Date of OR_in_dttm

    Date of OR_out_dttm

    OR_out_code char(1));

    create the table delme_all_dates

    (id varchar2 (20))

    , Unit_OR_id varchar2 (40)

    , Unit_rcd_id varchar2 (20)

    , Unit_desc varchar2 (20)

    Unit_in_code char (1)

    Date of Unit_in_dttm

    Date of Unit_out_dttm

    Unit_out_code char (1)

    , OR_rcd_id varchar2 (20)

    , OR_Desc varchar2 (20)

    OR_in_code char (1)

    Date of OR_in_dttm

    Date of OR_out_dttm

    OR_out_code char (1));

    insert into delme_unit_dates values ('123456', 'U1111', 'Unit A', 'A', to_date('2013-04-29 5:02:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-09 1:06:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'B');

    insert into delme_unit_dates values ('123456', 'U1112', 'Unit A', 'B', to_date('2013-05-09 1:06:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-09 4:53:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'B');

    insert into delme_unit_dates values ('123456', 'U1113', 'Unit A', 'B', to_date('2013-05-09 4:53:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-10 10:52:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 't');

    insert into delme_unit_dates values ('123456', 'U1114', ' unity, 't', to_date('2013-05-10 10:52:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-11 11:30:00 AM', 'yyyy-mm-dd hh:mi:ss am'), 'B' ");

    insert into delme_unit_dates values ('123456', 'U1115', ' unity, ' B', to_date('2013-05-11 11:30:00 AM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-12 4:00:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'B');

    insert into delme_unit_dates values ('123456', 'U1116', ' unity, ' B ', to_date('2013-05-12 4:00:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-16 2:14:00 PM', 'yyyy-mm-dd hh:mi:ss am'),' t ');

    insert into delme_unit_dates values ('123456', 'U1117', 'Unit Z', ', to_date('2013-05-16 2:14:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-17 2:26:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'B');

    insert into delme_unit_dates values ('123456 ', 'U1118', 'Unit Z', 'B', to_date('2013-05-17 2:26:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-20 11:30:00 AM', 'yyyy-mm-dd hh:mi:ss am'),');

    insert into delme_or_dates values ('123456', 'OR2221', 'or 1', 'O', to_date('2013-05-09 7:35:00 AM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-09 10:56:00 AM', 'yyyy-mm-dd hh:mi:ss am'), 'R');

    insert into delme_or_dates values ('123456', 'OR2222', ' 5', 'R', to_date('2013-05-09 10:56:00 AM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-09 3:20:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'U');

    insert into delme_or_dates values ('123456', 'OR3331', 'or 2', 'O', to_date('2013-05-16 7:59:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-16 10:43:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'R');

    insert into delme_or_dates values ('123456', 'OR3332', ' 8', 'R', to_date('2013-05-16 10:43:00 PM', 'yyyy-mm-dd hh:mi:ss am'), to_date('2013-05-17 11:20:00 PM', 'yyyy-mm-dd hh:mi:ss am'), 'U');

    commit;

    -Is far from what we

    Select

    U.*

    , o.*

    Of

    delme_Unit_dates U

    delme_OR_dates O

    where

    U.ID = o.id

    and U.UNIT_IN_DTTM < = O.OR_IN_DTTM

    and U.UNIT_OUT_DTTM > = O.OR_IN_DTTM

    order of U.UNIT_IN_DTTM, O.OR_IN_DTTM

    ;

    [/ code]

    Post edited by: Jason_S (changed a single date ' 2013 - 05 - 16 15:20 ' to ' 2013 - 05 - 09 15:20 ')

    Hi, Jason.

    Jason_S wrote:

    I edited one of the dates in the original post.

    Also although the inpatient unit and OR events are contiguous for a given patient (no overlap and without gaps - after that data are cleaned).

    ...

    The sample data you posted a of gaps in the data of the GOLD.  It is correct that the solution below works or not there are gaps in the two tables.

    WITH got_dttm AS

    (

    SELECT unit_in_dttm AS DTMC

    OF delme_unit_dates

    UNION

    SELECT unit_out_dttm AS DTMC

    OF delme_unit_dates

    UNION

    SELECT or_in_dttm AS DTMC

    OF delme_or_dates

    UNION

    SELECT or_out_dttm AS DTMC

    OF delme_or_dates

    )

    all_periods AS

    (

    SELECT DTMC AS in_dttm

    , (DTMC) ahead OF (ORDER BY DTMC) AS out_dttm

    OF got_dttm

    )

    SELECT NVL (u.id, o.id) as id

    u.unit_rcd_id

    u.unit_desc

    u.unit_in_code

    p.in_dttm

    p.out_dttm

    o.or_rcd_id

    o.or_desc

    Of all_periods p

    LEFT OUTER JOIN delme_unit_dates u WE u.unit_in_dttm<=>

    AND u.unit_out_dttm > = p.out_dttm

    LEFT OUTER JOIN delme_or_dates o WE o.or_in_dttm<=>

    AND o.or_out_dttm > = p.out_dttm

    WHERE p.out_dttm IS NOT NULL

    ORDER BY p.in_dttm

    ;

    You can use the query above to CREATE or a CREATE TABLE... AS command.

    If you have as much data as you say, a table or materialized view would be maybe faster to use.

    You will notice that I do understand not all columns; I would like to know if you have a problem, including them.

    I don't know what id role plays in this problem.  It is difficult to say when all rows have the same value.

  • Export / import tablespace with all objects (data, users, roles)

    Hi, I have a problem or a question to the export of the section / import tablespace.

    On the one hand, I have a database 10g (A) and on the other hand, an 11g database (B).

    At there is a tablespace called PRO.

    Also 3 users:

    PRO_Main - contains the datas - space PRO

    PRO_Users1 with a PRO_UROLE - professional role

    PRO_Users2 with a PRO_UROLE - professional role

    Now, I want to transfer the tablespace set PRO (included users PRO_MAIN, PRO_USER1, PRO_User2 and PRO_UROLE role) from A to B.

    On B, I created the user PRO_Main and the tablespace PRO.

    On A, I run suite statement:

    expdp TABLESPACES PRO_Main/XXX DIRECTORY PRO = DUMPFILE TSpro.dmp LOGFILE = backup_datapump = = TSpro.log

    B:

    Impdp TABLESPACES PRO_Main/XXX DIRECTORY PRO = DUMPFILE TSpro.dmp LOGFILE = backup_datapump = = TSpro.log

    Result:

    The user PRO_Main has been imported with all data.

    But miss me PRO_USER1, PRO_User2 and PRO_UROLE role...


    I guess, I've used wrong settings in my experienced and / or impdp.

    Would be nice, if someone can give me a hint.

    Thanks in advance.

    Best regards
    Frank

    When you perform an export of TABLESPACE mode by simply specifying tablespaces, then everything gets exported are tables and dependent objects. Users, roles, and tablespace definitions themselves don't get exported.

    When you perform a SCHEMA mode export by specifying the schemas, you will get the schema definitions (if the schema running export is privied) and all of the objects that has the schema. The schema is not owner of roles or tablespace definitions.

    In your case, you want to move

    1 patterns - that you have already created 1 on your target database
    2. the roles
    3 all in the storage spaces belonged to several patterns.

    There is not 1 import/export command that will do that. This is how I could do this:

    1. move the schema definitions
    a. you can either create them manually or
    B1. expdp schemas = include = user
    impdp B2 b1 results.

    2 transfer the roles
    complete expdp = include = role...
    don't forget, this will include all the roles. If you want to limit what is exported, use:
    include = role: "in (" ROLE1","ROLE2", etc.).
    impdo roles come to export

    3. move the user information
    a. If you want to move all the objects in the diagram as functions, packages, etc., then you need to use a schema view
    Export
    patterns of username/password expdp = a, b, c...
    b. If you want to move only the objects in these storage spaces, and then use the export of tablespace
    expdp username/password = tbs1 storage spaces, tbs2,...

    c. import the dumpfile generated in step 3
    Impdp username/password...

    I hope this helps.

    Dean

  • Need to create a new line in the table with the same data as the primary key, but new PK

    Hello gurus,

    I have a table with a column as a primary key, I need to create a new row in the table, but with the same data in one of the lines, but with different primary keys, focus a double row with key primary diferent...

    Any ideas of how it can be done without much complication?

    Thanks in advance for your answer.

    Agress,
    Karim idrissi

    user9970447 wrote:
    Hello gurus,

    I have a table with a column as a primary key, I need to create a new row in the table, but with the same data in one of the lines, but with different primary keys, focus a double row with key primary diferent...

    Any ideas of how it can be done without much complication?

    Thanks in advance for your answer.

    Agress,
    Karim idrissi

    something like

    insert into mytable values ('literal for new pk',
                                           select non-pk-1,
                                                    non-pk-2,
                                                    non-pk-n
                                           from mytable
                                           where pk-col = 'literal for existing pk')
    
  • ADF table with checkbox refresh data binding problem

    Hello.

    I use JDeveloper 11.1.1.3. I need to use the table with checkboxes in each row of the table in my project. I use VO with transitional 'Selected' attribute that has a boolean type.
    Everything works well, wait one thing:
    When you click checbox with valueChangeListener and try to get the selected line in the managedBean you won't get any selected lines. After selecting second maaged bean evil shows that 1 single line is selected. It's my managedBean method:

    public void SelectCountyClick (ValueChangeEvent valueChangeEvent) {}

    DCIteratorBinding it = ADFUtils.findIterator (ITERATOR_NAME);

    int selectedRowCount = 0;
    RowSetIterator laughs = it.getRowSetIterator ();
    Line r = rit.first ();
    If (r! = null) {}
    If ((Boolean) r.getAttribute ("Selected"))
    selectedRowCount ++;
    }

    While (rit.hasNext ()) {}
    r = rit.next ();
    If ((Boolean) r.getAttribute ("Selected"))
    selectedRowCount ++;
    }

    System.out.println ("selected all THE LINES:" + selectedRowCount);


    }

    I tried to change this event to the client event, I got the line number, I put 'true' or 'false' to the code data binding, but whenever I can't correct data after the value change event.

    Please help me.

    The latest idea is updated databing after click of checkbox, I think. Please help me.

    Thank you!

    You must go through the concepts of life cycle of page ADF. In simple terms the Boolean value in the model is not defined in valueChangeListener. Try adding (.processUpdates) valueChangeEvent.getComponent (FacesContext.getCurrentInstance ()); on top of your listener method and see the effect.

    Reference:
    http://docs.Oracle.com/CD/E15051_01/Web.1111/b31974/adf_lifecycle.htm

  • Export a table with name partition with a:

    Hey all, I feel so stupid to ask this Q, but take my injured brain. M trying to export a table (mgmt) in the schema of a g 10 DB with 'User' partition name: "2010-05-30 00:00". Now, I know that for the export of a partition we use - schemaname.tablename:partition_name

    Try various options but unable to load the Exp utility to ignore the ":" itself is a part of the name of partition

    the user/pswd = user.mgmt exp tables: '2010-05-30 00:00 ' compress = n
    user/pswd exp tables = user .mgmt: 2010-05-30\ 00\:00 compress = n
    the user/pswd = exp tables use. "" Mgmt:2010 - 00\:00 05 - 30\ "compress = n
    the user/pswd = exp tables use. ' Mgmt:2010 - 05-30 00:00 "compress = n

    And that's why I either errors such as:
    . . export of table MGMT
    EXP-00051: '2010-05-30' - in view of the partition or subpartition name doesn't do not part of the table "mgmt".
    EXP-00011: user.00 does not exist
    EXP-00051: "00" - given name of partition or subpartition not part of the table "00".

    Or:

    EXP-00019: failure of the treatment of parameters, type 'HELP EXP = Y' help
    EXP-00000: export completed unsuccessfully

    I feel that I m missing something simple but not able to find out. Try these links to find clues, but did not help
    http://download.Oracle.com/docs/CD/B10501_01/server.920/a96652/CH01.htm#1005947
    http://download.Oracle.com/docs/CD/B10501_01/server.920/a96652/CH01.htm#1006395

    I can't give even just take an export of the entire table since it has more than 30 scores and size of the dmp will be huge :(

    I see that your partition name contains white space, it seems that your main problem escapes reserved characters and spaces.
    In your case, you need to escape to the void of the name of the partition part.
    Use this settings file and you should be fine:

    exp.par table of contents:

    file=exp_1tabs_sample.dmp
    log=exp_1tabs_sample.log
    tables=user.mgmt:"2010-06-11 \ 00:00"
    compress=n
    

    Cheers.

  • How to add the primary key for the table with the existing data?

    The table is already busy data. There was no primary key before, so for each column, there are some duplicate values.

    I want to add a new column, which should be of the integer data type and can automatically incremented, from 001. I tried with Oracle SQL Developer, but it says "ORA-01758: table must be empty to add mandatory (NOT NULL) column. How can I do? Thank you!

    Hello

    Look for the [ALTER TABLE | http://download.oracle.com/docs/cd/B28359_01/server.111/b28286/statements_3001.htm#sthref4803] command to find out how to add a column (step (1)) and a (step (3)) constraint to an existing table.

    For the step (2):

    CREATE SEQUENCE  employee_id_seq
    START WITH  1
    ;
    
    UPDATE  employee
    SET     id  = employee_id_seq.NEXTVAL;
    

    When you create a sequence, START WITH 1 is the default value, so that the line is not really necessary above... I've included just to show how you could start with any number you have chosen.

  • Flash Pro CS6 - export the Sprite with EaselJS JSON Data sheet

    Hello

    I have imported a png sequence into Flash Pro and tried to get the new generation of spreadsheet sprite functionality to export JSON data in EaselJS format, but when I choose the option EaselJS in the drop-down menu, I only get a window function:


    (function (window) {}

    (} (window));

    However, when I export to JSON format, I get the appropriate JSON data.  Is this a known issue with the support of EaselJS in CS6, or should I do something differently?  I prefer not to use Zoe, the SWF file to the usefulness of the AIR EaselJS, because the png is large enough and Zoe seems to have a cap of maximum dimension for the export of sprite sheets, I get 12 of them while Flash Pro is able to automatically detect dimensions, export a sprite sheet, which is great, I just need to generate the data EaselJS with her.  Thanks in advance!

    try to use a movieclip with some symbols on the timeline?

  • help with quierying, a table with varchar as dates

    the guys need a little help im close, but I can't just close the deal.
    I have a table that the field is dataytped as varchar2, but she holds a date as such date of today '20100615'.
    I know that the first thing that you guys are going to say is that this must be formatted as a date but it is not my table and I do deal with this.
    Here's how the problem im trying to query for a range of dates, and im having a hell of a time to do.
    as you cannot say my query below im trying to bring back only 15 days worth of data by date.
    can someone please point out what is obvious. I was home for a day now to try to get this working.



    SELECT DISTINCT to_date (fwvitals_date, 'YYYYMMDD') "fwvitals_date".
    OF fwvitals
    where fwvitals_date
    between (((SELECT MAX (fwvitals_date) OF FWVITALS)))
    Double ((SÉLECTIONNEZ to_CHAR (sysdate-15, «AAAAMMJJ»)))

    Hello

    user633029 wrote:
    the guys need a little help im close, but I can't just close the deal.
    I have a table that the field is dataytped as varchar2, but she holds a date as such date of today '20100615'.
    I know that the first thing that you guys are going to say is that this must be formatted as a date but it is not my table and I do deal with this.

    You are absolutely right!

    Here's how the problem im trying to query for a range of dates, and im having a hell of a time to do.
    as you cannot say my query below im trying to bring back only 15 days worth of data by date.
    can someone please point out what is obvious. I was home for a day now to try to get this working.

    SELECT DISTINCT to_date (fwvitals_date, 'YYYYMMDD') "fwvitals_date".
    OF fwvitals
    where fwvitals_date
    between (((SELECT MAX (fwvitals_date) OF FWVITALS)))
    Double ((SÉLECTIONNEZ to_CHAR (sysdate-15, «AAAAMMJJ»)))

    "WHERE x BETWEEN y AND z" is equivalent to
    "WHERE x > = y AND x.<=>
    If y > z, no line will never be allowed, and if there is the great artist of value in your table, then lines (probably) very little, perhaps only 1 will satisfy the same if condition z > y.

    What exactly are you trying to do?
    It helps if post you a small example of data (CREATE TABLE and INSERT statements) and the results desired from these data. If the results are conditional, give some examples, for example, "if I run the query at any time on 15 June, I want to... but if it's June 16, so I want to..." »

    If you want the most recent 15 days, including today (that is, when run on 15 June, you want from 1 June until 15 June) then:

    SELECT DISTINCT
         fwvitals_date
    FROM      fwvitals
    WHERE       fwvitals_date     BETWEEN     TO_CHAR (SYSDATE - 14, 'YYYYMMDD')
                   AND     TO_CHAR (SYSDATE,      'YYYYMMDD')
    ;
    

    Fortunately, the strings are in a format such as sorting is explicit, so you don't have to run TO_DATE on each of them and get conversion errors.

    Published by: Frank Kulash, June 15, 2010 21:52

  • Syntex to create the table with the long data type

    I'm looking to create a table based on another table that having the column long data type. Throw the error ORA-00997: illegal use of the LONG data type


    I tired it

    create table abc_long (ag bgd long number);

    create table abc_long_dummy as ( ) Select *of abc_long); - ORA-00997 error: illegal use of the LONG data type

    How to get there?

    I'm looking to create a table based on another table that having the column long data type.

    You really don't want to do that.

    LONG data type has been deprecated for some time now, use CLOB.

    The TO_LOB() function will do the conversion on the fly:

    create table abc_long_dummy
    as
    select ag
         , to_lob(bgd) as bgd
    from abc_long ;
    
  • Importing data with impdp table with 3 new columns

    Hello

    Is it possible to import data with impdp in tables with 3 new columns?

    Kind regards

    William

    To do this, I use this method:

    Add the three columns in the source table and create the package:

    CREATE OR REPLACE PACKAGE DATAPUMP_TECH_COLS

    IS

    FUNCTION SET_DML_DATE (p1 in TIMESTAMP)

    BACK TO TIMESTAMP;

    FUNCTION SET_DML_TYPE (p2 in VARCHAR2)

    RETURN VARCHAR2;

    FUNCTION SET_DML_SCN (p3 in NUMBER)

    RETURN NUMBER;

    END DATAPUMP_TECH_COLS;

    /

    CREATE OR REPLACE PACKAGE BODY SYS. DATAPUMP_TECH_COLS

    IS

    FUNCTION SET_DML_DATE (p1 in TIMESTAMP)

    RETURNS THE TIMESTAMP

    IS

    BEGIN

    SYSDATE RETURN;

    END;

    FUNCTION SET_DML_TYPE (p2 in VARCHAR2)

    RETURN VARCHAR2

    IS

    BEGIN

    RETURN ' ';

    END;

    FUNCTION SET_DML_SCN (p3 in NUMBER)

    RETURN NUMBER

    IS

    BEGIN

    RETURN 0;

    END;

    END;

    /

    Export a table with remap_data

    expdp = TEMP_DIR PARALLEL = 8 TABLES directory is PIVOTMAT2. ACCTG_LINE LOGFILE = expdp_acctg.log = COMPRESSION STATISTICS ALL EXCLUDE =.

    DUMPFILE = ACCTG1.dmp, ACCTG2.dmp, ACCTG3.dmp, ACCTG4.dmp, ACCTG5.dmp, ACCTG6.dmp, ACCTG7.dmp, ACCTG8.dmp REUSE_DUMPFILES = YES.

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_TYPE:SYS. DATAPUMP_TECH_COLS. SET_DML_TYPE.------

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_DATE:SYS. DATAPUMP_TECH_COLS. SET_DML_DATE.------

    REMAP_DATA = PIVOTMAT2. ACCTG_LINE. DML_SCN:SYS. DATAPUMP_TECH_COLS. SET_DML_SCN

    Import table

    Impdp "" / as sysdba "" DIRECTORY = SRC_PIVOT TABLE_EXISTS_ACTION = TRONQUER REMAP_SCHEMA = PIVOTMAT2:STGPIV.

    DUMPFILE = ACCTG1.dmp, ACCTG2.dmp, ACCTG3.dmp, ACCTG4.dmp, ACCTG5.dmp, ACCTG6.dmp, ACCTG7.dmp, ACCTG8.dmp PARALLEL = 8

    to complete the removal of the collar if necessary.

  • Insert/update the column with the clob data type

    Hi all

    ORCL Version: 11g.

    I have a table with the clob data type.

    Test12

    (col1 clob);

    I'm trying to insert/update to update the column with more than 4000 characters.

    But due to the limitation of tank 4000, I could not Insert/Update.

    Need your help in resolving this issue.

    THX

    Rod.

    The limit of 4000 characters is incorrect.  That pertains only to the varchar2 data type.  A clob can hold more than 4 G.

    Here is an example that shows how to insert it, I found...

    Otherwise, here is a way 'dirty' to do.

    insert into your_table (COLA, COLB)

    values

    (PRIMARY_KEY, PART 1 OF DATA)

    ;

    Update your_table

    Define COLB = COLB | PART 2 OF BIG DATA

    where COLA = PRIMARY_KEY;

    Update your_table

    Define COLB = COLB | PART 3 OF BIG DATA

    where COLA = PRIMARY_KEY;

    .. and so on...

    I don't know that I personally recommend the second style...  But he could do the job.

  • How can I export the schema with all objects and a few tables without data

    Hi all

    Verion 10g EA.
    I export the schema with all objects but I need ignored some of the data in the table.

    There is a table 4 people gave huge, we need not to export data from tables but structure must export.


    Thank you
    Nr

    You can do this with a single command.  Run your export as usual and add the query parameters for 4 tables you want all the lines:

    expdp... query = schema1.table1: "where rownum = 0" query = schema2.table2: 'where rownum = 0'...

    It is best to place the query parameters in a parameter file if you don't have to worry about escaping special characters of the OS.

    Dean

Maybe you are looking for

  • Annoying beep when PIN unlock, unsilences too

    My Droid Turbo will beep loudly I unlock the phone with my PIN code. It also changes the mute my phone to / strong ringtone no matter what. Why? and how to change it, it is driving me crazy. Especially if I'm somewhere where the phone needs to be sil

  • Access a queue within a labview application .exe built

    Hello I have a labview application .exe built in which there is a queue. I would like to put items in this queue to a labview vi that exists outside of the application. Is this possible? Thank you, labJunky

  • Sharing data in a list

    Hi all! I apologize for this brings one more time, but I still can't understand how to SHARE data in a list item by clicking a button (not an element of action invoke). I think I'm close, but at the moment it does not work I found some help on code '

  • CD dvd drive don't work or detected

    I have a gateway NV55C Windows7 running. My cd/dvd drive does not work. When I put a cd it looks to its operation, and when I open it upwards to take what he turns. I tried Mr. Fix It and get a message saying that my device is not. I can't find a cd/

  • BlackBerry Smartphones Facebook

    Hi, I have a 8520 and I loaded the Facebook app on a few weeks ago. I don't know if anyone else has had problems. Half time do not show messages. When sending messages from systems sometimes fails. You cannot change the status of friends if you no lo