Loading data with sql loader

Hi Experts,

I have a file in the following format. I have to insert the data from these files in a table. Can I use SQL Loader to load these files.

My question is that I need to schedule the download of these files. Can I integrate sql loader in a procedure?
Agent Id|Agent Type|Create Date|Termination CDC|Activation CDC|Deactivation CDC|Agent IdX|Agent Status|Status Date|Status Reason Code|Update CDC|Update Serial|Update User|New Owner Agent Id|Previous Owner Agent Id|Agent Name|Primary Address1|Primary Address2|Primary Address3|Secondary Address1|Secondary Address2|Secondary Address3| Primary City|Primary State|Primary Zip|Primary Zip Suffix|Primary Country|Secondary City|Secondary State|Secondary Zip|Secondary Zip Suffix|Secondary Country|Phone Number|Fax number|Mobile Number|Business Type|Field Rep|Bill to Chain Id|Mon Open Time|Mon Close Time|Tue Open Time|Tue Close Time|Wed Open Time|Wed Close Time|Thu Open Time|Thu Close Time|Fri Open Time|Fri Close Time|Sat Open Time|Sat Close Time|Sun Open Time|Sun Close Time|Zone Id|Line Charge Class|Chain Id|Chain Code| Primary Contact  Name| Primary Contact Title| Primary Contact Phone|Secondary Contact Name|Secondary Contact Title|Secondary Contact Phone|Tertiary contact Name|Tertiary Contact Title|Tertiary Contact Phone| Bank Id| Bank Account Id| bank Account Type| Bank Account Date| EFT Flag| Fund Limit|Invoicable|TaxCode|Tax Id|Sales Tax|Service Charge|Instant Cashing Type|Instant Telsel Rep| Instant Number of Bins| Instant Number Itvms| InstantCredit Limit|Auto Reorder| Instant Terminal Reorder| Instant Telsel Reorder| Instant Teleset Active CDC| Instant Initial Distribution|Auto Telsel Schedule| Instant Auto Settle| Instant Call Day| Instant Call Week| Instant Call Cycle| Instant Order Restriction| Instant Delivery Flag| Instant Account Type| Instant Settle Class| Region|County|Territory|Route|Chain Statement|Master Agent Id| Minority Owned| Tax Name| State Tax Id|Mailing Name| Bank Account Name| DSR
0|1|0|0|0|0|0|1|0|0|302|0|0|0|0|||||||||||||||||||||0|0|0|||||||||||||||0|0|0|||||||||||||0|-2145916800|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0||0|0|0|||||
1|1|1256213087|0|-39081|-39081|1|2|1256213087|999|302|0|0|0|0|Pseudo Outlet||||||||MU|||MU||MU|||MU||||0|0|1|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|06:00|23:59|0|0|0|||||||||||||
{code)

Edited by: Kevin CK on 02-Feb-2010 03:28                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                    

Here you go...

drop table agent_dump_csv_temp
/

create table agent_dump_csv_temp
  (Agent_Id                   NUMBER
  ,Agent_Type                 NUMBER
  ,Create_Date                NUMBER
  ,Termination_CDC            NUMBER
  ,Activation_CDC             NUMBER
  ,Deactivation_CDC           NUMBER
  ,Agent_IdX                  NUMBER
  ,Agent_Status               NUMBER
  ,Status_Date                NUMBER
  ,Status_Reason_Code         NUMBER
  ,Update_CDC                 NUMBER
  ,Update_Serial              NUMBER
  ,Update_User                NUMBER
  ,New_Owner_Agent_Id         NUMBER
  ,Previous_Owner_Agent_Id    NUMBER
  ,Agent_Name                 VARCHAR2(50)
  ,Primary_Address1           VARCHAR2(50)
  ,Primary_Address2           VARCHAR2(50)
  ,Primary_Address3           VARCHAR2(50)
  ,Secondary_Address1         VARCHAR2(50)
  ,Secondary_Address2         VARCHAR2(50)
  ,Secondary_Address3         VARCHAR2(50)
  ,Primary_City               VARCHAR2(50)
  ,Primary_State              VARCHAR2(50)
  ,Primary_Zip                VARCHAR2(50)
  ,Primary_Zip_Suffix         VARCHAR2(50)
  ,Primary_Country            VARCHAR2(50)
  ,Secondary_City             VARCHAR2(50)
  ,Secondary_State            VARCHAR2(50)
  ,Secondary_Zip              VARCHAR2(50)
  ,Secondary_Zip_Suffix       VARCHAR2(50)
  ,Secondary_Country          VARCHAR2(50)
  ,Phone_Number               VARCHAR2(50)
  ,Fax_number                 VARCHAR2(50)
  ,Mobile_Number              VARCHAR2(50)
  ,Business_Type              NUMBER
  ,Field_Rep                  NUMBER
  ,Bill_to_Chain_Id           NUMBER
  ,Mon_Open_Time              VARCHAR2(5)
  ,Mon_Close_Time             VARCHAR2(5)
  ,Tue_Open_Time              VARCHAR2(5)
  ,Tue_Close_Time             VARCHAR2(5)
  ,Wed_Open_Time              VARCHAR2(5)
  ,Wed_Close_Time             VARCHAR2(5)
  ,Thu_Open_Time              VARCHAR2(5)
  ,Thu_Close_Time             VARCHAR2(5)
  ,Fri_Open_Time              VARCHAR2(5)
  ,Fri_Close_Time             VARCHAR2(5)
  ,Sat_Open_Time              VARCHAR2(5)
  ,Sat_Close_Time             VARCHAR2(5)
  ,Sun_Open_Time              VARCHAR2(5)
  ,Sun_Close_Time             VARCHAR2(5)
  ,Zone_Id                    NUMBER
  ,Line_Charge_Class          NUMBER
  ,Chain_Id                   NUMBER
  ,Chain_Code                 NUMBER
  ,Primary_Contact_Name       VARCHAR2(50)
  ,Primary_Contact_Title      VARCHAR2(50)
  ,Primary_Contact_Phone      VARCHAR2(50)
  ,Secondary_Contact_Name     VARCHAR2(50)
  ,Secondary_Contact_Title    VARCHAR2(50)
  ,Secondary_Contact_Phone    VARCHAR2(50)
  ,Tertiary_contact_Name      VARCHAR2(50)
  ,Tertiary_Contact_Title     VARCHAR2(50)
  ,Tertiary_Contact_Phone     VARCHAR2(50)
  ,Bank_Id                    NUMBER
  ,Bank_Account_Id            NUMBER
  ,Bank_Account_Type          NUMBER
  ,Bank_Account_Date          NUMBER
  ,EFT_Flag                   NUMBER
  ,Fund_Limit                 NUMBER
  ,Invoicable                 NUMBER
  ,TaxCode                    NUMBER
  ,Tax_Id                     NUMBER
  ,Sales_Tax                  NUMBER
  ,Service_Charge             NUMBER
  ,Instant_Cashing_Type       NUMBER
  ,Instant_Telsel_Rep         NUMBER
  ,Instant_Number_of_Bins     NUMBER
  ,Instant_Number_Itvms       NUMBER
  ,InstantCredit_Limit        NUMBER
  ,Auto_Reorder               NUMBER
  ,Instant_Terminal_Reorder   NUMBER
  ,Instant_Telsel_Reorder     NUMBER
  ,Instant_Teleset_Active_CDC NUMBER
  ,Instant_Initial_Distribution NUMBER
  ,Auto_Telsel_Schedule       NUMBER
  ,Instant_Auto_Settle        NUMBER
  ,Instant_Call_Day           NUMBER
  ,Instant_Call_Week          NUMBER
  ,Instant_Call_Cycle         NUMBER
  ,Instant_Order_Restriction  NUMBER
  ,Instant_Delivery_Flag      NUMBER
  ,Instant_Account_Type       NUMBER
  ,Instant_Settle_Class       NUMBER
  ,Region                     NUMBER
  ,County                     NUMBER
  ,Territory_x                NUMBER
  ,Route                      NUMBER
  ,Chain_Statement            NUMBER
  ,Master_Agent_Id            NUMBER
  ,Minority_Owned             NUMBER
  ,Tax_Name                   VARCHAR2(50)
  ,State_Tax_Id               NUMBER
  ,Mailing_Name               VARCHAR2(50)
  ,Bank_Account_Name          VARCHAR2(50)
  ,DSR                        NUMBER
  )
  ORGANIZATION EXTERNAL (
   TYPE oracle_loader
    DEFAULT DIRECTORY TEST_DIR
    ACCESS PARAMETERS (
      RECORDS DELIMITED BY NEWLINE
      SKIP 1
      BADFILE 'test.bad'
      DISCARDFILE 'test.dis'
      LOGFILE 'test.log'
      FIELDS TERMINATED BY '|'
      MISSING FIELD VALUES ARE NULL
      REJECT ROWS WITH ALL NULL FIELDS
        (Agent_Id
        ,Agent_Type
        ,Create_Date
        ,Termination_CDC
        ,Activation_CDC
        ,Deactivation_CDC
        ,Agent_IdX
        ,Agent_Status
        ,Status_Date
        ,Status_Reason_Code
        ,Update_CDC
        ,Update_Serial
        ,Update_User
        ,New_Owner_Agent_Id
        ,Previous_Owner_Agent_Id
        ,Agent_Name
        ,Primary_Address1
        ,Primary_Address2
        ,Primary_Address3
        ,Secondary_Address1
        ,Secondary_Address2
        ,Secondary_Address3
        ,Primary_City
        ,Primary_State
        ,Primary_Zip
        ,Primary_Zip_Suffix
        ,Primary_Country
        ,Secondary_City
        ,Secondary_State
        ,Secondary_Zip
        ,Secondary_Zip_Suffix
        ,Secondary_Country
        ,Phone_Number
        ,Fax_number
        ,Mobile_Number
        ,Business_Type
        ,Field_Rep
        ,Bill_to_Chain_Id
        ,Mon_Open_Time
        ,Mon_Close_Time
        ,Tue_Open_Time
        ,Tue_Close_Time
        ,Wed_Open_Time
        ,Wed_Close_Time
        ,Thu_Open_Time
        ,Thu_Close_Time
        ,Fri_Open_Time
        ,Fri_Close_Time
        ,Sat_Open_Time
        ,Sat_Close_Time
        ,Sun_Open_Time
        ,Sun_Close_Time
        ,Zone_Id
        ,Line_Charge_Class
        ,Chain_Id
        ,Chain_Code
        ,Primary_Contact_Name
        ,Primary_Contact_Title
        ,Primary_Contact_Phone
        ,Secondary_Contact_Name
        ,Secondary_Contact_Title
        ,Secondary_Contact_Phone
        ,Tertiary_contact_Name
        ,Tertiary_Contact_Title
        ,Tertiary_Contact_Phone
        ,Bank_Id
        ,Bank_Account_Id
        ,Bank_Account_Type
        ,Bank_Account_Date
        ,EFT_Flag
        ,Fund_Limit
        ,Invoicable
        ,TaxCode
        ,Tax_Id
        ,Sales_Tax
        ,Service_Charge
        ,Instant_Cashing_Type
        ,Instant_Telsel_Rep
        ,Instant_Number_of_Bins
        ,Instant_Number_Itvms
        ,InstantCredit_Limit
        ,Auto_Reorder
        ,Instant_Terminal_Reorder
        ,Instant_Telsel_Reorder
        ,Instant_Teleset_Active_CDC
        ,Instant_Initial_Distribution
        ,Auto_Telsel_Schedule
        ,Instant_Auto_Settle
        ,Instant_Call_Day
        ,Instant_Call_Week
        ,Instant_Call_Cycle
        ,Instant_Order_Restriction
        ,Instant_Delivery_Flag
        ,Instant_Account_Type
        ,Instant_Settle_Class
        ,Region
        ,County
        ,Territory_x
        ,Route
        ,Chain_Statement
        ,Master_Agent_Id
        ,Minority_Owned
        ,Tax_Name
        ,State_Tax_Id
        ,Mailing_Name
        ,Bank_Account_Name
        ,DSR
      )
    )
    LOCATION ('test.txt')
  )
  PARALLEL
 REJECT LIMIT UNLIMITED
/

SQL> select * from agent_dump_csv_temp
  2  /

  AGENT_ID AGENT_TYPE CREATE_DATE TERMINATION_CDC ACTIVATION_CDC DEACTIVATION_CDC  AGENT_IDX AGENT_STATUS STATUS_DATE STATUS_REASON_CODE UPDATE_CDC UPDATE_SERIAL UPDATE_USER NEW_OWNER_AGENT_ID PREVIOUS_OWNER_AGENT_ID
---------- ---------- ----------- --------------- -------------- ---------------- ---------- ------------ ----------- ------------------ ---------- ------------- ----------- ------------------ -----------------------
AGENT_NAME                                         PRIMARY_ADDRESS1                                   PRIMARY_ADDRESS2                                   PRIMARY_ADDRESS3
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- --------------------------------------------------
SECONDARY_ADDRESS1                                 SECONDARY_ADDRESS2                                 SECONDARY_ADDRESS3                                 PRIMARY_CITY
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- --------------------------------------------------
PRIMARY_STATE                                      PRIMARY_ZIP                                        PRIMARY_ZIP_SUFFIX                                 PRIMARY_COUNTRY
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- --------------------------------------------------
SECONDARY_CITY                                     SECONDARY_STATE                                    SECONDARY_ZIP                                      SECONDARY_ZIP_SUFFIX
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- --------------------------------------------------
SECONDARY_COUNTRY                                  PHONE_NUMBER                                       FAX_NUMBER                                         MOBILE_NUMBER                                      BUSINESS_TYPE  FIELD_REP BILL_TO_CHAIN_ID
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- -------------------------------------------------- ------------- ---------- ----------------
MON_O MON_C TUE_O TUE_C WED_O WED_C THU_O THU_C FRI_O FRI_C SAT_O SAT_C SUN_O SUN_C    ZONE_ID LINE_CHARGE_CLASS   CHAIN_ID CHAIN_CODE PRIMARY_CONTACT_NAME                               PRIMARY_CONTACT_TITLE
----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ---------- ----------------- ---------- ---------- -------------------------------------------------- --------------------------------------------------
PRIMARY_CONTACT_PHONE                              SECONDARY_CONTACT_NAME                             SECONDARY_CONTACT_TITLE                            SECONDARY_CONTACT_PHONE
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- --------------------------------------------------
TERTIARY_CONTACT_NAME                              TERTIARY_CONTACT_TITLE                             TERTIARY_CONTACT_PHONE                                BANK_ID BANK_ACCOUNT_ID BANK_ACCOUNT_TYPE BANK_ACCOUNT_DATE   EFT_FLAG FUND_LIMIT INVOICABLE
-------------------------------------------------- -------------------------------------------------- -------------------------------------------------- ---------- --------------- ----------------- ----------------- ---------- ---------- ----------
   TAXCODE     TAX_ID  SALES_TAX SERVICE_CHARGE INSTANT_CASHING_TYPE INSTANT_TELSEL_REP INSTANT_NUMBER_OF_BINS INSTANT_NUMBER_ITVMS INSTANTCREDIT_LIMIT AUTO_REORDER INSTANT_TERMINAL_REORDER INSTANT_TELSEL_REORDER INSTANT_TELESET_ACTIVE_CDC
---------- ---------- ---------- -------------- -------------------- ------------------ ---------------------- -------------------- ------------------- ------------ ------------------------ ---------------------- --------------------------
INSTANT_INITIAL_DISTRIBUTION AUTO_TELSEL_SCHEDULE INSTANT_AUTO_SETTLE INSTANT_CALL_DAY INSTANT_CALL_WEEK INSTANT_CALL_CYCLE INSTANT_ORDER_RESTRICTION INSTANT_DELIVERY_FLAG INSTANT_ACCOUNT_TYPE INSTANT_SETTLE_CLASS     REGION     COUNTY TERRITORY_X
---------------------------- -------------------- ------------------- ---------------- ----------------- ------------------ ------------------------- --------------------- -------------------- -------------------- ---------- ---------- -----------
     ROUTE CHAIN_STATEMENT MASTER_AGENT_ID MINORITY_OWNED TAX_NAME                                        STATE_TAX_ID MAILING_NAME                                          BANK_ACCOUNT_NAME                                 DSR
---------- --------------- --------------- -------------- -------------------------------------------------- ------------ -------------------------------------------------- -------------------------------------------------- ----------
         0          1           0               0              0                0          0            1       0                      0        302             0           0                  0                       0

                                                                                                                                                                                                                0          0                0
                                                                                             0                 0          0

                                                                                                                                                                                            0        -2.146E+09          0          0          0
         0                     0              0                    0                  0                      0                    0                   0            0                        0                      0                          0
                           0                    0                   0                1                 0          0                         0                     0                    0                    0          0          0           0
                         0               0              0

         1          1  1256213087               0         -39081           -39081          1            2  1256213087                999        302             0           0                  0                       0
Pseudo Outlet

MU                                                                                                                                               MU
                                                   MU
MU                                                                                                                                                                                                       0         0                1
06:00 23:59 06:00 23:59 06:00 23:59 06:00 23:59 06:00 23:59 06:00 23:59 06:00 23:59          0              0     0

                                                                                                                                                                                            1        -2.146E+09          1          0          1
         0                     0              0                    0                  0                      0                    0                   0            0                        0                      0                      -3287
                           0                    0                   0                1                 1          2                         0                     0                    0                    1          0        999           0
                         5               0              0

SQL>

He dislikes the identifier 'territory', so I renamed it to "territory_x" (I guess that's a reserved word).
Your initial problem you had was your fields "SEPARATED"... instead of "TERMINATED" by. ;)

Tags: Database

Similar Questions

  • Member not found when loading data with SQL

    Hello everyone:

    I created a cube map extract all information with SQL statements and it works perfectly. Now I'm trying to load data in the same way, but I can't.

    I created a view that returns data in this format:

    Dimension 1 < tab > axis 2 < tab > member... Member of dimension 5 < tab > measure 1 < tab > 2 < tab > measure 3

    I designed a new rule, indicating for each column in the dimension; and for each measure which specific member of dimension of accounts must be used. I have check and everything is ok, but when I try to load data, it does not work and tells me error:

    + "Data value [3.5] met before that all the selected Dimensions, [1] Records duly filled.
    + Essbase error unexpected 1003007 "+"

    If I get the names of the members with quotes (because they contain spaces, and it is perhaps the reason for the previous error, although the rule analysis correctly) with the SQL statement, when Essbase import, it deletes these quotes. I must use another symbol, and as a rule change this another symbol quotes. Is this normal? I know that this issue when importing formulas, but not here.

    Why in 'Dimension building' I don't have this problem with quotes?

    And when changing the symbols of quotes, this error occurs:

    + "Member x not found in the database" +. " But I check the Member and there are in general terms. What's wrong with that? »


    Thanks in advance

    Concerning

    Javier

    Published by: Javi M on 26-mar-2011 05:52

    Yes, the SQL files and data (of all kinds) are supported by more than one column of data. As you noted, you just point to the Member that represents the column.

    That said, I bet that if you look at your view/table and load the rule against your outline, I bet you find a dimension be mismapped, for example, you think that this column 4 points to the scenario, but you really repointe it produces and which purported to be the column 1, or you missed a dimension. Happens to me all the time.

    Kind regards

    Cameron Lackpour

  • Display data with sql

    Hello

    create TABLE EMP ( )

    Emp_id NUMBER ,

    DATE of DT_FROM ,

    DATE of DT_TO ,

    CREATE_DATE DATE)

    into EMP

    (ID_EMP, DT_FROM, DT_TO, CREATE_DATE)


    Values


    ()100 TO_DATE (March 1, 2001 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (June 30, 2001 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (1 May 2001 05:44:20 ', ' MM/DD/YYYY HH24:MI:SS '));



    into EMP


    ()Emp_id DT_FROM DT_TO CREATE_DATE( )


    Values


    ()100 TO_DATE (1 July 2008 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (April 30, 2012 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (May 8, 2009 14:11:21 ', ' MM/DD/YYYY HH24:MI:SS '));



    into EMP


    (ID_EMP, DT_FROM, DT_TO, CREATE_DATE)


    Values


    ()100 TO_DATE (1 June 2008 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (April 30, 2012 00:00:00 ', "MM/DD/YYYY HH24:MI:SS"), TO_DATE (June 26, 2009 15:48:15 ', ' MM/DD/YYYY HH24:MI:SS '));



    into EMP


    (ID_EMP, DT_FROM, DT_TO, CREATE_DATE)


    Values


    ()100 TO_DATE (September 30, 2012 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (September 30, 2012 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (September 28, 2012 17:13:52 ', ' MM/DD/YYYY HH24:MI:SS '));



    into EMP


    (ID_EMP, DT_FROM, DT_TO, CREATE_DATE)


    Values


    ()100 TO_DATE (October 1, 2012 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (30 April 2013 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (December 14, 2012 11:42:15 ', ' MM/DD/YYYY HH24:MI:SS '));



    into EMP


    (ID_EMP, DT_FROM, DT_TO, CREATE_DATE)


    Values


    ()100 TO_DATE (31 May 2013 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (31 May 2013 00:00:00 ', ' MM/DD/YYYY HH24:MI:SS '), TO_DATE (May 8, 2013 13:26:30 ', ' MM/DD/YYYY HH24:MI:SS '));




    expected results:




    DT_FROM DT_TO

                     03 / 01 / 2001           06/30/2001



                     06 / 01 / 2008           04/30/2012



                     09 / 30 / 2012           09/30/2012



                     10 / 01 / 2012           04/30/2013



                     05 / 31 / 2013           05/31/2013




    reason for with the exception of the the 1 July 2008 ' line is there is another line with less dt_from value that was created after this line created. If there is a sub sequent row


    less dt_from value and latest creation date we need to get that line and exclude the lines as over a.

    Thank you

    Hello

    So, you need to know if there is a previous dt_from on a line more later (where 'later' is determined by create_date).  This sounds like a job for the analytical MIN function:

    WITH got_min_dt_from AS

    (

    SELECT dt_from, dt_to

    , MIN (dt_from) over (PARTITION BY emp_id - just guessing

    ORDER BY DESC create_date

    ) AS min_dt_from

    WCP

    )

    SELECT dt_from, dt_to

    OF got_min_dt_from

    WHERE dt_from = min_dt_from

    ORDER BY dt_from

    ;

    Emp_id play a role in this problem?  It is difficult to say when all the rows of the data sample has the same value.

  • color change for data with sql developer

    I recently saw a few changed the parameter SQL developer to show some color for cells that contain null values. Cannot find where to set this property. Does anyone know how a color of a cell with nulls in default SQL Developer?

    -app

    Go to tools | Preferences | Database | Advanced. There is an option to set the background color set to null.

    Ashley
    SQL development team

  • export data with SQL-plus

    I have a table like this

    Column1 | Column2
    ----------------
    one | -2147483646
    b | -2147483638
    c | -2147483656
    d | -2147483651


    I exported the information with this script:
    SET LINESIZE 50
    termout off Set
    topic on the value
    set pagesize 1000
    NEWPAGE 0 VALUE
    coil outfile.txt

    @query.txt 4

    spool off
    Set termout on


    all values in column2 will be displayed as -2, 147F + 09

    How to display the actual value (without rounding, or as if it were a string)?

    You do not specify what is in your query.txt script, but assuming that it is something like select * from your table. You can use something like

    SET LINESIZE 50
    set termout off
    set heading on
    set pagesize 1000
    SET NEWPAGE 0
    column column2 format 9999999999
    spool outfile.txt
    

    LW

  • Encapsulate data problems when loading with sql loader

    Hi all

    I use sql loader to load data into a flat file HP UNIX.

    I find the fi the NUMBER data type or date type get wraped to the new line, the control file triggered errors.

    The data looks like to (field dilimiter is |, record dilimter is ~):

    1 A87CCH | 1 A87CCH | PLAN_ACCOUNT_PROMOTION | HIR6A-1 | 20100706 06:06:24 | 1 DNE1 | DC?
    2010.7 FY1011 Promoiton | 1 A87AW0 | 1 HJEZE | Private | 20100730 00:00:00 | 00 20100710
    : 00:00 | 0 | Completed | 20100730 00:00:00 | 20100710 00:00:00 | 0 | 1 4A6PKP | TFMAAI | N
    | 0 | 0 | 0 | 0 | 0 | 0 | 1 4A6PKP | Approved | 1 8U4E-163 | 00:00:20110630 00 |
    20100708 01:45:35 | 20100707 00:00:00 | -1||| 0 | 9000 | 0 | 0 ||| 100. N | 0 | 0 | 0 | 0
    | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | N | 20110426 04:01:34 | 1 8U4E-163 | 0|||||||||
    ||||| ~


    The control file looks like:

    OPTIONS (ERRORS = 1000, DIRECT = TRUE, PARALLEL = TRUE, DATE_CACHE = 5000, discardmax = 50)
    UNRECOVERABLE
    load data
    INFILE ' / home/bs7822/leon/leon.dat' "str" ~ "»
    BADFILE ' / home/bs7822/leon/leon.bad'
    DISCARDFILE ' / home/bs7822/leon/leon.discard'
    ADD THE LEON_123456 TABLE
    FIELDS TERMINATED BY ' | '. SURROUNDED OF POSSIBLY "" "
    TRAILING NULLCOLS
    (
    X_INTERFACED_DT EXPRESSION "to_date (replace (replace (: X_INTERFACED_DT_BF, chr (10),"), Chr (13), "), 'YYYYMMDD hh24:mi:ss')", "
    X_INTERFACED_DT_BF boundfiller,
    EXTERNAL DECIMAL X_ACCRUAL_AMT,
    X_PLAN_SHIPMENT_PATTERN_ID TANK (90)
    )

    I think that replace it can treat the wrapped date. But I want to know if we can find a faster or easier way to conquer this topic since the beginning.

    Best regards

    Leon

    user12064076 wrote:
    Thank you for your response. But how to ensure that a record is in a single line? For example, to unload data with coil?

    The table has more than 100 columns.

    Best regards
    Leon

    UH... which guarantee is implemented by anyone or anything that generates the data in the first place.
    for example if I am extracting data to CSV for a customer or another application, making sure that it is 1 card per line with delimiters known etc.

    What is your own code that produces the data in the first place? If so, how are you to produce? With large amounts of data I wouldn't do it with SQL * Plus spool command, but I do with UTL_FILE within PL/SQL, or generating data in a file CLOB and spell the CLOB in a go using one of the CLOB methods available writing files.

  • Loads of parallel data through SQL connect and replace existing data

    I'm doing a load of multiple data through SQL Connect using the following statement.  Just try to put in place a .bat file to automate the process but not sure that whenever it executes this statement adds data to existing or replaces the data? I tried to write an override for this but it works for instructions which contain the import section.

    Import of database data App.DB connect as username identified by password

    with the help of several rules_file, rul1, rul2, rul3

    load_buffer_block starting with buffer_id 10 on error write to 'C:\\dataload.err ';

    We can add spec commit buffer such as "replace all data ', 'create group', 'Add' to this import statement? The default declaration overrides the value whenever the data is loaded?

    Thanks in advance!

    you have an option in the rules file "Overwrite out of values."

    rulefile > setting dataload > loading values

  • user input windows OS when loading data using sql loader

    Hello
    I have the oracle 11g Linux database.
    I have excel sheets with data that must be loaded into the database.
    I have oracle running on my windows client11g.

    When I or anyone else is running the load sql script to load the data, I need to capture the user OS (the one that runs the script of their windows login) and pass it via the sql loader and insert into the table as a column value outside the columns in the excel sheet.

    I think that this could be a very simple solution, I did some research but not just anywhere.
    Can someone guide me on this.

    Thanks in advance.
    Philip.

    Well, I built this example on the fly to show the answer. But it is based on documentation and previous experience with sql * loader or sys_context.

    SQL * Loader Control File Reference
    http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/ldr_control_file.htm#SUTIL005

    SQL * Loader field list reference
    http://download.Oracle.com/docs/CD/E11882_01/server.112/e16536/ldr_field_list.htm

    SQL - SYS_CONTEXT language reference
    http://download.Oracle.com/docs/CD/E11882_01/server.112/e17118/functions184.htm#SQLRF06117

    This forum is a good source of examples, as well as asktom.oracle.com

  • Loading data to SQL server for the Oracle database

    I want to create a table in oracle db table in sql server. Table is huge that he has obtained the documents 97,456,789.
    I created the link (HS) db in the oracle database that points to the sql server. I choose this oracle db on the link table.

    Select * from 'dbo '. "T1@dblink;

    I shot below to create the table.
    create table t2 nologging parallel (degree = 3) as select * from 'dbo '. "T1@dblink;
    and its taking a long time... but its operation...

    is there any other method to do this and and fill the table in oracle db faster.

    Please notify. Thank you.

    vhiware wrote:
    create table t2 nologging parallel (degree = 3) as select * from 'dbo '. "T1@dblink;
    and its taking a long time... but its operation...

    I doubt that parallel processing will be used because it is unique to Oracle (using rowid varies in general) and not SQL-Server.

    is there any other method to do this and and fill the table in oracle db faster.

    Part of the performance overhead is to pull that data from SQL Server to Oracle in the whole of the liaison network between them. This can be accelerated by compressing the data first - and who then transfer them over the network.

    For example: using + bcp + to export the data in the SQL Server box to a CSV file, compress/zip file, scp/sftp file Oracle and then to unzip there. In parallel and direct treatment of load can now be done using SQL * Loader to load the CSV file into Oracle.

    If it is a basic Linux/Unix system, the process of decompression/unzip can be run in parallel with the SQL * process Loader by creating a pipe between the two – where the decompression process writes data uncompressed in the pipe and SQL * Loader reads and loads the data that is available through the pipe.

    Otherwise you are PQ own transformation. Assume that the data is date varies. You can create a procedure on Oracle that looks like this:
    {code}
    create or replace procedure as copyday (day) is
    Start
    Insert / * + append * / into local_tab select * from remote_tab@remotedb where col_day = day;
    -Add logging info, validation, etc.
    end;
    {code}

    You can now start 10 or more of these different days and run it in the background using DBMS_JOB.

  • Loading data from SQL to Essbase table

    Hello

    I'm loading data from SQL to Essbase table by using a rules file. Number of rows in the source table is 7 million. I use the SUNOPSIS MEMORY ENGINE as area transit, LKM SQL for SQL and SQL IKM for Hyperion Essbase (DATA).

    Question:

    1 can I not use any other LKM as MSSQL for MSSQL (PCBS) to load data to the staging instead of LKM SQL for SQL? What I have to change the transit area then? Loading data using LKM SQL for SQL seems quite slow.

    2 it is mandatory to use LKM SQL for SQL, can someone please tell me what I can change to make this quick support parameters?

    3. is it compulsory to use the SUNOPSIS MEMORY engine loading data from SQL server to Essbase?

    Thank you...

    (1) Yes, I highly recommend watching using a KM which uses native as database technology these will usually be more efficient than the generic (like LKM SQL for SQL) KM especially when large volumes of data are involved. Your staging will change depends on where you organize data for example if you are using a SQL server specific KM as - MSSQL for MSSQL (PCBS) - you must have a lay-by available on a MSSQL database and have access to the utility of PCBS.

    (2) it is not mandatory to use this KM you can use a KMs supported by your database technology

    (3) it is absolutely not obligatory to use the SUNOPSIS MEMORY engine. This should only be used when you have relatively small amounts of data, as well as all the processes in memory, or in the case where you have no other relational technology to perform the staging on. However, in your case to use wherever you are processesing these large volumes of data you should be staged on a physical such as SQL Server or Oracle database if they are available.

  • Problem loading data with SmartView

    Hello

    When I try to load the data with Smart View (Lock & send), I received this message:

    "+ OLAP_error (1020011): Maximum number of lines [5000] exceeded +".

    Is there a limit of lines to load in Smart mode?

    Thank you.

    Virgil.

    Yes, there is a limit, but it can be extended.

    See: Maximum lines exceed of SmartView

    Kind regards

    Cameron Lackpour

  • Error in loading data with SQLLDR in Oracle 10 G

    Hello

    Can one suggest what the problem is in the slot mentioned control file used for loading data via SQL * LOADER

    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    DOWNLOAD THE DATA
    INFILE 'D:\test\temt.txt '.
    BADFILE "test.bad."
    DISCARDFILE 'test.dsc '.

    IN THE TABLE 'TEST '.
    INSERT
    (INTEGER SRNO (7))
    PROD_ID INTEGER (10),
    PROMO_ID INTEGER (10),
    CHANNEL_ID INTEGER (10),
    UNIT_COST INTEGER (10),
    UNIT_PRICE INTEGER (10)
    )

    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    I'm trying to load data in the schema SCOTT scott user.

    Why make such a mistake, please see the attached log file.

    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 14:43:35 2009

    Copyright (c) 1982, 2005, Oracle. All rights reserved.

    Control file: D:\test\temt.ctl
    Data file: D:\test\temt.txt
    Bad leadership: test.bad
    Delete the file: test.dsc
    (Allow all releases)

    Number of loading: ALL
    Number of jump: 0
    Authorized errors: 50
    Link table: 64 lines, maximum of 256000 bytes
    Continuation of the debate: none is specified
    Path used: classics

    Table 'TEST', loaded from every logical record.
    Insert the option in effect for this table: INSERT

    Column Position Len term Encl. Datatype name
    ------------------------------ ---------- ----- ---- ---- ---------------------
    SRNO FIRST 7 INTEGER
    PROD_ID INTEGER 10 NEXT
    PROMO_ID INTEGER 10 NEXT
    CHANNEL_ID INTEGER 10 NEXT
    UNIT_COST INTEGER 10 NEXT
    UNIT_PRICE INTEGER 10 NEXT

    Sheet 1: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 2: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 3: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Folder 4: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 5: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 6: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 7: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 8: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    File 9: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Case 10: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Factsheet 11: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 12: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    File 13: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Fact sheet 14: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Fact sheet 15: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 16: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    File 17: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 18: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    File 19: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 20: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 21: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Account 22: rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 23: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record number of 24: rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 25: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Fact sheet 26: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Fact sheet 27: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 28: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 29: rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 30: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record of 31: rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    • Statement 32: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 33: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Page 34: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 35: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 36: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 37: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 38: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 39: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 40: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 41: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Page 42: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 43: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Sheet 44: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 45: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    • Statement 46: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 47: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 48: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 49: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Page 50: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested

    Record 51: Rejected - error on the table 'TEST '.
    ORA-01460: dead letter or unreasonable conversion requested


    NUMBER of MAXIMUM ERRORS EXCEEDED - above the statistics reflect partial performance.

    Table 'TEST'
    0 rows successfully loaded.
    51 lines not filled due to data errors.
    0 rows not loading because all WHEN clauses were failed.
    0 rows not populated because all fields are null.


    The space allocated to bind table: 3648 bytes (64 lines)
    Bytes of read buffer: 1048576

    Total logical records ignored: 0
    Total logical records read: 64
    Total rejected logical records: 51
    Total logical records ignored: 0

    Run started on Fri Mar 20 14:43:35 2009
    Run finished Fri Mar 20 14:43:43 2009

    Time was: 00:00:07.98
    Time processor was: 00:00:00.28



    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Here is the method to use SQLLDR and table details


    --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    SQL > desc test
    Name Null? Type
    ----------------------- -------- ----------------
    SRNO NUMBER (7)
    PROD_ID NUMBER (10)
    PROMO_ID NUMBER (10)
    CHANNEL_ID NUMBER (10)
    UNIT_COST NUMBER (10)
    UNIT_PRICE NUMBER (10)




    Use sqlldr process is:

    cmd PROMT,

    d:\ > sqlldr scott/tiger

    Control = D:\test\temt.ctl

    SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 15:55:50 2009

    Copyright (c) 1982, 2005, Oracle. All rights reserved.

    Commit the point reached - the number of logical records 64

    -----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    I even tried a few examples,

    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Which of the below control record make sense,

    -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    -1

    DOWNLOAD THE DATA
    INFILE 'D:\test\temt.txt '.
    BADFILE "test.bad."
    DISCARDFILE 'test.dsc '.

    IN THE TABLE 'TEST '.
    INSERT
    COMPLETED FIELD BY (EN)

    (INTEGER SRNO (7))
    PROD_ID INTEGER (10),
    PROMO_ID INTEGER (10),
    CHANNEL_ID INTEGER (10),
    UNIT_COST INTEGER (10),
    UNIT_PRICE INTEGER (10)
    )





    -2

    DOWNLOAD THE DATA
    INFILE 'D:\test\temt.txt '.
    BADFILE "test.bad."
    DISCARDFILE 'test.dsc '.

    IN THE TABLE 'TEST '.
    INSERT
    DOMAIN TERMINATED BY, eventually surrounded "" "

    (INTEGER SRNO (7))
    PROD_ID INTEGER (10),
    PROMO_ID INTEGER (10),
    CHANNEL_ID INTEGER (10),
    UNIT_COST INTEGER (10),
    UNIT_PRICE INTEGER (10)
    )




    For the code - 1 I get below mentioned error... *.

    D:\ > sqlldr scott/tiger

    Control = D:\test\temt.ctl

    SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:36 2009

    Copyright (c) 1982, 2005, Oracle. All rights reserved.

    SQL * Loader-350: error of syntax on line 8.
    Expecting "(", found "FIELD".
    COMPLETED FIELD BY (EN)
    ^




    * And for the code - 2 I get the error below *.

    D:\ > sqlldr scott/tiger

    Control = D:\test\temt.ctl

    SQL * Loader: Release 10.2.0.1.0 - Production on Fri Mar 20 16:39:22 2009

    Copyright (c) 1982, 2005, Oracle. All rights reserved.

    SQL * Loader-350: error of syntax on line 8.
    Expecting "(", found "FIELD".
    DOMAIN TERMINATED BY, eventually surrounded "" "
    ^
    ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    Hello

    This will help for you

    LOAD DATA
    INFILE 'D:\test\temt.txt'
    BADFILE 'test.bad'
    DISCARDFILE 'test.dsc'
    INSERT
    INTO TABLE "TEST"
    FIELDS TERMINATED BY ','
    (SRNO INTEGER EXTERNAL ,
    PROD_ID INTEGER EXTERNAL,
    PROMO_ID INTEGER EXTERNAL,
    CHANNEL_ID INTEGER EXTERNAL,
    UNIT_COST INTEGER EXTERNAL,
    UNIT_PRICE INTEGER EXTERNAL
    )
    

    Thank you

  • view the data in SQL

    I was able to use ASP to retrieve data from SQL database by using something like the one below:

    SQL = "SELECT PageName",

    SQL = SQL & "CONVERT (NUMERIC (6,2), AVG(Rating * 1.00))" AVERAGE ".

    SQL = SQL & 'COUNT (Rating) AS Total',

    SQL = SQL & "SUM(CASE WHEN Rating = 1 THEN 1 ELSE 0 END) AS [Star1Total]"

    SQL = SQL & "SUM(CASE WHEN Rating = 2 THEN 1 ELSE 0 END) AS [Star2Total]"

    SQL = SQL & "SUM(CASE WHEN Rating = 3 THEN 1 ELSE 0 END) AS [Star3Total]"

    SQL = SQL & "SUM(CASE WHEN Rating = 4 THEN 1 ELSE 0 END) AS [Star4Total]"

    SQL = SQL & "SUM(CASE WHEN Rating = 5 THEN 1 ELSE 0 END) AS [Star5Total].

    SQL = SQL & "FROM [SDBI]. [dbo]. [GnieRatePage] "

    SQL = SQL & "GROUP BY PageName".

    SQL = SQL & "ORDER BY PageName".

    I then post on the help page:

    Response.Write ("PageName") Recordset

    What I need, it of to transmit these data to Flash and let Flash to view the coast. How to do by way of ASP?

    Thank you

    I don't use the proper syntax for writing couples variable/value with asp, but, if this is the case, use:

    var myTextLoader:URLLoader = new URLLoader();

    myTextLoader.dataFormat = pouvez;

    myTextLoader.addEventListener (Event.COMPLETE, onLoaded);

    function onLoaded(e:Event):void

    {

    for {(var s:String in e.target.data)

    trace (s, e.Target.Data [s]);

    }

    myTextLoader.load (new URLRequest ("read_page_rating.asp"));

  • Problem with SQL Developer to capture

    Hi guys,.

    I am facing a problem for the last 4 days with SQL Developer. I use to migrate sql database to oracle SQL Developer. It worked well before and I had migrated some tables as well. But all of a sudden a few days before I had a mistake, I was unable to capture the objects. While I am trying to capture the sql database or the tables the progress dialog box is displayed as usual, but he suddenly gets hit and the green light turns red which is in the upper left corner. Said at that time, the progress message that appears at the bottom of the box +' stored jdbc:jtds:sqlserver: / / < hostname >: < port > / < the database name > ' +. But if I use another oracle server, then I can capture objects.

    Because of this error, I'm not able to go forward. I tried to solve this problem in many ways. But nothing helps! Please help me solve this problem, as it's an emergency.

    Thanks in advance.

    RAM.

    Published by: Ram Dhilip on March 22, 2010 08:16

    Published by: Ram Dhilip on March 22, 2010 08:16

    Published by: Ram Dhilip on March 22, 2010 08:19

    Published by: Ram Dhilip on March 22, 2010 08:34

    You can use the method of capture in offline mode:
    Click on tools-> Migration-> third party database offline Capture-> Capture database creation scripts
    Now select the output directory of these scripts and the source database. Also, make sure that you have selected the Batch mode. Once this is done, copy all scripts on a computer with an installed SQL Server client or perform the export on the SQL Server computer directly.
    Run the script and you'll get 2 directories with a couple of DAT files created. ALL switch to SQL Developer machine and load them into the SQL Developer
    Click on tools-> Migration-> third party database offline Capture-> Capture Script output load of database

  • Separation of data with multiple headers

    Hello

    I have a data set that has 6 columns with headers containing information for a cycle of some (data file is attached). About 200 lines later, so that I have a different set of data with 6 columns cycle 2 with a space and a new header and any other 400 lines later I data for cycle 5 with a space and a new header. This pattern is repeated throughout the data set.

    I need a way to separate this data so that I can trace different cycles. When I import this data set in Diadem with the Wizzard use it does not recognize the spaces and the new headers. He label spaces and headers "NO value", there are discontinuities in the data. Is there a way to separate the cycles in this dataset in tiara?

    For example, I would have 215 lines thru 220 630 thru 6 lines, lines 635 thru 1046, lines 1051 to 1462, etc.. This way I can trace different cycles against the execution time.

    Hi wils01,

    Here's a use I created that loads your file data in each cycle in a separate group of the data portal.

    Brad Turpin

    Tiara Product Support Engineer

    National Instruments

Maybe you are looking for