Data loading options

Hi all

Are there any other options other than SQL * tables LDR & external to load external data into ORCL?

Thank you
Rod.

SamFisher wrote:
Hi all

Are there any other options other than SQL * tables of LDR & external to load external data into ORCL?

Thank you
Rod.

There are many ways to read the data.

SQL * Loader is an external command line utility
Outdoor tables is an internal mechanism than using the same "engine" as the SQL * Loader utility, to be able to read the data as if it was on a table already (this is why it can be used through SQL).
You can also use the UTL_FILE package to read files line-wise or as a mechanism of low level bytes.
Also, methods such as reading of BFILE as evidenced by... ( goes off to find an example )... here:

{message: id = 9732075}

There are also other built in packages that contain functions or procedures that offer a form of file reading and writing.
And you can also create your own by using Java in the database.

Then, take your pick... many choices... and everything depends on what you want to do.

Tags: Database

Similar Questions

  • Schema name is not displayed in the data loading

    Hi all

    I'm trying to load a CSV file using oracle apex data loading option. The options are using a new upload (.csv) file and table. In the data load page, the schema name is not list my current schema because of which I could not not to download the CSV file.
    Can someone please help with that?


    I use apex oracle 4.1.1

    Concerning
    Rajendrakumar.P

    Raj,

    If it works on apex.oracle.com (4.2) and not in your case (4.1.1), my suspicion is that this is a bug that has been fixed in 4.2 APEX. Apart from upgrading your version 4.2 of the APEX, I'm not sure that there is not really a viable alternative.

    Thank you

    -Scott-

    http://spendolini.blogspot.com
    http://www.enkitec.com

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

  • With FDM/ERPi of Oracle Data Source incremental data loads

    Hello

    I use ERPi 11.1.2.1. In the workspace, it is possible to set the option of the rule of the data load to be instant, or incremental. However, I use FDM/ERPi set to load data from Oracle GL in Essbase. Is it possible for me to put in place the FDM for the data to load incremental data rules charges? Could be a parameter in the source ERPi adapter?

    Thanks for any information you could provide.

    Yes, the source ERPi adapter there is an option for "Method of Load Data" that will allow you to define how the rule of the DL is run. By default, it is "FULL REFRESH" but can be changed.

    (A) connecting to the application via the workbench and the source system adapters
    (B) make a right click on the Source ERPI adapter and choose "options".

    You will see an option to load method and it will be the full refresh value, choose the value of the option you want in the menu drop-down and save.

  • Oracle On Demand on EHA Pod data loader

    Oracle data loader does not work correctly.
    I downloaded from Staging (EHA Pod).
    And I did the following work.

    1. go to the "config" folder and update 'OracleDataLoaderOnDemand.config '.
    hosturl = https://secure-ausomxeha.crmondemand.com
    2. go to the "sample" folder and change the Owner_Full_Name to the 'account - insert.csv '.

    And at the command prompt, run the batch file.
    It runs successfully, but the records are not inserted on EHA Pod.Records exists on EGA Pod.
    It is the newspaper.
    Data loader is only EGA Pod? Could you please give me some advice?


    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: start of execution.
    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: load from the list of configurations: {sessionkeepchkinterval = 300, maxthreadfailure = 1, testmode is production, logintimeoutms = 180000, csvblocksize = 1000, maxsoapsize is 10240, impstatchkinterval = 30, numofthreads = 1, https://secure-ausomxeha.crmondemand.com = hosturl maxloginattempts = 1, routingurl = https://sso.crmondemand.com, manifestfiledir=.\Manifest\}
    [2012-09-19 14:49:55, 281] DEBUG - BulkOpsClient.main () [hand]: list of all options of loaded: {datafilepath = sample/account - insert.csv, waitforcompletion = False, clientlogfiledir = datetimeformat = usa, operation = insertion, username = XXXX/XXXX, help = False, disableimportaudit = False, clientloglevel = detailed, mapfilepath = sample/account.map, duplicatecheckoption = externalid, csvdelimiter is, importloglevel = errors, recordtype = account}
    [2012-09-19 14:49:55, 296] DEBUG - BulkOpsClientUtil.getPassword () [hand]: entering.
    [2012-09-19 14:49:59, 828] DEBUG - BulkOpsClientUtil.getPassword () [hand]: get out.
    [2012-09-19 14:49:59, 828] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: entering.
    [2012-09-19 14:49:59, 937] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: request for host to send to search: https://sso.crmondemand.com/router/GetTarget
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: search for returned: < host? XML version = "1.0" encoding = "UTF-8"? >
    < HostUrl > https://secure-ausomxega.crmondemand.com < /HostUrl >
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: extract successfully the host URL: https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.lookupHostURL () [hand]: get out.
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: entering.
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host of the routing application = https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host of the config = https://secure-ausomxeha.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: updated the config file:.\config\OracleDataLoaderOnDemand.config
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: URL of the host set to https://secure-ausomxega.crmondemand.com
    [2012-09-19 14:50:03, 953] DEBUG - BulkOpsClientUtil.determineWSHostURL () [hand]: get out.
    [2012-09-19 14:50:03, 953] INFO - [main] trying to connect...
    [2012-09-19 14:50:10, 171] INFO - [main] successfully connected as: XXXX/XXXX
    [2012-09-19 14:50:10, 171] DEBUG - BulkOpsClient.doImport () [hand]: start of execution.
    [2012-09-19 14:50:10, 171] INFO - request Oracle Loader validation on demand data import [main]...
    [2012-09-19 14:50:10, 171] DEBUG - FieldMappingManager.parseMappings () [hand]: start of execution.
    [2012-09-19 14:50:10, 171] DEBUG - FieldMappingManager.parseMappings () [hand]: complete execution.
    [2012-09-19 14:50:11, 328] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: call for submission BulkOpImportGetRequestDetail WS
    [2012-09-19 14:50:11, 328] INFO - SOAP [main], A request was sent to the server to create import demand.
    [2012-09-19 14:50:13, 640] DEBUG - SOAPImpRequestManager.sendImportGetRequestDetail () [Thread-3]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:13, 640] DEBUG - [Thread-3] ODWSSessionKeeperThread.Run (): BulkOpImportGetRequestDetail WS call ends
    [2012-09-19 14:50:13, 640] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: Code of State response SOAP = OK
    [2012-09-19 14:50:13, 640] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: go to sleep for 300 seconds.
    [2012-09-19 14:50:20, 328] INFO - [main] a response to the SOAP request to create the import on the server request has been received.
    [2012-09-19 14:50:20, 328] DEBUG - SOAPImpRequestManager.sendImportCreateRequest () [hand]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:20, 328] INFO - [main] validation of Oracle Data Loader application Import PASSED.
    [2012-09-19 14:50:20, 328] DEBUG - BulkOpsClient.sendValidationRequest () [hand]: complete execution.
    [2012-09-19 14:50:20, 343] DEBUG - ManifestManager.initManifest () [hand]: create manifest directory: .\\Manifest\\
    [2012-09-19 14:50:20, 343] DEBUG - BulkOpsClient.submitImportRequest () [hand]: start of execution.
    [2012-09-19 14:50:20, 390] DEBUG - BulkOpsClient.submitImportRequest () [hand]: sending CSV data Segments.
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.CSVDataSender () [hand]: CSVDataSender will use 1-wire.
    [2012-09-19 14:50:20, 390] INFO - [main] application to Oracle Loader on demand data import with the following request Id: AEGA-FX28VK...
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: creation of thread 0
    [2012-09-19 14:50:20, 390] INFO - [main] import Request Submission Status: started
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: from wire 0
    [2012-09-19 14:50:20, 390] DEBUG - CSVDataSender.sendCSVData () [hand]: there are pending requests. Go to sleep.
    [2012-09-19 14:50:20, 406] DEBUG - CSVDataSenderThread.run () [Thread-5]: Thread 0 Presentation of CSV data Segment: 1 of 1
    [2012-09-19 14:50:24, 328] INFO - [Thread-5] has received a response to the data import SOAP request sent to the server.
    [2012-09-19 14:50:24, 328] DEBUG - SOAPImpRequestManager.sendImportDataRequest () [Thread-5]: asks SOAP sent successfully and received a response
    [2012-09-19 14:50:24, 328] INFO - [Thread-5] A SOAP request that contains the import data was sent to the server: 1 of 1
    [2012-09-19 14:50:24, 328] DEBUG - CSVDataSenderThread.run () [Thread-5]: there is no more waiting for the request to be picked up by Thread 0.
    [2012-09-19 14:50:24, 328] DEBUG - CSVDataSenderThread.run () [Thread-5]: Thread 0 finished now.
    [2012-09-19 14:50:25, 546] INFO - [main] import Request Submission Status: 100.00%
    [2012-09-19 14:50:26, 546] INFO - [main] Presentation of Oracle Data Loader application Import completed successfully.
    [2012-09-19 14:50:26, 546] DEBUG - BulkOpsClient.submitImportRequest () [hand]: complete execution.
    [2012-09-19 14:50:26, 546] DEBUG - BulkOpsClient.doImport () [hand]: complete execution.
    [2012-09-19 14:50:26, 546] INFO - [main] trying to connect...
    [2012-09-19 14:50:31, 390] INFO - XXXX/XXXX [hand] is now disconnected.
    [2012-09-19 14:50:31, 390] DEBUG - ODWSSessionKeeperThread.Run () [Thread-3]: interrupted.
    [2012-09-19 14:50:31, 390] DEBUG - BulkOpsClient.main () [hand]: complete execution.

    Hello

    the points of data loader by default for the production environment without worrying if download you it from intermediary or production.
    To change the pod edit the configuration file and capture the content below:

    hosturl = https://secure-ausomxeha.crmondemand.com
    routingurl = https://secure-ausomxeha.crmondemand.com
    testmode = debug

  • SQL * LOADER OPTION

    Hi all

    First I created a table like
    create table load_sql (ID number (4), ins_time date);

    so I want to load below datatextfile datas

    I have a file like datatext

    10, 23:05:07
    20: 23: 54:34
    30, 13:09:20
    .
    .
    .
    .

    Then I saved the file in a path like C:\temp_file\datatext.txt

    Then I created the ctl as a file

    DOWNLOAD THE DATA
    INFILE 'C:\temp_file\datatext.txt '.
    BADFILE "C:\temp_file\badtext.txt."
    IN THE LOAD_SQL TABLE
    FIELDS TERMINATED BY ', '.
    (
    ID
    INS_TIME "TO_DATE(:INS_TIME,'HH24:MI:SS')".
    )

    Then save it as "C:\temp_file\controltext.ctl".

    Then, I gave the command sqlldr to load data.

    The successful data loading into the table, but I put
    Select * from load_sql;

    This is the view as

    10 1ST JUNE 11
    20 1ST JUNE 11
    30 1 JUNE 11

    then I put

    Select to_char (ins_time, HH24:MI:SS) in the load_sql;
    then I got the original time value.

    On this basis, my question is, is there an option to store a particular moment with sysdate or point with the 01/01/01 in this column to ins_time? and I don't want to store this June 1, 11.

    How can I do?

    Can anyone please share this answer?

    866916 wrote:
    Can I add 00-00-00 with the weather here?

    Sheet 1: Rejected - error on the table LOAD_SQL, column INS_TIME.
    ORA-01847: day of the month must be between 1 and the last day of the month

    >

    I think I can't, because the date range from JAN 01 - right?

    It's true...

  • Update quantities available through data loader

    Dear all,

    I need to update the quantities in stock of a number of elements (5000 points).

    I hope that I can do this with charger data, but which is the screen that I need to update the stock. Is the path, or any other that I can accomplish this stock update...

    Please update...

    Thanks in advance...

    Hello
    We do not update for amount of 5000 Articles through data loader will be a good option. Update of data /Loading by Dataloader is good for fewer records only, but when the volume is too high it is preferable to use interafce program/API and this way you can save a lot of your time and your energy :)

    Why do you not use below approach for updating onhand quantity:

    (* 1) table interface onhand amount of image data for the items for which you want to change the amount of onhand (apps.mtl_transactions_interface) *.
    (* 2) and then use below API to update the quantity onhand *.

    Apps. Inv_Txn_Manager_Pub.process_transactions (p_api_version-online p_api_version,
    p_init_msg_list-online p_init_msg_list,
    p_commit-online p_commit,
    p_validation_level-online p_validation_level,
    x_return_status-online x_return_status,
    x_msg_count-online x_msg_count,
    x_msg_data-online x_msg_data,
    x_trans_count-online x_trans_count,
    p_table-online p_table,
    p_header_id-online p_header_id);

    It's suggestion of backend (onhand API) ONHAND.

    This way update items onhand quantity will be relatively faster than data loader. Please consider this as a suggestion, I'm no way to challenge your approach to the data loader. :)

    Kind regards
    S.P DASH

  • Forcing errors when loading essbase nonleaf data loading

    Hi all

    I was wondering if anyone had experience forcing data load errors when loadrules is trying to push the nonleaf members data in an essbase cube.

    I obviously ETL level to prevent members to achieve State of charge which are no sheet of error handling, but I would however like the management of the additional errors (so not only fix the errors).

    ID much prefer errors to be rejected and shown in the log, rather than being crushed by aggregation in the background

    Have you tried to create a security filter for the user used by the load that allows only write at level 0 and not greater access?

  • Data loading convert timestamp

    I use APEX 5.0 and download data wizard allows to transfer data from excel.

    I define Update_Date as timestamp (2) and the data in Excel is 2015/06/30 12:21:57

    NLS_TIMESTAMP_FORMAT is HH12:MI:SSXFF AM JJ/MM/RR

    I created the rule of transformation for update_date in the body of the function, plsql as below

    declare

    l_input_from_csv varchar2 (25): = to_char(:update_date, 'MM/DD/YYYY HH12:MI:SS AM');

    l_date timestamp;

    Start

    l_date: = to_timestamp (l_input_from_csv, ' MM/DD/YYYY HH12:MI: SS AM' ");

    Return l_date;

    end;

    I keep having error of transformation rule.  If I do create a transformation rule, it will ask for invalid month.

    Please help me to fix this.

    Thank you very much in advance!

    Hi DorothySG,

    DorothySG wrote:

    Please test on demand delivery data loading

    I stick a few examples to the Instruction of Page, you can use copy and paste function, it will give the same result

    Please change a coma separator ',' on the other will be message "do not load".

    Check your Application 21919. I changed the rule of transformation:

    Data is loading properly.

    Kind regards

    Kiran

  • Move rejected records to a table during a data load

    Hi all

    When I run my interfaces sometimes I get errors caused by "invalid records. I mean, some contents of field is not valid.

    Then I would move these invalid to another table records, while the data loading lights in order not to interrupt the loading of data.

    How can I do? There are examples that I could follow?

    Thanks in advance

    concerning

    Hi Alvaro,

    Here you can find different ways to achieve this goal and choose according to your requirement:

    https://community.Oracle.com/thread/3764279?SR=Inbox

  • Data loading Wizzard

    On Oracle Apex, is there a feasibility study to change the default feature

    1. can we convert the load data Wizard just insert to insert / update functionality based on the table of the source?

    2. possibility of Validation - Count of Records < target table is true, then the user should get a choice to continue with insert / cancel the data loading process.

    I use APEX 5.0

    Need it please advice on this 2 points

    Hi Sudhir,

    I'll answer your questions below:

    (1) Yes, loading data can be inserted/updated updated

    It's the default behavior, if you choose the right-hand columns in order to detect duplicate records, you will be able to see the records that are new and those who are up to date.

    (2) it will be a little tricky, but you can get by using the underlying collection. Loading data uses several collections to perform the operations, and on the first step, load us all the records of the user in the collection "CLOB_CONTENT". by checking this against the number of records in the underlying table, you can easily add a new validation before moving on to step 1 - step 2.

    Kind regards

    Patrick

  • The data load has run in Odi but still some interfaces are running in the operator tab

    Hi Experts,

    I'm working on the customization of the olive TREE, we run the incremental load every day. Data loading is completed successfully, but the operator status icon tab showing some interfaces running.

    Could you please, what is the reason behind still running the interfaces tab of the operator. Thanks to Advance.your valuable suggestion is very useful.

    Kind regards

    REDA

    What we called stale session and can be removed with the restart of the agent.

    You can also manually clean session expired operator.

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • FDMEE of planning data loaded successfully but not able to see the data in Planning - export of fish shows in FDMEE

    Hi all

    We loaded FDMEE data to planning, data has been loaded successfully, but not able to see the data in the Planning Application.

    In the processes log, I can see her mentioned data loaded in the Cube. Please advise on this.

    Thank you

    Roshi

    Two things:

    -I wasn't talking about method you import data but export data. You use the SQL method. Go to target Applications, select your application of planning/essbase, and set load method as a file. Memorize your settings

    2014-06-19 12:26:50, 692 [AIF] INFO: rules properly locked the file AIF0028

    2014-06-19 12:26:50, 692 INFO [AIF]: load data into the cube by launching the rules file...

    2014-06-19 12:26:50, 692 INFO [AIF]: loading data into the cube using sql...

    2014-06-19 12:26:50, 801 [AIF] INFO: the data has been loaded by the rules file.

    2014-06-19 12:26:50, 801 [AIF] INFO: Unlocking AIF0028 rules file

    2014-06-19 12:26:50, 801 [AIF] INFO: successfully unlocked rules AIF0028 file

    -Then export again and review. DAT file in the Outbox folder. Is it empty?

    -You need to add a new dimension to your import format (Dimension add > currency). Then add Local as expression

    -Import, validate and export data

  • Data loader detects the NUMBER instead of VARCHAR2

    Hello

    In my database, I have a table that stores information about components of vehicles. I created a new process identifier as unique keys, fields of manufacturer and reference data loading.

    (For example: manufacturer 'BOSCH', '0238845' reference)

    When the system is running the data map, it detects the column reference number, delete the first zero character '0238845' -> 238845

    How can I solve this problem?

    Kind regards

    transformation in the data loader to make a character?

    Thank you

    Tony Miller

    Software LuvMuffin

Maybe you are looking for