Data loading Wizzard

On Oracle Apex, is there a feasibility study to change the default feature

1. can we convert the load data Wizard just insert to insert / update functionality based on the table of the source?

2. possibility of Validation - Count of Records < target table is true, then the user should get a choice to continue with insert / cancel the data loading process.

I use APEX 5.0

Need it please advice on this 2 points

Hi Sudhir,

I'll answer your questions below:

(1) Yes, loading data can be inserted/updated updated

It's the default behavior, if you choose the right-hand columns in order to detect duplicate records, you will be able to see the records that are new and those who are up to date.

(2) it will be a little tricky, but you can get by using the underlying collection. Loading data uses several collections to perform the operations, and on the first step, load us all the records of the user in the collection "CLOB_CONTENT". by checking this against the number of records in the underlying table, you can easily add a new validation before moving on to step 1 - step 2.

Kind regards

Patrick

Tags: Database

Similar Questions

  • Forcing errors when loading essbase nonleaf data loading

    Hi all

    I was wondering if anyone had experience forcing data load errors when loadrules is trying to push the nonleaf members data in an essbase cube.

    I obviously ETL level to prevent members to achieve State of charge which are no sheet of error handling, but I would however like the management of the additional errors (so not only fix the errors).

    ID much prefer errors to be rejected and shown in the log, rather than being crushed by aggregation in the background

    Have you tried to create a security filter for the user used by the load that allows only write at level 0 and not greater access?

  • Data loading convert timestamp

    I use APEX 5.0 and download data wizard allows to transfer data from excel.

    I define Update_Date as timestamp (2) and the data in Excel is 2015/06/30 12:21:57

    NLS_TIMESTAMP_FORMAT is HH12:MI:SSXFF AM JJ/MM/RR

    I created the rule of transformation for update_date in the body of the function, plsql as below

    declare

    l_input_from_csv varchar2 (25): = to_char(:update_date, 'MM/DD/YYYY HH12:MI:SS AM');

    l_date timestamp;

    Start

    l_date: = to_timestamp (l_input_from_csv, ' MM/DD/YYYY HH12:MI: SS AM' ");

    Return l_date;

    end;

    I keep having error of transformation rule.  If I do create a transformation rule, it will ask for invalid month.

    Please help me to fix this.

    Thank you very much in advance!

    Hi DorothySG,

    DorothySG wrote:

    Please test on demand delivery data loading

    I stick a few examples to the Instruction of Page, you can use copy and paste function, it will give the same result

    Please change a coma separator ',' on the other will be message "do not load".

    Check your Application 21919. I changed the rule of transformation:

    Data is loading properly.

    Kind regards

    Kiran

  • Move rejected records to a table during a data load

    Hi all

    When I run my interfaces sometimes I get errors caused by "invalid records. I mean, some contents of field is not valid.

    Then I would move these invalid to another table records, while the data loading lights in order not to interrupt the loading of data.

    How can I do? There are examples that I could follow?

    Thanks in advance

    concerning

    Hi Alvaro,

    Here you can find different ways to achieve this goal and choose according to your requirement:

    https://community.Oracle.com/thread/3764279?SR=Inbox

  • The data load has run in Odi but still some interfaces are running in the operator tab

    Hi Experts,

    I'm working on the customization of the olive TREE, we run the incremental load every day. Data loading is completed successfully, but the operator status icon tab showing some interfaces running.

    Could you please, what is the reason behind still running the interfaces tab of the operator. Thanks to Advance.your valuable suggestion is very useful.

    Kind regards

    REDA

    What we called stale session and can be removed with the restart of the agent.

    You can also manually clean session expired operator.

  • Ignore the ASO - zero data loads and missing values

    Hello

    There is an option that ignores the zero values & the missing values in the dialog box when loading data in cube ASO interactively via EAS.

    Y at - it an option to specify the same in the MAXL Import data command? I couldn't find a technical reference.

    I have 12 months in the columns in the data flow. At least 1/4 of my data is zeros. Ignoring zeros keeps the size of the cube small and faster.

    We are on 11.1.2.2.

    Appreciate your thoughts.

    Thank you

    Ethan.

    The thing is that it's hidden in the command Alter Database (Aggregate Storage) , when you create the data loading buffer.  If you are not sure what a buffer for loading data, see loading data using pads.

  • Data loading 10415 failed when you export data to Essbase EPMA app

    Hi Experts,

    Can someone help me solve this issue I am facing FDM 11.1.2.3

    I'm trying to export data to the application Essbase EPMA of FDM

    import and validate worked fine, but when I click on export its failure

    I am getting below error

    Failed to load data

    10415 - data loading errors

    Proceedings of Essbase API: [EssImport] threw code: 1003029 - 1003029

    Encountered in the spreadsheet file (C:\Oracle\Middleware\User_Projects\epmsystem1\EssbaseServer\essbaseserver1\app\Volv formatting

    I have Diemsion members

    1 account

    2 entity

    3 scenario

    4 year

    5. period

    6 regions

    7 products

    8 acquisitions

    9 Servicesline

    10 Functionalunit

    When I click on the button export its failure

    I checked 1 thing more inception. DAT file but this file is empty

    Thanks in advance

    Hello

    Even I was facing the similar problem

    In my case I am loading data to the Application of conventional planning. When all the dimension members are ignored in the mapping for the combination, you try to load the data, and when you click Export, you will get the same message. . DAT empty file is created

    You can check this

    Thank you

    Praveen

  • FDMEE of planning data loaded successfully but not able to see the data in Planning - export of fish shows in FDMEE

    Hi all

    We loaded FDMEE data to planning, data has been loaded successfully, but not able to see the data in the Planning Application.

    In the processes log, I can see her mentioned data loaded in the Cube. Please advise on this.

    Thank you

    Roshi

    Two things:

    -I wasn't talking about method you import data but export data. You use the SQL method. Go to target Applications, select your application of planning/essbase, and set load method as a file. Memorize your settings

    2014-06-19 12:26:50, 692 [AIF] INFO: rules properly locked the file AIF0028

    2014-06-19 12:26:50, 692 INFO [AIF]: load data into the cube by launching the rules file...

    2014-06-19 12:26:50, 692 INFO [AIF]: loading data into the cube using sql...

    2014-06-19 12:26:50, 801 [AIF] INFO: the data has been loaded by the rules file.

    2014-06-19 12:26:50, 801 [AIF] INFO: Unlocking AIF0028 rules file

    2014-06-19 12:26:50, 801 [AIF] INFO: successfully unlocked rules AIF0028 file

    -Then export again and review. DAT file in the Outbox folder. Is it empty?

    -You need to add a new dimension to your import format (Dimension add > currency). Then add Local as expression

    -Import, validate and export data

  • Data loader detects the NUMBER instead of VARCHAR2

    Hello

    In my database, I have a table that stores information about components of vehicles. I created a new process identifier as unique keys, fields of manufacturer and reference data loading.

    (For example: manufacturer 'BOSCH', '0238845' reference)

    When the system is running the data map, it detects the column reference number, delete the first zero character '0238845' -> 238845

    How can I solve this problem?

    Kind regards

    transformation in the data loader to make a character?

    Thank you

    Tony Miller

    Software LuvMuffin

  • ODI - SQL for Hyperion Essbase data loading

    Hello

    We have created a 'vision' in SQL Server that contains our data.  The view currently has every year and periods of Jan 2011 to present.  Each period is about 300 000 records.  I want to only load one period at a time.  For example may 2013.  Currently we use ODBC through a rule of data loading, but the customer wants to use ODI to be compatible with the versions of dimension metadata.  Here's the SQL on the view that works very well.   Is there a way I can run this SQL in the ODI Interface so it pulls only what I declare in the Where clause?  If yes where can I do it?

    Select

    CATEGORY, YEAR, LOCATION, SCRIPT, DEPT, PROJECT, EXPCODE, TIME, ACCOUNT, AMOUNT

    Of

    PS_LHI_HYP_PRJ_ACT

    Where

    YEAR > = "2013" AND PERIOD = 'MAY '.

    ORDER BY CATEGORY ASC ASC FISCAL_YEAR, LOCATION ASC, ASC, ASC, ASC, ASC, PERIOD EXPCODE PROJECT DEPT SCENARIO CSA ACCOUNT CSA;

    Hello

    Simply use the following KM to load data - IKM SQL for Hyperion Essbase (DATA) - in an ODI interface that has the view that you created the Source model. You can add filters to the source which are dynamically by ODI variables to create the Where clause based on the month and year. Make sure you only specify a rule of load method to load the data into the KM

  • Essbase in MSCS Cluster (metadata and data load failures)

    Hello

    Is there a power failure on the active node of the Cluster Essbase (call this node A) and the Cube needs to be rebuilt on the node of Cluster B, how the Cube will be rebuilt on Cluster Node B.

    What will orchestrate the activities required in order to rebuild the Cube)? Both Essbase nodes are mounted on Microsoft cluster Services.

    In essence, I want to know

    (A) Comment do to handle the load of metadata that failed on Node1 to Node2 Essbase?

    (B) makes the continuous session to run meta-data / load on the Second knot, Essbase data when the first node of Essbase fails?

    Thank you for your help in advance.

    Kind regards

    UB.

    If the failover product then all connections on the active node will be lost as Essbase will restart on the second node, just treat the same as if you restarted the Essbase service and had a metaload running that it would fail to the point when Essbase breaks down.

    See you soon

    John

    http://John-Goodwin.blogspot.com/

  • problems with the JSON data loading

    Hello

    I have follow-up Simon Widjaja (EDGEDOCKS) YouTube lesson for the JSON data loading external. But I am not able to connect at least the console database.

    I get this error: "error avascript in the handler! Type of event = element.

    Content.JSON is located in the folder. Data there are very simple:

    [

    {

    "title": "TITLE 1",

    'description': "DESCRIPTION 1"

    },

    {

    "title": "TITLE 2",

    'description': "DESCRIPTION 2"

    }

    ]

    And here's the code in edgeActions.js:

    (function ($, edge, compId) {})

    Composition of var = Edge.Composition, symbol = Edge.Symbol; alias for classes of edge commonly used

    Edge symbol: "internship."

    (function (symbolName) {}

    Symbol.bindElementAction (compId, NomSymbole, 'document', 'compositionReady', function (sym, e) {})

    external json data loading

    $.ajax({)

    type: 'GET ',.

    cache: false,

    URL: "content.json",

    data type: 'json ',.

    success: function (data) {console.log ("data:", data);},

    error: function() {console.log ("something went wrong") ;}}

    });

    });

    End of binding edge

    (}) ('step');

    End of edge symbol: "internship."

    }) (window.jQuery |) AdobeEdge. ($, AdobeEdge, "EDGE-11125477");

    I tried $getJSON also as mentioned in the youtube video.

    Please note: I do not understand 'something was wrong' also connected.

    I use the free trial version. It is a limitation in the free trial version?

    Well, same question as here: loading external data using ajax

    Cannot run the jQuery file is missing, then $. ajax() or $. getJSON().

    You must add the jQuery file as shown below:

    See: http://jquery.com/download/

    Note: Without loading the jQuery file, you can use these functions: API JavaScript Adobe Edge animate CC

  • What LKM and IKM for b/w MSSQL 2005 and Oracle 11 of fast data loading

    Hello

    Can anyone help to decide what LKMs and IKMs are best for data loading between MSSQL and Oracle.

    Staging area is Oracle. Need to load around the lines of 400Million of MSSQL to Oracle 11 g.

    Best regards
    Muhammad

    "LKM MSSQL to ORACLE (BCP SQLLDR)" may be useful in your case which uses BCP and SQLLDR to extract and laod of MSSQL and Oracle database.

    Please see details on KMs to the http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/ms_sqlserver.htm#BGBJBGCC

  • Apex data load configuration missing table when importing to the new workspace

    Hi everyone knows this show before and have a work around.

    I export / import my request for a different workspace, and everything works fine except for 1 loading tables data.

    In my application, I use tables of data loading, and it seems that the latter do not properly Setup for the table object to load data. This causes my application to fail when the user tries to download a text file.
    Does anyone have a work around next to recreate the table of data load object?

    Breadcrumb: Components shared-> load data tables-> add modify data load table

    The app before exporting displays Workspace: OOS
    Single column 1 - CURRENCY_ID (number)
    Single column 2 - month (Date)

    When I import the app in the new workspace (OOS_UAT) data type is absent.
    Single column 1 - CURRENCY_ID
    Single column 2 - MONTH

    When I import the same workspace app: OOS I do not know this problem

    Version of the apex: Application Express 4.1.1.00.23

    Hi all

    If you run 4.1.1 it was a bug 13780604 (DATA DOWNLOAD WIZARD FAILED if EXPORTS of OTHER workspace) and have been fixed. You can download the fix for 13780604 (support.us.oracle.com) and the associated 4.1.1

    Kind regards
    Patrick

  • Do we need to re - create data load rules if we move from EBS 11 to 12?

    If so, please explain why. Thank you.

    If you switch from EBS 11-12 of the EBS, you need to create a new entry in the source system in ERPi.

    Once the new source system is created, you then need to initialize the source system in ERPi.

    From there, you need to associate it with a format of import and localities in ERPI and your data load rules are then based on location.

Maybe you are looking for