Calculation script, loading data in each block?

Hello

I have a calculation script that does not replace an existing value with 1000. It should not do anything if the value is 0 or #missing, but I found it load actually numbers on all the blocks without worrying. Am I wrong in something?

:

FIX (@RELATIVE(Dimension1,0), @RELATIVE(Dimension2,0))
FIX (& CURRENT_SCENARIO, & CURRENT_YEAR)
"ACC100")
IF (ACC100! = 0 OR ACC100! = #Missing)
ACC100 = 1000;
ENDIF
ENDFIX
ENDFIX


Any help will be appreciated.


Kind regards

you want to use AND it is not OR

in your case it will always return the true value, it will not not 0 or may not be #MISSING.

If you use, and it won't work if its not the 0 and not #missing

Tags: Business Intelligence

Similar Questions

  • If I run the calc script to aggregate a BSO cube, it blocks and release each block in a few seconds?

    Hello

    If I run the calc script to aggregate a BSO cube, it blocks and release each block in a few seconds? Or is it keep held locked blocks even after aggregation for this block is over?

    For example if I correct sparse dimensions Forecast, FY15, dec.  and my accounts dimension is only dense, after calc has my senior level members in the accounts, it it will issue after updating (i.e. in fractions of seconds) or is agg keep it held in a lock?

    I ask because I want to run scripts tot., but there is the update of our cube users.  I never had a problem to start agg, while users are updating.  But maybe I am lucky.  If a user updates a closed block, they will receive an error message that I think.  They may try to update again after a few seconds, I hope.

    Thank you.

    Locking behavior for BSO Essbase is described in the database administrator's Guide: http://docs.oracle.com/cd/E57185_01/epm.1112/essbase_db/dstinteg.html

    It is certainly theoretically possible that a user can obtain a lock because of a calc, although I can't say I saw him be a problem in real-world applications (perhaps because access uncommitted is the default).

  • Need a sql script loader to load data into a table

    Hello

    IM new to Oracle... Learn some basic things... and now I want the steps to do to load the data from a table dump file...

    and the script for sql loader

    Thanks in advance

    Hello

    You can do all these steps for loading data...

    Step 1:

    Create a table in Toad to load your data...

    Step 2:

    Creating a data file... Create your data file with column headers...

    Step 3:

    Creating a control file... Create your control file to load the data from the table data file (there is a structure of control file, you can search through the net)

    Step 4:

    Move the data file and the control file in the path of the server...

    Step 5:

    Load the data into the staging table using sql loader.

    sqlldr control = data =

    connect as: username/password@instance.

  • Build a relationship of Scripts to load data by using a load rule

    You can build a relationship of Scripts to load data using a rule charge and if so, what would this process

    Please notify

    The requirement is in reality a quest for the Holy Grail (instantaneous aggregation in the planning of the BSO)?  We could have a conversation down to level 1 "when there is 10 000' options without being noticed.

  • loading data to essbase using EAS against back-end script

    Good afternoon

    We have noticed recently that when loading of our ASO cube (with the help of back-end scripts - esscmd etc.) it seems to load much more overhead then when using EAS and loading files of data individually.  When loading using scripts, the total size of the cube has been 1.2 gig.  When you load the files individually to EAS, the size of the cube has been 800 meg.  Why the difference?  is there anything we can do in scripts supported to reduce this burden?

    Thank you

    You are really using EssCmd to load the ASO cubes?  You should use MAxL with buffers to load. Default EAS uses a buffer to load when you load multiple files. Esscmd (and without the command buffer MAxL) won't. It means loads long and larger files. Loading ASO, it takes the existing .dat file and adds the new data. When you are not using a buffer load, it takes the .dat file and the .tmp file and merges together, then when you do the second file, it takes the files .dat (which includes the first load data) and repeats the process. Whenever he does that he has to move together (twice) .dat file and there is growth of the .dat files. If nothing else I'll call it fragmentation, but I don't think it's as simple as that. I think it's just the way the data is stored. When you use a buffer and no slices, he need only do this once.

  • Change in prior calculation percent after loading data

    Hi all

    I have a cube for which I set the calculation prior percent than 50 (for background wall). I loaded data in a partition. Now I want to change the preliminary computation for cent to 35 and load data into another partition. I can do without falling and re-create the cube? Can I change this dynamic parameter at any time?

    OLAP Version: 11.2.0.3.4

    Thank you

    Changing the level of precompute modifies the AGGMAP, and this in turn requires the data in the cube to re - aggregate. Do not charge anything, and other objects should not need to be touched. This should do the trick (with changes appropriate for MY_CUBE and X).

    exec dbms_cube.build('MY_CUBE USING (CLEAR AGGREGATES, SOLVE)', parallelism=>X, add_dimensions=>false)
    
  • Calculating a due date in XI from Adobe using Java script...

    I have two fields in a form. One with the current date is calculated by the following:

    var f = this.getField ("Today"); f.value = new Date();

    The second field, I try to add 7 days to the, but I can't understand it. What script do I need to add in the existing field is filling date and add the new field to fill out this information. I tried a lot of what I found on the internet but nothing seems to work...

    "In addition, please note that I use the ' Document Javascripts" Editor-in-Chief of Adobe Pro XI.

    Any help is appreciated


    It is possible. To make it work as you want, the custom in the field today calculation script should be:

    Custom calculation script

    (function () {}

    Get a reference to the DueDate field

    var f = getField ("DueDate");

    If (event.source & event.source = f) {}

    Proceed with this script if it is triggered by a change in the DueDate field

    return;

    }

    Update the value of this field to show the current date

    var d = new Date();

    Event.Value = util.printd ("mm/dd/yyyy", d);

    Add seven days to the date of the day

    d.setDate (d.getDate () + 7);

    The DueDate field update

    f.Value = util.printd ("mm/dd/yyyy", d);

    })();

    But first remove the DueDate field calculation script. If you do not the unalterable DueDate field, a user will be able to change, but it will reset when any field value changes.

  • Calculation script on page 2 with the data of page 1?

    I'm missing something simple here. I am wanting to write a calculation script by using the sum of two fields of digital type on page one divided by the sum of two fields of digital type on page two. The only thing that I can go to work is if I do a numeric field on page one and you do the calculation of the sum it. Then, if I move it to page two, it's just empty. What Miss me?

    Hello

    You can do this, you simply reference the fields correctly.

    First of all, make sure that you have named the two pages. Leaving the pages/subforms (and objects) without a name makes it more difficult to reference in the script.

    There is a very practical thing when you are in the Script Editor. Move the mouse over another object and press / hold. The mouse turns into a 'V '. Still holding control, click the object, and LC Designer will insert the reference in the script. It's easy enough when objects are close, so do a bit of practice. If the objects are on different pages, then you need to scroll down to the object you want to reference, enable the Script Editor again and then click the object with the control which was held.

    Here is an example of reference objects: http://www.assuredynamics.com/index.php/category/portfolio/referencing-objects/

    Hope that helps,

    Niall

    Ensure the dynamics

  • Reg: using dynamic calc in the calculation script

    Can we use Sylvie calco sparse dimension member in a calculation script. In what situations, we use intelligent calculation and in what situations, we only use intelligent calculation.

    Generally, you disable smart calc for manual calculation scripts, you want to run. Intelligent calculation means that "If you then load a subset of the data, on subsequent calculations, Essbase calculates only the blocks of data that have not been calculated and calculated blocks that require a recalculation because of new data".

    Taken directly from: http://download.oracle.com/docs/cd/E12825_01/epm.111/esb_dbag/dcaoptic.htm

  • Optimization of loading data

    Hello

    I have a cube with the following information dimension and requires optimization for loading data, its data are deleted and loaded each week from SQL data source using the rule of load. He charges 35 million documents and the charge is so slow that for load data excluding the calculation only takes 10 hrs. Is it current? Is there a change in the structure that I have to make loading faster as replacing the scattered measures or change the position of the dimensions. The block size is too large, 52920 B is a little absurd. I also have the following cache settings then please look please give me suggestions on this

    MEASURE accounts Dense 245 (No.) Members)
    Time Dense PERIOD 27
    Sparse CALC No 1
    Rare SCENARIO No 7
    Rare GEO_NM no 50
    Sparse PRODUCT no 8416
    Rare CAMPAIGN No 35
    Sparse SEGMENT No 32

    Cache settings:

    Setting the index Cache: 1024
    The current index Cache: 1024
    Cache data file setting: 32768
    Current value of the Cache data file: 0
    Data caching framework: 3072
    Data cache the current value: 3049

    I'd appreciate any help on this. Thank you!

    If the order of your dimensions is whay you how with the dimensions above dense forst AND your sql follows this order, you will have the WORST possible load. You provoquerez an extreme fragmentation and blocks will be revisited thousands of times, their in and out of memory to disk paging. The workload more efficiently is to have your scattered dimensions of first then the dense dimensions and sort the entry from the first to the last (from left to right) which blocks way will be visited once or at least kept in memory, so there is no physical i/o.

    I had a client who has made what looks like you did and changing the order I got a load of data 7 hrs to 3 minutes. He help things, you might want to restructure your base data before trying this rget rid of the fragmentation that you probably caused. (of course I do this on a test database and erase all data before trying the charges so I can get apples to apples comparisons)

  • Calculation script of the OSB for exchange rates - shared members

    Hello together,

    Happy new year!!!

    I need your help with a BSO calculation script in the exchange rate.

    Here you can see how the entity & currency hierarchies are made up:

    Hierarchy of the entity

    Hierarchie1 (element Stored, EUR) - all children are only once in the hierarchy.

    H1Parent1 (stored element, EUR)

    H1P1Child1 (stored element, EUR)

    H1P1Child2 (stored element, EUR)

    H1Parent2 (stored element, SEK)

    H1P2Child1 (stored element, SEK)

    H1P2Child2 (stored element, SEK)

    Hierarchie2 (Stored item, EUR) - the child might be several times in this hierarchy!

    H2Parent1 (stored element, USD)

    H1Parent1 (element Shared, EUR)

    H1Parent2 (element Shared, SEK)

    H2Parent2 (stored element, CAD)

    H1Parent2 (element Shared, SEK)

    H1P1Child1 (element Shared, EUR)

    Hierarchy of currencies

    Local currency

    EUR

    USD

    SEK

    CAD

    Reported (group currency)

    Euro

    Parent currency

    EUR_USD

    EUR_CAD

    EUR_SEK

    First script:

    For the exchange rate of the national currency in euro (1 hierarchy), it was very easy because I could give each level-0-member of an attribute (UDA) for the currency. So it's pretty easy to create a script for all the members-0-level ((DEV) duration: 12 minutes)

    'Euro')

    IF (@ISUDA ("Accounts", "AR"))

    IF (@ISUDA ("Entities", "heart")) 'Euro ' is' EUR '.

    "Else if (@ISUDA ("entities","cAED")) 'Euro' = (@SUMRange ("AED","January": & CurrMonth)) /"AED"-> Average_Rate;

    ENDIF

    ENDIF);

    Second script - Part1:

    But for the second hierarchy (hierarchy 2) I have the problem that there are a lot of shared items and I can't give them an obvious UDA in which I can calculate the currency ((DEV) duration: 12 minutes).

    ("Parent currency"

    IF (@ISUDA ("Accounts", "AR"))

    IF (@ISMBR (@RDescendants("H2Parent1",0))) 'Euro_USD' = (@SUMRange ("Euro", "January": "January")) * "USD"-> "Average_Rate";

    ElseIf (@ISMBR (@RDescendants("H2Parent2",0))) 'Euro_CAD' = (@SUMRange ("Euro", "January": "January")) * "CAD"-> "Average_Rate";

    ENDIF

    ENDIF);

    Second script - Part2:

    This formula works, but it is very slow, because I have several fixes. (Duration (DEV): round about 1 h)

    FIX (@RDescendants("H2Parent1",0))

    ("Parent currency"

    IF (@ISUDA ("Accounts", "AR")) "Euro_USD" = (@SUMRange("Euro","January":"January")) * «USD»-> «Average_Rate»;)

    ElseIf (("Accounts", "CR") @ISUDA) "Euro_USD" = (@SUMRange("euro","January":"January")) * «USD»-> «Closing_Rate»;)

    ENDIF);

    ENDFIX


    You have an idea how I can improve my second calculation script in a dynamic and efficient?


    I hope that I have described my problem very well.


    Thanks a lot for your help


    Ben

    change "Motto of Parent" to "Euro_USD" would not make a difference if the hierarchy of currencies is Sparse

    I'll try to make the dense hierarchy of currencies as long as it does not have the size of your too big block. If make you it dense AND change "motto of Parent' to 'Euro_USD', it must increase the speed a lot.  However to change the size of Sparse to Dense is a very big change for the database.

  • Question to load data using sql loader in staging table, and then in the main tables!

    Hello

    I'm trying to load data into our main database table using SQL LOADER. data will be provided in separate pipes csv files.

    I have develop a shell script to load the data and it works fine except one thing.

    Here are the details of a data to re-create the problem.

    Staging of the structure of the table in which data will be filled using sql loader

    create table stg_cmts_data (cmts_token varchar2 (30), CMTS_IP varchar2 (20));

    create table stg_link_data (dhcp_token varchar2 (30), cmts_to_add varchar2 (200));

    create table stg_dhcp_data (dhcp_token varchar2 (30), DHCP_IP varchar2 (20));

    DATA in the csv file-

    for stg_cmts_data-

    cmts_map_03092015_1.csv

    WNLB-CMTS-01-1. 10.15.0.1

    WNLB-CMTS-02-2 | 10.15.16.1

    WNLB-CMTS-03-3. 10.15.48.1

    WNLB-CMTS-04-4. 10.15.80.1

    WNLB-CMTS-05-5. 10.15.96.1

    for stg_dhcp_data-

    dhcp_map_03092015_1.csv

    DHCP-1-1-1. 10.25.23.10, 25.26.14.01

    DHCP-1-1-2. 56.25.111.25, 100.25.2.01

    DHCP-1-1-3. 25.255.3.01, 89.20.147.258

    DHCP-1-1-4. 10.25.26.36, 200.32.58.69

    DHCP-1-1-5 | 80.25.47.369, 60.258.14.10

    for stg_link_data

    cmts_dhcp_link_map_0309151623_1.csv

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    DHCP-1-1-2. WNLB-CMTS-03-3,WNLB-CMTS-04-4,WNLB-CMTS-05-5

    DHCP-1-1-3. WNLB-CMTS-01-1

    DHCP-1-1-4. WNLB-CMTS-05-8,WNLB-CMTS-05-6,WNLB-CMTS-05-0,WNLB-CMTS-03-3

    DHCP-1-1-5 | WNLB-CMTS-02-2,WNLB-CMTS-04-4,WNLB-CMTS-05-7

    WNLB-DHCP-1-13 | WNLB-CMTS-02-2

    Now, after loading these data in the staging of table I have to fill the main database table

    create table subntwk (subntwk_nm varchar2 (20), subntwk_ip varchar2 (30));

    create table link (link_nm varchar2 (50));

    SQL scripts that I created to load data is like.

    coil load_cmts.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_cmts IS SELECT *.

    OF stg_cmts_data;

    TYPE t_stg_cmts IS TABLE OF stg_cmts_data % ROWTYPE INDEX BY pls_integer;

    l_stg_cmts t_stg_cmts;

    l_cmts_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_cmts.

    Get the c_stg_cmts COLLECT in BULK IN l_stg_cmts;

    BECAUSE me IN l_stg_cmts. FIRST... l_stg_cmts. LAST

    LOOP

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_cmts (i) .cmts_token;

    IF l_cmts_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_cmts (i) .cmts_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_cmts (i) .cmts_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_cmts EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for dhcp


    coil load_dhcp.log

    Set serveroutput on

    DECLARE

    CURSOR c_stg_dhcp IS SELECT *.

    OF stg_dhcp_data;

    TYPE t_stg_dhcp IS TABLE OF stg_dhcp_data % ROWTYPE INDEX BY pls_integer;

    l_stg_dhcp t_stg_dhcp;

    l_dhcp_cnt NUMBER;

    l_cnt NUMBER;

    NUMBER of l_cnt_1;

    BEGIN

    OPEN c_stg_dhcp.

    Get the c_stg_dhcp COLLECT in BULK IN l_stg_dhcp;

    BECAUSE me IN l_stg_dhcp. FIRST... l_stg_dhcp. LAST

    LOOP

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = l_stg_dhcp (i) .dhcp_token;

    IF l_dhcp_cnt < 1 THEN

    INSERT

    IN SUBNTWK

    (

    subntwk_nm

    )

    VALUES

    (

    l_stg_dhcp (i) .dhcp_token

    );

    DBMS_OUTPUT. Put_line ("token has been added: ' |") l_stg_dhcp (i) .dhcp_token);

    ON THE OTHER

    DBMS_OUTPUT. Put_line ("token is already present'");

    END IF;

    WHEN l_stg_dhcp EXIT. COUNT = 0;

    END LOOP;

    commit;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    for link -.

    coil load_link.log

    Set serveroutput on

    DECLARE

    l_cmts_1 VARCHAR2 (4000 CHAR);

    l_cmts_add VARCHAR2 (200 CHAR);

    l_dhcp_cnt NUMBER;

    l_cmts_cnt NUMBER;

    l_link_cnt NUMBER;

    l_add_link_nm VARCHAR2 (200 CHAR);

    BEGIN

    FOR (IN) r

    SELECT dhcp_token, cmts_to_add | ',' cmts_add

    OF stg_link_data

    )

    LOOP

    l_cmts_1: = r.cmts_add;

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    SELECT COUNT (1)

    IN l_dhcp_cnt

    OF subntwk

    WHERE subntwk_nm = r.dhcp_token;

    IF l_dhcp_cnt = 0 THEN

    DBMS_OUTPUT. Put_line ("device not found: ' |") r.dhcp_token);

    ON THE OTHER

    While l_cmts_add IS NOT NULL

    LOOP

    l_add_link_nm: = r.dhcp_token |' _TO_' | l_cmts_add;

    SELECT COUNT (1)

    IN l_cmts_cnt

    OF subntwk

    WHERE subntwk_nm = TRIM (l_cmts_add);

    SELECT COUNT (1)

    IN l_link_cnt

    LINK

    WHERE link_nm = l_add_link_nm;

    IF l_cmts_cnt > 0 AND l_link_cnt = 0 THEN

    INSERT INTO link (link_nm)

    VALUES (l_add_link_nm);

    DBMS_OUTPUT. Put_line (l_add_link_nm |) » '||' Has been added. ") ;

    ELSIF l_link_cnt > 0 THEN

    DBMS_OUTPUT. Put_line (' link is already present: ' | l_add_link_nm);

    ELSIF l_cmts_cnt = 0 then

    DBMS_OUTPUT. Put_line (' no. CMTS FOUND for device to create the link: ' | l_cmts_add);

    END IF;

    l_cmts_1: = TRIM (SUBSTR (l_cmts_1, INSTR (l_cmts_1, ',') + 1));

    l_cmts_add: = TRIM (SUBSTR (l_cmts_1, 1, INSTR (l_cmts_1, ',') - 1));

    END LOOP;

    END IF;

    END LOOP;

    COMMIT;

    EXCEPTION

    WHILE OTHERS THEN

    Dbms_output.put_line ('ERROR' |) SQLERRM);

    END;

    /

    output

    control files -

    DOWNLOAD THE DATA

    INFILE 'cmts_data.csv '.

    ADD

    IN THE STG_CMTS_DATA TABLE

    When (cmts_token! = ") AND (cmts_token! = 'NULL') AND (cmts_token! = 'null')

    and (cmts_ip! = ") AND (cmts_ip! = 'NULL') AND (cmts_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:cmts_token))' cmts_token,

    cmts_ip ' RTRIM (LTRIM(:cmts_ip)) ")". "

    for dhcp.


    DOWNLOAD THE DATA

    INFILE 'dhcp_data.csv '.

    ADD

    IN THE STG_DHCP_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (dhcp_ip! = ") AND (dhcp_ip! = 'NULL') AND (dhcp_ip! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    dhcp_ip ' RTRIM (LTRIM(:dhcp_ip)) ")". "

    for link -.

    DOWNLOAD THE DATA

    INFILE 'link_data.csv '.

    ADD

    IN THE STG_LINK_DATA TABLE

    When (dhcp_token! = ") AND (dhcp_token! = 'NULL') AND (dhcp_token! = 'null')

    and (cmts_to_add! = ") AND (cmts_to_add! = 'NULL') AND (cmts_to_add! = 'null')

    FIELDS TERMINATED BY ' |' SURROUNDED OF POSSIBLY "" "

    TRAILING NULLCOLS

    ('RTRIM (LTRIM (:dhcp_token))' dhcp_token,

    cmts_to_add TANK (4000) RTRIM (LTRIM(:cmts_to_add)) ")" ""

    SHELL SCRIPT-

    If [!-d / log]

    then

    Mkdir log

    FI

    If [!-d / finished]

    then

    mkdir makes

    FI

    If [!-d / bad]

    then

    bad mkdir

    FI

    nohup time sqlldr username/password@SID CONTROL = load_cmts_data.ctl LOG = log/ldr_cmts_data.log = log/ldr_cmts_data.bad DISCARD log/ldr_cmts_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    nohup time username/password@SID @load_cmts.sql

    nohup time sqlldr username/password@SID CONTROL = load_dhcp_data.ctl LOG = log/ldr_dhcp_data.log = log/ldr_dhcp_data.bad DISCARD log/ldr_dhcp_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_dhcp.sql

    nohup time sqlldr username/password@SID CONTROL = load_link_data.ctl LOG = log/ldr_link_data.log = log/ldr_link_data.bad DISCARD log/ldr_link_data.reject ERRORS = BAD = 100000 LIVE = TRUE PARALLEL = TRUE &

    time nohup sqlplus username/password@SID @load_link.sql

    MV *.log. / log

    If the problem I encounter is here for loading data in the link table that I check if DHCP is present in the subntwk table, then continue to another mistake of the newspaper. If CMTS then left create link to another error in the newspaper.

    Now that we can here multiple CMTS are associated with unique DHCP.

    So here in the table links to create the link, but for the last iteration of the loop, where I get separated by commas separate CMTS table stg_link_data it gives me log as not found CMTS.

    for example

    DHCP-1-1-1. WNLB-CMTS-01-1,WNLB-CMTS-02-2

    Here, I guess to link the dhcp-1-1-1 with balancing-CMTS-01-1 and wnlb-CMTS-02-2

    Theses all the data present in the subntwk table, but still it gives me journal wnlb-CMTS-02-2 could not be FOUND, but we have already loaded into the subntwk table.

    same thing is happening with all the CMTS table stg_link_data who are in the last (I think here you got what I'm trying to explain).

    But when I run the SQL scripts in the SQL Developer separately then it inserts all valid links in the table of links.

    Here, she should create 9 lines in the table of links, whereas now he creates only 5 rows.

    I use COMMIT in my script also but it only does not help me.

    Run these scripts in your machine let me know if you also get the same behavior I get.

    and please give me a solution I tried many thing from yesterday, but it's always the same.

    It is the table of link log

    link is already present: dhcp-1-1-1_TO_wnlb-cmts-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-02-2

    link is already present: dhcp-1-1-2_TO_wnlb-cmts-03-3
    link is already present: dhcp-1-1-2_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-5

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-01-1

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-8
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-6
    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-0

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-03-3

    link is already present: dhcp-1-1-5_TO_wnlb-cmts-02-2
    link is already present: dhcp-1-1-5_TO_wnlb-cmts-04-4

    NOT FOUND CMTS for device to create the link: wnlb-CMTS-05-7

    Device not found: wnlb-dhcp-1-13

    IF NEED MORE INFORMATION PLEASE LET ME KNOW

    Thank you

    I felt later in the night that during the loading in the staging table using UNIX machine he created the new line for each line. That is why the last CMTS is not found, for this I use the UNIX 2 BACK conversion and it starts to work perfectly.

    It was the dos2unix error!

    Thank you all for your interest and I may learn new things, as I have almost 10 months of experience in (PLSQL, SQL)

  • Dynamo admin: method to start loading data?

    Hello

    The files get generated in the folder you want. Now I need to test the load of data takes place successfully rather than wait until the next run time.

    Which is the method in the admin of dynamo I call inorder to start loading data?

    I tried to call the method loadAllAvailable() method under each Chargers, but it keeps returning 0 even if there are a lot of log files in the folder.

    Please let me know.

    Thank you

    Saud

    Hello

    Finally managed to find where I had made the mistake. Eventhough the ARF.base has been added to the modules for construction of production, it was not in the list of modules in the startup for the production instance script.

    Thus the LogRotationSink component was not enabled in dynamoMessagingSystem.xml

    Once the ARF.base module has been included in the startup script, everything worked like a charm

    Thank you

    Saud

  • Load data for a YEAR in a scenario of periodicals

    Dear,

    I think that it is an easy one, but I want to be 100% sure of what I should expect from HFM.

    Need to load balances for a YEAR in a regular scenario (periodic = default view, zeroview = periodic, consolidate the CDA = no)

    I have to specify the "CDA" string in the file, that I download?

    Periodic data are automatically calculated as the difference between consecutive CDA balances?

    Is there some advise/alert should I consider?

    A big thanks to all the world.

    Yes, you must specify the CDA in the files that you upload.  As a general rule, you want to avoid to load data to a recurring scenario for a YEAR, and vice versa.  In your example, a question if you have loaded data from one YEAR to a flow account and then the amount of your VOE in the month following to zero resulting in your not having data file is not a record for this intersection next month.  For example, if you have loaded $1,000 in January to sell your account but then the amount of the CDA in February is zero (rare but it happens) then you would not have a record of this account data from sale in February so that after loading you expect to see a zero number CDA in HFM.  However, with the zeroview the value of periodic, HFM expects a periodic amount and when you're not busy it assumes that the periodical (not a year) is zero and that it derives from the number of CDA.  In this case, it would show a value of February of the periodical of $0 and $1,000 for a YEAR when you expect $0 CDA and - periodical of $1,000.  Bottom line is that you should take periodic data to a periodic script and vice versa.

  • Import the custom calculation Script

    I have a purchase order General Im is about to build. I have a lot of subtotal fields needing a custom on them calculation script.

    Is it possible to load a script (XML / import data) in all my fields desired with the appropriate script?

    Or is there little javascript that allows to calculate per line?

    Sample:

    https://DL.dropboxusercontent.com/u/2944617/sample.PDF

    CAL script:

    Event.Value = (this.getField("__PRICE-01__").value * this.getField("__QTY-01__").value);

    If (+ event.value = 0) event.value = "";

    You can use a script to set calculation script custom field, like this:

    this.getField("FieldName").setAction ("calculate", "code as string");

Maybe you are looking for