Date in the DataExport Script logic

Hi all


I'm working on a script dataexport to what requirements is to have all data between the 01-10-2012 and 2013-09-30 for a member of the "Date of beginning of FTES' exported.
I am new to this day of logic in the planning and Essbase and have never worked on any condition like that. I scoured the technology guide and find that the functions @todate or @Formatdate can be used for the same thing.
I was hoping if anyone can guide me through which if function I use in my calc script to limit the data of these two dates.
Please let me know if I'm working on the right track, or if there is a different approach through which we can achieve.
FYI, I'm working on 11.1.2 version

I appreciate your advice and suggestions

Thank you
Ranjan

Hi Roger

Account planning using the 'Date' format data storage in Essbase in form of numbers in the format YYYYMMDD, for example, 20120829 is today's date.

I think that to try to export using the actual values for a member rather than a subset of the members you must look at the function DATAEXPORTCOND in the technical reference on Essbase. An example of the syntax of this doc is provided below:

SET DATAEXPORTOPTIONS
{
DataExportLevel 'ALL '.
};
DATAEXPORTCOND (real > = 2 AND sale > 2000 GOLD COGS > 600);
Fix("100-10","East");
DATAEXPORT 'File' ',' 'E:\temp\2222.txt ';
ENDFIX;

The bit in bold is the important bit for you, set the condition to be your scenario, for example

DATAEXPORTCOND (real > = 20121001 AND actual)<=>
DIFFICULTY (FTE start date");
.. .etc
ENDFIX;

Hope this helps
Stuart

Tags: Business Intelligence

Similar Questions

  • A good resource to learn the Web script logic

    I need a good resource with examples.

    The first step is reading [http://edocs.bea.com/wls/docs103/pdf/config_scripting.pdf].

    It is the totality of all the WLST documentation. Beyond that, it is useful to know that the WLST scripts use Jython and Python to make things happen, therefore, more that know you about these two is useful. The warning that is WLST does not use the latest version of one of those.

    I wrote a blog post on how to use getopt with WLST: [http://davidmichaelkarr.blogspot.com/2008/10/make-wlst-scripts-more-flexible-with.html]

  • Sending data to the PHP Script

    I am writing an application that needs to send a GET request with parameters (ie: http://blah.com/GET?name=bob) to a PHP script that interacts with a MySQL database on a server.

    Anyone know where I can find a good tutorial on this or point me to another thread?

    You can use URLEncodedPostData (check the documentation for the API).

    You should do something like this:

    
    URLEncodedPostData postData = new URLEncodedPostData(URLEncodedPostData.DEFAULT_CHARSET, true);
    
    postdata.append("www.myurl.com/foobar?");
    postdata.append("name",m_nameParam);
    postData.append("id",Integer.toString(m_locationId));
    postData.append("text",m_checkinText);
    
    return postData.toString();
    
  • DEFAULT: Beta to model data allows the optionals in logical UKs

    When you enter a logic model, I didn't have the right to choose one optional attribute or relationship to include in a definition of unique key.

    The physical level correctly allows me to do.

    Hi David,

    We'll fix it.

    Thank you
    Philippe

  • Problem with logic groups combining data at the point of importation

    I really hope that you will be able to help me with a problem I've had for the last few days.

    The scenario I have requires a logical group be created as the source data must be mapped differently if its + ve or - ve even thought the source account is the same.

    I created a logical group complex and added the corresponding logical group account in the account mapping table. The group uses the definition of rule 001806 *.

    The accounts that would be affected by this rule are below the values:
    18060000 - 2093096.69
    18060000 2093096.69
    18060005 - 2474955.48
    18060005 2474955.48
    18060015 11319512.13
    18060015-8000000 (a PEAK partner)
    18060015 - 3319512.13 (a PEAK partner)

    The + ve values need to go to an asset account and the values of ve - to a liability account.

    At the stage of import, I have a zero value which seem to be the sum of the values of the first 4, substantive values 3 then appear on individual lines.

    Step validate the asset HFM account displays the 3 + values of ve (15.8 M) Although 2 of them apparently wide net to zero with the values of the first and the third - ve. However, the liability account shows the background raising 2 (11. 3 m) as its missing the other 2 values - ve. Assets and liabilities HFM values should be the same

    In the account mapping, I'm using the following script for the + ve (its been reformatted for this forum):
    ' If varValues (9) > '0' Result = "BSA401000" Else Result = "Ignore" End If

    The script - ve is:
    ' If varValues (9) < "0" Result = "BSL625600" Else Result = "Ignore" End If

    I later changed the logical group to be Simple, but I get a similar result, although what I am trying to reach seems to be detailed in the Administrator's Guide FDM under the heading create accounts within Simple logic groups

    I understand that it is a little lengthy so apologies if you need more information, then please let me know and I will happy to provide more

    Thank you, James

    You could even put in place two mapping rules and avoid the complexity of a logic rule, if you wish. The mapping will continue to the next line if it does not meet the criterion, and there is no result.

    Side assets would be with a name of rule vAsset (you can change this, but it must be unique to the next line)
    If varValues (9) > 0 then result = "BSA401000."

    and then you add a second line to your card with a name of vLiab rule and check the sign flip.

    If varValues (9)< 0="" then="" result="BSL6256002">

    In the past I have avoided the rules of logic to the extent possible, by adding removes the ability to drill holes in the source file.

    Concerning
    JOINT TASK FORCE

  • [SOLVED] Export Oracle SQL Data Modeler is missing a PRIMARY KEY on the DDL script

    I use data 4.1.888 maker to create an ER diagram and generate a DDL her script.

    The diagram contains more than 40 paintings, most of them have a primary key defined.

    For some reason any there is a table that has a primary key defined, but which is ignored when I export the model to a DDL script.

    It is the "wrong" key (even if it is checked that it is not found on the generated DDL script):

    4faS5.png

    This is where the key is set:

    O5mPb.png

    And it is the preview of the DDL (Yes, primary key up there shows):

    SrwMu.png

    This is what happens if I try to generate the DDL for just this (still not generated primary key) table:

    MljUm.png

    Has anyone had the same problem? Any ideas on how to solve it?

    There is no error in the log file, but when I run the generated DDL script there, and then I realized that I was doing something wrong:

    The table MEMBERS had a mandatory foreign key from another table, which in turn had a mandatory key against MEMBERS himself.

    So even if I could generate this primary key on members myself, and then run the the constraint definition that returned an error on the DDL script, I could not perform an insert operation on any of these two tables because of the constraint.

    I revised my design and realized relationships was not mandatory. I unchecked the mandatory box on the definition of the constraint and everything went well.

    I could reproduce the problem and the solution on a diagram with only two tables, so I'm sure that's it.

    Anyway, the Data Modeler is "a failed" silently in this kind of situation. It should be fairly obvious to an experienced designer that I was doing something wrong, but it is not so obvious when you deal with dozens of tables and all their relations and this is your first time using the Modeler.

    Thanks for your reply :-)

  • Using the event Script to load the Excel model data for users

    Hello

    Whoever did this in the FDM 11.1.2.1.

    We want to exploit the event Script to load the data validated MDF for a standard template so that users can make some adjustments and then load the same data through the FDM? Is there a better alternative. Basically, trying to avoid making adjs data forms in HFM in view of the number of columns that we have and validations for each line required with adjustments.

    Thank you

    PM

    Hello

    google + Administrator's Guide.

    It's just a matter of thought in the solution:

    -Runs a SQL query to export data from table (TDATASEG). You can filter the query by LOADID

    -Write csv file records

  • Script in time of latency list vm e/s and the data store the virtual machine is on

    Hello.  We have a vsphere 5.0 environment and we live a latency of IO heavy.  I'm looking for powercli script will get the latency of i/o for each virtual machine and get the data store name, to what it is now.  We will access our storage on optical fiber.  I'm trying to get a good overview of the latency of IO in a nice view in a csv file.  I found what could be a good basis to https://communities.vmware.com/thread/304827?start=0 & tstart = 0 , but I'm not sure how to get the name of the data store in the table and I think that it is written to the nfs in any case storage.  Thanks in advance for any info\advice!

    Try the next version, it includes the average latency time read/write for the virtual machine and PAHO are / s average for the virtual machine.

    Since the CSV has a row for each data store, the values for the virtual machine are repeated.

    I also added the host name

    $vmName = "VM*"
    
    $stat = "datastore.totalReadLatency.average","datastore.totalWriteLatency.average",  "datastore.numberReadAveraged.average","datastore.numberWriteAveraged.average"$entity = Get-VM -Name $vmName$start = (Get-Date).AddHours(-1)
    
    $dsTab = @{}Get-Datastore | Where {$_.Type -eq "VMFS"} | %{  $key = $_.ExtensionData.Info.Vmfs.Uuid  if(!$dsTab.ContainsKey($key)){    $dsTab.Add($key,$_.Name)  }  else{    "Datastore $($_.Name) with UUID $key already in hash table"  }}
    
    Get-Stat -Entity $entity -Stat $stat -Start $start |Group-Object -Property {$_.Entity.Name} | %{  $vmName = $_.Values[0]  $VMReadLatency = $_.Group |    where {$_.MetricId -eq "datastore.totalReadLatency.average"} |    Measure-Object -Property Value -Average |    Select -ExpandProperty Average  $VMWriteLatency = $_.Group |    where {$_.MetricId -eq "datastore.totalWriteLatency.average"} |    Measure-Object -Property Value -Average |    Select -ExpandProperty Average  $VMReadIOPSAverage = $_.Group |    where {$_.MetricId -eq "datastore.numberReadAveraged.average"} |    Measure-Object -Property Value -Average |    Select -ExpandProperty Average  $VMWriteIOPSAverage = $_.Group |    where {$_.MetricId -eq "datastore.numberWriteAveraged.average"} |    Measure-Object -Property Value -Average |    Select -ExpandProperty Average  $_.Group | Group-Object -Property Instance | %{    New-Object PSObject -Property @{      VM = $vmName      Host = $_.Group[0].Entity.Host.Name      Datastore = $dsTab[$($_.Values[0])]      Start = $start      DSReadLatencyAvg = [math]::Round(($_.Group |           where {$_.MetricId -eq "datastore.totalReadLatency.average"} |          Measure-Object -Property Value -Average |          Select -ExpandProperty Average),2)      DSWriteLatencyAvg = [math]::Round(($_.Group |           where {$_.MetricId -eq "datastore.totalWriteLatency.average"} |          Measure-Object -Property Value -Average |          Select -ExpandProperty Average),2)      VMReadLatencyAvg = [math]::Round($VMReadLatency,2)      VMWriteLatencyAvg = [math]::Round($VMWriteLatency,2)      VMReadIOPSAvg = [math]::Round($VMReadIOPSAverage,2)      VMWriteIOPSAvg = [math]::Round($VMWriteIOPSAverage,2)    }  }} | Export-Csv c:\report.csv -NoTypeInformation -UseCulture
    
  • How to delete a DATA FILE and rename the other DATA of FILE Batch scripts

    DELETE or DROP d:\NDM\Data\StampFiles\STAMPLOADBKUP.csv

    RANAME d:\NDM\Data\StampFiles\STAMPLOAD_cwoo.csv data file for d:\NDM\Data\StampFiles\STAMPLOADBKUP. CSV

    what works for use in support of scripting batch above and need exact syntax please. I am poor in the batch script.

    Concerning
    Soma

    ================
    D:

    CD d:\NDM\Data\StampFiles

    del STAMPLOADBKUP.csv

    Ren STAMPLOAD_cwoo.csv STAMPLOADBKUP. CSV

    ================

    That should do it for you!

  • error in one of the scripts of mover of data during the installation of campus solution

    Hello world
    This is my first attempt to install one of the products of peoplesoft
    I install HCM 9.0 on windows 2008 64-bit, oracle 11g
    I am now in task called a 7-16-9: PeopleTools system data update
    I ran pt849tls.dms with success
    but pt850tls.dms failed with the error:

    File: Data error MoverSQL. Stmt #: 0 error Position: 25 back: 904 - ORA-00904: "PT_RETENTIONDAYS": invalid identifier
    Failure of SQL stmt:UPDATE PS_PRCSSYSTEM SET PT_RETENTIONDAYS = RETENTIONDAYS
    Error: Error execute SQL for UPDATE PS_PRCSSYSTEM SET PT_RETENTIONDAYS = RETENTIONDAYS

    Is there a script that I missed?
    I followed the instructions step by step
    Thanks for your help,

    Did you run the sql script generated by the generation of project successfully (it is not rel849un.sql and rel850un.sql either)?

    Nicolas.

    Published by: Gasparotto N on April 12, 2010 11:36

  • Data Modeler: generate the Delta Scripts against an existing schema

    Can I generate the DDL Scripts with 'ALTER TABLE' - statements?

    For example: relational model and the database are synchronized. That means: each Table of the db is stored in the data model of the Data Modeler.

    Now I need new columns in a table. I have set these columns in the data model and now I need a script with:

    ALTER TABLE my_table ADD (new_column DATE);

    In the Oracle Designer, it is easier to create such delta-scripts. But I can't find anything similar in the Data Modeler

    Thanks in advance
    Gerd

    Hi Gerd,

    We called the "Swap target model" at the last page of the wizard 'import data dictionary' check box. If you select your database will target the model and you can get ALTER statements in the DDL of the Merge dialog preview. It's the same thing with the DDL script.

    Philippe

  • Command or a script required for the data store, the identifier of NAA and multipath

    The following script/command gets me politics multichannel on a csv file, however, I need to also identify the "Datastore" name with the number of naa.

    Get-cluster 'windows 01' | Get-vmhost | Get-scsilun - luntype disc | Export-csv c:\lun_multipath.csv

    any help would be appreciated.

    Try something like this

    $datastore = get-datastore
    
    $disks = get-cluster 'windows 01' | get-vmhost | get-scsilun -luntype disk$entry = @()$output = @()ForEach ($disk in $disks){  $entry = "" | Select DataStorename, HostName, Canonicalname, Multipathing  $entry.datastorename= $datastore | Where-Object {($_.extensiondata.info.vmfs.extent | %{$_.diskname}) -contains $disk.canonicalname}|select -expand name  $entry.HostName = $disk.VMHost.Name  $entry.canonicalname=$disk.canonicalname  $entry.multipathing=$disk.multipathpolicy  $output += $entry}$output | Export-csv c:\lun_multipath.csv
    
  • Problem with sending data to the Php Mail Script.

    So, I work with this Flash pattern that my boss bought. It has a contact form, you are supposed to be able to fill, and he sent an e-mail to a specified address.

    However, it does not work. Not at all!

    There are 4 fields on the form

    name

    E-mail

    Phone

    Message

    The Code for the button send is

    -Code button-

    onClipEvent (load) {this.t.v = _root.contacts_txt7 ;}

    on (rollover) {this.gotoAndPlay ("s1") ;}

    (deployment) {this.gotoAndPlay ("s2") ;}

    {We (Release)}

    _parent.loadVariables ("inc/mail.php", "POST");

    }

    -Code button-

    The php script is

    Php script-

    <?

    $name = $_POST ['name'];

    $email = $_POST ['email'];

    $phone = $_POST ['phone'];

    $message = $_POST ['message'];

    $ToEmail = "email@here";

    $ToSubject = 'the message of your site;

    $EmailBody = "name: $name\n".

    E-mail: $email\n

    Phone: $phone\n

    "Message: $messages\n";

    $Message = $EmailBody;

    $headers. = "content-type: text; charset = iso-8859-1\r\n ";

    $headers. = « de : ". $nom. « / ». $email. » \r\n » ;

    mail ($ToEmail, $ToSubject, $Message, $headers);

    ? >

    Php script-

    It seems like it should work. The mail on the server function works, as I am able to set up a base in php contact page and it works, however, he wants that this flash contact form. It's as if the flash does not send anything to the php script. Is there something missing? I have looked for hours and have resorted to hitting my head on the wall, in an effort to jar loose ideas, so far, nothing.

    1. don't set the $ToEmail variable

    2. I do not see where $headers is initialized

    3. what directory (relative to the html embed your swf) contains mail.php and

    4. use tracking feature to ensure that these variables (name, etc.) are set to _parent when your php file is called.

  • Update a UDA at a member of the main lines of the Calc script?

    Happy new year everyone!

    Is it possible to update a UDA at a member of the main lines of the Calc script? There is a custom for that function?

    We want to check the data, and then update UDA based on what is our data.

    Thank you.

    Not that I know, and it is not possible to write one, since you probably can't restructure with the running calculation (chickens and eggs).

    More likely, you will need to do a multi-step process; export the members that you want to set up a file via DATAEXPORT conditions or report designer, and then use this output to feed in an accumulation of dimension with an appropriate load rule.

    Or write a Java API to fully customized program.

    I'm also curious about the driving condition, as it is a rather unusual request.

  • How to switch the input of command in the DAL script parameter

    Hello guys,.

    I want to pass the name to extract xml (one of the input for the batch of documaker parameter) to one of my trigger DAL.

    I plan to use the Ext2GVM function for this. I need to know the syntax of the following parameters:

    Say-exporter variable XMLNAME am for the batch.


    (1) what should be of syntax for the AFGJOB file

    Is this correct-> ; Ext2Gvm; 2; ParameterName = ~ GETENV XMLNAME ;

    (2) what should be my entry in the trndfdfl. DFD

    Is this correct->

    < FIELD: parametername>

    EXT_LENGTH = 21

    EXT_TYPE = CHAR_ARRAY_NO_NULL_TERM

    INT_LENGTH = 20

    INT_TYPE = CHAR_ARRAY

    KEY = O

    REQUIRED = Y

    FIELDNAME = parametername


    (3) is it necessary to go into details as < Trn_Fields > in the ini files?

    If so, please help me with sytnax


    (4) final, how can I call this variable in the script DAL ?

    Is this-> OK

    Metafilename = GVM ('parametername')


    Thank you


    I reformulerai what I think you do.

    You have a file name that you want to reference it in a script DAL. This file is NOT the snippet file should be able to use in the creation and mapping of your transactional data, but a few additional files that you expect to process directly from DAL.

    If this is correct, two possibilities come to mind.

    1. assuming it's a file that you can copy using your linux script, copy the file to a consistent name. Your script DAL can then use the code name hard you still copied.

    2. a method that does not have a copy of the file would require that you create an environment variable of the linux script that contains the name of the file passed. In your INI, you must define your own INI, something like this:

    Filename = ~ GetEnv MyExternalFile

    The DAL script would do this to get the name:

    xmlFileName = GetINIString ("MyINIGroup", "Filename");

    When DAL is requesting the INI value, see the logic underlying the ~ GetEnv which is a built-in function to get the environment variable named following. So if you have assigned your script linux environment variables, the name going through now and DAL has a way to get it.

    (Note there is no magic in the name of the INI option or group or the name of variable environment you choose to use.) The magic is to use the ~ GetEnv to retrieve the value of the environment when the option is requested.)

    Second note, the GetINIString function has an optional first parameter to name a context INI. In this case, I do not pretend that you use this setting. Just include a comma to indicate that the first parameter is omitted.)

Maybe you are looking for

  • Satellite C70-C-199 - dual channel mode is not active for RAM modules

    Hello I recently bought a Satellite C70-C-199 laptop with 4 GB RAM and I installed a module of RAM 4 GB more.The module I installed is not recognized by the BIOS. What I tried:(1) replace the Rams repeatedly.(2) exchanging the two Rams between the tw

  • Import of questions book Audio CDs

    Hello I have some old Audio books on CD that I started to import into iTunes.  I noticed that iTunes brings me multiple options for track/CD info, but basically, it was a guessing game about which are the good, because there has been no consistent na

  • Share button in the Finder, Safari and also does not

    I have been using the release of El Capitan for a while now, but recently the 'Action' button in the Finder, Safari and other applications when you press to share for example at the post office, leads the application unresponsive, ends up having to b

  • How do you turn volume of labtop

    Increase its labtop

  • Vista update problem; Please help me.

    OK, so I'll try to be descriptive, so I can receive assistance. So I recently installed Vista on my new PC that I built (first time vista user). It runs, decently, I suppose. Well, I am informed of updates so I DL them. I noticed that this button app