Error of the API in the import script

I'm trying to concatenate the result of a search of the dimension on the string of the source account. I have attached the script below to the account dimension in the import format using a data pump import script.
____________________________
' Function CCRule (strField, strRecord)

Dim StrCC

StrCC = (DW. Utilities.fParseString(strRecord,8,3,","))

If StrCC = "" then "

CCRule = StrField

On the other

StrField = CCRule + '_' + API. SqlMgr.fMapItemLookup(800,"UD5",StrCC)

End If
End Function
_______________________________________________________


When I run this script I get the error "object required «API"»

Please let me know your thoughts on what I'm doing wrong here.

Thank you

the only thing I want to point out is to look at the performance of this. There are ways to do better if it is 'no '.

Also, if she answered your problem do not hesitate to award points. If I get 1000 points do I get free a slinky or something. :-)

Tags: Business Intelligence

Similar Questions

  • problem with fdmContext ["PERNAME"] in the import script

    Hello

    I am struggling to understand why the following script returns an empty when string assigned to a dimension in an import format.

    def Parse_Period (strfield, strrec):

    sPeriod = fdmContext ["PERIODNAME"]

    return sPeriod

    I tried several combinations of the variable without joy fdmContext (LOCNAME, PERIODKEY, etc.). I also noticed that there is a discrepancy between the Oracle EPRI_Admin_11.1.2.3.510 guide and scripts used in the forums. the guide uses fdmContext ["LOCNAME"] while the forums seem to have success with fdmContext ["LOCNAME"]

    Thank you

    KM

    Hi KM.

    You can try to add this line in your script:

    fdmAPI.logInfo (', '.join (map (str, fdmContext.values ()))

    and see in your log file if you have any value at all in fdmContext.

    You must also include the content of your log file, it might help.

    Julien

  • Permission errors on the importation of a custom in the person table column

    I have made a schema extension to our D1IM system in our dev environment, has added a new column and the appropriate permissions to the person table. I could see successfully, modify and import into this new column. Then, I carried my changes to the test system, and now I get a permission error whenever I run the import. I can still change this field using the Manager. The error is:

    [810024] employees: look at a permission denied for value "import updated HR."

    Any thoughts on what I'm missing would be fantastic!

    Thank you
    Blair

    I had the same problem as well, I had to give up the field and create it rather than import. I even created a separate label to change just for a single column that I've extended, would like to know the right way to import as well.

  • error in the windows script host

    When I start my laptop, an error message is displayed:
                                  Windows script host
                                    script: c:\winset\mine, vbs
                                    Line: 2
    Char: 1
    error: the system cannot find the specified file
    Code: 80070002
    Source: (null)

    How can I solve this problem?

    HI San,

    1. are you aware of any changes made to the computer before the show?

    2. What antivirus application is installed on the computer?

    This could be the visual basic script that loads the background that is causing the problem.

    By placing the computer in a clean boot state, we can refine the question of which program is the cause.

    Reference:

    How to perform a clean boot for a problem in Windows Vista, Windows 7 or Windows 8
    http://support.Microsoft.com/kb/929135

    Note: See "How to reset the computer to start as usual after a clean boot troubleshooting" under more information to prepare the computer to start as usual after a repair.

    Hope this solves the problem. If the problem persists, you can write to us with the name of the program that causes this problem.

  • Connection error during the import of metadta in OBIEE 11 g

    Hi all Experts,

    I'm trying to import metadata from Oracle 11 g R2 database (supported Version) to OBIEE 11 g R1 (11.1.1.5) physical layer Admin Tool, when I select a source of data, such as connection type OIC 10 g / 11 g returns an error, connection failed , alll of oracle database services are up and running. (My repository creation utility is run successfully on this database when is install OBIEE 111 g)

    Also, I created an ODBC connection to oracle database 11g R2 import, the connection is created successfully, to return the same error message with all sys, s/n users.
    with other data source when I tried as of SQL SERVER 2005 there is no error and can import successfully.

    Please give some solutions

    Thank you

    Hi reda,.

    Paste the file tnsnames.ora in these two places,

    {Oracle_BI1} \network\admin directory &

    \Network\admin {oracle_common} directory

    Rgds,
    DpKa

  • Error after the DDL script loading

    Hi all

    I am running Oracle 10 g on my machine. I have create an ER diagram in Toad Data Modeler, which includes all the keys, constraints etc..
    I generated a DDL script that I want to biuld my tables with Oracle.

    I loaded the script using SQL DDL * more calculation sheet and there is no problem.
    I then tried to insert test data in my tables and I get an error code:

    ORA-00942 table or view does not exist.


    Now, I know that the tables have been created and also check with the dictionary of data using

    Select table_name
    from user_tables;

    TABLE-NAME
    =========
    Table1:
    Table2
    Table 3
    .
    .
    etc.


    It shows then all 20 of my paintings, as shown above. I use a Visual Basic front-end and I can see all 20 tables that are listed with the names of attribute for each table.
    So guys, where wrong me?

    Thanks in advance,
    OracleTechie

    Log in as a SYSADMIN & problem SQL below

    Select * from 'customer '.

  • Error in the importation of tables with nested table types

    Hello

    I have two tables with the nested table type when I m try to import a schema to another all tables with error to give the type of identifying different

    Here is my order of import
    imp leader = tables_nested.dmp ignore = y fromuser = ABC touser = DEV_SCHEMA toid_novalidate = sup_payment_type
    I tried to import with the option TOID_NOVALIDATE, but it says
    IMP-00086: TOLD 'SUP_PAYMENT_TYPE' not found in the export file


    Here, the view of my log file.

    . . table import 'CONTRACT' 788 imported lines
    . . importation of "EQUIPMENT" 4301 imported table lines
    . . rows in table 'CONT_EQUIPMENT' import imported 4300
    IMP-00017: statement failed with error ORACLE 2304:
    "CREATE TYPE"SUP_PAYMENT_TYPE"TIMESTAMP" 2007-11 - 28:10:50:50' OID "3FF6F10."
    "CADC08A99E040A8C0010178F9"AS OBJECT ().
    "CONT_NO NUMBER".
    "EQP_NO NUMBER".
    "PMT_NO NUMBER".
    'DATE OF PLAN_PMT_DATE '.
    "S_NO NUMBER".
    "BATCH_NO NUMBER".
    'DATE OF TRAN_DT '.
    'DATE OF ACTUAL_PMT_DATE '.
    "ACTUAL_PAID_AMT_CURR NUMBER".
    "ACTUAL_PAID_AMT_KZT NUMBER".
    "PMT_CURSTYP_CD NUMBER".
    'NUMBER OF PMT_EXG_RATE);'
    IMP-00003: ORACLE error 2304
    ORA-02304: invalid literal object identifier
    IMP-00063: WARNING: jump table "DEV_SCHEMA". "" SUPPLIER_PAYMENT "as the type of the object
    'DEV_SCHEMA '. "' SUP_PAYMENT_TYPE ' could not be created or has identify different
    About to activate the constraints...

    Thank you

    Baptist

    Get the error on tahiti, I realized this must be a common problem and is described in Metaclunk.
    So in Metaclunk I get "ora-02304 imp' and came up with 1066139.6 ML.
    It describes your situation.
    So many times, it is very easy to solve your problems in a few minutes. I always wonder why people walk immediately into shock and horror and nothing to do when they hit a mistake.

    ----
    Sybrand Bakker
    Senior Oracle DBA

  • The importer reported a geniric error

    Hello

    I can't download any file in my first items 10. get the error message "the importer reported a geniric error." He used to work fine before (I downloaded file even earlier). My laptop was crushed. So I restored and reinstalled PE10. I lost my serial number original PE10. Got the new issue of the Tech Support series. Now, I can not download any file. I see a lot of discussion on the same topic. but it seems that nothing is resolved. Please help me.

    Thank you

    The Anji

    We begin with General information.

    What operating system is your first items 10 working on? And, your computer uses an NVIDIA GeForce video/graphics card card.

    If a NVIDIA GeForce card, then stops and let us know. There is a serious problem first Elements 10/NVIDIA GeForce which is corrected by roll back the driver version of the card about may 2013. We need exclude this factor or back before considering other ways of solving problems.

    See the ad at the top of this forum.

    What are the properties of this file which was compatible with first Elements before 10, but is not now?

    Video compression

    Audio compression

    Size of the image

    Frame rate

    Interlaced or progressive

    File extension

    Format of the pixels

    Brand/model/parameters of the camera that records video might give us the information we need. How long ago

    What was all this work?

    What project preset are you affecting match the properties of your source media?

    Then the usual... latest version of QuickTime installed... and you are running a user account with administrative privileges and also have applied run them as administrator?

    Let us start here and then look at the troubleshooting plan.

    Thank you.

    RTA

  • Group of simple logic to operate before the custom import script?

    Hi all

    Thank you for taking the time to read my question. I gladly mark this thread as helpful or reply if you can help me. I am a novice to FDM so please keep with.

    I have a script to import custom that PKI none assigns to a specific account (substituting any detail PKI). However, now I need the detail of PKI for this account in a second statistical account. I have set up a group of simple logic to create the account of logic that I can map on statistics, but then realizes that the import script runs before the logical group so I lose all details of PKI in the account of the logic as well.

    Is it possible to repeat the logical group before importing the script or is there a better way to accomplish what I'm doing?

    I don't know how critical it is, but I'm using FDM v11.1.1.3.01 adapter 11 x-G5-C

    Published by: user4591089 on August 17, 2011 14:10

    Published by: user4591089 on August 17, 2011 14:50

    Follow these steps:

    (1) remove the custom import script.
    (2) create a complex logic account and change the dimension of the PKI in the column with the value [no PIC] group. This will be what is displayed on the screen to import on behalf of this logic.
    (3) map the original source under the statistics account and the account of logic as the case

    Published by: SH on August 18, 2011 09:48

  • Import script API

    Hi guys,.

    Here's my question! If we know, we CN can't use API in the import scripts, is possible to use the API in the import scripts? :)

    Without using the API in the import scripts, how can you call a script in another script or in the same script? (using functions such as fImportScripts, etc.)

    Thank you in advance!

    I have no problem understanding of your question. However the sentence key in my answer was that you'll get better answers if you ask specific questions. The reason why I ask you to be specific, it's I know WHY you want to use the API in a script to import, I might be able to make suggestions how to solve for your needs by going in a different way.

    I want to help you, but I need something concrete to go.

    On the issue of the API, you ask:

    Well, I guess the general answer is no. The purpose of the API is out of context when executing an import script. I imagine that this prevents users to code something when you import that would cause FDM to tangle itself by allowing direct access to the data and runtime objects. Whatever the reason, it's the way THAT FDM is designed, and you need the code within this design in order to get help. It is a long journey around the same answer I gave you earlier.

    However, the second part of your question, can call you other scripts of the import script: Yes, absolutely. You can start the process too.

    Kind regards
    Robb Salzman

  • [SOLVED] Export Oracle SQL Data Modeler is missing a PRIMARY KEY on the DDL script

    I use data 4.1.888 maker to create an ER diagram and generate a DDL her script.

    The diagram contains more than 40 paintings, most of them have a primary key defined.

    For some reason any there is a table that has a primary key defined, but which is ignored when I export the model to a DDL script.

    It is the "wrong" key (even if it is checked that it is not found on the generated DDL script):

    4faS5.png

    This is where the key is set:

    O5mPb.png

    And it is the preview of the DDL (Yes, primary key up there shows):

    SrwMu.png

    This is what happens if I try to generate the DDL for just this (still not generated primary key) table:

    MljUm.png

    Has anyone had the same problem? Any ideas on how to solve it?

    There is no error in the log file, but when I run the generated DDL script there, and then I realized that I was doing something wrong:

    The table MEMBERS had a mandatory foreign key from another table, which in turn had a mandatory key against MEMBERS himself.

    So even if I could generate this primary key on members myself, and then run the the constraint definition that returned an error on the DDL script, I could not perform an insert operation on any of these two tables because of the constraint.

    I revised my design and realized relationships was not mandatory. I unchecked the mandatory box on the definition of the constraint and everything went well.

    I could reproduce the problem and the solution on a diagram with only two tables, so I'm sure that's it.

    Anyway, the Data Modeler is "a failed" silently in this kind of situation. It should be fairly obvious to an experienced designer that I was doing something wrong, but it is not so obvious when you deal with dozens of tables and all their relations and this is your first time using the Modeler.

    Thanks for your reply :-)

  • How to check the user POV period and year in the mapping script?

    Hello

    I want to use information from period and the year in the script map.

    for example: if I load the March data, I use Custom 3 Q1 dimension. So I want to check my active period of user POV in my C3 automatically mapping as in the Import Script, something like this:

    Select the option get Month (dblPerKey)

    Cases 1 to 3

    Result = Q1

    Box of 4 to 6

    Result = Q2

    ......

    The test results with dblPerKey:

    It seems that if I use the Script to Import, then dblPerKey contains the value, but when you use the script to map, then dblPerKey is empty. At least when I tested this script returns 1899 script 'result = Year (dblPerKey).

    Summary:

    In the import script: (dblPerKey) Year returns as a result of user POV selection year

    In the mapping script: Year (dblPerKey) always returns 1899

    Question: Is there a way to check user POV when mapping scripts?

    Thanks a lot for your answers

    BR

    Karnie

    Replying to myself: I need to use varValues (3) in map scripts. Not a dblPerKey.

  • Using .split on an import script with a comma delimited data file

    Hi everyone-

    Any attempt to create a script to import amount field to remove the apostrophes (') of a description field to account in a .csv data import file (any folder with an apostrophe will be rejected during the import phase).  Now if it was a file delimited by semicolons (or other separators and more by commas), I could remove all the apostrophes recording with a string.replace command, then return the amount with a command of string.split.  Unfortunately, there is a problem with this solution when using comma delimiters. My data file is comma-delimited .csv file with several amount fields that have commas in them.   Even if the fields are surrounded in quotes, the .split ignores the quotes and treats the commas in fields amount as delimiters.  Therefore, the script does not return the correct amount field.

    Here is an example of a record of reference data:

    "", "0300-100000", "description of the account with an apostrophe ' ', '$1 000.00",' $1 000.00 "," $1 000.00 ","$1 000.00"," $1 000,00 "" "

    My goal is to remove the apostrophes from field 3 and return the amount in field 8.

    Some things to note:

    • If possible, I would like to keep this as an import script for amounted to simplify administration - but am willing to undertake the event script BefImport if this is the only option or more frank than the import script-based solution.
    • I tried using regular expressions, as seems to be conceptually the simplest option to respect the quotes as escape character, but think that I am not implementing properly and impossible to find examples of regex for FDMEE.
    • I know that we cannot use the jython csv on import the script by Francisco blog post - fishing with FDMEE: import scripts do not use the same version of Jython as event/Custom scripts (PSU510). This may be a factor to go with a script of the event instead.
    • It's probably a little more engineering solution, but I have considered trying to write a script to determine where to start all the quotes and the end.  Assuming that there are no quotation marks on the inside of my description of account (or I could remove them before that), I could then use the positioning of the quotes to remove commas inside those positions - leaving the commas for the delimiters as is.  I could then use the .split as the description/amount fields have no commas.  I think it may be better to create a script of the event rather than down this solution from the point of view to keep administration as simple as possible
    • Yes, we could do a search and replace in the excel file to remove the apostrophes before import, but it's no fun

    Thanks for any advice or input!

    Dan

    Hi Dan,.

    If your line is delimited by comma and quote qualified, you can consider the delimiter as QuoteCommaQuote or ', ' because it comes between each field.  Think about it like that, then simply divided by this value:

    split("\",\"")

    Here's something I put together in Eclipse:

    '''

    Created on Aug 26, 2014

    @author: robb salzmann

    '''

    import re

    strRecord = "\"\",\"0300-100000\",\"Account description with an apostrophe ' \",\"$1,000.00\",\"$1,000.00\",\"$1,000.00\",\"$1,001.00\",\"$1,002.00\""

    strFields = strRecord.split("\",\"")

    strDescriptionWithoutApos =  strFields[2].replace("'", "")   'remove the apostrophe

    strAmountInLastCol = strFields[-1:].replace("\"", "")        'You'll need to strip off the last quote from the last field

    print strDescriptionWithoutApos

    print strAmountInLastCol

    Account with an apostrophe description

    ' $1 002,00

  • 12.2.4 WebADI - Integrator custom fails to cleanup script is triggered when the import returns the error lines

    Hi, hoping for help because it is not very useful detailed documentation on this topic here.

    I built a custom WebADI that uses custom procedures to validate and download the date in a custom table.  I then called an importer script that checks the balance of files (and other things) and mark the lines that do not reach a State of error.  I also managed to get these errors to be selected by the definition of error line and message error search section and everything is right with the world.

    Except that when imported and reports to hide the sad faces appear in the worksheet, I expect the cleanup script then learn to delete the records so that when the errors are corrected in the spreadsheet we don't get the lines in double... and is not.  No amount of tweaking, etc. application errors will result in cleaning the fire.

    Can someone tell me what I need to import of PL/SQL program to do when it detects a mistake for execution of the cleaning procedure?

    Thanks in advance

    Mike

    IT FIXED myself.

    The downloaded must be a function and return an error message of VARCHAR2.  If you return NULL - then the cleaning is not called as expected. If you return to any string, then cleanup is called.

  • FDMEE: Unable to capture the State of the successful import flag in the afterimport script

    Hello Experts,

    I am facing a problem in the import successful FDMEE shot.

    I have an obligation to send e-mail notifications on import successful and unsuccessful import mentioning status.

    I tried to use variables:

    1 set objProcessStatus = objFDMAPI.API.MaintenanceMgr.fProcessStatus (strLoc, strCat, strPer, strRulID)

    objProcessStatus.blnImp

    The objProcessStatus.blnImp value is always returned as 'false' regardless of the State of importation (successful & unsuccessful both cases)

    2. I tried with objProcessStatus.lngStatus,

    Here also, the returned value is always '0', regardless of the status of the import.

    3. I tried to capture the "PROCESSIMP" of TLOGPROCESS and the value of PROCESSIMP is always 0 even on successful importation.

    While the fish turns Orange Import indicating a successful import.

    Need help in an effective way to capture the flag of the State of import successful.

    Sample code I use to capture the State:

    Dim lngStateCheck

    Dim objProcessStatus

    Dim stat

    strLoc = objFDMAPI.API.State.LocName

    strLocKey = objFDMAPI.API.State.LocKey

    strCat = objFDMAPI.API.State.CatName

    strCatKey = objFDMAPI.API.State.CatKey

    strPer = objFDMAPI.API.State.PeriodKey

    strRulID = objFDMAPI.API.State.RuleID

    WScript.Echo "key location" & strLocKey

    WScript.Echo "ID of the rule" & strRulID

    WScript.Echo "Partition key" & strLocKey

    WScript.Echo strPeriodKey & "period"

    "The value of object process status indicator.

    Set objProcessStatus = objFDMAPI.API.MaintenanceMgr.fProcessStatus (strLoc, strCat, strPer, strRulID)

    Set FSO = CreateObject ("Scripting.FileSystemObject")

    Set FSO = writefile. OpenTextFile("E:\FDMEE\data\scripts\debug.txt",8)

    «I am here and my value is» WriteFile.WriteLine

    WriteFile.WriteLine objProcessStatus.blnImp

    WriteFile.WriteLine objProcessStatus.lngStatus

    WriteFile.Close

    'Destroy objects '.

    objFDMAPI.Dispose

    Set objFDMAPI = Nothing

    ********************************

    Save file entry of PROCESSIMP: (the following query is in the log file indicating that PROCESSIMP is updated as 0)

    2015-09-01 11:15:08, 008 [AIF] DEBUG:

    UPDATE TLOGPROCESS

    SET PROCESSENDTIME = CURRENT_TIMESTAMP

    PROCESSSTATUS = 0

    PROCESSIMP = 0

    PROCESSVAL = 0

    PROCESSEXP = 0

    PROCESSENTLOAD = 0

    PROCESSENTVAL = 0

    BLNWCDIRTY = 0

    BLNLOGICDIRTY = 0

    BLNVALDIRTY = 0

    PROCESSIMPNOTE = NULL

    PROCESSVALNOTE = NULL

    PROCESSEXPNOTE = NULL

    PROCESSENTLOADNOTE = NULL

    PROCESSENTVALNOTE = NULL

    WHERE PARTITIONKEY = 37 AND CATKEY = 1 AND PERIODKEY = ' 2017-01-31' AND RULE_ID = 56

    I tried to print the values of IMPSTATUS in afterimport & befValidate script.

    The value of AfterImport is = 0

    While in BefValidate = 1

    This proves that the value of the indicator of State for IMPSTATUS is changed after the afterimport of script is executed.

    Thanks for your time and your help.

Maybe you are looking for