Question of data quality products

Hello
I prepare the offer for one of our clients, and I after questions about the ODI of advanced features:
(1) is 'Oracle Data Quality for integration of data with Oracle 11 g of profiling data' and 'Oracle Enterprise Data Quality' are different products or the second is the stand-alone version of the first by the way?
(2) where can I find the list of countries supported in the "quality of the Oracle data to the integration of data with Oracle 11 g of profiling data' transformations country-specific integrated names and address standardization? Or maybe you know another way to check if, for example, the Poland is supported?
I can't find clear answers to my questions, so I decided to ask for help.
Thanks in advance,
Kind regards
Piast

You can

Tags: Business Intelligence

Similar Questions

  • question about the quality of the data and CKM

    If I understand correctly, CKM only supports the audit based on the constraints of the db. If I want to have more complicated built with the audit of the business logic, is the quality of the data as a good choice. Or any other suggestions?

    In my case, I'll have to check the data in the source table, based on the data in the table from different sources (the source and target tables). This should be doable thanks to the quality of the data, correct? I am new to ODI. When I first installed the ODI, I chose not to install the module of data quality. I guess I can install DQ separately and bind it to ODI? They share the same master repository?

    Sorry for the naïve questions, your help is greatly appreciated.

    -Wei

    Hi Wei,

    Not necessarily.

    You can create you own constraint that will only exist in ODI and can be complex.

    Right-click on "Constraints" in any data store, and you can navigate between them...

    This help you?

  • ODI certification: Oracle Enterprise Data Quality: match, Parse, profile, Audit, exploit

    Please suggest how to prepare for the certification ODI topic: Oracle Enterprise Data Quality: Match, Parse, profile, Audit, operate.

    There is a forum OEDQ very good quality of the data of the product team here is very active there so go there to ask.

    There are also videos on YouTube related links (they are made by the product team and are therefore best practices) - just Google youtube disqualification

    As always the best way to learn is to do - set up your own system of Disqualification and trying things on your PC

  • Challenges using dynamic SQL for data quality tests

    (published for the wrong forum)

    Background:

    I'm developing a data quality, mechanism, on Oracle 11 g, which uses five blocks of PL/SQL by test test script:
    < li > two to supervise the event test by inserting a record in this regard in a fact table and update after the test with the result of the counts;
    table < li > two to create an image "should be" of the tested data set and store it in a scene.
    < li > and the remaining block to perform the analysis by selecting a union of the staging table data and the data "such what" of the target table, abandoning all matching files so that only incompatibilities (errors) remain and store this information in a results table, labeled with either recording source "should be" ("should be") or target ("such what").

    I intend to do this pilot by the data, so that each block is stored as a value in an array of scripts, associated with one or more test scripts through the record key values and pulled in a SQL statement execution by a stored procedure in a package called by Informatica Oracle dynamic. In this way, a variety of data quality tests can be programmed as automated controls that work every night and trigger alerts if problems are detected.

    I have two challenges:
    < li > PL/SQL blocks that create the DataSet "should be" can be very long, and I learned through Discussion Forums OTN size maximum of 32767 K for a string of PL/SQL variable cannot be large enough to fit the entire block, and the EXECUTE IMMEDIATE statement accepts not the CLOB.
    < li > if there is no anomaly, the trick of COUNT () ON the PARTITION that I use to get the counts of subquery won't work, because the analysis block everything is an INSERT INTO < i > < /i > table (column1, Column2, etc.) SELECT (< i > < /i > comparison query) statement, to avoid that my duty to hit the table target several times. My approach to this is driven by performance problems.

    First question: can I I EXECUTE IMMEDIATE nest, so that the SQL block running offers another EXECUTE IMMEDIATE statement that calls another block of SQL? This would solve my problem first.
    Second question: what is the way most effective to get the record account of subqueries in an INSERT INTO (SELECT) statement? I feel I'm too close the code to see the obvious answer here.

    Here is a shell of the block analysis ((deuxieme question):
    DECLARE
    
    StartDate DATE;
    EndDate DATE;
    TEST_KEY NUMBER;
    
    BEGIN
    
    INSERT INTO TEST_EVENT_DETAIL 
        (TEST_EVENT_KEY,TEST_EVENT_DETAIL_KEY,COLUMN1,COLUMN2,COLUMN3,COLUMN4,
        COLUMN5)
    SELECT
        TEST_KEY as TEST_EVENT_KEY,
        TEST_EVENT_DETAIL_SEQ.NEXTVAL AS TEST_EVENT_DETAIL_KEY,
        RESULTS.TABLENAME as COLUMN1,
        RESULTS.COLUMNNAME1 as COLUMN2,
        RESULTS.COLUMNNAME2 as COLUMN3,
        RESULTS.subqry_count as COLUMN4,
        null as COLUMN5  -- there are more generic columns, this is just an example; columns not used by a particular test are filled with nulls
    FROM
    (SELECT MIN(TABLENAME) as TABLENAME,
        min(subqry_count) as subqry_count, 
        COLUMNNAME1,
        COLUMNNAME2
      FROM 
    (SELECT TABLENAME as TABLENAME,
        count(TABLENAME) over (partition by TABLENAME) subqry_count,
        COLUMNNAME1,
        COLUMNNAME2
      from
    (
    /** Source Data **/
     SELECT 'SOURCE' as TABLENAME,
        COLUMNNAME1,
        COLUMNNAME2
       FROM TEST_EVENT_STAGE A
     WHERE A.TEST_EVENT_KEY=TEST_KEY
      UNION ALL
    /** Target Data **/
      SELECT 'TARGET' as TABLENAME,
        COLUMNNAME1,
        COLUMNNAME2
      FROM TABLENAME B
      WHERE ____________
    ) TMP
    )
    GROUP BY COLUMNNAME1, COLUMNNAME2
    HAVING COUNT(*) = 1 -- this drops out all records that don't match
    ORDER BY COLUMNNAME1, COLUMNNAME2, TABLENAME ASC
    ) RESULTS;
    
    END;
    and here's some pseudocode for the stored procedure that would call the PL/SQL (first question) blocks:
    Declare
    
    TestProcessStart DATE;
    TestKey     NUMBER;
    StartDate      DATE;
    EndDate      DATE;
    BlockStatus        varchar2(200);
    BlockSQL        varchar2(32767);
    
    begin
     
    Set BlockSQL=(select SCRIPT_BLOCK from DIM_SCRIPTS where SCRIPT_KEY=(select SQL_BLOCK2 from DIM_TEST where TEST_KEY=TestKey));
    
    execute immediate 'begin ' || BlockSQL || '; end;'
       using in BlockSQL, in out BlockStatus ;
    
     if BlockStatus != 'OK' then
        dbms_output.put_line('error');
     end if;
    end;
    Any ideas/recommendations?

    PL/SQL blocks that create the data set "should be" can be very long, and I learned through the Forums of Discussion OTN 32767 K maximum size for a string of PL/SQL variable cannot be large enough to fit the entire block, and the EXECUTE IMMEDIATE statement does not accept the CLOB.

    Not in 11g more: EXECUTE IMMEDIATE statement:

    »
    dynamic_sql_stmt

    Literal string, string variable or string expression representing a SQL statement. Its type must be CHAR, VARCHAR2 or CLOB.
    «

  • Suggestion for Oracle data mining product team

    Team of Oracle products,

    Here's a suggestion Data mining product marketing and development.

    For a specialist in data mining or data scientist, he does well synchronize when they see the name of the product as "SQL Developer" who has the ability to data mining, unless Oracle is intentionally marketing and sale to its existing base Oracle client, which I hope not.

    Continue to have capabilities to extract data as part of the developer sql and make it as a stand-alone product. This will put Oracle in a much better position to compete with manufacturers as fast Miner, etc. and also it gels with many statisticains and scientific data.

    Bottom line, start to use "SQL" less and more 'algorithms, libraries, APIs' data mining market product to customer non-Oracle base and make them buy product so DB. It seems to me that you sell products critical to Oracle DB clients now, but you should also have a good salesforce that can sell capabilites Data Mining product to customers not Oracle DB and make them buy Oracle DB as well.


    I like SQL and is the way to go for most data extraction, but go with the wind and SQL and other non-SQL conditions to get a foot in the door of the market. Oracle has an amazing technology outside the Oracle database... just need to market it better. (I say this with 22 years of experience of the Oracle)


    I'm curious to see what other people love Oracle and use SQL Developer - Data Mining capabilities say about it. Please post your comments. THX


    D21,

    You make a very good point. This isn't a comment which went unnoticed or that we ignored.  Unfortunately, Oracle cannot make statements about the 'future '. Thank you very much for the comments. Keep it coming!

    Charlie

  • Where to find 'Data quality customer Data Services Pack Installation Guide'?

    Does anyone know where to find this document:

    Oracle Enterprise Data quality customer Data Services Pack Installation Guide


    It is referenced in the Oracle® Enterprise Data quality customer Data Services Pack Siebel Integration Guide (Oracle & amp; reg;) Enterprise Data Quality).


    I tried to import the project in my test system, but there are some error messages. Therefore, I would stick to the documentation.


    Kind regards

    Daniel

    Hello

    All the technical documentation for Disqualification 11 g is online here: http://docs.oracle.com/cd/E48549_01/index.htm

    The CD Installation Guide is here: content

    The Disqualification OTN documentation page has the docs for 9.0 if necessary:

    http://www.Oracle.com/technetwork/middleware/oedq/documentation/index.html

    Kind regards

    Mike

  • Hello, I just purchased Leica, should be free, Lightroom, I try to download there problem to to do. can anyone help.  Dear Leica customer, we are delighted that you are interested in one of our high quality products and are hereby sending you the rede

    I just bought Leica, should be free, Lightroom,

    I try to download there problem to to do.

    can anyone help.

    Dear customer Leica,
    We are delighted that you are interested in one of our high quality products and are hereby send you the redemption code for the download of the software directly from Adobe.

    Please enter this code at the following Web site to download your version of Lightroom:

    Your redemption code: xxxx.xxxx.xxxx.xxx.xxx

    https://redeem.licenses.Adobe.com/getserial

    We hope you enjoy our product.
    Leica Camera AG

    Please check How to download, install and activate Adobe applications

  • In order to become a distributor of volume Adobe Reader, (my computers have no internet access), I need to fill out the Distribution of Volume License Agreement, but will not accept an answer I gave to the question: "Please indicate the product or service

    In order to become a distributor of volume Adobe Reader, (my computers have no internet access), I need to fill out the Distribution of Volume License Agreement, but will not accept an answer I gave to the question: "Please indicate the product or service name and description.

    This form must be completed and submitted online. Also the browser Javascript must be activated.

    https://distribute.Adobe.com/mmForm/index.cfm?name=distribution_form&PV=RDR

  • OWB Data Quality &amp; option basic or Premium license data profiling?

    Hello

    Anyone know if the options Data Quality & data profiling is already included in the base of the oracle Warehouse Builder License or not?

    I look forward your answers :)

    Published by: BartSimpson81 on 11.04.2012 07:38

    It's not basic/free.

    See you soon
    David

  • Difficulty to tear the question with all adobe products using mercury pass?

    Hello.

    I am running the latest version of the Adobe Suite CC (applications are up to date). My computer is an AMD FX 8150 with 390 R9 graphics (whose driver is also up-to-date) running newer versions of Windows 10 retail. The question that I have is a kind of 'tearing' effects on all monitors using mercury pass, any, I choose rather made hardware or software. "Tear", as long as I can describe it appear in all the images loaded in all adobe, same programs still images that are exhausted after through. It's almost like my graphics card seeks to restore the 1/4 of the screen horizontally at different times, and this translates into 4 or 5 "break points" along the sequence of the preview. This is not bad, until you have a shot that makes a horizontal pan, then it's like 4 or 5 different 'zones' of the screen trying to return at different times... Given that it is clearly not a problem with the original files (how could it be?), and it does not appear when the sequence is rendered, what it might be? There must be a problem purely with mercury pass, given that the tear does NOT appear on the workspace to preview the video in the any adobe applications... Appearing rather than on the monitors that are designated as "external" using mercury emissions. My images of viewport seems perfectly functional and free ' tear'.

    I don't see it to be a rate issue, because it is, it does not matter what film is loaded in... Even a still image with a pan of the camera it will show after effects.

    So to summarize, there seems to be a problem with mercury pass that I can't solve. No matter what I do, the tear will not disappear when you view the images on a monitor 'external', which has been designated as such in the preferences "previews". All the drivers and software are updated, and it does little matter images by selecting the software hardware rendering vs makes no difference either.

    Does anyone know of a solution to this? Someone else has the same problem? It's very annoying and almost enough to make me want to give up using Adobe products all together, it's that bad.

    My main graphics card is the I/O device.

    Ahhh.  This is probably the question on this one.  You must run a dedicated unit of I/O for companies like AJA or Blackmagic.

  • Question about the quality of the files PREL

    Hello

    I started to install cassettes VHS (C) by placing the bars in an adapter, I then worked for a VCR. This is attached to the Tower from my PC (Win 7) via a cable SCART end of VCR and a USB cable connected to my turn.

    The software supplied with this equipment is called show business and is produced by a gentle Arc. So I connect everything, open the ShowBiz software, press 'play' on the VCR, then
    "Capture" on the software. The file saves to a file MPG which I then import in first 11 elements.

    When I play the file .prel in my first items 11 monitor, quality is significantly lower than the installation file of the VCR - particularly visible if I change the band and the use of fonts. The letters are not crisp or 'white white' if I use a white font.

    To avoid this loss of quality, I think it would be beneficial if I could avoid the software Arc ShowBiz completely Soft and capture the tape VHS (C) directly
    in 11 elements. Is it reasonable to summise that and, if so, is it possible to 'capture' the file directly in 11 elements and how I would do? This is a tape of a UK VHS (C).

    Thank you

    Steve

    On the site of Arcsoft Showbiz looks to be a competing product of PRE. According to its technical specifications that it can output formats AVI, M2TS, MOV, MP4, MPEG1, MPEG2 and WMV. For the VHS capture I suggest you to try setting the capture save format for. AVI 720 x 576 PAL_B. This should significantly improve the quality for use in the FIELD. Be sure to check the other settings of capture in the product options - it is likely, there are other arrangements related to quality.

    For the second part of your question, the answer is no, PRE can't capture from devices analog USB converter.

    Use a Canopus Firewire Converter so I can capture directly to MEADOW (although by choice, I use WinDV) but I recently helped a friend who has problems with a USB Pinnacle converter because he had lost the software disc with which it came. I found a FREE interesting "NCH debut" product that can capture USB converters. Its native save file format is .avi, but she can capture and record .dv format that is as good as you will get for VHS video capture. Note to set up another free product, ffmpeg, support .dv - some have reported giving problems on their systems, but I've never had problems with it. And for the support of an unusual video format Adobe themselves suggested I install it (and it does not solve the problem). Note that the .dv files use about 13 GB per hour so you will need a lot of disk space if you capture in this format. So if you are unhappy with Showbiz, give NCH debut a try, if you do and you have problems with this post back here and I'll walk you through its options. During the early install beware of the boxes - it offers to install other NCH products trial versions - all deselect all.

    See you soon,.
    --
    Neale
    Insanity is hereditary, get you your children.

  • Problem launching ODI data quality/profiling in Vista

    Hello

    I have a weird problem with Oracle Data Integrator installation in Vista. Hope someone could help me.

    I installed ODI (10.1.3.5.0) as well as the quality of the data and profiling. The installation was successful, however, while trying to launch the quality/Metabase database server, etc., I get an error: "admin.exe is not found" / "discovery.exe not found." In addition, the bin folder of the oracledq folder is empty. I was just wondering need 'something' at least in the bin directory if the installation was error-free!

    Could you please let me know if something is wrong with the installation or is it a Vista problem? Do I need to install anything else as a sine qua non?


    Your help is very appreciated...

    Ludovic Gupta

    Hi guys,.

    I would like to contribute a little,

    ODQ/ODP is not (compatible) with Win Vista certified. Please see the excel sheet for more info below.

    http://www.Oracle.com/technology/products/Oracle-data-integrator/10.1.3/htdocs/documentation/odi_certification.xls

    Thank you
    Guru

  • Comples little question about the transfer product key

    I have a HP (SR5518F) computer and origianally comes with vista, I downgraded to xp, but as time passed gradually began to fall so I decided to restore vista but found the restore partition has been broken, not wanting to pay 40 to 50 dollars for the restore disks I just went and borrowed a friends disc reformat It wasn't a full or upgrade, according to me, it was just a restore (a dell one because that's what it says in my computer).

    and basically because of this I can not activate with its original product key, but I got an upgrade to windows 7 but I'm sure he checks for activation, it would be possible to activate and or upgrade with this situation?

    You cannot use a Dell recovery disk on an HP computer.

    Spend the money on the HP recovery disks:

    http://welcome.HP.com/country/us/en/contact_us.html

    Contact your computer manufacturer and ask them to send a recovery disk/s Vista set.

    Normally, they do this for a cost of $ small.

    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    You can't bring a unactivated Windows 7 Vista!

    For any other question of Windows 7:

    http://social.answers.Microsoft.com/forums/en-us/category/Windows7

    Link above is Windows 7 Forum for questions on Windows 7.

    Windows 7 questions should be directed to the it.

    You are in the Vista Forums.

    See you soon.

    Mick Murphy - Microsoft partner

  • Capture data from product and display also recently consulted (add to Favorites)

    Hey guys!

    I was wondering if I could get help with 2 things...

    1. what someone has managed to use the {tag_addtofavorites} without someone is saved in?

    2 somehow capture the data on the products and display it as recent vu at the bottom of the page like Argos.co.uk or some other stores... ?

    Thank you

    Hi Ricardo,

    You can't use it without being connected, but you can create a solution with liquid/javascript to make a visit. You can contact brett [at] prettdigital.com.au if you not able to implement one yourself for us to cite and build it for you.

  • Question WebService data persistence data control

    Hello

    We use a Web Service data on two pages jspx for application control. The control of data attributes are added on the two pages as data binding.

    Scenario: we submit certain data from page 1 using the web service data and go to page 2. Fill in some data on page 2 and again to apply for the data Web Service and after successful response control back page 1. Same thing repeats again and again for this scenario.

    Now the question is, submit us data of page 1, move to Page2, submit some data of page 2 and return to page 1. If repeat us this process, while presenting data on page 1, the data that we presented earlier on Page2 also get submitted, should not. We want only to present data according to the link on this page. Page1 is unlimited taskflow and Page2 is in the stubborn taskflow.

    We tried setting values as: CacheResults: false on iterator, refresh condition: Ifneeded, UsePersistentStructure to false, the control data. Also tried to reset and turn off data using the API and resetInputState on the links. But nothing works. The only method that works is ClearForRecreate on iterator Page1. But since internal its method, it might not be good to use. Also, after the use of this tells us to receive intermittent error indicating AttributeList$ IRB cannot be cast to a string and exception argument is not an array a few times.

    Please let me know, how I can ensure that the correct data are sent to the data control for the web service.

    With the help of JDev 11.1.1.6.

    Thank you

    Solved by removing the option of data relating to the shares in taskflow control.

    Thank you

Maybe you are looking for