Oracle Streams and impdp?

Hi, I am not a DBA or anything close :)... in a project that I'm getting, there are 2 or more prod servers that are connected
through flow for bi directional data replication. Now, we must develop a back/restore mechanism using scripts. The back of the scripts take the dump using expdp of a schema and the restore script use impdp to apply it. Is there any where I can apply using impdp to a database and the flow will apply to all other databases. I tried Googling, yahooing, everything, but to no avail. We use also dataguard and his strange for me, dataguard applies the changes on the day before, each time I have doing an impdp on the primary database.

Maybe that is wrong my understanding in streams...

Any help is greately appreciated.

Thank you
Renjith

Renjith,

Answering your question:

Yes, if all databases are configured with the replication stream. An example of only 2 databases A and B when you set up replication Streams from A to B, here if you perform impdp for a schema (which is configured for replication) database so all changes made by the impdp will be replicated on database B.

If you have configured bidirectional replication between 2 or more databases (can be n star) then the impdp changes must be replicated to all other databases.

However, you can see that replication does not occur after performing impdp, then you must apply the hotfix for the bug: 5220845>. This bug is fixed on 10.2.0.4 group of hotfixes (also in 11.1.0.6-> background 11g Release) and more. If you run a group of fixes lower than that (like 10.2.0.3) then apply the single patch for this bug. Take a look at the following note on metalink:

"Water does not replicate data imported with Datapump."

Thank you
Florent

Tags: Database

Similar Questions

  • Database objects can be replicated using Oracle Streams and which object cannot be replicated?

    Hi Experts,

    I need clarification on the sub questions,.


    Database objects can be replicated using Oracle Streams and which object cannot be replicated?

    How can we check that what schema and objects are used streams replication and which schema and objects is not used in the replication stream?

    Thanks in advance.

    Select *.
    of dba_streams_unsupported
    where owner | '.' || table_name (...)

    order by 1, 2, 3;

  • Oracle Streams and CLOB column

    Hello

    We use "Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64 bit. My question is "Does Oracle Streams capture, spreads (source capture method) and applies the CLOB column changes?"

    If so, is this default behavior? Can we say Brooks to exclude all of the CLOB column (capture-spreading-request) process?

    Thanks in advance!

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14229/strms_capture.htm#i1006263

  • Oracle Streams Advanced Queuing and advanced replication

    Hi experts,

    I don't have much experience in Oracle. Please tell me what is "Oracle Streams" in simple words. What is use of this example.

    difference between "Oracle Streams, advanced and advanced replication queues".

    Reg,

    Hard

    Hi harsh,

    I'll try and simply summarize for you.

    (1) advanced replication - the ancient mode of replication of data between databases - no one really uses this method more (even if it's still there)

    (2) AQ (or now renamed as streams AQ I think) is a technology of queues for the publication and subscription to messages - it is not a replication method on its own, but is a key technology for manufacturing workflow

    (3) water - a replication technology course complete with a huge amount of flexibility (and complexity) - one of the best tools of the oracle set product - but it is now be depracated in favor of goldengate (in large part because GG is a costly option, personally, I think)

    (4) goldengate - just like rivers but can replicate to/from databases (Sql server, sybase etc.) non-oracle

    streams and goldengate roughly the same work but are implemented very differently - flow is much plsql and queues, GG is to learn another language (although scripting tool really)

    hope that helps.

    See you soon,.

    Rich

  • Additional logging with Oracle 10 g 2 streams and Data Guard

    Hello

    I have a physical environment with DB Oracle 10 g 2 and Standby with Data Guard DR CONF. At the present time, this environment will be extended to a pattern of replication using 2-way Oracle Streams Replication (for replication to the central office of this branch, other channels will be added soon). The primary DB is replicated to the other primary DB (at the telephone exchange remotely).

    So, there's my question: it is completely necessary to specify an additional record on data sources (primary) to set the 2-way Streams replication? and, if it is completely necessary, then, can configure additional connection on primary without affecting their physical Standby, or what I have to do something special?

    Thanks in advance.

    Yes, for the streams you need additional registration. There is no impact on databases physical standby.

  • Oracle streams heterogeneous Support (non-Oracle and Oracle databases) envi

    Oracle streams heterogeneous Support (non-Oracle and Oracle databases) environments?

    It is possible to move data from Oracle to DB2 using Oracle Streams?

    Hello

    Visit this link

    http://download-West.Oracle.com/docs/CD/B28359_01/server.111/b28322/hetero.htm

    Kind regards
    Deepak

  • Oracle Streams without link DB

    Hi all

    We have an obligation to move data for about 20 tables of a database in a particular network to a database in another network. The database on both sides runs on 11g and has an RAC implementation.

    To perform this replication options are here

    -Oracle Streams
    -Oracle Golden Gate
    -Export and import Datapump then

    We have the following restrictions to make the transfer

    -No DB connection cannot be created
    -No additional licenses can be purchased for the purchase of Golden Gate

    Subject limitations above still go with Oracle Streams (as I read the oracle documentation, and he says that DB Link is mandatory) or am I just left with the option of datapump

    Thanks in advance,
    Jayadeep

    Archives can be trasnfer by Redo transport services.detail below

    http://docs.Oracle.com/CD/B28359_01/server.111/b28294/log_transport.htm

    before finding the CDC.
    Take a look at following note

    Planned off the change data Capture support
    http://docs.Oracle.com/CD/E18283_01/server.112/e17222/changes.htm#CJAECCIJ

  • Oracle Streams is a licensed product

    Hi guys,.

    I need to know whether or not the oracle streams is a licensed product?


    Thanks and greetings
    Andrea Ballarin

    Hello

    Oracle workflow is part of Oracle Enterprise Edition. If you have this version, then you can use the streams without an additional license.

    Herald tiomela
    htendam.WordPress.com

  • Difference between Oracle Designer and developer of forms

    Hello

    Could someone explain to me the subject


    Rgds

    sexy

    Oracle Designer is a complete computer-aided design tool and software development. It gives you a set of tools for logical design as an entity-relationship diagrammer, a Process Modeler and a stream Diagrammer. It has a set of design tools from database to physical database design. It can also help to design modules of Oracle Forms application, reports Oracle, PL/SQL Web Toolkit or Visual Basic (VB6, I think, but versions not later as VB.Net). It will then generate these modules. It stores your designs in a metadata repository in an Oracle database, and you can make some nice reports your design data.

    Oracle Forms Developer does one thing and does it very well - it allows you to design and develop modules of Oracle Forms. Note that Designer can do this too, but because the designer generates your forms, you don't have quite as much control over the way they look and work like you would with the Oracle Forms Developer.

    One last thing, these tools are considered to be "mature". For Designer, this means that not much has changed or improved for a while. Forms, however, is still a few changes and improvements with each new version. Nothing really important, but more designer. See the statement of management tools for more information.

  • Full expdp and impdp: one db to another

    Hello! Nice day!

    I would like to ask for any help with my problem.

    I would like to create a full database export and import them to a different database. This data base 2 are on separate computers.
    I try to use the expdp and impdp tool for this task. However, I experienced some problems during the import task.

    Here are the details of my problems:

    When I try to impdp the dump file, it seems that I was not able to import data and metadata to the user.

    Here is the exact command that I used during the import and export of task:

    Export (Server n ° 1)

    expdp user01 / * directory = ora3_dir full = y dumpfile=db_full%U.dmp filesize = parallel 2 logfile = db_full.log G = 4

    import (Server #2)
    Impdp user01 / * directory = ora3_dir dumpfile=db_full%U.dmp full = y log = db_full.log sqlfile = db_full.sql estimate = parallel blocks = 4

    Here is the log that was generated during the impdp runs:

    ;;;
    Import: Release 10.2.0.1.0 - 64 bit Production on Friday, 27 November 2009 17:41:07

    Copyright (c) 2003, 2005, Oracle. All rights reserved.
    ;;;
    Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - 64 bit Production
    With partitioning, OLAP and Data Mining options
    Table main "PGDS. "' SYS_SQL_FILE_FULL_01 ' properly load/unloaded
    Departure "PGDS. ' SYS_SQL_FILE_FULL_01 ': PGDS / * directory = ora3_dir dumpfile=ssmpdb_full%U.dmp full = y logfile = ssmpdb_full.log sqlfile = ssmpdb_full.sql.
    Object DATABASE_EXPORT/TABLESPACE of treatment type
    Type of object DATABASE_EXPORT/PROFILE of treatment
    Treatment of DATABASE_EXPORT/SYS_USER/USER object type
    Treatment of DATABASE_EXPORT/SCHEMA/USER object type
    Type of object DATABASE_EXPORT/ROLE of treatment
    Treatment of type of object DATABASE_EXPORT, GRANT, SYSTEM_GRANT, PROC_SYSTEM_GRANT
    DATABASE_EXPORT/SCHEMA/SCHOLARSHIP/SYSTEM_GRANT processing object type
    DATABASE_EXPORT/SCHEMA/ROLE_GRANT processing object type
    DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE processing object type
    DATABASE_EXPORT/SCHEMA/TABLESPACE_QUOTA processing object type
    DATABASE_EXPORT/RESOURCE_COST processing object type
    Treatment of DATABASE_EXPORT/SCHEMA/DB_LINK object type
    DATABASE_EXPORT/TRUSTED_DB_LINK processing object type
    DATABASE_EXPORT/PATTERN/SEQUENCE/SEQUENCE processing object type
    Treatment of type of object DATABASE_EXPORT/PATTERN/SEQUENCE/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
    DATABASE_EXPORT/DIRECTORY/DIRECTORY of processing object type
    Treatment of type of object DATABASE_EXPORT/DIRECTORY/EXCHANGE/OWNER_GRANT/OBJECT_GRANT
    Treatment of type of object DATABASE_EXPORT/DIRECTORY/EXCHANGE/CROSS_SCHEMA/OBJECT_GRANT
    Type of object DATABASE_EXPORT/CONTEXT of transformation
    Object DATABASE_EXPORT/SCHEMA/PUBLIC_SYNONYM/SYNONYM of treatment type
    Object DATABASE_EXPORT/SCHEMA/SYNONYM of treatment type
    DATABASE_EXPORT/SCHEMA/TYPE/TYPE_SPEC processing object type
    Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, PRE_SYSTEM_ACTIONS, PROCACT_SYSTEM
    Treatment of type of object DATABASE_EXPORT, SYSTEM_PROCOBJACT, POST_SYSTEM_ACTIONS, PROCACT_SYSTEM
    DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA processing object type
    DATABASE_EXPORT/SCHEMA/TABLE/TABLE processing object type
    Treatment of type of object DATABASE_EXPORT, SCHEMA, TABLE, PRE_TABLE_ACTION
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/SCHOLARSHIP/CROSS_SCHEMA/OBJECT_GRANT
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
    DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT processing object type
    Object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS of treatment
    DATABASE_EXPORT/SCHEMA/TABLE/COMMENT processing object type
    DATABASE_EXPORT/SCHEMA/PACKAGE/PACKAGE_SPEC processing object type
    Treatment of type of object DATABASE_EXPORT/SCHEMA/PACKAGE/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    DATABASE_EXPORT/SCHEMA/FEATURE/FUNCTION processing object type
    Treatment of type of object DATABASE_EXPORT/SCHEMA/FUNCTION/GRANT/OWNER_GRANT/OBJECT_GRANT
    DATABASE_EXPORT/DIAGRAM/PROCEDURE/PROCEDURE processing object type
    Treatment of type of object DATABASE_EXPORT/DIAGRAM/PROCEDURE/GRANT/OWNER_GRANT/OBJECT_GRANT
    DATABASE_EXPORT/SCHEMA/FUNCTION/ALTER_FUNCTION processing object type
    DATABASE_EXPORT/DIAGRAM/PROCEDURE/ALTER_PROCEDURE processing object type
    DATABASE_EXPORT/SCHEMA/VIEW/VIEW processing object type
    Treatment of type of object DATABASE_EXPORT/SCHEMA/VIEW/SCHOLARSHIP/OWNER_GRANT/OBJECT_GRANT
    Treatment of type of object DATABASE_EXPORT/SCHEMA/VIEW/SCHOLARSHIP/CROSS_SCHEMA/OBJECT_GRANT
    DATABASE_EXPORT/SCHEMA/VIEW/COMMENT processing object type
    Type of object DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/REF_CONSTRAINT of treatment
    Type of object DATABASE_EXPORT/SCHEMA/PACKAGE_BODIES/PACKAGE/PACKAGE_BODY of treatment
    DATABASE_EXPORT/SCHEMA/TYPE/TYPE_BODY processing object type
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/INDEX/FUNCTIONAL_AND_BITMAP/INDEX
    Treatment of object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS.
    Treatment of type of object DATABASE_EXPORT, SCHEMA, TABLE, POST_TABLE_ACTION
    DATABASE_EXPORT/SCHEMA/TABLE/TRIGGER processing object type
    DATABASE_EXPORT/SCHEMA/VIEW/TRIGGER processing object type
    Treatment of object DATABASE_EXPORT/PATTERN/JOB type
    Object DATABASE_EXPORT/SCHEMA/DIMENSION of treatment type
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCACT_INSTANCE
    Treatment of type of object DATABASE_EXPORT/SCHEMA/TABLE/POST_INSTANCE/PROCDEPOBJ
    Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCOBJ
    Treatment of type of object DATABASE_EXPORT, DIAGRAM, POST_SCHEMA, PROCACT_SCHEMA
    Job 'PGDS. "" SYS_SQL_FILE_FULL_01 "led to 17:43:09

    Thank you in advance.

    The good news is that your dumpfile seems fine. It has metadata and data.

    I looked through your impdp command and found your problem. You have added the sqlfile parameter. This indicates datapump to create a file that can be run from sqlplus. In fact, it is not objects. It also excludes data because data could get pretty ugly in a sqlfile.

    Here's your impdp command:

    Impdp user01 / * directory = ora3_dir dumpfile=db_full%U.dmp full = y log = db_full.log sqlfile = db_full.sql...

    Just remove the

    sqlFile = db_full. SQL

    After you run your first job, you will have a file named db_full.sql that has all the inside create statements. After you remove the sqlfile parameter, your import will work.

    Dean

  • need advice on expdp and impdp

    I have a database on server win of oracle 10g.
    I installed a newserver on oracle 11g in GNU / linux.
    Now I would like to clone the data old server to the new server.
    I think expdp and impdp is the best option.

    If so, what would be the maximum amount of data that I can cope?
    its better to do it in shema shema or database data?

    help me guys

    If you have SQL Developer, on the Tools menu, there is an option to copy the database. It will not be an easy one.

  • Update of Oracle streams conflict does not work

    I use three of database DB1, DB2, DB3. I use workflows to reproduce the
    Scott.Dept table for all three database. I use conflicts of update for dept
    table. The database version is 9.2.0.8

    Here's the code I ran for all three database.


    DECLARE
    passes DBMS_UTILITY.NAME_ARRAY.
    BEGIN
    collar (1): = "deptno";
    collar (2): = "dname";
    collar (3): = 'loc ';
    () dbms_apply_adm.set_update_conflict_handler
    object_name = > "scott.dept"
    method_name = > "crush."
    resolution_column = > 'deptno. "
    column_list = > CLO);
    end;
    /


    Here are the contents of the table in all three database.


    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON



    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON

    [email protected] >


    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON

    [email protected] >


    Now, I inserted a registration of DB1 and it is reproduced in all databases.


    [email protected] > insert into dept
    2 values(50,'IT','HOUSTON');

    1 line of creation.

    [email protected] > commit;

    Validation complete.

    After inserting the record, it is replicated on all the three db


    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    50. IT HOUSTON

    [email protected] >

    [email protected] > /.

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    50. IT HOUSTON

    [email protected] >

    [email protected] > /.

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    50. IT HOUSTON

    [email protected] >


    Now, I want to test the update conflict. I'm updating the same record
    in DB1, DB2, and DB3.



    [email protected] > update dept
    2 set dname = "db1", loc = "db1".
    3 where deptno = 50;

    1 line update.

    [email protected] > commit;

    Validation complete.

    [email protected] >


    [email protected] > update set dname dept = 'db2 ',.
    loc 2 = "db2" where deptno = 50;

    1 line update.

    [email protected] > commit;

    Validation complete.

    [email protected] >

    [email protected] > update
    2 dept set dname = 'db3', loc = "db3".
    3 where deptno = 50;

    1 line update.

    [email protected] > commit;

    Validation complete.

    [email protected] >


    After the above change, the output is as follows.


    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    50 db2 db2

    [email protected] >

    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    db1 50 db1

    [email protected] >

    [email protected] > select * from the Department;

    DEPTNO DNAME LOC
    ---------- -------------- -------------
    10 ACCOUNTS NEW YORK
    SEARCH 20 DALLAS
    30 SALES CHICAGO
    40 OPERATIONS BOSTON
    50 db2 db2

    [email protected] >

    If the deptno = 50 record is not consistent for all
    three databases. Please help me why the record (deptno = 50)
    is not compliant?

    The before and after image of columns that has been manipulated with some of the key values to identify a target line are those who must be part of a LCR. You can check this by writing a DML handler and looking at the columns in the CSF.

    Some operations require additional columns are saved unconditionally. These requirements are set out in the Documentation of Oracle Streams.

  • Oracle 11g and partitions on a star model

    Hello

    I am Manager of the Business Intelligence service in a society. We use SAP BI4.1 SP3 and an Oracle 11 g server.

    We have big questions about our various requests.

    One of them is the use of "partition" and how they are used.

    Here's the situation. I have a big fact with data on the inventory table. This table is partitioned with partition 1 per day, using the foreign key 'DATE_KEY"(digital YYYYMMDD)

    I have a CIVIL dimension, whose primary key is also DATE_KEY (numeric YYYYMMDD). Other columns of attribute are like DAY (in a date format), YEAR, WEEK and so on. Very classic.

    Now, I want to execute the following queries:

    (1) to recover the 1 day stock, using the date format of my calendar dimension attribute

    Select sum (stock_quantity)

    of s FACT_STOCK

    inner join DIM_CALENDAR d

    On s.DATE_KEY = d.DATE_KEY

    where d.DAY = to_date('16/02/2016','DD/MM/YYYY')

    (2) get 1 day stock, using the primary key of my calendar dimension

    Select sum (stock_quantity)

    of s FACT_STOCK

    inner join DIM_CALENDAR d

    On s.DATE_KEY = d.DATE_KEY

    where d.DATE_KEY = 20160216

    (3) to recover the stock of 1 week, using the week of my calendar dimension attribute

    Select sum (stock_quantity)

    of s FACT_STOCK

    inner join DIM_CALENDAR d

    On s.DATE_KEY = d.DATE_KEY

    where d.WEEK = 201607

    Results of the explain plan command:

    1)

    expected: it's supposed to used 1 partition because the partition key is in the join

    true: it does not use the partition. All the table partitions are analyzed.

    2)

    expected: it is supposed to used 1 partitionas the partition key is in the filter where

    true: it uses a 1 partition.

    3)

    expected: it is supposed to use 7 partitionas the partition key is in the join

    Real: it does not use the partition. All the table partitions are scanned.

    The behavior is normal? I think that this isn't, but I'm not DBA.

    "If it's normal, partition are completely unnecessary, since I can't say my professional user: 'Please use the technical dimension instead of the calendar attribute key' or 'Please list the day of your week instead of using the attribute of the week'"

    Is it possible to implement dabase settings in order to use the partition.

    Notice the: BF0000 used for pstart and pstop, also references to the "party join filter.

    It is Oracle, creation and use of a filter of flowering of the first entry (small) to identify the partitions that need to access the second table.

    It is possible for Bloom filters identify data redundant, but to identify the partition, they seem to be very accurate.

    It seems that you should have a unique constraint on d.day for your first query, this would allow Oracle to see she had access only a single partition - and which could replace the filter of flowering with a subquery to pstart/pstop and KEY/KEY filter (possibly KEY (SQ) / KEY (SQ)) instead of: BFnnnn

    Concerning

    Jonathan Lewis

  • What is Oracle SRM and why is it used?

    Hello

    What is Oracle SRM and why is it used?

    Thank you

    POOJA

    Hi Pooja,

    SRM is used to combine social networks for businesses that use them for advertising.

    SRM Oracle documents are available on:-https://docs.oracle.com/cloud/social/index.htm

    Kind regards

    eDynamic Experts Eloqua

  • Oracle Forms and reports 12 c (12.2.1.0) Installation

    Hello

    I had downloaded a file of

    Oracle Forms and reports 12 c (12.2.1.0)

    Published on 23 October 2015

    http://www.Oracle.com/technetwork/developer-tools/forms/downloads/index.html .

    Now, I am trying to install, but its not install on the computer. can guide you how to install it.

    Sandy

    Two things:

    1. you didn't did not read carefully my last update.  I said that you need to install JDK 8 (more precisely 8U51 or newer).  Which suggests that you installed and that he tried to use Java 7 with that statement that you answered:

    C:\Program Files\Java\jdk1.7.0_79\bin>java-jar fmw_12.2.1.0.0_infrastructure.jar

    You must UNINSTALL 7U79 (which you cited above) and install the last v8, which is 8U66.

    http://www.Oracle.com/technetwork/Java/javase/downloads/index-JSP-138363.html

    2. you have not share of details necessary to understand what could happen.  For example:

    • On what platform and the version you install?
    • Have you reviewed the matrix of product Certification to ensure that this platform is certified for use?
    • Is your Windows user, a member of the Windows Administrators group?
    • The machine has at least 6 GB of RAM?
    • Is the installer staged locally or are access you it from a network share?
    • Did you review the Setup logs?  If this is not the case, do, as they will probably tell you what is happening.  They are located in:

      C:\Program Files\Oracle\Inventory\logs

    I highly recommend that go you through the documentation (Setup Guide, Sys requirements, etc.) before you install or use the product:

    http://docs.Oracle.com/middleware/1221/formsandreports/index.html

Maybe you are looking for