Using Oracle Streams schema replication

Hi DBAs,


I use 11g and that you must configure replication of schema/table between 2 databases. I never used the front stream.

I spent reviewing the Oracle documentation to understand the function of current, but implementation of the replication stream is confusing. Please help with step by step instructions or any article of tutorial or demo OTN to configure the data flow.



Thank you
-Samar-

Hi Samar,

[Metalink Note 753158.1: how to configure streams in real-time environment downstream | https://metalink2.oracle.com/metalink/plsql/f?p=130:14:8789336070734834078:p14_database_id, p14_docid, p14_show_header, p14_show_help, p14_black_frame, p14_font:NOT, 753158.1, 1, 1, 1, helvetica]

The note is quite elaborative to set up a basic configuration of water courses and just your requirement.

On both servers (Source and downstream)
=============================

Conn virtue sysdba

Streams_tbs CREATE TABLESPACE DATAFILE 'streams_tbs_01.dbf' SIZE 100 M REUSE AUTOEXTEND ON MAXSIZE UNLIMITED;

Stradmin CREATE USER IDENTIFIED BY strmadmin
TABLESPACE streams_tbs default
QUOTA UNLIMITED ON streams_tbs;

GRANT DBA TO stradmin; ---> Due to a bug, you must grant the DBA privilege STRADMIN

BEGIN
DBMS_STREAMS_AUTH. () GRANT_ADMIN_PRIVILEGE
dealer-online "stradmin,"
grant_privileges-online true);
END;
/

__Checking the administrator is created

SELECT * FROM dba_streams_administrator;

Direct the logmnr to use the streams_tbs site downstream tablespace:
=============================================

exec DBMS_LOGMNR_D.SET_TABLESPACE ("streams_tbs");

Create the connection between the source and downstream:
=========================================

1. check if it is a perfect connectivity between these two databases

2. as it is a derivative of Setup before the transport of newspapers service requires same password for SYS on both sides.

3. set GLOBAL_NAMES = TRUE on both sides

4 create DBlink from both sides, so that the stradmin can communicate with each other.

create database STREAMS1 link to connect to strmadmin identified by strmadmin using 'STREAMS1 ';

Select * from global_name@STREAMS1;

Definition of the parameters of archiving downstream:
============================

Configure databases for the source database can transfer the logs to check-in on the destination site. (See you documentation on configuring standby basis)

Creating Eve-newspaper of recovery to receive again the Source data (only if you want to have mining in real time):
================================================================

-From the source:

(1) determines the size of the log file used on the source database:

Select THREAD #, GROUP #, BYTES/1024/1024 in V$ LOG;

-The station downstream:

(2) add the newspapers of the day before:

-For example, the source database has three online redo log file groups and each size of 50 MB log file. In this case, use the following instructions to create the appropriate standby log file groups.

Establishment of environment of rivers downstream on the site
================================

BEGIN
DBMS_STREAMS_ADM. () SET_UP_QUEUE
queue_table => ' stradmin. DOWNSTREAM_Q_TABLE',.
queue_name => ' stradmin. DOWNSTREAM_Q',.
queue_user => "STRADMIN");
END;
/

Select name, queue_table from user_queues;

BEGIN
DBMS_APPLY_ADM. () CREATE_APPLY
queue_name => ' stradmin. DOWNSTREAM_Q',.
apply_name-online "DOWNSTR_APPLY."
apply_captured-online TRUE
);
END;
/

Start
DBMS_APPLY_ADM. SET_PARAMETER (apply_name-online 'DOWNSTR_APPLY', parameter =>
Value of 'DISABLE_ON_ERROR', => ' n ");"
end;
/ - This will require installing flow do not stop even if there is an error

BEGIN
DBMS_CAPTURE_ADM. () CREATE_CAPTURE
queue_name => ' stradmin. DOWNSTREAM_Q',.
capture_name-online "DOWNSTR_CAP."
rule_set_name => NULL,
start_scn => NULL,
source_database-online "STREAMS1."
use_database_link to-online true.
first_scn => NULL,
logfile_assignment-online 'implied');
END;
/

BEGIN
DBMS_CAPTURE_ADM. SET_PARAMETER)
capture_name-online "DOWNSTR_CAP."
parameter-online "downstream_real_time_mine."
value => 'y');
END;
/ - Instructing capture processes of mining in real-time

= Adding schemas positive rules for the capture process.

BEGIN
DBMS_STREAMS_ADM. () ADD_SCHEMA_RULES
schema_name-online "SANTU1."
streams_type-online "capture."
streams_name-online "DOWNSTR_CAP."
queue_name => ' stradmin. DOWNSTREAM_Q',.
include_dml to-online true.
include_ddl to-online true.
include_tagged_lcr => FALSE,
source_database-online "STREAMS1."
inclusion_rule => TRUE);
END;
/

BEGIN
DBMS_STREAMS_ADM. () ADD_SCHEMA_RULES
schema_name-online "SANTU2."
streams_type-online "capture."
streams_name-online "DOWNSTR_CAP."
queue_name => ' stradmin. DOWNSTREAM_Q',.
include_dml to-online true.
include_ddl to-online true.
include_tagged_lcr => FALSE,
source_database-online "STREAMS1."
inclusion_rule => TRUE);
END;
/
= By excluding one of the table using the negative rule =.
BEGIN
DBMS_STREAMS_ADM. ADD_TABLE_RULES
(
table-name => ' SANTU1. XYZ'.
streams_type-online "capture."
streams_name-online "downstr_cap."
queue_name => ' stradmin. DOWNSTREAM_Q',.
include_dml to-online true.
include_ddl to-online true.
source_database-online "STREAMS1."
inclusion_rule-online fake - specifies the negative rule set
);
END;
/
==================================================================

Now its time to instantiate the schemas. You can use exp/imp legacy or datapump to instantiate the schemas. I use exp/imp

File System/Manager exp = log = owner = (SANTU1, SANTU2) object_consistent = y
File System/Manager IMP = log = fromuser = (SANTU1, SANTU2) touser = (SANTU1, SANTU2) streams_instantiation = y

If the instantiation is completed successfully, start applying and capture process

exec DBMS_APPLY_ADM. START_APPLY (apply_name-online 'DOWNSTR_APPLY');
exec DBMS_CAPTURE_ADM. START_CAPTURE (capture_name-online 'DOWNSTR_CAP');

Check the State of the capture process / apply:

Select capture_name, dba_capture State;
Select capture_name, State of v$ streams_capture;

Select apply_name, dba_apply State;
Select apply_name, State of v$ streams_apply_server;

Test the configuration. Also note that it is not necessary to maintain the database in archivelog mode downstream.

Tried to explain that the Basic, in short things, but please pass through the documentation.

Kind regards
S.K.

Tags: Database

Similar Questions

  • With Oracle Streams schema replication

    Hi all

    I have oracle 11 GR 2 on RHEL 6.4.

    I want to know is there any with which I can know what are the patterns involved at the request of replication.any?

    For example:

    I have the diagrams as a, b, c, d, e and I wants only the schame has and b gets replicated e not c, d, once the configuration is done, how can I check that these are the tables/schema that all be replicated?

    Thank you.

    Hello

    You can query the dictionary views below to find all tables/diagrams that are currently replicated using the oracle workflow.

    Source database:

    =======================

    TABLES ESTABLISHED FOR CAPTURE:

    Select * from dba_capture_prepared_tables by table_owner, table_name;

    PATTERNS ESTABLISHED FOR CAPTURE:

    Select * from dba_capture_prepared_schemas by schema_name.

    DATABASE ESTABLISHED FOR CAPTURE:

    Select * from dba_capture_prepared_database;

    Target database:

    ===============

    INSTANTIATED APPLY TABLES:

    ------------------------------------------------

    Select source_database, source_object_owner |'. ' || source_object_name OBJECT,

    ignore_scn, instantiation_scn, apply_database_link DBLINK

    from dba_apply_instantiated_objects by source_database, oppose;

    DIAGRAM of APPLY INSTANTIATED and DATABASE:

    -----------------------------------------------------------

    Select source_database, source_schema OBJECT,

    DBLINK, instantiation_Scn INST_SCN apply_database_link,

    "SCHEMA" dba_apply_instantiated_schemas global_flag

    UNION

    Select source_database, "OBJECT.

    DBLINK, instantiation_Scn INST_SCN apply_database_link,

    'GLOBAL' Global_flag from dba_apply_instantiated_global by source_database, oppose;

    Thank you

    MazharAli

  • Can we use Oracle Streams for replication

    I use streams of oracle for replication of data between one and several databases.
    Its works for me but sometimes ADR stops, spreading from the source to the destination database. But when we reboot our databases it starts to work. In 10g, I've never faced the problem.
    I am facing this problem in 11g.
    Can someone help me please its urgent...?

    Yes

  • Database objects can be replicated using Oracle Streams and which object cannot be replicated?

    Hi Experts,

    I need clarification on the sub questions,.


    Database objects can be replicated using Oracle Streams and which object cannot be replicated?

    How can we check that what schema and objects are used streams replication and which schema and objects is not used in the replication stream?

    Thanks in advance.

    Select *.
    of dba_streams_unsupported
    where owner | '.' || table_name (...)

    order by 1, 2, 3;

  • Oracle streams heterogeneous Support (non-Oracle and Oracle databases) envi

    Oracle streams heterogeneous Support (non-Oracle and Oracle databases) environments?

    It is possible to move data from Oracle to DB2 using Oracle Streams?

    Hello

    Visit this link

    http://download-West.Oracle.com/docs/CD/B28359_01/server.111/b28322/hetero.htm

    Kind regards
    Deepak

  • We are in the streams we want to use the tools of replication of database of 3rd party for Oracle freeware. Pls suggest

    We are in the streams we want to use the tools of replication of database of 3rd party for Oracle freeware. Pls suggest

    Hello

    GoldenGate and shareplex are large databases for heteregenous platforms oracle replication tools.

    Shareplex replicates data between heteregenous platforms for example source on Linux and Windows.Shareplex target come from queues so that when we define it a few tables in the configuration of the source files, it will get automatically reproduced in the target database.

    SharePlex 8.6.2 technical documentation

    Concerning

    Rami

  • Oracle Streams Advanced Queuing and advanced replication

    Hi experts,

    I don't have much experience in Oracle. Please tell me what is "Oracle Streams" in simple words. What is use of this example.

    difference between "Oracle Streams, advanced and advanced replication queues".

    Reg,

    Hard

    Hi harsh,

    I'll try and simply summarize for you.

    (1) advanced replication - the ancient mode of replication of data between databases - no one really uses this method more (even if it's still there)

    (2) AQ (or now renamed as streams AQ I think) is a technology of queues for the publication and subscription to messages - it is not a replication method on its own, but is a key technology for manufacturing workflow

    (3) water - a replication technology course complete with a huge amount of flexibility (and complexity) - one of the best tools of the oracle set product - but it is now be depracated in favor of goldengate (in large part because GG is a costly option, personally, I think)

    (4) goldengate - just like rivers but can replicate to/from databases (Sql server, sybase etc.) non-oracle

    streams and goldengate roughly the same work but are implemented very differently - flow is much plsql and queues, GG is to learn another language (although scripting tool really)

    hope that helps.

    See you soon,.

    Rich

  • How to use references from web third party service with service Cloud Computing to Oracle database schema

    APEX 5.0

    Cloud Computing service for the Oracle database schema


    I'm in the middle of do a proof of concept.  Basically, I need an application with security of the stored data, UI, user, data loading, and able to post data via an external web service said.  It seems that with the database schema Oracle cloud service, it is not possible to use web service references that are not in the field.

    If I try to use a service via http reference, I get:

    ORA-20987: APEX - the requested URL was forbidden. Contact your administrator. -Contact your administrator of the application.

    If I try to use the same reference service via https, I get:

    ORA-29273: HTTP request failed

    ORA-06512: at & quot; SYS. UTL_HTTP & quot; line 1130

    ORA-29259: end-of-input reached

    I read somewhere that cloud services only https can be used.  Is this true?

    And then I read somewhere to use the protocol https, the portfolio must be configured to store certificates, etc.  However, I read somewhere else that the portfolio cannot be configured because there is no access to the instance database with the Oracle Cloud Computing database schema service.  Is this true?

    If both are true, how can I make a call to post data to an external web service?  Or do I need to use a different Cloud Computing service?  Or do I need my own instance of Oracle DB?

    Any help would be great.  Thank you!

    It turns out there was a problem with the remote rest service.  After successfully calling a rest service that was created using SQL Workshop, I tried different remote rest services and they all work.  Sorry for the confusion.  I thought it was very strange that the schema of database service wouldn't be able to do it easily.

  • Synchronize two schemas from database on a linux server using Oracle GoldenGate

    I followed this guide ( http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/goldengate/12c/OGG12c_Installation/index.html?cid=9167 & ssid = 0 ) and for two different databases running on the same instance of database schemas.


    What I have now is a Linux server with two Oracle database schemas.
    I have access through my Linux virtual machine. I want to do the synchronization possible between them.

    I use Oracle GoldenGate to make this possible. What should I do now? Goldengate is must be installed in the server too? I'm totally new to this situation. Can someone give me at least the major steps before going to the more comprehensive study?

    Well understood. This helped.

    Install Oracle GoldenGate Linux to synchronize two schemas from database on a single server

  • How to change the password of a schema by using Oracle SQL Developer

    Hi need to change the password of a schema by using Oracle SQL Developer, how can I do?

    or maybe http://www.thatjeffsmith.com/archive/2012/11/resetting-your-oracle-user-password-with-sql-developer/

  • The details of the Instance of the host to Oracle Streams

    Hi all

    I'm using Oracle 11 g on RHEL-5 2.

    I have Oracle installed streams. How could I know the instance and the host name where the data stream configured amoong each other.

    Also how can I get the schemas that are synchronized to within the framework of water courses?

    Kind regards.

    Check log_archive_dest_n to see the location of the delivery of the archivelogs. Tnsnames.ora will give you hostsnames related to the services used.

    Also, you can find

    dba_services

    sys.streams$ _capture_process

    dba_capture

  • URG: How can I force APEX to use the specific schema: spaces of work APEX_040100 or entitled to another schema?

    Hi all

    Here's the scenario:

    1. My computer crashed. were to return to complete export (expdp 11.2.x to impdp 12 c). successfully done.
    2. Configured ADR blah blah... can access the admin of the apex.
    3. Problem is: administration interface does not SEE my workspaces apex Apex (only the workspace IN-HOUSE).
    4. After investigation it seems that APEX (after the upgrade) is somehow configured / run with: schema APEX_040200 and not

    APEX_040100 (as I have improved Apex in the old database versions).


    5. If I'm running the following:

    ALTER session set current_schema = APEX_040100;

    Select short_name, display_name

    of wwv_flow_companies

    where source_identifier is not null;


    I SEE all my beloved workspaces.

    6 my questions:

    a. How can I force APEX to use the specific schema: APEX_040100 instead of APEX_040200?

    b. or are there other alternatives to assign these workspaces in the current environment?


    Concerning

    Etay G

    Hello Brad,

    Thank you for your response. Appreciated.

    • Yet, as I had a little corrupted environment Apex after import (impdp), COMPREHENSIVE database this method of revocation (above) has failed several times to me.
    • Here's what I have (solves the problem):
    • Deleted APEX_040200 Apex 4.2 installation().
    • Text left in DB is the only APEX_040100
    • Then, the version 5.0.3 and it worked!

    More importantly, after the installation of APEX, all workspaces, users, etc. are automatically correctly resided in APEX.
    • I think that if you have only 1 version of apex corrupted in DB. It should work as well. Remove again, schema using the above script (check version) with caution (after backup, etc.).

    Kind regards

    Etay G

  • Connection to the database to Oracle using Oracle Apex 5.0.1

    Hi all

    Is it possible to connect to the Oracle (Oracle SQL Developer 4.1.1) database using Oracle Apex 5.0.1 instead of using the object browser. As he was more workspace after you have downloaded the data we have on CSV Excel.

    For example: regardless of the tables I've created in the Application Builder is created in the object browser. We have limited the workspace of 500 MB. This is the reason why we cannot use this method to load the data and retrieve it. (Note: can I ask 2 GB of space to work on the approval of the Director, who is still not sufficient)

    Is it possible that I will create an apex.oracle.com application by using the application builder and I will create in the database tables. When I load the file in application, data will be stored in the database Oracle instead of the object browser.

    Well let me know is possible, if yes, how can I proceed.

    If this isn't the case, please let me know how I can get more workspace.

    According to my understanding, other possibilities are: 1) taking backup of old files and free up space, 2) create another workspace and get extra work space.

    Thanks for your help!

    Kind regards

    Vinod

    Hello

    When you load data using APEX SQL Workshop utilities for example, the data is stored in your database tables.

    Object browser is just seen tool your schema as the table objects and the data in the tables, just as SQL Developer.

    If you have no good reason to keep these files that you downloaded in the workspace, delete those. Data is stored in database tables.

    Kind regards

    Jari

  • Oracle Streams without link DB

    Hi all

    We have an obligation to move data for about 20 tables of a database in a particular network to a database in another network. The database on both sides runs on 11g and has an RAC implementation.

    To perform this replication options are here

    -Oracle Streams
    -Oracle Golden Gate
    -Export and import Datapump then

    We have the following restrictions to make the transfer

    -No DB connection cannot be created
    -No additional licenses can be purchased for the purchase of Golden Gate

    Subject limitations above still go with Oracle Streams (as I read the oracle documentation, and he says that DB Link is mandatory) or am I just left with the option of datapump

    Thanks in advance,
    Jayadeep

    Archives can be trasnfer by Redo transport services.detail below

    http://docs.Oracle.com/CD/B28359_01/server.111/b28294/log_transport.htm

    before finding the CDC.
    Take a look at following note

    Planned off the change data Capture support
    http://docs.Oracle.com/CD/E18283_01/server.112/e17222/changes.htm#CJAECCIJ

  • Oracle Streams and CLOB column

    Hello

    We use "Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64 bit. My question is "Does Oracle Streams capture, spreads (source capture method) and applies the CLOB column changes?"

    If so, is this default behavior? Can we say Brooks to exclude all of the CLOB column (capture-spreading-request) process?

    Thanks in advance!

    http://download.Oracle.com/docs/CD/B19306_01/server.102/b14229/strms_capture.htm#i1006263

Maybe you are looking for