Previous |
Next |
This example configures an Oracle Streams replication environment that replicates data manipulation language (DML) changes to all of the tables in the hr
schema. This example configures a two-database replication environment with a downstream capture process at the destination database. This example uses the global database names src.example.com
and dest.example.com
. However, you can substitute databases in your environment to complete the example. See "About Two-Database Replication Environments" for more information about two-database replication environments.
In this example, the downstream capture process runs on the destination database dest.example.com
. Therefore, the resources required to capture changes are freed at the source database src.example.com
. This example configures a real-time downstream capture process, not an archived-log downstream capture process. The advantage of real-time downstream capture is that it reduces the amount of time required to capture the changes made at the source database. The time is reduced because the real-time downstream capture process does not need to wait for the redo log file to be archived before it can capture data from it.
This example assumes that the replicated database objects are used for reporting and analysis at the destination database. Therefore, these database objects are assumed to be read-only at the dest.example.com
database.
This example uses the Setup Streams Replication Wizard in Oracle Enterprise Manager to configure the two-database replication environment. This wizard is the fastest and simplest way to configure an Oracle Streams environment that replicates one or more schemas. In addition, the wizard follows established best practices for Oracle Streams replication environments.
The database objects being configured for replication might or might not exist at the destination database when you run the wizard. If the database objects do not exist at the destination database, then the wizard instantiates them at the destination database using a Data Pump export/import. During instantiation, the instantiation SCN is set for these database objects. If the database objects already exist at the destination database, then the wizard retains the existing database objects and sets the instantiation SCN for them. In this example, the hr
schema exists at both the src.example.com
database and the dest.example.com
database before the wizard is run.
Figure: Two-Database Replication Environment with a Downstream Capture Process provides an overview of the environment created in this example.
Two-Database Replication Environment with a Downstream Capture Process
Note: Local capture processes provide more flexibility than downstream capture processes in replication environments with different platforms or different versions of Oracle Database. See Oracle Streams Concepts and Administration for more information.
To configure this two-database replication environment:
Complete the following tasks to prepare for the two-database replication environment:
Configure network connectivity so that the src.example.com
database and the dest.example.com
database can communicate with each other.
See Configuring the Network Environment for information about configuring network connectivity between databases.
Configure an Oracle Streams administrator at each database that will participate in the replication environment. See "Tutorial: Configuring an Oracle Streams Administrator" for instructions. This example assumes that the Oracle Streams administrator is strmadmin
.
Set initialization parameters properly at each database that will participate in the Oracle Streams replication environment. See "Preparing for Oracle Streams Replication" for instructions.
Configure both databases to run in ARCHIVELOG
mode. For a downstream capture process to capture changes generated at a source database, both the source database and the downstream capture database must be running in ARCHIVELOG
mode. In this example, the src.example.com
and dest.example.com
databases must be running in ARCHIVELOG
mode. See Oracle Database Administrator's Guide for information about configuring a database to run in ARCHIVELOG
mode.
Configure authentication at both databases to support the transfer of redo data.
Redo transport sessions are authenticated using either the Secure Sockets Layer (SSL) protocol or a remote login password file. If the source database has a remote login password file, then copy it to the appropriate directory on the downstream capture database system. The password file must be the same at the source database and the downstream capture database.
In this example, the source database is src.example.com
and the downstream capture database is dest.example.com
. See Oracle Data Guard Concepts and Administration for detailed information about authentication requirements for redo transport.
Configure initialization parameters at both the source database and the downstream database to support downstream capture.
The initialization parameters must be set properly at both databases for redo transport services to transmit redo data from the online redo log at the source database src.example.com
to the standby redo log at the downstream database dest.example.com
.
To configure initialization parameters to support downstream capture:
In Enterprise Manager, log in to the source database as the Oracle Streams administrator.
In this example, the source database is src.example.com
.
Go to the Database Home page for the database instance.
Click Data Movement to open the Data Movement subpage.
Click Setup in the Streams section.
The Streams page appears, showing the setup options.
Select Setup Downstream Capture.
In the Host Credentials section, enter the username and password for an administrative user on the host computer system.
Click Continue.
The Setup Downstream Capture page appears.
In the Downstream Database Details section, identify the downstream capture database host name, port, and SID or service name, and enter the credentials for the Oracle Streams administrator at the downstream database.
In the Capture Process Details section, enter capture
in the Capture Process Name field and ensure that Real-Time Downstream Capture is selected.
In the Log Details section, enter a location for the archived redo log files on the computer system running the downstream database in the Standby Redo Log File Location field.
Specify either a valid path name for a disk directory or, to use a flash recovery area, specify USE_DB_RECOVERY_FILE_DEST
. This location is the local destination for archived redo log files written from the standby redo logs. Log files from a remote source database should be kept separate from local database log files. See Oracle Database Backup and Recovery User's Guide for information about configuring a flash recovery area.
In the Log Details section, ensure that Configure Log Parameters for Downstream Capture is selected and that LOG_ARCHIVE_DEST_
n
parameter in Log Parameter is not already in use at the source database.
Click OK.
On the Schedule Job page, either select Immediately or specify a time for the job to run later.
Click OK.
On the Confirmation page, optionally click the job link to monitor the job.
When the job completes successfully, downstream capture is configured. Do not proceed until the jobs completes successfully.
While still logged in to the source database in Enterprise Manager as the Oracle Streams administrator, go to the Database Home page for the database instance.
Click Data Movement to open the Data Movement subpage.
Click Setup in the Streams section.
The Streams page appears, showing the setup options.
Select Replicate Schemas in Setup Streams Replication.
In the Host Credentials section, enter the username and password for an administrative user on the host computer system.
Click Continue.
On the Object Selection page, select HR and click Next.
On the Destination Options page, identify the destination database by specifying its host name, port, SID or service name, and Oracle Streams administrator credentials.
Click Next.
Complete the Replication Options page:
In the Directory Path section, leave the directory paths for the source and destination database unchanged if the host user you specified in Step 7 can read from and write to the directories and the directories have enough space for a Data Pump export dump file. Otherwise, specify different directory paths, or create directory objects that meet these requirements and specify those.
Expand Advanced Options.
In the Options section, ensure that Capture, Propagate and Apply data manipulation language (DML) changes is selected.
In the Options section, deselect Capture, Propagate and Apply data definition language (DDL) changes and ensure that Setup Bi-directional replication is not selected.
In the Capture Process section, select Downstream Capture.
In the Capture Process section, identify the downstream capture host, database, and capture process, and enter the credentials for the Oracle Streams administrator at the downstream database. You configured downstream capture in Step 2. In this example, the name of the downstream capture process is capture
.
In the Capture Process section, ensure that the correct LOG_ARCHIVE_DEST_
n
initialization parameter is selected in Log Parameter. You configured this initialization parameter Step 2.
In the Apply Process section, enter apply
.
On the Schedule Job page, either select Immediately or specify a time for the job to run later.
Click Next.
On the Review page, review the configuration information and click Submit.
On the Confirmation page, click the job link to monitor the job. Do not proceed to the next step until the job completes successfully.
When the job is running, information about its progress is recorded in the following data dictionary views: DBA_RECOVERABLE_SCRIPT
, DBA_RECOVERABLE_SCRIPT_PARAMS
, DBA_RECOVERABLE_SCRIPT_BLOCKS
, and DBA_RECOVERABLE_SCRIPT_ERRORS
. If the job stops because it encounters an error, then see Oracle Streams Replication Administrator's Guide for instructions about using the RECOVER_OPERATION
procedure in the DBMS_STREAMS_ADM
package to recover from these errors.
In SQL*Plus, connect to the source database src.example.com
as an administrative user.
See Starting SQL*Plus and Connecting to the Database for more information about starting SQL*Plus.
Archive the current log file at the source database:
ALTER SYSTEM ARCHIVE LOG CURRENT;
Archiving the current log file at the source database starts real-time mining of the source database redo log.
When you complete the example, a two-database replication environment with the following characteristics is configured:
At the src.example.com
database, supplemental logging is configured for the tables in the hr
schema.
The dest.example.com
database has the following components:
A downstream capture process named capture
. The capture process captures changes to the hr
schema in the redo log information sent from the source database src.example.com
.
A queue with a system-generated name. This queue is for the capture process and apply process at the database.
An apply process named apply
. The apply process applies changes to the hr
schema.
To check the Oracle Streams replication configuration:
At the dest.example.com
database, ensure that the capture process is enabled and that the capture type is downstream. To do so, follow the instructions in "Viewing Information About a Capture Process", and check the Status and Capture Type fields on the Capture subpage.
At the dest.example.com
database, ensure that the apply process is enabled. To do so, follow the instructions in "Viewing Information About an Apply Process", and check Status field on the Apply subpage.
To replicate changes:
At the src.example.com
database, make DML changes to any table in the hr
schema, and commit the changes.
After some time has passed to allow for replication of the changes, use SQL*Plus to query the modified table at the dest.example.com
database to view the DML changes.
Note: The wizard does not configure the replicated tables to be read only at the destination database. If they should be read only, then configure privileges at the destination database accordingly. However, the apply user for the apply process must be able to make DML changes to the replicated database objects. In this example, the apply user is the Oracle Streams administrator. See Oracle Database Security Guide for information about configuring privileges.
About Hub-And-Spoke Replication Environments
When to Replicate Data with Oracle Streams
Administering an Oracle Streams Replication Environment
Oracle Streams Concepts and Administration for more information about downstream capture processes
Oracle Streams Replication Administrator's Guide for an example that configures this replication environment using the DBMS_STREAMS_ADM
supplied PL/SQL package