Staged siteRepository configuration

Hi all.

I am trying to solve some cross between staging configuration problems and other instances of the Production environment. The main problem is that each store has a configuration of "productionURL". I can see the value of this property in two places. First one on the ITC Site Configuration, were I can set this value for each Bank of the environment. This set value is represented on the dynAdmin repository "siteRepository", a second, I can go to the point- siteConfiuration descriptor and see the "productionUrl" of each bank.

As far as I know, put in scene has no configuration backEnd, so its configuration depends on the production that it points to. Then the BCC will set the "productionURL" for all products of the environment, but staging must have another. I'll try to explain by actual values.

Configuration of the ITC site, I put the following values for each store on the environment:

-store 1 (site):

Base URL of the site: www.host-name.es

-store 2 (site):

Base URL of the site: www.host-name.es

But the staging product must have this value: stg.host - name.es

I can manually change this value in the dynAdmin for putting in scene, but whenever I restart servers obviously dynAdmin configuration changes. Can someone tell me a way to automate this configuration. Thanks to you all.

You must use BCC property "Alternative URL" of your site to set up an alternative URL. It is a property of type list.

You can add your URL of transit in her area.

~ Yvan

Tags: Oracle Applications

Similar Questions

  • OEDQ - configuration of Siebel Staging Database

    I'll put up 11g with stand alone 8.1.1.11 version OEDQ. I'm in the section configuration of the database staging in the below URL

    http://docs.Oracle.com/CD/E48549_01/doc.11117/e40741/TOC.htm

    Document said to run scripts to create different tables for the batch.

    What is the best place to create this tables? These tables should be with SIEBEL schema or can they be created in the schema EDQCONFIG or EDQRESULTS?

    Thank you


    Hello

    As the staging tables are used only for temporary data for running batch DQ of Siebel, they agree quite like the extension of the pattern of results of the Disqualification or (better) an additional scheme in the same database. That's what we normally recommend.

    Note that if you extend the schema of results you may need to re-run the scripts after the Disqualification upgrade to a new version.

    Kind regards

    Mike

  • Importance of the ATG, staging site, DS configuration as well as an excerpt

    Am a newbie to ATG. You have a question. Please read my understanding below and find my questions at the end correct me if my understanding is wrong as well.

    A typical active ATG staging will look like below (base level) according to my understanding so far in learning of the ATG.

    Asset Management Server - stores / manages the internal users (users BCC/CA/Merchandising/ACC), goods trade versioning & other repositories versioned

    Staging server - elements not versioned/no versioned trade and other repositories

    Production server - items not versioned/no versioned of trade & other repositories and users external stores/gΦre (customer)-"core schema.

    In this regard, external (customers) profiles are stored only in the production site.

    As the staging site is basically called replica of the production site, should the Store (customer facing) application that will be deployed to staging server as well? If so, how it will point production database schema?

    With this hand, I've also heard "preview feature/server? This is not staged? What is the difference?

    Your interpretation is correct.

    In the intermediate server ear request even the production is deployed. There will be a replica of the kernel, switcha/switchb as in production patterns.

    The preview server is identical to another instance of the BCC server, but it must not have little of the BCC module running.

    You can preview with - to the same BCC server or create a separate instance as BCC server to preview.

    Peace

    Shaik

  • Configuration of CPQ-set used with cim, db error

    I am installing CPQ reference integration using http://docs.oracle.com/pdf/E66154_01.pdf

    I have installed:

    • Java
    • JBoss
    • MDEX
    • Services platform
    • Tools and frameworks
    • ATG

    Then I downloaded Oracle trade and integration reference CPQ (11.1.0.0.0) and unzipped to endeca/ATG/ATG11.1

    Now I'm stuck in the IMC when I import the schema for the coins of 1 / 3 of Oracle 11 g database goes without problem of Production:

    CIM looks like:

    Production base

    * [I] import data

    [S] Skip

    >

    -PRODUCTION CORE DATA IMPORT-

    Enter [h] support, [m] main Menu, [q] Quit to exit

    The combination of the tasks of model... Success

    /CIM/tmp/import/nonswitchingCore-import1.XML import (1 of 2):

    /MotorpriseJSP/install/data/contracts.XML to/atg/commerce/contract/Contracts

    /MotorpriseJSP/install/data/inventoryRepository.XML/ATG/Commerce/inventaire /.

    InventoryRepository

    / MotorpriseJSP/install/Data/giftlists. XML to/atg/commerce/gifts/Giftlists

    /MotorpriseJSP/install/data/profileAdapterRepository.XML /ATG./userprofiling.

    ProfileAdapterRepository

    /MotorpriseJSP/install/data/orderRepository.XML/ATG/Commerce/commande /.

    OrderRepository

    /MotorpriseJSP/install/data/profilePurchaseLists.XML /ATG./userprofiling.

    ProfileAdapterRepository

    ... Error

    /CIM/tmp/import/nonswitchingCore-import2.XML import (2 of 2):

    /MotorpriseJSP/install/data/ProductCatalog.XML/ATG/Commerce/catalogue /.

    ProductCatalog

    /MotorpriseJSP/install/data/priceLists.XML/ATG/Commerce/pricing/tarifs /.

    Price lists

    ... Error

    Update password (1 of 1). Update no ignored, no action required.

    Please see the cim.log for more details.

    -------DATA IMPORT FAILED-------------------------------------------------------

    Enter [h] support, [m] main Menu, [q] Quit to exit

    Make sure that you have configured the connection details and have created the diagram.

    2 2 data import errors. Please visit /usr/local/endeca/ATG/ATG11.1/CIM/

    log/CIM.log for more details.

    and when I look at the first error of cim.log says:

    Info game Sep 17 08:24:13 UTC 2015 1442478253837 atg.cim.task.ant.utility.AntLogger [exec] * error Thu Sep 17 08:24:13 UTC 2015 1442478253835/atg/multisite/SiteRepository Table 'B2B_SITE_ATTRIBUTE' in the point descriptor: 'siteConfiguration' does not exist in a space of accessible table by the data source.  DatabaseMetaData.getColumns returns without columns.  Catalog = null Schema = PRODUCTION

    Please can someone help me find out why B2B_SITE_ATTRIBUTE is not created

    It turns out that, before importing initial data for the production scheme to do step 5 - adding data CPQ.

    In my case, the files were endeca/ATG/ATG11.1/sql/db_components

    and I had to run the ddl files.

  • Use of the staging of prepareBulk table name parameter / completeBulk functions.

    I try to load 1.6 billion triple in a semantic graph RDF instance. I use the prepareBulk / completeBulk approach described in "7.12 Bulk loading Using RDF graphic semantic support for Apache Jena. I loaded the triplets of. TTL.gz files in an intermediate table with prepareBulk according to the "example 7-10 loading data into the staging (prepareBulk) Table.

    Example 7-10, I used "null" for the parameter "staged by the name of the table" at prepareBulk. I then ran a separate program to run completeBulk according to the "example 7-11 loading data into the table of staging in semantic networks (completeBulk). 7-11 watch also the use of "null" as default value for the parameter "staged by the name of the table. PrepareBulk operations seem to have executed successfully with a null value, staging table name. However, null does not seem to be a valid entry for the staging of completeBulktable name parameter. "CompleteBulk (null, null);" run displays the following error message:

    Hit the exception ORA-00942: table or view does not exist

    What is the relationship between "staged by the name of the table" Settings prepareBulk and completeBulk? Is null, a valid value for this parameter to prepareBulk, and if so what should be the corresponding value passed to completeBulk?

    Hello

    This seems odd. We have a test for this case. We will try this. By default, the intermediate table created is under the same user schema and table name would be "RDFB_" followed by the name of model.

    Can you please verify the existence of such a table in your schema? There must be 1 b + lines. If so, you can directly proceed to the name of the table.

    Since you are dealing with a good amount of data, the following should be helpful for performance:

    (1) remove the indexes on the table of the application before you run the completeBulk call;
    2) enable parallel DML before the call: oracle.executeSQL ("alter session enable dml parallel");

    (3) use the parallel load options. An example is the following. Degree of parallelism is set to 4, and you will need to customize it to your own configuration.

    "PARSE PARALLEL PARALLEL_CREATE_INDEX = 4 mbv_method = shadow"

    Thank you

    Zhe Wu

  • Get Erros in CIM: initial data import configurations

    Hello

    I installed ATG 11, Weblogic 12 c in my system and connect to oracle db (12 c) is installed in the other system. I use CIM.bat to configure the connections. I created schema successfully, but impossible to import the data. I get following errors:

    IT'S CONSOLE:

    The combination of the tasks of model... Success

    /CIM/tmp/import/management-import1.XML import (1 of 12):

    /DAS/install/data/dynAdminRepo.XML to/atg/dynamo/security/AdminSqlRepository

    /DPS/InternalUsers/install/data/das-security.XML /ATG./userprofiling.

    InternalProfileRepository

    /DPS/InternalUsers/install/data/DCs-security.XML /ATG./userprofiling.

    InternalProfileRepository

    /DPS/InternalUsers/install/data/security.XML /ATG./userprofiling.

    InternalProfileRepository

    /DPS/InternalUsers/install/data/SearchAdmin-security.XML /ATG./userprofiling.

    InternalProfileRepository

    ... Error

    /Publishing/base/install/EPUB-role-data.XML (2 of 12) import/atg/userprofil

    Import (3 of 12) /Publishing/base/install/epub-file-repository-data.xml to /ATG.

    Loading (4 of 12) DSS/atg/registry/data/scenarios/DSS/*.sdl & DSS/atg /.

    Registry/Data/Scenarios/Recorders / *. SDL... Error

    /CIM/tmp/import/management-import2.XML import (5 of 12):

    /DCs/install/data/initial-segment-lists.XML /ATG./userprofiling.

    PersonalizationRepository

    /DCs/versioned/install/data/internal-users-security.XML /ATG./userprofiling.

    InternalProfileRepository

    /WebUI/install/data/profile.XML to/atg/userprofiling/InternalProfileRepository

    /WebUI/install/data/external_profile.XML /ATG./userprofiling.

    ProfileAdapterRepository

    / CommerceReferenceStore/Store/Knowledgebase/install/Data/viewmapping. XML to /.

    ATG/web/viewmapping/ViewMappingRepository

    /CommerceReferenceStore/store/storefront/data/catalog-versioned.XML to /atg/

    trade/catalogue/ProductCatalog

    ... Error

    Import (6 of 12) / CommerceReferenceStore/store/showcase/data /.

    pricelists. XML to/atg/commerce/pricing/priceLists/PriceLists... Error

    Store.Storefront.NoPublishing/atg/registry/Slots/ loading (7 of 12)

    * .properties... Error

    Store.Storefront.NoPublishing/atg/registry/ loading (8 of 12)

    RepositoryTargeters/ProductCatalog/*.properties... Error

    Store.Storefront.NoPublishing/atg/registry/ load (9 of 12)

    RepositoryGroups/*.properties... Error

    Store.Storefront.NoPublishing/atg/registry/ loading (10 of 12)

    RepositoryGroups/UserProfiles/*.properties... Error

    Store.Storefront.NoPublishing/atg/registry/data/scenarios/ loading (11 of 12)

    Store/abandonedorders / *. SDL & Store.Storefront.NoPublishing/atg/registry/data/

    Scenarios/Store/global / *. SDL & Store.Storefront.NoPublishing/atg/registry/data/

    Scenarios/Store/homepage / *. SDL & Store.Storefront.NoPublishing/atg/registry/

    Data/Scenarios/Store/Category / *. SDL & Store.Storefront.NoPublishing/atg/

    Registry/Data/Scenarios/Store/Orders / *. SDL & Store.Storefront.NoPublishing/atg/

    Registry/Data/Scenarios/Store/Returns / *. SDL & Store.Storefront.NoPublishing/

    atg/registry/data/scenarios/DCS/*.sdl... Error

    Import (12 of 12) /CIM/tmp/import/management-import3.xml:

    /CommerceReferenceStore/store/storefront/data/sites.XML /ATG./multisite.

    SiteRepository

    /CommerceReferenceStore/store/storefront/data/stores.XML /ATG./commerce.

    places/LocationRepository

    /CommerceReferenceStore/store/storefront/data/promos.XML /ATG./commerce.

    Catalog/ProductCatalog

    /CommerceReferenceStore/store/storefront/data/claimable.XML /ATG./commerce.

    STEPS/ClaimableRepository

    /CommerceReferenceStore/store/storefront/data/storecontent.XML /ATG./magasin.

    stores/StoreContentRepository

    /CommerceReferenceStore/store/storefront/data/seotags.XML /ATG./seo.

    SEORepository

    /BIZUI/install/data/portal.XML to/atg/portal/framework/PortalRepository

    /BIZUI/install/data/profile.XML to/atg/userprofiling/InternalProfileRepository

    /BIZUI/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    / BCC/install/Data/viewmapping. XML to/atg/web/viewmapping/ViewMappingRepository

    /DPS-UI/AccessControl/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /DPS-UI/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /AssetUI/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /AssetUI/install/data/assetManagerViews.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /SiteAdmin/versioned/install/data/siteadmin-role-data.XML to /atg/

    userprofiling/InternalProfileRepository

    /SiteAdmin/versioned/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /SiteAdmin/versioned/install/data/templates.XML /ATG./multisite.

    SiteRepository

    /DPS-UI/versioned/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /DPS-UI/versioned/install/data/examples.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /DCs-UI/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /DCs-UI/install/data/viewmapping_preview.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    / CommerceReferenceStore/store/shop/Versioned/install/data /.

    sites - templates.xml to/atg/multisite/SiteRepository

    /CommerceReferenceStore/store/Knowledgebase/install/data/basic-URLs.XML to /.

    ATG/multisite/SiteRepository

    / CommerceReferenceStore/Store/eStore/Versioned/install/Data/viewmapping. XML to

    / ATG/Web/viewmapping/ViewMappingRepository

    / CommerceReferenceStore/store/shop/Versioned/install/data /.

    site-model - viewmapping.xml to/atg/web/viewmapping/ViewMappingRepository

    / CommerceReferenceStore/store/shop/Versioned/install/data /.

    internal-users - security.xml to/atg/userprofiling/InternalProfileRepository

    /DCs-UI/versioned/install/data/users.XML /ATG./userprofiling.

    InternalProfileRepository

    /DCs-UI/versioned/install/data/viewmapping.XML/ATG/Web/viewmapping /.

    ViewMappingRepository

    /DCs-UI/siteadmin/versioned/install/data/viewmapping.XML /ATG./web.

    viewmapping/ViewMappingRepository

    ... Error

    Update password (1 of 1). The administrator password has not been

    updated in the database. The update was bypassed. Please see the cim.log for

    Details

    -------DATA IMPORT FAILED-------------------------------------------------------

    Enter [h] support, [m] main Menu, [q] Quit to exit

    Make sure that you have configured the connection details and have created the diagram.

    12 12 data import errors. Please visit C:\ATG\ATG11.0\CIM\log/cim.log

    For more details.

    * [D] done - import of brand as done

    [C] continue

    I solved it. It was because of the incompatible ojdbc.jar pass... Use the compatible ojdbc container for your oracle database.

  • As for the rate of initiation process off staging area?

    Hello!

    Could someone confirm that this is the expected behavior to see SSD being emptied only twice on hard drives write buffer per day or even less in some circumstances. Currently we have up to 40 simultaneous Horizon linked clone users in our new environment but we gradually increase the number up to 250 by moving users from the old cluster to the new and the current amount of concurrent writes (IOs) at staging scares me gradually. Not clearly said in VSAN whitepaper algorithm cache which would be an expected rate, but I think that hot flushes more often would put less pressure on magnetic disks. Please take a look at the attachment.

    Thank you.

    Perttu

    No, you would typically see Turning off much more than that, but it is usually down to the amount of data is written to the buffer to write to the hybrid configurations.

    In my view, there is a statement in the http://www.vmware.com/files/pdf/products/vsan/VSAN-Troubleshooting-Reference-Manual.pdf indicating that destagin begins when the WB is 30% full.

    There is no tunable parameters for these algorithms setting customer-facing.

    I would like to use the observers VSAN utility to monitor the use of the WB that you add desktop computers more. There are details on how to use it in the troubleshooting guide.

    HTH

    Cormac

  • Failed to load data from the staging area using the map of ERP SAP ABAP

    Hello

    I am new to ODI and some challenges. ODI 11g (11.1.1.6.0) I want to move data from a SAP EHP6 system to a warehouse of the oracle. I got the table that I need, using the SAP reverse engineering metadata browser. I then created the interface and on executing this operation fails on task/Session 13 - Loding - Srcset0 - load data into a staging. I use a shared directory. I put FTP_TRANSFER_METHOD = FSMOUNT_DIRECT. I get the error message is

    org.apache.bsf.BSFException: exception of Jython:
    Traceback (most recent call changed):
    File '< string >", line 14, in < module >
    Load error: see /home/dwsap/ZODI_13001_11001_GLOBAL.log for more details

    to org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
    to com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
    to com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
    to oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
    to oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
    to oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
    to com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
    to com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
    to com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
    to com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ 2.doAction(StartSessRequestProcessor.java:366)
    to oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$ 0 (StartSessRequestProcessor.java:292)
    to oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$ StartSessTask.doExecute (StartSessRequestProcessor.java:855)
    to oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
    to oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$ 2.run(DefaultAgentTaskExecutor.java:82)
    to java.lang.Thread.run(Thread.java:744)
    Caused by: Traceback (most recent call changed):
    File '< string >", line 14, in < module >
    Load error: see /home/dwsap/ZODI_13001_11001_GLOBAL.log for more details

    to org.python.core.PyException.doRaise(PyException.java:219)
    to org.python.core.Py.makeException(Py.java:1166)
    to org.python.core.Py.makeException(Py.java:1170)
    to org.python.pycode._pyx0.f$ 0 (< string >: 50)

    to org.python.pycode._pyx0.call_function (< string >)
    to org.python.core.PyTableCode.call(PyTableCode.java:165)
    to org.python.core.PyCode.call(PyCode.java:18)
    to org.python.core.Py.runCode(Py.java:1204)
    to org.python.core.Py.exec(Py.java:1248)
    to org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
    to org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
         ... 19 more


    I checked the log file and it contains the following


    $ ZODI_13001_11001_GLOBAL.log more

    SQL * Loader: release 11.2.0.4.0 - Production on Thu Mar 27 11:19:17 2014

    Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

    SQL * Loader-704: Internal error: ulconnect: OCIServerAttach [0]
    ORA-12504: TNS:listener did not have the SERVICE_NAME in CONNECT_DATA


    I checked my tnsnames.ora and it seems to be OK


    tnsnames.ora # Network Configuration file: /u01/app/oracle/product/11.2.0/dbhome_1/network/admin/tnsnames.ora
    # Generated by Oracle configuration tools.

    XXXXXX =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = TCP)(HOST = xxxxxx) (PORT = 1521))
    (CONNECT_DATA =
    (SERVER = DEDICATED)
    (SERVICE_NAME = xxxxxx)
    (SID = xxxxxx)
    (GLOBAL_NAME = xxxxxx)
    )
    )


    Here's my listener.ora


    listener.ora # Network Configuration file: /u01/app/oracle/product/11.2.0/dbhome_1/network/admin/listener.ora
    # Generated by Oracle configuration tools.
    LISTENER =
    (DESCRIPTION_LIST =
    (DESCRIPTION =
    (ADDRESS = (PROTOCOL = CIP)(KEY = EXTPROC1521))
    (ADDRESS = (PROTOCOL = TCP)(HOST = xxxxx) (PORT = 1521))
    )
    )
    ADR_BASE_LISTENER = / u01/app/oracle



    What could be the problem?

    Concerning

    Thanks a lot for your help. The problem is that I had put in the incorrect name of the instance.

  • 2 writers can add to the same data staged?


    Hello

    How can we get 2 entries to write on the same data staged without data loss? I need to write the rejected records to a staging of data in 2 different processes before exporting.

    Kind regards

    Ravi

    No, you can not do this. Ecrire write to different sets of data on stage, then use a process with the merge data stream to bring together them.

    Otherwise, use two different sets of data staged or data interfaces and add to export. Data interfaces means that you never staged. The staged data: you can choose whether or not to step in the configuration of the task (or run profile).

  • Cannot configure Flatfile reconciliation in IOM 9.1.0.1

    Hello
    I am facing this weird problem with IOM in my new project. I followed the steps below to set up reconciliation flatfile using as a source of confidence.
    1 transportation provider shared drive
    2. the format CSV - provider
    I filled out the location for the directory (Parent) of the commissioning stage and archiving. I'm using Cp1251 to the encoding of the file.

    The problem I'm facing is that as soon as I move to the step above for the mapping, I see not all fields in the Source and the staging of reconciliation
    step 3: change the Configuration of the connector. What could be the reason? The flat file is already in the location of the parent directory, before starting the setup of Flatfile GTC. There is no logs generated for this for obvious reasons. It would be awesome if someone can answer soon because I was facing this problem since yesterday morning.

    Remove the password...

    IOM generates its own password internally (i.e. User Login in the CAPITAL) in Trusted Recon...

    Also, please give us sample of a line so that we do not know what is the error...

    Hope it's something like that: -.

    Connector #GTC
    Login, firstname, lastname, Type of user, the Type of employee, organization
    A000001, John, Thompson, the end user, full-time, Xellerate users

    Typical error is role (i.e. ideal case should be of Type Employee-->--> full-time role ) and the Type of Xellerate (i.e. ideal case should be of Type user--> Xellerate Type--> for the end user )

    First creation 'Reconciliation rule' because connector GTC creates no 'rule of reconciliation. "

    User--> user login ID

    Reapply matching rule... the event stuck in "Received event ' status...".

    Send us the error

  • colleagues, I need some help 'staged system' approach / autoconfig/upgrade

    Gentlemen - I have a problem,

    Will to upgrade R12.0.4 on CARS (32-bit linux) R12.0.6 RAC 11.2 10.2 (linux 64 bit) on new hardware with minimal downtime.

    The plan is to use a 'Staging system' approach where we will clone the system to new servers and new topology (the new system has a few different intermediate etc.) while current production is still running.
    Then while the company is still underway off the coast of the old Kit we will be upgraded to - end the copy on the new servers (which is not used at this time).

    so now the system we cloned on the new servers, we will:
    change DB wordsize to 64-bit
    change DB 11.2.0.1 worm
    then run the 12.0.6 patch to upgrade the system as a whole (db and app nodes) to 12.0.6
    the history of patch of the export of the db using adphmigr.ph (no db)

    (If at this stage, we have 2 systems one) old production and b) completely new system on the new kit

    now, we must make the prod system nedw (b) system, so we need to get current data prod transferred with minimal downtime, so we intend to (a weekend):
    (1) complete the business on the current prod
    (2) move the database to upgrade to the new system (b) (ignore the system (a) in the future)
    ((3) rapidclone the prod (a) database to the new servers for the system (b) (in replacement of the db we come to fall) (we have only configure db - no application of bleachers and yet they're 12.0.6 awaits us) database (b) now contains current information)
    (4) convert newly cloned db to new (b) servers to 64-bit - then upg to 11.2.0.1
    (5) run 1 12.0.6 autoconfig high (sys b) to associate with the new db (X)
    then run adpatch (I can safetly run among the 12.0.6 adpatch new levels? against the 12.0.4 db) of a 12.0.6 high with nocopyportion nogenerate, so that it only upgrades the db to 12.0.6
    (6) to perform automatic configuration of all other midtiers associated with the db
    (7) start the history of patch in the db via adpatch for each appl_top

    what the above reasonable sound?
    (x) safe above? (to associate a 1206 Middle 1204 db?)

    comments gratefully received
    Martin

    what the above reasonable sound?

    It seems reasonable to me. However, I recommend you log an SR and confirm with the support of the Oracle if your approach is taken in charge or not. Additionally, if you could try in the same place on a TEST first instance to avoid errors and problems you may experience.

    (x) safe above? (to associate a 1206 Middle 1204 db?)

    Yes, but are you planning to create the appsutil.zip file, copy it to the node of the database layer and remove it? If Yes, I would expect a few problems because of the mismatch between the 12.0.4 database and 12.0.6 application.

    Thank you
    Hussein

  • Staged beginning CTE and CTE

    Hello
    Can someone please shed some light on the available environments, we have for Crm od. Can we have total 4 environments in total? (production, staging, leading CTE, Trailing ETC)?

    Receive a quick response.

    Thank you
    my

    Staging environment - this will be a copy of your production environment, but the data and configuration will be as good as the last time that your environment has been refreshed. You can use this as a test environment to test all changes before migrating to production

    CTE-attack - this will be the basis of the CRMOD environment without any of your business or the configuration data. I think that for a certain fresh oracle will do a refresh for you (not sure). You can use as a dev environment to do all your development changes and migrate the code in your test environment by using the new migration tools provided by Oracle. This environment will be the first environment that is upgraded. Any changes which should be out in the new versions of all will be in CTE leading the environment even before the upgrade is done on the staging.

    CTE - leak - this is similar to CTE - Leading except this happen only after your production upgrade is (correct me if am wrong)

  • Site configuration: multiple users

    We have an intranet site with 20,000 files and Dreamweaver authors of 20 +. With a site of this size it is impracticle to us to copy and keep a local copy and keep synchronized with the remote site, which is a staging server. I want to know if someone has defined the Local site for the staging server, renouncing the recommended Dreamweaver site drive C and the problems with this configuration. I can't perhaps the only person in the community of users with this issue.

    Shared local site is supported!

    http://www.Adobe.com/cfusion/knowledgebase/index.cfm?id=tn_16432

  • where alsb configuration data stored?

    Done alsb implements all database internal to save the configuration data of all services developed in alsb for development environments, staging and production.

    Hi Abhishek,

    Can you point me to any doc that says that.

    As far as I know, ALSB comes with pointbase DB and can be configured to use other databases as well.
    But it does not store the configuration data related services. Configuration is stored in the local filesystem only.

    Other databases can be configured to record etc.

    Let me know if I'm wrong.

    Thank you

  • Tips to add a VPN router to my current network configuration

    Dear all

    My apologies if the answer to this question already exists, however, I searched in many situations and none seem to match what I'm after.

    I currently have an ISP modem/router in Bridge mode connected to a TC of Apple which is my wireless router, I have 2 Express airport connected to this acting as the extensors of the range.  I have a VPN service through the MyPrivate network I activate on the desired device when required and everything works fine.

    What I want to do now is to be able to use my AppleTV and burning Amazon via the VPN as well so you need to add a VPN router in the configuration.  I want to finish with 2 wireless networks running together for these devices who need VPN and those who are not.  I don't want to lose the opportunity to extend the network to express it however airport.

    If someone could explain to me if this is possible and if so how do I set up the network.

    Thanks in advance

    Mark

    Basically you would need a device that supports VPN-passthrough and VLANS for your goals of networking. MyPrivate network, seems to be a VPN SSL, which is a user-server configuration. In other words, you install a client VPN on your Mac and you connect to the VPN network MyPrivate server to establish a VPN tunnel.

    Networking two or more "separated", should be using a router that supports VLAN services. Each segment of VIRTUAL local area network, in essence, would be a separate, she either wired or wireless network or a combination of both. This would probably be the 'easiest' part for the installation program.

    Now how combining the two would be the question, and I don't know what would be the best way, or even if it is possible.

    A few thoughts:

    • Use a router that supports VLANS. Create at least two VIRTUAL LAN segments. One for Apple TV & Burns, one for Internet access in general. Connect the device to VPN client host on the first segment, and configure for Internet sharing.
    • Download a dedicated VPN network application that supports hosting of third-party VPN clients, like yours. You would still need a router that supports VLAN to provided separate network segments.
    • Hire a consultant network. Let them know what you the goals of networking and ask them to offer potential solutions.

Maybe you are looking for