Synchronize the database of ODI table changes

Hi all

which is the best approach to synchronize the changes of database in ODI.

Is it possible to automate?

For example, if we have a store of data such as target and some types of data are changes in the database, the re - reverse will not apply these changes...

Thank you

Hello

Try this...

http://oracledwbi.WordPress.com/2010/03/12/synchronising-ODI-datastore-with-database-changes/

Kind regards
Its

Tags: Business Intelligence

Similar Questions

  • How to load data from matrix report in the base using ODI table data

    Hello

    How to load matrix report data in the base table data using oracle Data Integrator?

    Description of the requirement:

    This is the data from matrix report:
    JOB                       DEPT10                DEPT20  
    ___________________________ _____________
    ANALYST                                           6000
    CLERK                   1300                     1900 
    Need to convert it to the format below:
    JOB                             Dept                        Salary
    _____________________________________________
    ANALYST                  DEPT10      
    ANALYST                  DEPT20                     6000
    CLERK                       DEPT10                    1300
    CLERK                       DEPT20                    1900
        
    Thank you for your help in advance. Let me know if any other explanation is needed.

    Your list seems to be a bit restrictive, you can do much more with the procedures of ODI.

    If you create the new procedure and add a step. In the 'source' tab command you define technology and pattern according to your source database. Use the unpivot operator as described in the link, please, instead of using "SELECT *' use the column names and aliases for example:"

    SELECT workstation,
    deptsal as deptsal,
    saldesc as saledesc
    OF pivoted_data
    UNPIVOT)
    deptsal-<-->
    FOR saldesc-<-->
    IN (d10_sal, d20_sal, d30_sal, d40_sal).<-->
    )

    Then in your tab 'command on target' defined technology and drawing on your target db, then put your INSERT statement for example:

    INSERT INTO job_sales
    (employment,
    deptsal,
    saledesc
    )
    VALUES
    (
    : job,.
    : deptsal,.
    : saledesc
    )

    That's why you use bind variables from source to load data into the target.

    Obviously if the source and target table is in the same database, you can have it all in a single statement to the "command on target' as

    INSERT INTO job_sales
    (employment,
    deptsal,
    saledesc
    )
    SELECT workstation,
    deptsal as deptsal,
    saldesc as saledesc
    OF pivoted_data
    UNPIVOT)
    deptsal-<-->
    FOR saldesc-<-->
    IN (d10_sal, d20_sal, d30_sal, d40_sal).<-->
    )

    also assign the log count "Insert" on the tab corresponding to your INSERT statement, so that you know how many rows you insert into the table.

    I hope this helps.

    BUT remember that this feature is out in Oracle 11 g.

  • Install HRCS 9.0 R5 with problem of Linux: HRCS90 of the database has no table PSSTATUS

    People,


    Hello. I was installing HCM and Campus Solution 9.0 with PeopleTools8.53. Server machine is Oracle Linux 5.10 and Windows XP-based client computer.  The architecture of my internet is WebLogic11g/Tuxedo11g/OracleDatabase 11 GR 1 material. PeopleTools 8,53 works correctly in the browser.


    In the database server Oracle Linux 5.10 machine, I run scripts 'createdb10.sql, utlspace.sql, hrcddl.sql, dbowner.sql, psroles.sql, psadmin.sql and connect.sql' one by one. Can I use Data Mover to load Windows XP data into the Oracle Linux 5.10 DB HRCS90 instance. The Data Mover script hrcs90ora.dms is done correctly in Windows XP. Configuration Manager is configured correctly.  But when I login in the application designer, I can't connect and get the message as below:


    «Security Table Manager (Get): the database is release 8.52.»  The PeopleTools running require databases at 8.53 release. »


    I followed the http://docs.oracle.com/cd/E37306_02/psft/acrobat/PeopleTools-8.53-Upgrade_02-2013.pdf document to upgrade HCM and HR 9.0 revision 5 Database Instance HRCS90 in Oracle Database Server with Linux. I did in chapters 4 and 5. But the connection to the application designer and start the Tuxedo application server can always put the same error message: "the database is release 8.52. The PeopleTools performed require a database at 8.53 release. »


    I ran 4-3-2 of the task script ptddlupg.sql and PSIMAGE2 tablespace is created successfully.

    When I run rel853n.sql script Task 4-3-5, I see the first line table PSSTATUS does not exist.

    The big problem is that there are no PSSTATUS table in my database HRCS90. I activated Windows XP Data Mover C:\PT8.53\log\hcengs.log as below:

    ... ...

    The remaining records: 7687

    Import PSSTATUS

    Creating Table PSSTATUS

    Import PSSTATUS

    Indices of real estate required for PSSTATUS

    Update statistics for PSSTATUS

    The remaining records: 7686

    ... ...


    As we see above, the hrcs90ora.dms script load PSSTATUS table in the instance database HRCS90. But the PSSTATUS table is not in HRCS90.



    My questions are:

    First of all, why is there no PSSTATUS table in my HRCS90 database? What stage is the problem?  How to fix?

    Second, does run the hrcs90ora.dms script in Windows XP Data Mover to load data in HRCS90 under Linux, yet due to no PSSTATUS table?


    Thank you.

    PSSTATUS is the property of the accessid, means that the user Oracle appearing in PSDBOWNER.

    You are not connected with this user but you can ask PSDBOWNER because there's a public synonym, which is not the case for PSSTATUS.

    But I think you missed the point that upgrade scripts must be run with this accessid (again, it is the owner of the Peoplesoft tables).

    In front of your whole posts on this same forum, I am always surprised you make this mistake, if I remember correctly, this point has been said several times for you.

    Nicolas.

    PS:

    > Moreover, can't copy and paste the contents of my computer on the Forum. My browser there problem or the Forum have problem?

    It is a known bug in the forum in a configuration of your browser/operating system...

  • Change the host name of the database server, which must change accordingly?

    Hello

    I have an oracle (instance 11.2.0.3.0) database is running on a Linux machine. (Release of Red Hat Enterprise Linux Server 5.8 (Tikanga))

    now, the database is running.

    issues related to the:

    1. what files to change after change of server name? (tnsnames.ora? listener.ora? something else?)

    2. what appropriate measures must be in what orders? thank you very much in advance

    (we need to bounce back to the database, listener?)

    PL see also MOS Doc 578169.1

    HTH
    Srini

  • Display data from the database in a table

    Hello

    How to display data from my database in an ADF table using a backing bean? I created an arraylist in the bean, but only the last row of my query is displayed in the table...

    Thank you...

    Hello

    Create a simple Java class that implements Serializable. Create attributes that represent each column in your table. This class represents on the row of your table. A list of these objects, and you can fill your af:table.

    Visit this link below for an example.
    Re: Is it possible to create a static array of ADF and the tree?

    Kind regards
    Amélie Chan

  • Transpose the text column of the Oracle using ODI table

    Here's what I'm trying to do:
    Source table that contains the columns:
    Time, Blob, Item-ID

    The expected target has the columns:
    Time, data, Item1, Item2. ItemN

    1. where Item1, Item2 are as shown: say for example the source table has 10 rows of different IDS, there are 10 columns in the table target from Item1 to article 10. (Assuming that no. different IDs are fixed)
    How to do this? I learn ODI for a few weeks now. Understand the basics and terminology, I don't know how to do this.

    2 data (the target table) is a return value from a stored procedure that would transform the Blob (of the Source table)
    The stored procedure is ready, but how do you use the procedure to get the target to the source.

    Any help/direction would be greatly helpful.
    -Vincent

    Published by: user12397263 on December 29, 2009 23:53

    1. a question: How do you who would determine row goes to which column? You can say that the 1st line value becomes item1, but without a rank ORDER 1 will always random. If you have a way to ensure the order of the rows (say, a column that can store the rownum) then what you need is in the interface, you specify for n - th element mapping must be MAX (Case when n = rownum then point else ' ' end). Who should take care of it.

    2. There is a way to do this in ODI. Because you are a beginner, I would recommend that you create a temporary table and you use a PL/SQL block to generate data of BLOB and serve another table of the source that can update the column of data in the temporary table target table.

  • How to display the results of the database from several tables in descending order?

    I have a key word to search for images on my website that allows visitors to search for specific things. My db msysql has 15 tables and for the search function I use a UNION ALL operation to join them any visitor can find all tables at the same time. The problem is that all the results are in ascending order. How can I get the results in descending order.

    SELECT *.

    FROM table1

    Keyword WHERE AS id LIKE %s or %s

    UNION ALL

    SELECT *.

    FROM table2

    Keyword WHERE AS id LIKE %s or %s

    UNION ALL

    SELECT *.

    TABLE 3

    Keyword WHERE AS id LIKE %s or %s

    PS

    I tried this and it doesn't work:

    SELECT * FROM table1 WHERE AS %s or id AS %s ORDER BY id DESC keyword

    and I know that this is not conventional to use SELECT * with so many tables, but believe me, I have my reasons. I'm not trying to do is to wrap the whole UNION within an external SELECTION

    Once more, the order by clause can only appear at the end of the last select statement in a UNION query. You have it on all select statements.

  • Synchronize the database and the model does not not for some FK in 4.0.2

    It does not sync DB to the model or model DB:

    1 FK in model has the DEFERRABLE INITIALLY DEFERREED = YES but ACTIVE and YES = value

    2. in the base of the FK is NOVALIDATE DISABLED

    Sync shows no difference at all.

    Never mind! It's the same problem I had before. David said it was a known problem.

  • Synchronize the Oracle database

    I have direct Oracle 9i R2 Database on TRU 64 Unix 5. 1 b and test\development environment on Windows 2003.

    The database is instant 2 months old, now we want to update the data with the latest news from live. I want to import the main schema with the setting ignore = y, but I see tables have no unique or primary key on the table and there are more than 10,000 tables in all, import can duplicate data.

    Exp/Imp same incremental might not help because there is no structural changes in the database.

    What is the best way to update/synchronize the database with the most recent data without duplicating records. ?

    Using an oracle/third party tools?

    Appreciate the comments.

    Thank you.

    To export only the data changed since the last export, you must ensure that all tables in the database contains a column that lists them when a row is inserted or updated, and a window of downtime in production for each update that makes sure that no data changes intervene during the time the export is run. That seems pretty unlikely.

    While replication is a theoretical possibility, which is probably not practical. You do not want the objects in lowest environments to be materialized views, you want that they act as matching the production items. Generally, you do not want to incur the overhead of tracking and storage of the changes in the production database in order to make updates in the lowest environments. In addition, you do development in the lowest environments, so you should have to constantly change the replication code to match the new environment.

    I guess you can register a third-party replication tool, but who will have the same scope of issues that would be to use the Oracle replication. You must have a staging area potentially huge staging all data changes, you put an overhead on all data bases of production, and you would still have issues with screwing the process of replication of schema changes.

    The safe and sane solution is almost always blow the less environment and do a complete refresh of production.

    Justin

  • can express us batch relationship XML structure in the database table

    Hello
    Please help me...
    I have a lot of structure of batch XML... .can we express batch relationship XML structure in the database of tha table?

    Yes... so how do?

    Thank you
    Amou

    Published by: amu_2007 on March 25, 2010 18:57

    Published by: amu_2007 on March 25, 2010 19:03

    But what is the problem with the original solution, given that divides the XML into the data?

    I mean you could do something like that?

    SQL> create table batch (customer    VARCHAR2(10)
      2                     ,cust_name   VARCHAR2(10)
      3                     ,cust_type   VARCHAR2(10)
      4                     )
      5  /
    
    Table created.
    
    SQL>
    SQL> create table section (customer    VARCHAR2(10)
      2                       ,sect_name   VARCHAR2(10)
      3                       ,sect_depend VARCHAR2(10)
      4                       )
      5  /
    
    Table created.
    
    SQL> create table job_sections (customer        VARCHAR2(10)
      2                            ,sect_name       VARCHAR2(10)
      3                            ,job_sect_name   VARCHAR2(10)
      4                            ,job_sect_depend VARCHAR2(10)
      5                            )
      6  /
    
    Table created.
    
    SQL> create table job (customer        VARCHAR2(10)
      2                   ,sect_name       VARCHAR2(10)
      3                   ,job_sect_name   VARCHAR2(10)
      4                   ,job_type        VARCHAR2(10)
      5                   ,job_sub_type    VARCHAR2(10)
      6                   ,job_depend      VARCHAR2(10)
      7                   )
      8  /
    
    Table created.
    
    SQL>
    SQL>
    SQL> insert all
      2    when batch_rn = 1 then
      3      into batch (customer, cust_name, cust_type) values (customer, cust_name, cust_type)
      4    when section_rn = 1 then
      5      into section (customer, sect_name, sect_depend) values (customer, sect_name, sect_dependency)
      6    when job_sections_rn = 1 then
      7      into job_sections (customer, sect_name, job_sect_name, job_sect_depend) values (customer, sect_name, job_sect_name, job_sect_dependency)
      8    when 1=1 then
      9      into job (customer, sect_name, job_sect_name, job_type, job_sub_type, job_depend) values (customer, sect_name, job_sect_name, job_type, jo
     10  --
     11  WITH t as (select XMLTYPE('
     12  
     13    
     14      
    15 16 17 18 19 20 21 22
    23
    24 25 26 27 28 29 30 31
    32
    33 34 35 36 37 38 39 40 41 42 43
    44
    45
    46 ') as xml from dual) 47 -- 48 -- END OF TEST DATA 49 -- 50 ,flat as (select a.customer, a.cust_name, a.cust_type 51 ,b.sect_name, NULLIF(b.sect_dependency,'NULL') as sect_dependency 52 ,c.job_sect_name, NULLIF(c.job_sect_dependency,'NULL') as job_sect_dependency 53 ,d.job_type, d.job_sub_type, NULLIF(d.job_dependency,'NULL') as job_dependency 54 from t 55 ,XMLTABLE('/BATCH' 56 PASSING t.xml 57 COLUMNS customer VARCHAR2(10) PATH '/BATCH/@customer' 58 ,cust_name VARCHAR2(10) PATH '/BATCH/@name' 59 ,cust_type VARCHAR2(10) PATH '/BATCH/@type' 60 ,bat_sections XMLTYPE PATH '/BATCH/BATCH_SECTIONS' 61 ) a 62 ,XMLTABLE('/BATCH_SECTIONS/SECTION' 63 PASSING a.bat_sections 64 COLUMNS sect_name VARCHAR2(10) PATH '/SECTION/@name' 65 ,sect_dependency VARCHAR2(10) PATH '/SECTION/@dependency' 66 ,section XMLTYPE PATH '/SECTION' 67 ) b 68 ,XMLTABLE('/SECTION/JOB_SECTIONS' 69 PASSING b.section 70 COLUMNS job_sect_name VARCHAR2(10) PATH '/JOB_SECTIONS/@name' 71 ,job_sect_dependency VARCHAR2(10) PATH '/JOB_SECTIONS/@dependency' 72 ,job_sections XMLTYPE PATH '/JOB_SECTIONS' 73 ) c 74 ,XMLTABLE('/JOB_SECTIONS/JOBS/JOB' 75 PASSING c.job_sections 76 COLUMNS job_type VARCHAR2(10) PATH '/JOB/@type' 77 ,job_sub_type VARCHAR2(10) PATH '/JOB/@sub_type' 78 ,job_dependency VARCHAR2(10) PATH '/JOB/@dependency' 79 ) d 80 ) 81 -- 82 select customer, cust_name, cust_type, sect_name, sect_dependency, job_sect_name, job_sect_dependency, job_type, job_sub_type, job_dependency 83 ,row_number() over (partition by customer order by 1) as batch_rn 84 ,row_number() over (partition by customer, sect_name order by 1) as section_rn 85 ,row_number() over (partition by customer, sect_name, job_sect_name order by 1) as job_sections_rn 86 from flat 87 / 16 rows created. SQL> select * from batch; CUSTOMER CUST_NAME CUST_TYPE ---------- ---------- ---------- ABC ABC1 ABC_TYPE SQL> select * from section; CUSTOMER SECT_NAME SECT_DEPEN ---------- ---------- ---------- ABC X ABC Y X ABC Z Y SQL> select * from job_sections; CUSTOMER SECT_NAME JOB_SECT_N JOB_SECT_D ---------- ---------- ---------- ---------- ABC X JOB1 ABC Y JOB2 X ABC Z JOB3 ABC Z JOB4 SQL> select * from job; CUSTOMER SECT_NAME JOB_SECT_N JOB_TYPE JOB_SUB_TY JOB_DEPEND ---------- ---------- ---------- ---------- ---------- ---------- ABC X JOB1 X xx ABC X JOB1 X yy ABC X JOB1 X zz ABC Y JOB2 Y xx X ABC Y JOB2 Y yy X ABC Y JOB2 Y zz X ABC Z JOB3 ..... .... ABC Z JOB4 .... .... 8 rows selected. SQL>

    But it would depend on what you are really after regarding primary keys and relationships between the tables etc.

    I would like to put this just for you...

    H1. . If YOU PROVE to THE United States THAT OUTPUT you NEED, WE cannot GIVE YOU AN ANSWER

  • Read a record of file and write to the database.

    Hello

    I read a file using a file adapter and write to the database using DBAdapter table. In BPEL, I used activity to receive the entry from the file and call activity to call dbadapter.in between receive and call I used activity activity of transformation of transformation.the problem is that, after deployment it.i get following error two errors:

    (1) no recoverable system failure:
    < bpelFault > < faultType > 0 < / faultType > < bindingFault xmlns = "http://schemas.oracle.com/bpel/extension" > < a name = "summary" part > < summary > Exception is is produced when the link has been invoked. Exception occurred during invocation of the JCA binding: "JCA binding run reference"insert"operations have to: connection problem component binding JCA.". JCA Binding component is unable to create an outgoing connection of JCA (CCI). ReadWriteDB:WriteDB [WriteDB_ptt::insert (MydbCollection)]: The JCA Binding component could not establish an outbound connection of JCA CCI due to the following problem: BINDING. Error of JCA - 12510 JCA adapter localization resources. Unable to locate the adapter of JCA resource through the element of the binding file .jca & lt; factory connections / > The JCA Binding component is unable to startup of the resource adapter specified in the & lt;-factory connections / > element: location = 'EI, DB, null. The reason is most likely that 1) the resource adapter RAR file has not been deployed successfully to the WebLogic application server or 2) the "& lt;" jndi name > ' element in weblogic - ra.xml has not been defined in eis/DB/null. In the latter case, you need to add a new factory of connections from WebLogic JCA (deploy a RAR). Please fix this and then relaunch the Application Server, please make sure that the JCA connection factory and dependent connection factories have been set up with a sufficient limit for maximum connections Please also make sure that the physical connection to the EIS server is available and the backend itself accepts connections. ". The called JCA adapter threw an exception of resource. Please review the error message above carefully to determine a resolution. < Summary / > < / part > < part name = "detail" > < details > error location JCA resource adapter. Unable to locate the adapter of JCA resource through the element of the binding file .jca & lt; factory connections / > The JCA Binding component is unable to startup of the resource adapter specified in the & lt;-factory connections / > element: location = 'EI, DB, null. The reason is most likely that 1) the resource adapter RAR file has not been deployed successfully to the WebLogic application server or 2) the "& lt;" jndi name > ' element in weblogic - ra.xml has not been defined in eis/DB/null. In the latter case, you need to add a new factory of connections from WebLogic JCA (deploy a RAR). Please fix this and then restart the application server < / details > < / part > < part name = "code" > < code > 12510 < / code > < / piece > < / bindingFault > < / bpelFault >

    2 non recoverable system fault):
    Exception occurred when the link was invoked. Exception occurred during invocation of the JCA binding: "JCA binding run reference"insert"operations have to: connection problem component binding JCA.". JCA Binding component is unable to create an outgoing connection of JCA (CCI). ReadWriteDB:WriteDB [WriteDB_ptt::insert (MydbCollection)]: The JCA Binding component could not establish an outbound connection of JCA CCI due to the following problem: BINDING. Error of JCA - 12510 JCA adapter localization resources. Cannot find the adapter in JCA resources via the element of the binding .jca file <-factory connections / > The JCA Binding component is unable to startup of the resource adapter that is specified in the <-factory connections / > element: location = 'EI, DB, null. The reason is most likely that 1) the resource adapter RAR file has not been deployed successfully to the application server WebLogic or 2) the element '< jndi name >' in weblogic - ra.xml has not been set to eis/DB/null. In the latter case, you need to add a new factory of connections from WebLogic JCA (deploy a RAR). Please fix this and then relaunch the Application Server, please make sure that the JCA connection factory and dependent connection factories have been set up with a sufficient limit for maximum connections Please also make sure that the physical connection to the EIS server is available and the backend itself accepts connections. ". The called JCA adapter threw an exception of resource. Please review the error message above carefully to determine a resolution.

    Please suggest how to solve.

    Thank you

    Tejas

    Check your source data and your connection adapter db adpater deployments factory.
    Check if you have configured all steps of http://docs.oracle.com/cd/E15523_01/integration.1111/e10231/life_cycle.htm#BABBEDBF.
    Also, in your file .jca in your jdeveloper have named the location of connection-factory as "ist/DB/draw ' if so change it to one that you have created in the db adapter deployments.

  • Duplicate the database and incremental backups

    Hello

    I reviewing options to maintain a test server that is "relatively" updated compared to the production server.
    This is my first contact with RMAN and I went through the books, but as I have not had a chance to play with it I could have missed something.

    It is an Oracle devoted to a Redhat Enterprise Linux (not sure of the complete version number) 11.2.

    The test server must be writable and must be synchronized with the server in production about once or twice a month. Due to the size of the database (~ 600 GB) we are studying solutions that could make using incremental backups that are already taken on the production database.

    While launching the test using RMAN DUPLICATE server seems to be honest, I'm not sure how the "sync" can be done as effectively as possible - especially because we are changing the data on the test server.

    After the RMAN BACK command description, is not clear to me if this could serve to 'recover' the database of test based on the incremental backups to the 'current state' of the production database, even if the test database has been changed. After reading a few posts here, it's not seems to be the case.

    A restore point would make such a possible incremental restore? Creating a (name?) point of restoration immediately after duplication, and once the tests are finished making back to the restore point and then apply the incremental backups that have accumulated since then.


    Another of the options seems to be to create a database of the previous day. Then when running the tests, we turn to the backup database in one instantaneous standby database open read/write and run tests. Once we have finished, we he switch back to a physical standby. This switchback to rewind the changes and recovery logs accumulated are applied as "incremental" catching up with production - which is basically what we want.

    Is there another solution to this? Or use a watch of the snapshot, the most effective solution? If possible, we would like to avoid transferring the 600GB whenever we want to re - synchronize the database to test.

    Thanks in advance
    Thomas

    Hi Tomas. I think you're right. Ensures the most effective solution in your situation of snapshot.

  • Windows Media Player 11 - Wrong Information in the database online Album

    Hello

    I have a CD of Erik Satie in a collection of Piano Classic FM music.

    When I insert my CD in the drive and traction in the album of the WMP database information, he chose the bad album. I had a quick glance at all the other many albums of Satie in the database in order to change for the good pair, but he's not here. The version I have is not in the database. Furthermore, I can't just edit the information from another CD, because my CD has 38 tracks and none of the others do it, when I click the button change, it does not give me enough of text fields to use, or on some versions, he gives me too and the art of the album is also wrong.

    My question is, how do I add my CD to the database of Satie CD already available for WMP?

    When someone inserts this version of Satie in their computer, I want the right information to be extracted from the data base for them also.

    Thank you very much

    David

    Hi HeathD,
    Thank you for visiting the Microsoft Answers forum.
    To apply, please visit the Microsoft Connect portal, at https://connect.microsoft.com

    What is connect?
    This site is a point of connection between you and Microsoft and eventually the entire community. Your feedback helps Microsoft make software and services the best they can be, and you can learn and contribute to exciting projects. To learn more.

    Thank you

    Martin
    Microsoft Answers Support Engineer
    Visit our Microsoft answers feedback Forum and let us know what you think

  • Show only the value not save to the database based on the dynamic action

    Nice day
    I have a select box with products and dynamic action that updates a single display element with the price, based on the selection of products.
    Once the selection is made and the page is submitted, I noticed that the price is not stored in the database. If I change the display
    only the value of a text box, the data is saved. This is expected behavior? If so, can I add something to the text box to make it uneditable?

    Thanks for any help you can provide.

    Version is Application Express 4.1.1.00.23

    Steve

    stmontgo wrote:
    Hello
    Thanks for your advice. I changed the STATE of SESSION SAVE Yes with the other values remains the same, as they met to your recommendations.
    When I change the value, I get the error below. It's maybe because the value is taken from a dynamic action?

    Yes. That would mean there isn't an item "view only". The value is changed by the browser.

    You might change view as textitem and set read-only by adding the ReadOnly attribute to the property attributes of HTML form elements .

  • Crash of the database after "edit user".

    Hello

    I have a problem with oracle 8i under windows 2000 server databases, grouped (active-passive).
    When I try to change password for sys as sysdba instance crashes without error messages in alterlog and dump files.

    Download
    sqlplus/nolog
    SQL > conn sys/pwd@db as sysdba
    SQL > alter user sys identified by pwd2;

    This command works and the password file is changed, but the database cluster resource has changed its status as 'failed '.
    Is there a relationship with the service cluster file and microsoft password or security intrinsic oralce?

    This happens in stand-alone server or database 10g clustered

    Yes, I think so.

    Check the MOS FailSafe database goes offline after changing the password for SYS - 167496.1

    HTH
    -André

Maybe you are looking for

  • 4 GB memory not correctly reported on Satellite A200 - 19L

    I installed 2 pcs of modules of memory Kingston 2 GB in my A200 - 19L, but unfortunately I had only 3 GB of read in the BIOS and if the Vista operating system!Of course because he reads the BIOS memory sizebut when I run the PC diagnosis, it signals

  • KERNEL_STACK_INPAGE_ERROR__

    KERNEL_STACK_INPAGE_ERRORI got this message on a blue screen. The message on the screen told me it off my PC for some kind of security, saying that it protects.At the end of the message, he said: beginning physical memory dumpPhysics complete emptyin

  • I/o Magic DHM worked fine w/XP but it does not see Win 7

    I used a 250 G DHM I/O Magic with three different computers under XP SP3 - it has always worked well - I just got a Dell 17R laptop - when I plugged it in the first place in, unit has been identified and apparently 'ready to work' - I used the laptop

  • transfer songs from itune to sony walkman using Windows Media Player 11

    you are trying to transfer songs from iTunes to itunes library for sony walkman with windows media player 11 on xp sp3 version.  error when syncing... says the file is not supported and I can't sync songs to the walkman.  I can get songs off the list

  • Problem SOLVED [ping SPIKE]! (if you have ping spikes, it is probably the answer)

    Hey all,. If you have a ping spike/temporarily stopped in connection with your wireless adapter, this is probably the solution for you! You can read my story here of my previous thread about my problems and solutions I've tried along the way. http://