date of creation of database

Hello
on the FSCM, 8.49 tools 91
under which peopleSoft table can see the database creation time?

Any kind of DB (Oracle, Sqlserver, DB2).

Thank you.

In most cases, you can use LASTREFRESHDTTM in the PSSTATUS table if you don't want to use a specific data source.

Kind regards
Bob

Tags: Oracle Applications

Similar Questions

  • Comparing the data set of Oracle database local to the Remote SQL Server database

    Hello

    I have a table in an Oracle database that is gradually being updated with a table in SQL server through the database link. There is a field "Date of creation" in the table in SQL server that I used to find the newly created records and shoot and insert them into the local table to the Oracle database.

    Now the question is the documents/data in the SQL server remote can be deleted as well and I see a lot of files are already deleted and I couldn't find a way to capture the information of deleted data to remove from my local table.

    I tried

    SELECT id local_table

    where there is no

    (select * (or 1))

    of remote_table@database_link

    where remote_table.id = local_table.id).

    And I tried to replace not exist with no so.

    The problem is that the query is too slow with link of database for IN/NOT EXISTS clause NO. So, I'm trying to find an effective way to capture the lines deleted in the remote to remove as a result of my local table table.

    Any help is appreciated.

    Oracle Database 11 g Enterprise Edition Release 11.2.0.2.0 - 64 bit

    Thank you

    Chrystelle

    It is perhaps easier

    Select pk / * or sustitutive key * / from Oracle.tba

    less

    Select pk / * or sustitutive key * / from SQLSERVER.tab

    This returns the rows that exist in Oracle but not in SQL SERVER-> delteted lines in SQL Server.

    Another way is to create a trigger to remove in sqlserver and track in a table delete lines.

    HTH - Antonio NAVARRO

    http://SQL1.WordPress.com/

  • date of creation in v$ datafile

    Hello

    This is my system config.

    >
    Development Server
    OEL 5.0

    Version of DB
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod
    PL/SQL Release 10.2.0.1.0 - Production
    CORE 10.2.0.1.0 Production
    AMT for Linux: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production

    >

    I've added a data file to a TS User and then run following query
    select * from v$datafile;
    There is a column called "Date of creation", it shows me the day for this new data file but shows 30 June 2005 for the other data files that are members of the system, sysaux TS... so my question is why it shows old date for system data files? is it because the Oracle software was packaged by Oracle Corp. in 2005?

    Second question also is since I am new to Oracle and use for most sqldeveloper tool instead of sqlplus, is it true that at some point I'll have to use sqlplus, since the sql developer does not support execution of orders / queries like for example when I execute 'archive list journal' in sql, more it works where the fine as the same does not work in sql developer tool , so I'm just trying to understand this am I have something wrong, or is my correct assumption I should also start learning how to use sqlplus?

    Concerning

    Learner

    If the database was created with a script CREATE DATABASE, CREATION_TIME in V$ DATAFILE_HEADER would correspond to this date.

    If the database has been "created" via DBCA by using pre-configured models in Oracle, seed files don't have "created", they actually get "extract" from an RMAN backup that is in the model and restored and renamed. As a result, the CREATION_TIME in these files are database that Oracle created and put into its models.

    See the update 25 July 10 to my post at http://hemantoracledba.blogspot.com/2010/07/vdatabasecreated-is-this-database.html where I as the CREATION_TIME of my SYSTEM data file, a 10.2.0.4 model database is "12 March 08.

    Hemant K Collette

  • Date of creation and the Date of importation

    When you import photos or video in the Photos to a folder, the application uses the date of importation of integration rather than the original creation date.  The result is that imports are all presented together under "Today."  Many photos and video taken on different dates, so I would only they listed according to date of creation rather than be grouped under the date of importation.  I went 'View' and checked "date of creation".  Photos don't work with "SORT" because it is always grey.  Any help would be greatly appreciated!

    If you look in the window of Photos photos and videos are sorted by date with the oldest items at the top.  This sort order cannot be change.

    In the pictures window, the elements are sorted by the date imported into the library with the oldest at the top.  The sort order cannot be changed here either.

    So, you can use a smart album to include all your photos in the library and then sort them one of these ways:

    The smart album could be created to this criterion:

    that would include all the photos in the library.  Now you can sort them as you like.  Just make sure that the date is early enough to catch all the photos in the library.

    Moments in Photos are new events, i.e. groups of photos sorted by date of catch.

    When the iPhoto library has been migrated first to the pictures there is a folder created in the box titled iPhoto events and all migrated iPhoto events (which are now Moments) are represented by an album in this folder. Use the Command + Option + S key combination to open the sidebar if it is not already open.

    NOTE: it has been reported by several users that if the albums of the event are moved in the iPhoto Library folder in the sidebar, they disappear.  It is not widespread, but several users have reported this problem.  Therefore, if you want to if ensure that you keep these event albums do not transfer out of the iPhoto events folder.

    Is there a way to simulate events in pictures.

    When new photos are imported in the library of Photos, go to the smart album last import , select all the photos and use the file menu option ➙ New Album or use the key combination command + N.  Call it what you want.  It appears just above the folder to iPhoto events where you can drag it into the events in iPhoto folder

    When you click on the folder to iPhoto events, you will get a window simulated of iPhoto events.

    Albums and smart albums can be sorted by title, by Date, with the oldest first and by Date with the most recent first.

    Tell Apple what missing features you want restored or new features added in Photos Photo-Applefeedback.

  • How to copy "Date to update the" to "Date of creation"?

    For some reason, all my photos have had their 'date Created' changed for a date in 2015? I do not have a back-up I can restore (a more 30 000 photos), but both the "date modified" and the date of "Last opened" remain as the original "date of creation".

    Is it possible to copy the 'date Modified' or the 'last opening' to replace the 'created date '?

    ANY suggestions much appreciated.

    Thank you and happy new year,

    Kenneth

    I use a MacBook Air with the latest Photos and operating system installed.

    Where you see the 'creation Date' in the Photos?  The information about the Photos Panel is only indicating the date of capture integrated in the file original image IPTC and EXIF tags.

    File creation date that you see in the Finder, if you access the originals using the Finder, may change, if your library is synchronized with iCloud. All original image in my library on my MacBook Pro to iCloud retina display a date of creation sometime in July, when I downloaded the iCloud library.

  • You can use Windows Explorer (in Windows XP) to view my files according to their DATE OF PUBLICATION (rather than the date of creation or modification).

    I have a lot of cuts in MY Documents. They are variously in format (TXT, WORD, and PDF). Using Windows Explorer, I know exactly how to display these documents based on their date of creation or update. But how to display these documents based on their release date.

    (1) there must be a way to set "date of publication" as a property of TXT, Word, and PDF files.

    (2) there must be a way to display in chronological order the files in Windows Explorer - not according to their creation or modification date, but according to their date of publication.

    Can someone please? Maybe someone write an essay with many historical sources can feel my pain. I was playing with MS Word properties and trying to create custom, attributes but so far I still did not understand how to format a) correctly a new attribute for b) search for this custom in Windows Explorer or the Publication Date attribute.

    In short, I want to see all my files, because they have been published so that I can have a "timeline of events" outcome of saved newspaper clippings.
     
    Rabindra

    Name/rename the files with numbers on the front of the file name; for example, 01-11-2009 - historicaldocument1.doc, 02-13-1957 - historicaldocument2.doc. There is no way for Windows can read the text of a file whether it refers to the great Chicago Fire what on such and such a date.

    If you are doing historical research, you can check on some forums/newsgroup user on your particular area to see if other people use the documents management software. It would be the best way to go. MS - MVP - Elephant Boy computers - don't panic!

  • Insertion of data in blackberry java database

    Here is my code to insert data into the database

    package mypackage;
    
    import net.rim.device.api.database.Database;
    import net.rim.device.api.database.DatabaseFactory;
    import net.rim.device.api.database.Statement;
    import net.rim.device.api.io.URI;
    import net.rim.device.api.ui.component.LabelField;
    import net.rim.device.api.ui.component.RichTextField;
    import net.rim.device.api.ui.container.MainScreen;
    
    //class untuk insert data ke database table People
    
    class InsertDataScreen extends MainScreen
    {
        Database d;
        public InsertDataScreen()
        {
            LabelField title = new LabelField("SQLite Insert Data " +
                                              "Schema Sample",
                                              LabelField.ELLIPSIS |
                                              LabelField.USE_ALL_WIDTH);
            setTitle(title);
            add(new RichTextField("Attempting to insert data into " +
                                   "MyTestDatabase.db on the SDCard."));
            try
            {
                URI myURI = URI.create("file:///Store/Databases/SQLite_Guide/" +
                                       "MyTestDatabase2.db");
                d = DatabaseFactory.open(myURI);
    
                Statement st = d.createStatement("INSERT INTO People(Name,Age) " +
                                                 "VALUES ('John',37)");
                st.prepare();
                st.execute();
                st.close();
                d.close();
                add(new RichTextField("inserting data successful"));
                System.out.println( "inserting data successful" );
            }
            catch ( Exception e )
            {
                System.out.println( e.getMessage() );
                e.printStackTrace();
            }
    
        }
    }
    

    I checked this database is exist. When I run the above code, my data has been never inserted in the database, code bellows

    add(new RichTextField("inserting data successful"));
    

    never called.

    Anyone know what is my fault? Please

    Problem solved!

    I have successfully insert data and view the database when I change the location of storage on SD card

    I don't know why, but its probably that blackberry can't save the SQLite database on internal memory (OS7)

  • date of creation of account user in XP and windows 7

    Hello

    I am managing a project on the security of the information, the goal of the project is out of privilege the user admin, so that they cannot create/install new software from unauthorized accounts and change the policy of the company on manchine there without assistance.

    I want to know if the user has admin privilege account and created the account, say abc or test with administrator privileges, can enter us date of creation of such accounts, if so how? and there at - there no report available on XP/Windows 7 that can help?

    I hope that the people of knowgeable in the forum can guide me on this

    Concerning

    Maneesh Kumar

    You can get some of this information when you type this command at the command prompt:

    NET user 'Kumar '.
  • Is this possible in double database on another server without losing any data on the auxiliary database?

    Hi guys.

    I have following scenario

    -i have a backup game compressed a database in NOARCHIVELOG mode.

    -i want a database that has been duplicated on a server different but

    without losing the data on the auxiliary database.

    -the two server contains the same version of oracle 11g.

    -two servers UNIX/Linux then

    s ' please reply to more info.

    DB version: 11.2.0.1

    Name of the comic book source: has

    Name of the comic in double: diferent Pentecost B file structure.



    Can someone help me please?

    The target and duplicate database cannot have the same ID duplicate so is not what you need. If you have FOO database on a server and want to copy to FOO on server B, with a different file structure, then simply copy the RMAN backupsets to Server B, and restore the database by renaming the data files. There are many examples on the web.

  • Data dictionary Synchonizing (reading) - database

    Hello

    I work for a while with the Oracle Data Modeler using the JDBC tab for the database connection. The database is a SQL Anywhere and the connection details like: jdbc:sybase:Tds:localhost:2638? ServiceName = Hades & CHARSET = utf8.

    Unfortunately, I have still no clear idea in my mind about the relationship of the data dictionary and the database connected (via JDBC adapter).

    -What time is read the database? If it is read in the dictionary? I guess that the information of connected database are read in the ODM at startup?

    -If a change occurs in the database (for example, a column added, a table, or a foreign key), can I manually update the ODM dictionary?

    -In connection details, I can change the local host with an IP address and press the Test button with success as a result. But I'm not sure, the data dictionary now 'filled' with the database newly connected without rebooting of the MDGS?


    Documentation of any clarification or pointing to on the concept of how and when the data dictionary and database are synchronized would be very welcome.

    Best regards

    Robert

    Hi Robert,.

    Just to clarify that "Data dictionary" refers to the definitions of the database metadata.  File > import > data dictionary is important these definitions in Data Modeler.

    (There is no separate "data dictionary" in data maker.)

    The blue button on the right (the data dictionary synchronize with model) is similar in effect to the opening of your model and then by doing a file > import > data dictionary (and by setting the option to exchange the target model in step 2 of the Wizard).

    Both shows a comparison between the current relational model and the current definitions in the database.

    David

  • Creation of database of cluster requires default listener configured and executed in grid House Infrastructure.

    Hello!

    I created the new database with DBSA in configuration RAC (NŒUD 2)
    Oracle Database 11 g Enterprise Edition Release 11.2.0.3.0 - 64 bit

    Release of Red Hat Enterprise Linux Server 5.8

    Requirements has successfully passed:

    [oracle@db1-mng ~] $ cluvfy stage - pre dbcfg - n mng - db1, db2 - mng - d
    /U01/app/Oracle/product/11.2.0.3/dbhome_1

    Perform preliminary checks for database configuration

    Check accessibility of node...

    Accessibility of node check from node "db1 - mng.

    Check the user equivalence...

    User equivalence check passed for user 'oracle '.

    Verify node connectivity...

    Checking hosts config file...

    The hosts config file verification successful

    Check: connectivity interface node 'bondeth0 '.

    Connectivity node passed to the interface 'bondeth0 '.

    Check the TCP connectivity to subnet "10.116.176.196."

    Check: connectivity interface node 'bondib0 '.

    Connectivity to the node passed to the interface 'bondib0 '.

    Check the TCP connectivity to subnet "192.168.18.10."

    Checking consistency of subnet mask...

    Consistency of subnet mask check passed for subnet
    '10.116.176.196 '.

    Consistency of subnet mask check passed for subnet
    '192.168.18.10 '.

    Verification of consistency for the last subnet mask.

    Check the passed node connectivity

    Checking for multicast communication...

    Check "10.116.176.196" for multicast communication subnet with
    Multicast group '230.0.1.0 '...

    Subnet check "10.116.176.196" for multicast communication with
    from '230.0.1.0' multicast group

    Check "192.168.18.10" for multicast communication subnet with
    Multicast group '230.0.1.0 '...

    Subnet check "192.168.18.10" for multicast communication with
    from '230.0.1.0' multicast group

    Control of transmitted multicast communication.

    Total memory check passed

    Check the available memory

    Checking space last swap

    Check the disk space free past for
    'db1-mng:/u01/app/oracle/product/11.2.0.3/dbhome_1 '.

    Check the disk space free past for
    'db2-mng:/u01/app/oracle/product/11.2.0.3/dbhome_1 '.

    Free disk space for audit "db1 - mng: / tmp '.

    Free disk space for audit "db2 - mng: / tmp '.

    Check for multiple users with the value of the UID 1000 ago

    Check for existence of user for 'oracle '.

    Existence of group check passed for "oinstall".

    Existence group check passed for "dba".

    Membership check for user 'oracle' in group 'oinstall' [as principal]
    past

    Membership to check for 'oracle' user group 'dba' spent

    Run control the level of the past

    Hard deadline check passed for "maximum open file."
    "descriptors of '.

    Verification of limits soft for "maximum open file."
    "descriptors of '.

    Hard limits check passed for "maximum user process.

    The soft limits check passed for "maximum user process.

    The system architecture check past

    Check the version of the kernel spent

    Kernel parameter check passed for "semmsl.

    Kernel parameter check passed for "semmns.

    Kernel parameter check passed for "semopm.

    Kernel parameter check passed for "semmni.

    Kernel parameter check passed for "shmmax.

    Kernel parameter check passed for "shmmni(5)."

    Kernel parameter check passed for "shmall.

    Kernel parameter check passed for 'file-max.

    Kernel parameter check passed for "ip_local_port_range.

    Kernel parameter check passed for "rmem_default."

    Kernel parameter check passed for "rmem_max.

    Kernel parameter check passed for "wmem_default."

    Kernel parameter check passed for "wmem_max.

    Kernel parameter check passed for "aio-max-nr.

    Existence of package check passed to 'make it '.

    Existence of package check for "binutils".

    Existence of package check passed for "gcc (x86_64).

    Existence of package check for "libaio (x86_64).

    Existence of package check for 'glibc (x86_64).

    Existence of package check passed for
    "compat-libstdc ++-33 (x86_64).

    Existence of package check passed for
    "elfutils-libelf (x86_64).

    Existence of package check passed for "elfutils-libelf-devel '.

    Existence of package check passed for 'glibc-common ".

    Existence of package check for 'glibc-devel (x86_64).

    Existence of package check for ' glibc headers.

    Existence of package check passed for "gcc - c++ (x86_64).

    Existence of package check for "libaio-devel (x86_64).

    Existence of package check for 'libgcc (x86_64).

    Existence of package check for 'libstdc ++ (x86_64).

    Existence of package check passed for
    'libstdc ++ - devel (x86_64).

    Existence of package check for "sysstat".

    Existence of package check for 'ksh '.

    Check for multiple users with the value of the UID 0 passes

    ID of current group check passed

    Audit of departure for the coherence of the primary root group
    user

    Check the consistency of the primary group of the user root past

    CRS checking the integrity...

    Consistency of version Clusterware spent

    Checking the integrity of the src passed

    Checking node application existence...

    Verification of the existence of the node VIP application (required)

    VIP application node check past

    Verification of the existence of the application of NETWORK node (required)

    Application node passed control NETWORK

    Verify the existence of GSD (optional) application node

    Application node GSD is in offline mode on nodes "mng - db1, db2 - mng"

    Verification of the existence of the application of ONS node (optional)

    ONS node application check past

    Check the consistency of zone past

    Check beforehand for the database configuration was successful.

    ------------------------------------------

    [oracle@db1-mng ~] $ ps - aef | grep lsnr

    10435 1 0 2014 Oracle?        00:09:57
    /U01/app/Oracle/product/11.2.0.3/dbhome_1/bin/tnslsnr LISTENER
    -inherit

    10447 1 0 2014 Oracle?        00:14:10
    /U01/app/11.2.0.3/grid/bin/tnslsnr LISTENER_SCAN1-inherit

    Oracle 58226 74449 14:00 0 pts/0 00:00:00 grep
    LSNR

    ------------------------------------------

    [oracle@db1-mng ~] $ lsnrctl LISTENER status

    LSNRCTL for Linux: Version 11.2.0.3.0 - Production on 16 April 2015
    11:56:48

    Connection to
    (DESCRIPTION = (ADDRESS = (PROTOCOL = IPC) (KEY = LISTENER)))

    STATUS of the LISTENER

    Alias LISTENER

    Settings for the listen port file
    /U01/app/Oracle/product/11.2.0.3/dbhome_1/network/admin/listener.ora

    Log file of listener
    /U01/app/Oracle/product/11.2.0.3/dbhome_1/log/diag/tnslsnr/db1-MNG/listener/alert/log.XML

    Listen to endpoint points summary...

    (DESCRIPTION = (ADDRESS = (PROTOCOL = ipc) (KEY = LISTENER)))

    (DESCRIPTION = (ADDRESS = (PROTOCOL = tcp)(HOST=10.116.176.129) (PORT = 1521)))

    (DESCRIPTION = (ADDRESS = (PROTOCOL = tcp)(HOST=10.116.176.100) (PORT = 1521)))

    Summary of services...

    Service '+ ASM' a 1 instance (s).

    Instance '+ ASM1' READY State, has 1 operation for this
    service...

    Service 'COM' has 1 instance (s).

    Instance "COM1", State LOAN, has 1 operation for this
    service...

    Service 'COMXDB' has 1 instance (s).

    Instance "COM1", State LOAN, has 1 operation for this
    service...

    'PM' service has 1 instance (s).

    Instance "PM1" READY State, has 1 operation for this
    service...

    Service 'PMXDB' has 1 instance (s).

    Instance "PM1" READY State, has 1 operation for this
    service...

    Service 'TEST' has 1 instance (s).

    Instance "TEST1", State LOAN, has 1 operation for this
    service...

    Service 'TESTXDB' has 1 instance (s).

    Instance "TEST1", State LOAN, has 1 operation for this
    service...

    The command completed successfully

    ----------------------------

    [oracle@db1-mng ~] $ lsnrctl status LISTENER_SCAN1

    LSNRCTL for Linux: Version 11.2.0.3.0 - Production on 16 April 2015
    14:12:13

    Connection to
    (DESCRIPTION = (ADDRESS = (PROTOCOL = IPC) (KEY = LISTENER_SCAN1)))

    STATUS of the LISTENER

    Alias LISTENER_SCAN1

    Settings for the listen port file
    /U01/app/11.2.0.3/grid/network/admin/listener.ora

    Log file of listener
    /U01/app/11.2.0.3/grid/log/diag/tnslsnr/db1-MNG/listener_scan1/alert/log.XML

    Listen to endpoint points summary...

    (DESCRIPTION = (ADDRESS = (PROTOCOL = ipc) (KEY = LISTENER_SCAN1)))

    (DESCRIPTION = (ADDRESS = (PROTOCOL = tcp)(HOST=10.116.176.197) (PORT = 1521)))

    Summary of services...

    Service 'COM' has 2 occurrences.

    Instance "COM1", State LOAN, has 1 operation for this
    service...

    Instance "COM2", State LOAN, has 1 operation for this
    service...

    Service 'COMXDB' has 2 occurrences.

    Instance "COM1", State LOAN, has 1 operation for this
    service...

    Instance "COM2", State LOAN, has 1 operation for this
    service...

    'PM' service has 2 occurrences.

    Instance "PM1" READY State, has 1 operation for this
    service...

    Instance "PM2", State LOAN, has 1 operation for this
    service...

    Service 'PMXDB' has 2 occurrences.

    Instance "PM1" READY State, has 1 operation for this
    service...

    Instance "PM2", State LOAN, has 1 operation for this
    service...

    Service 'TEST' has 2 occurrences.

    Instance "TEST1", State LOAN, has 1 operation for this
    service...

    Instance "TEST2", State LOAN, has 1 operation for this
    service...

    Service 'TESTXDB' has 2 occurrences.

    Instance "TEST1", State LOAN, has 1 operation for this
    service...

    Instance "TEST2", State LOAN, has 1 operation for this
    service...

    The command completed successfully

    ------------------------

    Also successful on the second node db2 - mng to LISTENER_SCAN2, LISTENER_SCAN3.

    When I try to run the command "dbca responsefile - /home/oracle/PR/PRN.rsp-silent ' displays an error:

    Creation of database of cluster requires the default listening port configured and
    currents of the Home Network Infrastructure. Use NETCA in grid Infrastructure - home
    "/ u01/app/11.2.0.3/grid" to set up a front listening port
    instance.

    RAC-database now works in production. Manipulation of the Junk listener.

    How can I get around the error?

    In particular all databases have been created in the same way, the truth, about a year ago.

    Thank you, Thomas! I chose the first method) OK

  • Time zone and date of creation of PDF

    Hello

    I have two questions which I would like to please your help:

    -Can the date of the creation of a pdf file be changed? What is strong evidence that the document was born in effect on that date? I should specify that the document was created with the pages mac application, and the date of creation has been consulted by checking the properties of the document on the adobe reader software.

    -I am located in France/Paris GMT + 1, will be the creation date poster/converted in my local time zone (GMT + 1) or is it possible that the document seems to have been created later it should be because the creator or sender (later than the sender said) was in a different time zone?

    Thank you for your help

    The date of the creation of a PDF document can be easily changed by using Adobe Acrobat or many other third-party PDF viewers. There is no hard proof when in fact the file was created.

  • How to configure a web service primavera to return the data in the second database?

    Hello world


    We have P6 with WS first deployment on a single weblogic domain server. The first WS returns the first instance of database data.

    Then deployed to the WS second tip on a weblogic domain server separated with a different port. Set up the second WS with < WS2_INSTALL_HOME > / bin/dbconfig.sh, creating a new branch a configuration that specifies one second instance of the database. However, this configuration is ignored, and the second web services return data from the database.

    We have a single domain, including notably the following servers:

    Name / host / Port / deployments

    P6 / localhost / 0001 / P6 (v8.3), p6ws1 (v8.3)

    p6ws2 / localhost / 0002 / p6ws2 (v8.3)

    We have now two different files BREBootstrap.xml.

    P6 BREBootstrap.xml:

    < database >

    < URL > JDBC:Oracle:thin:@db1:1521:db1 < / URL >

    < user name > pubuser < / name >

    password <>anycriptopass1 < / password >

    oracle.jdbc.OracleDriver < driver > < / driver >

    < PublicGroupId > 1 < / PublicGroupId >

    < / data >

    < CfgVersion > 8.330 < / CfgVersion >

    <>configurations

    < name BRE = 'P6 Config_DB1"instances ="1"logDir ="anydir, P6EPPM, p6, PrimaveraLogs"/ >

    < / configuration >

    p6ws2 BREBootstrap.xml:

    < database >

    < URL > JDBC:Oracle:thin:@DB2:1521:DB2 < / URL >

    < user name > pubuser < / name >

    password <>anycriptopass2 < / password >

    oracle.jdbc.OracleDriver < driver > < / driver >

    < PublicGroupId > 1 < / PublicGroupId >

    < / data >

    < CfgVersion > 8.330 < / CfgVersion >

    <>configurations

    < name BRE = 'P6 Config_DB2"instances ="1"logDir ="anydir, P6EPPM, ws2, PrimaveraLogs"/ >

    < / configuration >

    "P6 Config_DB1" and "P6 Config_DB2" including the property database for the database 1 and 2 respectively.

    How to set up a second web service to return the data to the second database?


    Thanks in advance!


    Kind regards

    Dmitry

    So, answer oracle support:

    Looks like it is in the documentation, Web Services cannot be configured in this way as the other modules. See the following topics:

    BUG 19516437 - Is it POSSIBLE TO hardcode a DEPLOYMENT of SERVICES WEB P6 to an INSTANCE of DATABASE? (ask if this is possible)

    BUG 19579735 - FOR BEING ABLE to hardcode A P6 WEB SERVICES DEPLOYMENT to A DATABASE INSTANCE (corresponding improvement because it can be done).

    The problem has been resolved by the following:

    1 create the WebLogic domain.

    2 P6 and p6ws deployed on managed servers.

    3 configuration P6 uses the second instance of database and P6 has not begun.

    4 result: the p6ws (from additional domain WebLogic) returns data for the second instance of the database.

    Kind regards

    Dmitry

  • Date of creation of VM - support in a script

    Hi people,

    I need to know a date of creation of virtual machine, and apparently the only way is to run a script on the event logs. I found this script: time determination of vCenter VM creation events & amp; laquo; vmdev.info

    I copied the script in Notepad and saved in a .ps1 file.

    I'm a newbie with powershell and powercli, but had everything installed and working. I have connected to a host (5.5) and ran the script.\script.ps1. Nothing, no errors, no results. The cursor just returned.

    Can someone help me with what Miss me? Or y at - it another way to determine the creation date? Now that I've started, I'm very keen to learn powercli. Thank you

    Did you actually call the function (see the last line)?

    Function Get-VMCreationTimes {}

    $vms = get - vm

    $vmevts = @)

    $vmevt = new-object PSObject

    {foreach ($vm to $vms)

    #Progress bar:

    $foundString = "found:" + $vmevt.name + "" + $vmevt.createdTime + ""+ $vmevt. " ' IPAddress + "" + $vmevt.createdBy.

    $searchString = "search for:" + $vm.name.

    $percentComplete = $vmevts.count / $vms.count * 100

    write-progress-activity $foundString - $searchString - percentcomplete $percentComplete status

    $evt = get-vievent $vm | Sort the Createduserid | Select - 1 first

    $vmevt = new-object PSObject

    $vmevt | Add-Member-type NoteProperty-Createduserid name-value $evt.createdTime

    $vmevt | Add-Member-type NoteProperty-Name name - value $vm.name

    $vmevt | Add-Member-type NoteProperty-IPAddress name-value $vm. Guest.IPAddress

    $vmevt | Add-Member-type NoteProperty-createdBy name-value $evt. Username

    #uncomment lines of the following to extract the data store (s) each virtual stored on computer

    #$datastore = get-datastore - VM $vm

    #$datastore = $vm. Hard drives [0]. File name | sed / \ [\(.*\) \]. * / \1 /' #faster that get-datastore

    #$vmevt | Add-Member-type NoteProperty-name-value $datastore data store

    $vmevts += $vmevt

    $vmevt # #uncomment this to print the results line by line

    }

    $vmevts | Sort the Createduserid

    }

    Get-VMCreationTimes

  • Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine

    Not possible to export a list of virtual machines that are created in the past 7, 30, 120 and 180 days since an imported csv file containing the date of creation of virtual machine. My questions is the correct statement to the variable: $VmCreated7DaysAgo: $_CreatedOn "-lt" $CDate7.

    # #SCRIPT_START

    $file = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\VM-Repo.csv".

    $Import = import-csv $file

    $VMCreatedLast7RDayRepoFile = "C:\Users\Admin\Documents\WindowsPowerShell\08-18-2014\Last7Days.csv".

    $start7 = (get-Date). AddMonths(-1)

    $CDate7 = $start7. ToString('MM/dd/yyyy')

    $VmCreated7DaysAgo = $Import | Select-object - property name, Powerstate, vCenter, VMHost, Cluster, file, Application, CreatedBy, CreatedOn, NumCpu, MemoryGB | Where-Object {$_.} CreatedOn - lt $CDate7} | Sort-Object CreatedOn

    $TotalVmCreated7DaysAgo = $VmCreated7DaysAgo.count

    $VmCreated7DaysAgo | Export-Csv-path $VMCreatedLast7RDayRepoFile - NoTypeInformation - UseCulture

    Write-Host "$TotalVmCreated7DaysAgo VMs created in 7 days" - BackgroundColor Magenta

    Invoke-Item $VMCreatedLast7RDayRepoFile

    # #SCRIPT_END

    You can use the New-Timespan cmdlet in the Where clause, it returns the time difference between 2 DateTime objects.

    An example of this cmdley

    New-TimeSpan-start (Get-Date). AddDays(-7)-end (Get-Date). Select days - ExpandProperty

    In your case, you could do

    Where {(New Timespan-démarrer ([DateTime] $_.))} CreatedOn) - end $start7). {7 days - gt}

    But beware of negative numbers.

Maybe you are looking for