Information contained in various log files

Hello

I am new to Oracle Fusion Middlware. I have two linux system that has Oracle SOA Suite 11 g installed and is grouped, there is only a single domain named "soa_domain" who has admin server and server managed 2 (SOA) cluster. Now, when I connect the AdminServer folder, I see 5 files of different newspapers.

They are:

  1. AdminServer - diagnostic.log
  2. AdminServer.log
  3. AdminServer.out
  4. soa_domain.log
  5. Access.log

This server is bounced every day using WLST and NodeManager scripts.

Can someone explain to me please the meaning of each log file.

Thank you

Alisson

Hi Laurent,.

I mean that each individual server has its own server log.

For example, in your case there are newspapers like: adminserver.log, wls_soa1.log, and wls_soa2.log

There will be only one domain for each domain log.

Let me know if you have any questions.

Thank you

Sharmela

Note: Please mark the content as being correct or useful if it addresses your query.

Tags: Fusion Middleware

Similar Questions

  • We get an error "could not create a new partition or find existing. "For more information, see the Setup log file" error message "while trying to install Windows 8

    * Original title: 8 victory moved - Error Message: failed to create new partition...

    I have Win XP on my Dell Dimension 5150, which is the dual boot with Linux Mint 12 Lisa and this is my favorite of the bunch.

    I bought the DVD of 8 Pro Windows by an Australian retailer.

    Win XP is on a 39 GB partition with other application files. I had to delete several files to get the free space necessary to WIN 8 and finally finished by formatting the partition and passes for a COMPLETE new installation.

    Unfortunately, I now get the "failed to create new partition or find existing. For more information, see the Setup log file"error message. I can't find a Setup log file that I do a boot from the DVD.

    I tried to delete all external drives and other USB devices, including my Modem but my Wired USB keyboard/mouse.

    Two internal HARD drives are as follows:
    Disk 0 Partition 1-110 GB - system (LINUX)
    Disk 0 Partition 2 - 3.5 GB - logical

    Disk 1 Partition 1 - 47 MB - OEM (reserved) [DellUtility]
    Disc 1 2:Win - 39.1 GB - System Partition
    1 3 disk partition: DATA - 39,1 GB - logical
    Disk partition 1 4:OfficeProgs - 19.5 GB - logical
    Partition on drive 1 logical - 39.1 GB - 5:PROJECTS
    etc to score 8 with 9 MB of unallocated space.

    I have tried both 64 and 32 discs with the same result.

    As I have now is no longer no matter what Windows on my computer, what's the next step? If any ;-)

    Hello

    I solved the problem. It seems that you can not install on a secondary partition with in an earlier version of windows. You must restart the computer and run the installation from a dvd or other media.  Once you get the installation running you should be able to install on another partition without any problems.

    Nice day

  • Size of the log file Archive

    Redlogfile size is 16 MB and generating the archivelog file size is approximately 10 Mb. what could be the reason?

    Oracle Version: 10.2.0.4

    OS: Windows

    Published by: Deccan charger 19 April 2010 23:31

    First of all:

    a redo log contains a lot of things necessary for instance etc. recovery an archived log is used only for the restorations - he didn't need all the stuff that is in a redo log, so when written CRA the archiving log that he does not write all knowledge, all that is required for restores

    Second:

    Archive logs is created with dimensions smaller, irregular, as the original redo logs. Why? [388627.1 ID]

    --------------------------------------------------------------------------------

    Last updated 2 June 2007 Type status MODERATE HOWTO

    In this Document
    Goal
    Solution
    References

    --------------------------------------------------------------------------------

    This document is available to you through process of rapid visibility (RaV) of the Oracle's Support and therefore was not subject to an independent technical review.

    Applies to:
    Oracle Server - Enterprise Edition - Version: 8.1.7.4 to 11.1
    Information in this document applies to any platform.

    Goal
    Archive logs is created with dimensions smaller, irregular, as the original redo logs.
    Commands like:
    ALTER SYSTEM SWITCH LOGFILE
    or
    ALTER SYSTEM ARCHIVE LOG...
    are not used to generate archives or change the log file. Thus, there is no parameter ARCHIVE_LAG_TARGET set.
    What else could cause this behaviour?
    Solution
    From:
    Bug: 5450861: NEWSPAPERS ARCHIVE IS GENERATED with one SIZE SMALLER THAN THE REDO LOG FILES
    the explanation of this situation has 2 main reasons:
    1 archiving logs do not have to be in the same size. This was decided very long ago, when blank padding archiving logs stopped for a very good reason - in order to save disk space.

    2. the log command does not exist when a redo log file is 100% full. There is an internal algorithm that determines when to switch journal. It also has a very good reason - do the command of newspaper at the last moment may incur performance problems (for various reasons, outside the scope of this note).
    So, after that newspaper ordering occurs the archivers are only copying the information from the redo log files. As recovery logs are not 100% full after the command to log and archive logs are empty not filled after the copy operation is complete, this results in unequal files smaller than the original of redo log files.
    This is very apparent for very low (less than 10 MB) log files; as a result, produced 2.5 MB archive logs of 5 MB recovery logs are very visible.
    Just note that currently, the default log files are 100 MB in size. If the archives log files would be between 98 and 100 MB person would notice.
    The main concern that one must have for newspapers of archives files is a possible corruption. This can be easily verified by attempting a resumption of testing. When it's ok, the size of the log archive uneven should be of no interest, as is expected.

  • Where to see the log files from the runtime

    For my setup, the location mentioned in the documentation here http://download.oracle.com/docs/cd/E13154_01/bpm/docs65/admin_guide/index.html?t=modules/logging/c_Head_Logging.html does not contain the log files. Where can I find more information on the research log files and also more importantly how I connect the BPM process. Is there one out of the area of activity of the newspaper available in BPM. I found nothing in the documentation.

    Go to this link: Re: save the Message in the Studio

    Paragraph number 4 shows how to find the current log file.

    Dan

  • error message: 357 errors: more information can be found in the log file: / library/application support/contactsheetII.log before making the contact sheet in PS

    Hello

    I'm doing a Board-contact in PS and I keep coming up with this error message: 357 errors: more information can be found in the log file: applications/library

    I did one before in this way in an earlier version of the PS, but this new version is not letting me. Any suggestions on how I can go beyond this?

    Thank you very much

    Jenn

    If please close Photoshop and try and rename the Photoshop preferences file once.

    Location: preference file works, names, places | Photoshop CC 2014

  • That redo log files waiting?

    Hello Experts,

    I read articles on the log redo and undo segment files. I was wondering something very simple. That redo log files waiting in there? It stores the sql statements?

    Lets say that my update statement to modify 800 blocks of data. A unique single update statement can modify different data 800 right blocks? Yes, it may be true. I think that these data blocks can not hold buffers to the log to roll forward, right? I mean I know exactly what to do redo log buffer and redo log file. And I know that the task of backgrounding LGWR. But, I wonder if she she holds the data blocks? It is not supposed to hold data like cache buffer blocks, right?

    My second question is, rollback isn't effect to restore the newspaper to the right buffer? Because it does not need log buffer for effect do it again. Conversely, the restoration; statement is included in the restore log buffer by progression when someone isse, am I right?

    As far as I know, rollback interact directly with UNDO TABLESPACE?

    I hope that I have to express myself clearly.

    Thanks in advance.

    Here's my question:

    My second question is, rollback isn't effect to restore the newspaper to the right buffer? Because it does not need log buffer for effect do it again. Conversely, the restoration; statement is included in the restore log buffer by progression when someone isse, am I right?

    As far as I know, rollback interact directly with UNDO TABLESPACE?

    Yes, where else would the undo data come from? Undo tablespace contains the Undo segments that contain the Undo data required for the restoration of your transaction.

    I can say that rollback does not alter the data of the log buffer rede to the past. In other words, change vectors will be remain the same before restoration. Conversely, rollback command is also recorded in the log file of restoration by progression. As the name, all orders are saved in the REDO LOGS.

    I hope that I am wrong so far?

    Not sure why you even the buffer log roll forward for Rollback? This is the reason why I asked you it was for, where occurs the dose the cancellation? And the answer for this is that it happens in the buffer cache. Before you worry about the drivers of change, you must understand that it is not serious what contains where as long as there is no transaction recorded in the operating of the Undo segment table. If the operating table indicates that the transaction is longer there, there must be a cancellation of the transaction. Vectors of change are saved in the file log roll forward, while the restore happens on blocks of data stored in the file "data" undo blocks stored in the undo file "data".

    At the same time I read an article about redo and undo. In this article process transaction is explained. Here is the link http://pavandba.files.wordpress.com/2009/11/undo_redo1.pdf

    I found some interesting information in this article as follows.

    It is worth noting that during the restore process, recovery logs never participate. The only time where redo logs are read is retrieving and archiving. This is the concept of tuning key: redo logs are written on. Oracle does not read during normal processing. As long as you have sufficient devices so that when the ARC is reading a file, LGWR's writing to a different device, then there no contention for redo logs.

    If redo logs are never involved in the restoration process, how is it Oracle will then know the order of the transaction? As far as I know it is only written in redo logs.

    I have thoughts very amazed to Aman.

    Why you ask?

    Now, before giving a response, I say two things. One, I know Pavan and he is a regular contributor to this forum and on several other forums Facebook and two, with all due respect to him, a little advice for you, when you try to understand a concept, to stick to the Oracle documentation and do not read and merge articles/blog-posts from the web. Everone, which publishes on the web, has their own way to express things and many times, the context of the writing makes it more confusing things. Maybe we can erase the doubts that you can get after reading the various search results on the web.

    Redo logs used for the restoration, not to restore. The reason is the redo log files are applied in sequential order, and this is not the case when we look for the restoration. A restore is required to do for a few blocks away. Basically, what happens in a restoration, is that the records of cancellation required for a block of data are sought in the reverse order of their creation. The entry of the transaction is in the slot ITL of the block of data that point to the necessary undo bytes Address (UBA) using which oracle also knows what that undo the blocks would be necessary for the restoration of your transaction. As soon as the blocks of data will be cancelled, the ITL slots would be cleared as well.

    In addition, you must remember, until the transaction is not qualified as finished, using either a commit or a rollback, the cancellation of this data would remain intact. The reason for this is that oracle would ensure that undo data would be available to make the cancellation of the transaction. The reason why Undo data are also recorded in the journals of recovery is to ensure that in the event of the loss of the cancellation of the data file, retrieving them would be possible. Because it would also require changes that's happened on the blocks cancel, restore the vectors change associated with blocks of cancellation are also saved in the buffer log roll forward and, in the redo log files.

    HTH

    Aman...

  • imageFORMULA CR-180 check the log file to scan

    Hello

    I work with an imageFORMULA CR-180 check Scanner and when executing a good amount (longer than 100) of the controls, I noticed that the scan control stops and a pop-up window appears on the Analysis ToolPak asked if I would like to replace my log file and start over.

    The software that I use is called "Utility scan for CR-180II" and the log file contains the information of MICR line controls scanned, as well as the date of the analysis, and the location of the file to the images captured from the parser.

    Log file options under "Options-> MICR data parameters.

    What I do, it is by clicking on ' no, I want to overwirte the file ", stop the scan, moving the log of the 'old' file, restart the scanning and backup to a new log file.

    Is there anyway to work around this problem and record continuously in the same log file?

    Is there a memory limit on the scanner?

    This is the Web page of the scanner that I use:

    http://www.USA.Canon.com/Cusa/support/Office/imageformula_scanners/imageformula_cr_180_cr_180ii/imag...

    Hi lendjones!

    Thanks for posting in the Forum! Canon does not provide direct support for imageFORMULA product series, but your dealer will be able to help you! If you do not have a reseller, please call us at 1-800-OKCANON (652-2666) and we will be happy to provide you with dealers who are in your area.

  • Generate the log file for the dialog box

    Hi all


    I'm generating information for the dialog box as a .txt log file format. That means that if the box is checked, the log file will be give ' checkbox1 - 01.»   Check the report, sizes against the information on tickets and slug jobs"is checked


    If the checkbox is not checked, the log file will be give ' checkbox1 - 01.»   Check the report, sizes against ticket and slug information on employment"is not checked


    and also the entry "myText2" also needs to generate the log file


    Can someone help on this... Help would be appreciated!



    var l is new window ('dialogue');.

    myGroup1 var = w.add ("panel", undefined, ' P & & G check the list ');

    myGroup1.alignChildren = 'left ';

    CheckBox1 var = myGroup1.add ("checkbox", not defined, '01.   (Check the ratio, size against the information on tickets and slug jobs");

    CheckBox2 var = myGroup1.add ("checkbox", not defined, '02.   "" "Check images are linked");

    var checkbox3 = myGroup1.add ("checkbox", not defined, '03.   Visually check the progress of KV/model/CP images");

    var checkbox4 = myGroup1.add ("checkbox", not defined, '04.   Visually check the progress of other elements such as Logo and bottle");

    var checkbox5 = myGroup1.add ("checkbox", not defined, '05.   Check the positioning of the markup language");

    var checkbox6 = myGroup1.add ("checkbox", not defined, '06.   Ensure that all measures are calculated Live based area");

    var checkbox7 = myGroup1.add ("checkbox", not defined, '07.   After that the resizing of the picture KV frame open to cut and bleed");

    var checkbox8 = myGroup1.add ("checkbox", not defined, '08.   Complete Magenta if there is insufficient image');

    var checkbox9 = myGroup1.add ("checkbox", not defined, '09.   ("To ensure that the document's bleed, crop gutter and slug information brands ');

    var checkbox10 = myGroup1.add ("checkbox", not defined, '10.   Make sure that the final work is updated on the server");

    var checkbox11 = myGroup1.add ("checkbox", not defined, '11.   ("Enter time cmd");

    var myGroup2 = w.add ('panel', undefined, 'The operator name');

    var myText2 = myGroup2.add ("edittext", undefined, "");

    myText2.characters = 25;

    myGroup2.orientation = 'left ';

    var buttons = w.add ("group");

    Buttons.Add ('button', undefined, 'Export to PDF', {name: 'ok'});

    Buttons.Add ('button', undefined, 'Cancel');

    w.Show ();

    ~ group();

    ~ If (myGroup1.alignChildren.value! = true) {}

    ~ alert ('yes')

    //~ }


    myDoc = app.activeDocument;

    w = [];


    DESCRIPTION: Make a TXT file

    myDoc = app.activeDocument;

    Log1 = makeLogFile (app.activeDocument.name.split('.') ([0], myDoc, true);

    log (log1, app.activeDocument.name);

    ~ log2 = makeLogFile ("test", myDoc, false);

    ~ Journal (log2, "Text file log base 2");

    Log1. Execute();

    ~ log2.execute ();

    function makeLogFile (aName, aDoc, deleteIt) {}

    var logLoc; path to the folder that will contain the log file

    try {}

    logLoc = aDoc.filePath;

    } catch (e) {}

    logLoc = getmyDoc (). parent.fsName

    }

    var queue = aFile (logLoc + "/" + name + ".txt");

    If {(deleteIt)

    aFile.remove ();

    return aFile;

    }

    var n = 1;

    so that {(aFile.exists)

    aFile = File (logLoc + "/" + String (n) + ".txt" aName);

    n ++

    }

    return aFile

    }

    function getScriptPath() {}

    try {}

    Return app.activeScript;

    } catch (e) {}

    Return File (e.fileName);

    }

    }

    function log (aFile, message) {}

    var today = new Date();

    If (! aFile.exists) {}

    do the new log file

    aFile.open ("w");

    aFile.write (String (today) + "\n");

    aFile.close ();

    }

    }

    function log (aFile, message) {}

    var text = o;

    If (! aFile.exists) {}

    do the new log file

    aFile.open ("w");

    aFile.write (message + "\n" + "\n" + String (w) + "\n");

    aFile.close ();

    }

    ~ aFile.open ("e");

    ~ aFile.seek (0.2);

    ~ aFile.write ("\n" + message);

    ~ aFile.close ();

    }

    myDoc.close (SaveOptions.no);

    Thanks in advance

    Steve

    Hi Steve,.

    There are some errors in your code.

    1. function 'getmyDoc' is used, but not created.
    2. fucntion 'getScriptPath' is created but not used. (In any case, this will not give you error)
    3. function 'journal' has defined two times with the same length of the parameter.

    etc...

    Here, I have modified your code. Try this.

    var w = new Window ("dialog");
    var myGroup1 = w.add('panel', undefined, 'P&&G Check List');
    myGroup1.alignChildren = "left";
    var checkbox1 = myGroup1.add ("checkbox", undefined, "  01.  Check the ratio, sizes against job ticket and slug information");
    var checkbox2 = myGroup1.add ("checkbox", undefined, "  02.  Check images are linked");
    var checkbox3 = myGroup1.add ("checkbox", undefined, "  03.  Visually check the progression of KV/Model/CP images");
    var checkbox4 = myGroup1.add ("checkbox", undefined, "  04.  Visually check the progression of other elements like Logo and Bottle");
    var checkbox5 = myGroup1.add ("checkbox", undefined, "  05.  Check the placement of Language Tagging");
    var checkbox6 = myGroup1.add ("checkbox", undefined, "  06.  Ensure that all measurements are calculated based on Live area");
    var checkbox7 = myGroup1.add ("checkbox", undefined, "  07.  After resizing the KV image frame opened up to trim and bleed");
    var checkbox8 = myGroup1.add ("checkbox", undefined, "  08.  Fill Magenta if there is inadequate image");
    var checkbox9 = myGroup1.add ("checkbox", undefined, "  09.  Ensure the document has bleed, crop marks, gutter marks and slug information");
    var checkbox10 = myGroup1.add ("checkbox", undefined, "  10.  Ensure the final artwork is updated in the Server");
    var checkbox11 = myGroup1.add ("checkbox", undefined, "  11.  Enter time in CMD");
    var myGroup2 = w.add('panel', undefined, ' Operator Name');
    var myText2 = myGroup2.add("edittext", undefined, "");
    myText2.characters = 25;
    myGroup2.orientation = "left";
    var buttons = w.add ("group");
    buttons.add ("button", undefined, "Export PDF", {name: "ok"});
    buttons.add ("button", undefined, "Cancel");
    w.show ();
    myDoc = app.activeDocument;
    log1 = makeLogFile(app.activeDocument.name.split('.')[0], myDoc, true);
    log(log1, app.activeDocument.name);
    log1.execute();
    function makeLogFile(aName, aDoc, deleteIt)
    {
        var logLoc = "";
        try
        {
            logLoc = aDoc.filePath;
            } catch (e) {}
        var aFile = File(logLoc + "/" + aName + ".txt");
        var n = 1;
        while (aFile.exists)
        {
            aFile = File(logLoc + "/" + aName + String(n) + ".txt");
            n++;
            }
        return aFile
        }
    function log(aFile, message)
    {
        var text = w;
        var rep = "";
        if (!aFile.exists)
        {
            aFile.open("w");
            var today = new Date();
            rep += String(today) + "\n";
            rep += message + "\n" + "\n\n";
            for(var i =0;i
    

    Kind regards

    Cognet

  • Cleaner thread prematurely deleting log file

    Dear community,

    I´d to display a stack trace. It seems that the thread cleaner had removed a log file that was still needed.

    We run I 4.0.71, Solaris 10, x 86, Java HotSpot 64 bit, ZFS 1.6u20.

    Database environment had to be closed with this exception:
    com.sleepycat.je.EnvironmentFailureException: (I 4.0.71) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (I 4.0.71) / klikit/database 0x2c3/0x564bb4 parent IN fetchTarget 29622582 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2dd/0x38ae2f parent.getDirty () = true status = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed.

    The root cause was:
    Due to: java.io.FileNotFoundException: /klikit/database/000002c3.jdb (no such file or directory)

    Here's the complete stack trace:

    [#: 2011-02 - 02T 03: 53:14.179 + 0100 |] SEVERE | Sun - AppServer2.1: javax. Enterprise.System.Container.Web|_ThreadID=17;_ThreadName=httpWorkerThread-80-1;_RequestID=df4c9c76-F069-4098-BE75-566b9f216008;|StandardWrapperValve[Dispatcher]: PWC1406: Servlet.service () for dispatcher servlet threw the exception
    org.apache.velocity.exception.MethodInvocationException: Invocation of method 'getSportResultsForAssociation' in class lu.luggel.web.model.ContentManager threw the lu.kpmg.core.db.store.StoreException exception: com.sleepycat.je.EnvironmentFailureException: (I 4.0.71) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (I 4.0.71) / klikit/database 0x2c3/0x564bb4 parent IN fetchTarget 29622582 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2dd/0x38ae2f parent.getDirty () = true status = 0 LOG_FILE_NOT_FOUND: log file missing log is probably invalid. Environment is not valid and must be closed. to municipality/sportResults/sportResultsForAssociation.html[line 3, column 27]
    at org.apache.velocity.runtime.parser.node.ASTMethod.handleInvocationException(ASTMethod.java:337)
    at org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:284)
    at org.apache.velocity.runtime.parser.node.ASTReference.execute(ASTReference.java:252)
    at org.apache.velocity.runtime.parser.node.ASTReference.value(ASTReference.java:493)
    at org.apache.velocity.runtime.parser.node.ASTExpression.value(ASTExpression.java:71)
    at org.apache.velocity.runtime.parser.node.ASTSetDirective.render(ASTSetDirective.java:142)
    at org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:336)
    at org.apache.velocity.runtime.directive.Parse.render(Parse.java:260)
    at org.apache.velocity.runtime.parser.node.ASTDirective.render(ASTDirective.java:175)
    at org.apache.velocity.runtime.parser.node.SimpleNode.render(SimpleNode.java:336)
    at org.apache.velocity.Template.merge(Template.java:328)
    at org.apache.velocity.Template.merge(Template.java:235)
    at org.springframework.web.servlet.view.velocity.VelocityLayoutView.renderScreenContent(VelocityLayoutView.java:180)
    at org.springframework.web.servlet.view.velocity.VelocityLayoutView.doRender(VelocityLayoutView.java:150)
    at org.springframework.web.servlet.view.velocity.VelocityView.renderMergedTemplateModel(VelocityView.java:291)
    at org.springframework.web.servlet.view.AbstractTemplateView.renderMergedOutputModel(AbstractTemplateView.java:167)
    at org.springframework.web.servlet.view.AbstractView.render(AbstractView.java:250)
    at org.springframework.web.servlet.DispatcherServlet.render(DispatcherServlet.java:1060)
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:798)
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:716)
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:647)
    at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:552)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:734)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:847)
    at org.apache.catalina.core.ApplicationFilterChain.servletService(ApplicationFilterChain.java:427)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:333)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
    at com.sun.appserv.web.cache.filter.CachingFilter.doFilter(CachingFilter.java:291)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
    at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88)
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76)
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:246)
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:214)
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:313)
    at org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:287)
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:218)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:648)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:593)
    at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:94)
    at com.sun.enterprise.web.PESessionLockingStandardPipeline.invoke(PESessionLockingStandardPipeline.java:98)
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:222)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:648)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:593)
    at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:587)
    at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:1093)
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:166)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:648)
    at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:593)
    at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:587)
    at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:1093)
    at org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:291)
    at com.sun.enterprise.web.connector.grizzly.DefaultProcessorTask.invokeAdapter(DefaultProcessorTask.java:666)
    at com.sun.enterprise.web.connector.grizzly.DefaultProcessorTask.doProcess(DefaultProcessorTask.java:597)
    at com.sun.enterprise.web.connector.grizzly.DefaultProcessorTask.process(DefaultProcessorTask.java:872)
    at com.sun.enterprise.web.connector.grizzly.DefaultReadTask.executeProcessorTask(DefaultReadTask.java:341)
    at com.sun.enterprise.web.connector.grizzly.DefaultReadTask.doTask(DefaultReadTask.java:263)
    at com.sun.enterprise.web.connector.grizzly.DefaultReadTask.doTask(DefaultReadTask.java:214)
    at com.sun.enterprise.web.connector.grizzly.TaskBase.run(TaskBase.java:264)
    at com.sun.enterprise.web.connector.grizzly.WorkerThreadImpl.run(WorkerThreadImpl.java:117)
    Caused by: lu.kpmg.core.db.store.StoreException: com.sleepycat.je.EnvironmentFailureException: (I 4.0.71) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (I 4.0.71) / klikit/database 0x2c3/0x564bb4 parent IN fetchTarget 29622582 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2dd/0x38ae2f parent.getDirty () = true status = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed.
    at lu.kpmg.core.db.store.Store.execute(Store.java:230)
    at lu.kpmg.core.db.store.RecordManager.forEach(RecordManager.java:164)
    at lu.kpmg.core.db.store.RecordManager.queryRange(RecordManager.java:158)
    at lu.kpmg.core.db.store.RecordManager.queryAll(RecordManager.java:147)
    at lu.luggel.web.model.ContentManager.getSportResultsForAssociation(ContentManager.java:436)
    at sun.reflect.GeneratedMethodAccessor160.invoke (unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    to org.apache.velocity.util.introspection.UberspectImpl$ VelMethodImpl.doInvoke (UberspectImpl.java:389)
    to org.apache.velocity.util.introspection.UberspectImpl$ VelMethodImpl.invoke (UberspectImpl.java:378)
    at org.apache.velocity.runtime.parser.node.ASTMethod.execute(ASTMethod.java:270)
    ... more than 58
    Caused by: com.sleepycat.je.EnvironmentFailureException: (I 4.0.71) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (I 4.0.71) / klikit/database 0x2c3/0x564bb4 parent IN fetchTarget 29622582 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2dd/0x38ae2f parent.getDirty () = true status = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed.
    at com.sleepycat.je.EnvironmentFailureException.wrapSelf(EnvironmentFailureException.java:197)
    at com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:1392)
    at com.sleepycat.je.dbi.CursorImpl.checkEnv(CursorImpl.java:2813)
    at com.sleepycat.je.Cursor.checkEnv(Cursor.java:2846)
    at com.sleepycat.je.Cursor.close(Cursor.java:439)
    at lu.kpmg.core.db.store.Store.execute(Store.java:226)
    ... more than 68
    Caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (I 4.0.71) / klikit/database 0x2c3/0x564bb4 parent IN fetchTarget 29622582 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2dd/0x38ae2f parent.getDirty () = true status = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed.
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1241)
    at com.sleepycat.je.tree.BIN.fetchTarget(BIN.java:1300)
    at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2362)
    at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2389)
    at com.sleepycat.je.dbi.CursorImpl.getCurrentAlreadyLatched(CursorImpl.java:1410)
    at com.sleepycat.je.Cursor.searchInternal(Cursor.java:2174)
    at com.sleepycat.je.Cursor.searchAllowPhantoms(Cursor.java:2058)
    at com.sleepycat.je.Cursor.search(Cursor.java:1926)
    at com.sleepycat.je.SecondaryCursor.search(SecondaryCursor.java:1364)
    at com.sleepycat.je.SecondaryCursor.getSearchKeyRange(SecondaryCursor.java:1176)
    at lu.kpmg.core.db.store.Store.execute(Store.java:197)
    ... more than 68
    Due to: java.io.FileNotFoundException: /klikit/database/000002c3.jdb (no such file or directory)
    at java.io.RandomAccessFile.open (Native Method)
    in java.io.RandomAccessFile. < init > (RandomAccessFile.java:212)
    in java.io.RandomAccessFile. < init > (RandomAccessFile.java:98)
    to com.sleepycat.je.log.FileManager$ 1. < init > (FileManager.java:992)
    at com.sleepycat.je.log.FileManager.openFileHandle(FileManager.java:991)
    at com.sleepycat.je.log.FileManager.getFileHandle(FileManager.java:887)
    at com.sleepycat.je.log.LogManager.getLogSource(LogManager.java:1073)
    at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:779)
    at com.sleepycat.je.log.LogManager.getLogEntryAllowInvisibleAtRecovery(LogManager.java:743)
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1225)
    ... more than 78



    The exception has occurred several times a day. The root cause was always about the file missing '000002c3.jdb '.

    When we run the DBVerify tool on this basis, it will fail with the following exception:

    Tree for key verification
    Encountered error (continuous):
    com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound 0x2c3/0x4d677e parent IN fetchTarget = 29622782 class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty () = State false = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed.
    Key 592d83a084 Y-\83\a0\84 error
    Error data 5a4e3398 ZN3\98
    Encountered error (continuous):
    com.sleepycat.je.EnvironmentFailureException: (EJ 4.1.7) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound 0x2c3/0x4d677e parent IN fetchTarget = 29622782 class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty () = State false = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed. 0x2de/0x78b409 parent IN fetchTarget 29622782 = TO class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty = false = 0 State)
    Key 592d83a084 Y-\83\a0\84 error
    Error data 5a4e341f ZN4\1f
    Encountered error (continuous):
    com.sleepycat.je.EnvironmentFailureException: (EJ 4.1.7) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound 0x2c3/0x4d677e parent IN fetchTarget = 29622782 class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty () = State false = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed. 0x2de/0x7a568e parent IN fetchTarget 29622781 = TO class = com.sleepycat.je.tree.DIN lastFullVersion = 0x2de/0x8cce1a parent.getDirty = false = 0 State)
    Key 592d87a06842fed4e17b2afa Y-\87\a0hB\fe\d4\e1 error {* \fa}
    UNKNOWN error data
    Encountered error (continuous):
    com.sleepycat.je.EnvironmentFailureException: (EJ 4.1.7) environment must be closed, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment because of the previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound 0x2c3/0x4d677e parent IN fetchTarget = 29622782 class = com.sleepycat.je.tree.DBIN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty () = State false = 0 LOG_FILE_NOT_FOUND: log file missing, log is probably invalid. Environment is not valid and must be closed. 0x2de/0x79dae8 parent IN fetchTarget 16651289 = TO class = com.sleepycat.je.tree.BIN lastFullVersion = 0x2de/0x91e179 parent.getDirty = false = 0 State)
    Key 592d88a06842fed362b24baa Y-\88\a0hB\fe\d3b\b2K\aa error
    UNKNOWN error data
    Encountered error (continuous):
    ...
    ...
    com.sleepycat.je.EnvironmentFailureException: (EJ 4.1.7) environment must be clo
    SED, caused by: com.sleepycat.je.EnvironmentFailureException: invalid environment
    d because of the previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound, fe
    0x2c3/0x4d677e parent IN tchTarget = 29622782 IN class = com.sleepycat.je.tree.DB
    IN lastFullVersion = 0x2de/0x8ccc00 parent.getDirty = false state () = 0 LOG_FILE_NOT_F
    ROUND: Log file missing, log is probably invalid. Environment is not valid and should b
    e closed.
    at com.sleepycat.je.EnvironmentFailureException.wrapSelf(EnvironmentFailureException.java:196)
    at com.sleepycat.je.dbi.EnvironmentImpl.checkIfInvalid(EnvironmentImpl.java:1439)
    at com.sleepycat.je.Database.checkEnv(Database.java:1778)
    at com.sleepycat.je.Database.closeInternal(Database.java:377)
    at com.sleepycat.je.Database.close(Database.java:314)
    at com.sleepycat.je.util.DbVerify.verify(DbVerify.java:293)
    at com.sleepycat.je.util.DbVerify.main(DbVerify.java:98)
    Caused by: com.sleepycat.je.EnvironmentFailureException: invalid OCI environment
    ause of previous exception: (EJ 4.1.7) d:\transfer\database_FileNotFound fetchTa
    0x2c3/0x4d677e parent IN rget = 29622782 IN the class = com.sleepycat.je.tree.DBIN
    stFullVersion = 0x2de/0x8ccc00 parent.getDirty () = false = 0 LOG_FILE_NOT_FOUND State:
    Save the missing file, log is probably invalid. The environment is not valid and must be clo
    SED.
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1337)
    at com.sleepycat.je.tree.BIN.fetchTarget(BIN.java:1367)
    at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2499)
    at com.sleepycat.je.dbi.CursorImpl.fetchCurrent(CursorImpl.java:2526)
    at com.sleepycat.je.dbi.CursorImpl.getCurrentAlreadyLatched(CursorImpl.java:1545)
    at com.sleepycat.je.dbi.CursorImpl.getNextWithKeyChangeStatus(CursorImpl.java:1692)
    at com.sleepycat.je.dbi.CursorImpl.getNext(CursorImpl.java:1617)
    at com.sleepycat.je.dbi.DatabaseImpl.walkDatabaseTree(DatabaseImpl.java:1473)
    at com.sleepycat.je.dbi.DatabaseImpl.verify(DatabaseImpl.java:1420)
    at com.sleepycat.je.util.DbVerify.verifyOneDbImpl(DbVerify.java:366)
    at com.sleepycat.je.util.DbVerify.verify(DbVerify.java:285)
    ... 1 more
    Due to: java.io.FileNotFoundException: d:\transfer\database_FileNotFound\000002c3.jdb (the system cannot find the file specified)
    at java.io.RandomAccessFile.open (Native Method)
    in java.io.RandomAccessFile. < init >(Unknown Source)
    in java.io.RandomAccessFile. < init >(Unknown Source)
    to com.sleepycat.je.log.FileManager$ 1. < init > (FileManager.java:995)
    at com.sleepycat.je.log.FileManager.openFileHandle(FileManager.java:994)
    at com.sleepycat.je.log.FileManager.getFileHandle(FileManager.java:890)
    at com.sleepycat.je.log.LogManager.getLogSource(LogManager.java:1074)
    at com.sleepycat.je.log.LogManager.getLogEntry(LogManager.java:778)
    at com.sleepycat.je.log.LogManager.getLogEntryAllowInvisibleAtRecovery(LogManager.java:742)
    at com.sleepycat.je.tree.IN.fetchTarget(IN.java:1320)
    ... 11 more
    Exit code = false



    It seems that the cleaner thread had deleted the file "000002c3.jdb", while he was still necessary.
    As the database is running with the parameter "je.cleaner.expunge = false", the file was kept as "000002c3.del".

    So we renamed this file in "000002c3.jdb".
    On this database running the successful dbVerify tool (exit code = true).
    The "FileNotFoundException" mentioned previously also not moved no more.


    There is another piece of information that may be useful.
    The database with the file missing the dbverify tool was not a record with "error key 592d83a084".

    Before having the 'FileNotFoundException' we found a record in the database that has a damaged secondary index. We found that
    the 592d83a084 key points on this issue.
    Deletion or update of this record fails with the following exception:

    Caused by: com.sleepycat.je.SecondaryIntegrityException: (I 4.0.71) secondary is corrupt: the primary record contains a key that is not present in the secondary cycle
    at com.sleepycat.je.SecondaryDatabase.deleteKey(SecondaryDatabase.java:937)
    at com.sleepycat.je.SecondaryDatabase.updateSecondary(SecondaryDatabase.java:900)
    at com.sleepycat.je.SecondaryTrigger.databaseUpdated(SecondaryTrigger.java:42)
    at com.sleepycat.je.Database.notifyTriggers(Database.java:2004)
    at com.sleepycat.je.Cursor.putNotify(Cursor.java:1692)
    at com.sleepycat.je.Cursor.putInternal(Cursor.java:1616)
    at com.sleepycat.je.Database.putInternal(Database.java:1178)
    at com.sleepycat.je.Database.put(Database.java:1050)
    at lu.kpmg.core.db.store.Store.store(Store.java:325)
    ... more than 51

    I hope this helps to discover the root cause of the problem and do I even better.

    Holger
  • Analysis &amp; analyze Log files

    Afternoon,

    Yet once thank you advance for looking over my post.

    Here's what I'm working on this period. I'm working on a screen of messaging that is managed via a scheduled task on the hour, every hour. It is of course in coldfusion.

    What I have to do is enter e-mails. The only mention of the status of the e-mail message is in a daily log in the e-mail server. The log file can be between 20 KB to 120 MB. The format of the logfile itself is a little random based on what stage of the E-mail process is on.

    The name of the file is saved as sysMMDD.txt and we have a process that runs every 20 minutes to check the size of the log file for the current date. If it is greater than 10 MB we rename it sysMMDD_1.txt it's really irrelevant to my question but I would like to provide all the information of the thought.

    Going back to the actual format of the journal format it looks like this:

    HH: mm SS:MS TYPE (HASH) [IP ADDRESS] etc.

    Type = type of e-mail or service called

    Hash = a code of unique hash of the email to connect steps

    etc is all the text after the [IP ADDRESS] has no database based on what stage structure.

    The monitor needs to catch all send them in this newspaper between one hour file. Don't forget, the newspaper could contain up to a days worth of data. As it is now I am able to do a count of sending it by searching for the number of times "ldeliver" appears in the log.

    Anyone have any suggestions for the analysis of a log like this? I fear that the way I do it now, which is a hack, is not enough and there is probably a better way to do it.

    Basically, right now I am a cfloop with the index = 'line' through the file. You can imagine how it works with large log files, that's why we created the above scheduled task to rename the log files. Now if I start adding extractions of time as well, I'm sure that this process is going to bust.

    I know that this post is dispersed, but it's just one of those days where everything seems to happen at the same time. Someone at - it other ideas to go about this process? Someone said that an ODBC data source in the text file, but will only work when it is delimited by spaces and only the first "four" pieces are reliable format?

    Any help is appreciated!

    Sorry, Yes.  I don't see you mention that another application generates the log.

    Looping through the file line-by-line adds not really too much of an overload of the resource.  It does not need to load the entire file into RAM, it reads each line in turn.  I tried looping through a file of 1 GB on an instance of CF with only 512 MB of RAM allocated to her and she brassèrent away quite a few lines per millisecond, treatment and that he never broke a sweat.  It took about 7 minutes to treat 1 million lines and never consumed more than a marginal amount of memory.

    Do really know in this way will you gyp?  It doesn't look like the kind of process that must be lightning fast: it is a process in the background, is it not?

    I suppose if you were involved in this topic, you could file through grep the pump at the level of the filesystem first to extract the lines you want and treat this much smaller file.  The file system must process the files of this size quite quickly and effectively.

    I don't would not bother to try to put this stuff in a DB and then treat it: it would be probably more work than just a loop on the file that you are now.

    --

    Adam

  • shell script for oracle alert.log file

    Hi gurus,

    I want to write the shell script to know the last timing 10 stop of the database of the alerts log file. I'm working on oracle 9i.

    Could someone please advice on that.

    Thanks in advance

    Kind regards
    Shaan

    Published by: Shaan_dmp on January 5, 2009 13:27

    Published by: Shaan_dmp on January 5, 2009 13:28

    Use awk. I have not at hand a 9i but here is a very simple version for 10g XE

    My awk file (line numbers for the notes below - don't include them):

    01:BEGIN { prevline = "";} 02:03:/Completed: alter database close/ {print prevline,FS,$0;}04:05:{prevline = $0;}
    

    The command line and the results (from my alert log 300 k)

    $ awk -f alert.awk.txt alert_xe.logFri Apr 11 18:08:40 2008   Completed: alter database close normalFri May 16 18:53:21 2008   Completed: alter database close normalTue May 20 17:28:23 2008   Completed: alter database close normalThu Jul 17 19:08:52 2008   Completed: alter database close normalFri Aug 15 15:12:48 2008   Completed: alter database close normalWed Nov 05 08:52:59 2008   Completed: alter database close normalFri Nov 14 16:36:03 2008   Completed: alter database close normalTue Dec 09 10:46:23 2008   Completed: alter database close normalMon Jan 05 11:12:22 2009   Completed: alter database close normal
    

    What it means:

    (1) section START line 01 defines the variable to contain the previous line
    (2) the /search chain / line 03 search marker in the file for a stop, and then performs the requested action (print the time that was in the previous line and then this line; use FS (the field to the awk - space normally separator) as a separator
    (3) on line 05 is a statement that we do on each line - is remembered in case it is the timestamp for the closure.

    Now, you can include several cases corner for closures by adding several models of research etc. For more information, google for examples of awk.

    AWK is really good at this sort of thing!

    HTH

    Nigel cordially

  • Quickly fill out log files

    I noticed that whenever I have mount smbfs sharing wireless does not work after a few minutes and must be reset. Log files seem to be filled quickly upward.
    Demon and sys log contains similar lines.
    2 jan 21:51:34 NetworkManager TOSHIBA-User: "supplicant" changed state: 0
    2 jan 21:51:40 TOSHIBA - user NetworkManager: begging changed state: 1
    2 jan 21:53:48 TOSHIBA - user NetworkManager: begging changed state: 0
    2 jan 21:53:54 NetworkManager TOSHIBA-User: "supplicant" changed state: 1
    2 jan 21:56:03 NetworkManager TOSHIBA-User: "supplicant" changed state: 0
    2 jan 21:56:09 NetworkManager TOSHIBA-User: "supplicant" changed state: 1
    2 jan 21:58:10 user TOSHIBA gdmgreeter [21619]: Gtk-CRITICAL: gtk_tree_view_get_selection: assertion 'GTK_IS_TREE_VIEW (tree_view)' failed
    2 jan 21:58:10 user TOSHIBA gdmgreeter [21619]: Gtk-CRITICAL: gtk_tree_selection_unselect_all: assertion 'GTK_IS_TREE_SELECTION (sélection)' failed
    2 jan 21:58:10 user TOSHIBA gdmgreeter [21619]: Gtk-CRITICAL: gtk_tree_selection_select_iter: assertion 'GTK_IS_TREE_SELECTION (sélection)' failed
    2 jan 21:58:10 user TOSHIBA gdmgreeter [21619]: Gtk-CRITICAL: gtk_tree_view_scroll_to_cell: assertion 'GTK_IS_TREE_VIEW (tree_view)' failed
    2 jan 21:58:17 TOSHIBA - user NetworkManager: begging changed state: 0
    2 jan 21:58:23 NetworkManager TOSHIBA-User: "supplicant" changed state: 1
    2 jan 21:58:41 user TOSHIBA NetworkManager: lists wireless network enabled to update.
    2 22:00:31 jan NetworkManager TOSHIBA-User: "supplicant" changed state: 0
    2 jan 22:00:37 TOSHIBA - user NetworkManager: begging changed state: 1
    22:02:45 jan 2 NetworkManager TOSHIBA-User: "supplicant" changed state: 0

    This could have an impact on anther problem already registered

    Any ideas?

    malcolli

    Also read and check logs it seems that this may be two flaws.
    The NetwokManager and the Gnome Display Manager.

    Now, to learn more.

    malcolli

  • KB954430 continues to appear in Windows Update, Error Message for lack of Log File during manual re - install

    I followed several other discussions about the inability to install this update, but I succumbed.

    My problem started when I did a system restore to solve a problem with a newly installed program.

    Under update history, windows update is of the opinion that this update has been successfully installed several times.

    The 954430.log log file is not present in my windows directory.

    The update is not present in the list of installed updates.

    I moved msxml4.dll and msxml4r.dll in the system32 directory.  When I try to reinstall KB954430 manually, the following error message is received: "error opening installation log file.  Verify that the specified log file location exists and is writable. »

    Thanks in advance for your help.

    Thanks for the attempt to help Umesh P.  Went before your answer.
    Something I've done has solved the problem, if I'm not 100% of the exact solution.

    So, I found some of these random number files that contains kb954430 log files in the home directory of my swap file partition.  I deleted these.

    I tried to reinstall manually, it still gave me the same "log file" error as described in my post above.  Then, I tried to install it via the automatic update service.  Then, he stated that the update was not necessary.  The update kept bugging me on this subject, but finally, after a reboot or two, I noticed he had stopped to ask.
    My computer seems to work now, as far as I know, although I do not have the patch installed in my list of updates, and the msxml dll never came back.  Is there a chance that the patch was never installed and he abandoned?
  • find the entries in the log file for sfc /scannnow XP

    How to analyze the entries in log file generating the program Checker (SFC.exe) resources of Microsoft Windows in Windows xp?

    That means "[SR]" in the findstr/c: "[SR]" %windir%\logs\cbs\cbs.log > sfcdetails.txt entry in the cmd window to find newspapers?

    What you trying to do?

    If you read this article on Vista (KB 928228) - which does not apply to XP - while the command does is use the fact that "each entrΘe the SFC.exe program in this file has a [SR] tag" to remove entries related to sfc.exe since services component-based' sign (CBS), which contains other entries as well.

    If you ask what the letters SR actually rest in this context, you will have to search you, but I think that documentation on this subject is quite deeply buried at Microsoft.

  • Need help with following error Message: ERROR of OPENING WET7CABLE. LOG FILE on my Windows XP laptop

    Please need help with an Error Message on my cell phone. The message is as follows: ERROR of OPENING WET7CABLE. LOG FILE

    This message came after running a disk that was provided by Belkin cable easy transfer (FU279) on my old laptop with Windows XP Home Edition you are trying to transfer my files from my old computer laptop w/Win XP on a new computer laptop w/Win 7. This record is for the Windows XP upgrade to Windows 7 and transfer the files.

    I want to thank all in advance for your answers.

    Nelson Santiago

    Hi NELSONSANTIAGO,

    1. when exactly you receive the error message?

    2. is the Belkin easy transfer cable recognized by the Windows XP computer?

    This file may be located on the Belkin Easy Transfer Cable installation disc.

    For more information on how to use or configure the Belkin Easy Transfer cable in Windows XP, see the link below the manual on the Belkin site and check if that helps.

    http://en-UK-support.Belkin.com/app/product/detail/p/4825

Maybe you are looking for