Log Agent Insight - tracking logs files / pick up where he leaves?

Hello

I want to check that the Agent Insight of the newspaper (Linux in my case) actually 'catch up' after a restart of the service liagnetd or a server restart.  It seems to follow the current logfile against operational records, that the agent uses.  Please let me know if this is the case, or if my assumptions are false.

Thank you

Pete Boguszewski

You are right, as soon as the officer first finds a file, it will begin to collect in the next event and will always watch where it stopped and will catch up if anything interrupts collection.

Tags: VMware

Similar Questions

  • How to read file returned by the file picker

    I'm going to obviously stupid here, but for the life of me I can't figure out how read in a text file returned by the file picker.  I am currently getting the good full path of the file in the file picker but where do I go from here?  BlackBerry.IO.file does not seem to exist in Blackberry 10 and I'm confused as to how I would use the HTML5 File API from only a file returned by the file picker path.

    Can someone please spit out a quick and dirty example showing me the rudiments of reading in the data file from a path returned by the file picker, please?

    Thank you very much

    DM

    Don't know if you have the path to the file, and then you left there. One thing to note. The file MUST be inside you bar (sandbox) OR you need to add the <> ermit > access_sharedermit >
    and then he can live the file in the shared folder on the device.

    Here is a code that you can use to read a file. I poached it our docs found here:https://developer.blackberry.com/html5/apis/blackberry.io.html

    
    
  • Try to collect events to a log file and the Agent installed Linux and work - need help.

    I modified liagent.ini by documentation... If I understand well it... actually I changed so many times my eyes hurt.

    Here it is:

    ; Configuration of the Agent of VMware Log Insight. Please save it in UTF-8 format if you use non-ASCII names / values!

    ; The actual configuration is this file that is associated with the server settings to form animal - effective .ini

    ; Note: The agent is not necessary to restart after making a configuration change

    ; Note: It may be more efficient to configure Server Agents page!

    [Server]

    hostname = 192.168.88.89

    ; Name of host or IP of your Server Log Insight / load balancing cluster. By default:

    ; hostname = LOGINSIGHT

    ; Protocol can be cfapi (Log Insight REST API), syslog. By default:

    proto = cfapi

    ; Server port connect Insight to connect to you. Default ports for protocols (TCP all):

    ; syslog: 514; syslog with ssl: 6514; cfapi: 9000; cfapi with ssl: 9543. By default:

    port = 9000

    ; Use SSL. By default:

    SSL = no.

    ; Example of configuration with the certification authority:

    ; SSL = yes

    ; ssl_ca_path=/etc/PKI/TLS/certs/CA.PEM

    ; Time in minutes to force the reconnection to the server.

    ; This option reduces the imbalances caused by the long lifetime as TCP connections. By default:

    Reconnect = 30

    [record]

    ; Logging detail level: 0 (no debug messages), 1 (essentials), 2 (verbose with more impact on performance).

    ; This option should always be 0 in normal conditions. By default:

    debug_level = 1

    [storage]

    ; Local max storage expiration (data + logs) in the valid range MBs.: 100-2000 MB.

    max_disk_buffer = 2000

    ; Uncomment the appropriate section to collect log files

    ; The recommended method is to activate the content pack Linux server LI

    [filelog | bro]

    Directory = / data/bro/newspapers/2015-03-04

    ; include = * .log

    parser = auto



    I post it here, I have created a support pack?


    Post edited by: I added a screenshot of the status of the personnel of kevinkeeneyjr

    Post edited by: kevinkeeneyjr added liagent.ini

    Ah! Yes, the agent is to collect real-time events. If no new event is written then it won't work. If you want to collect logs that have been generated before you use the importer of Log insight which was published with LI 3.3. I hope this helps!

  • View the log files and message tracking

    Just our ES3300 running. Looked through the Administrator's guide and searched a bit, is there not another way to view a log of what happened through the system of having to download log files?

    Message tracking would be done in the audit.

    Is not necessary to download and browse through the newspapers unless your train to solve a problem.

  • With the help of Windows agent to send log files

    Hello

    I really need to understand the sending of log files by using the Windows agent.  I have seeen your doc about this and it is a bit easy.  Then the event_marker is complicated.  The example shows a bunch of control characters and braces.    I need to understand the parameters event_marker.  Which, in a text file suggests that there must be a event_marker?

    Thanks in advance,

    Michael

    Hey Michael - don't know how I missed this one! The event_marker is a regular expression that is known to match the beginning of a new log message. For example, say I have the following logs:

    2014-03-27 10:29:38, 534 [pool-35-thread-1] DEBUG opId = com.vmware.sps.qs.notify.VasaProviderInfoNotifier - survey for the VASA provider changes

    2014-03-27 10:29:38, 540 [pool-35-thread-1] DEBUG opId = com.vmware.vim.query.client.impl.QueryDispatcherImpl - send request:

    declare default element space namespace "urn: vasa;

    declare namespace qs = "urn: vmware:queryservice";

    declare namespace vim25 = "urn: vim25";

    declare namespace xlink = "http://www.w3.org/1999/xlink";

    declare namespace vasaData = "http://data.vasa.vim.vmware.com/xsd";

    declare namespace vasa = "http://com.vmware.vim.vasa/1.5/xsd";

    for $doc in //VendorProviderInfo

    return

    {data ($doc/vasa: back / vasaData:uid)})}

    2014-03-27 10:29:38, 543 [pool-35-thread-1] DEBUG opId = com.vmware.vim.query.client.impl.QueryDispatcherImpl - recorded flow (QueryResponseMonitor): 27227

    You can see how the second message is a multiline message. If you do not specify an event_marker then the multiline message will be divided into individual events. However, if you specify an event_marker with a regular expression that matches the start of an event, then it will not break the message upward. For the type of message above, I could use an event_marker like:

    {event_marker = ^ \d{4}-\d {2} - \d {2} \d{2}:\d{2}:\d{2},\d{3}

    Which matches: 2014-03-27 10:29:38, 534

    The event_marker can be as short or as long as you want, but you must ensure that the regular expression that you specify is guaranteed to match a new event. I hope this helps!

  • Question about the display of what log file mapped in LogFilter agent

    Hello

    LogFilter agent allows you to have up to 4 different log files (and paths) to match strings in the list.

    Is there a way to make a rule that kicks in when there is a match of logFilter - to have access to what filepath had the match?

    So, for example if I have;

    /path1/server.log

    /path3/server.log

    and if the rule that fires when the logFilter has a match, I would like it to show which of the 2 filepaths contained the game.

    Thank you

    "mark".

    Hi Mark,

    The default LogFilter rule creates an alarm that contains the path to the log file of the execution of the variable of severity level "text".

    This script uses "entry.get("LogName")" to extract the name of the log file, which is displayed in the Message field of alarm and the alarm dialog box as well as the text that triggered the alarm:

    Kind regards

    Brian Wheeldon

  • Log file that can be used to track the HFR reports HTML preview

    Hello

    Could if it you please let me know the log file which can be used to track the preview HTML of the HFR reports?

    Thank you
    Aparna

    You can check the FinancialReporting0.log and the FRLogging.log. You can return the same thing in the
    "http://docs.oracle.com/cd/E17236_01/epm.1112/epm_install_troubleshooting_11121.pdf", page 47.

  • Foglight monitoring log file

    Hi all

    We use the IC 5.6.7 cartridge for our monitoring infra goal. We have a requirement for the windows log file monitoring so that we can trigger alerts based on the words and we have not inherited cartridge more.

    Can we control logs using IC cartridge or any separate available for monitoring log files?

    Please let us know those required as we need.

    Kind regards

    Guenoun

    If it's just normal newspaper followed, you can use the logfilter officer but which does not require a fglam on the computer running target

    http://en.community.Dell.com/TechCenter/performance-monitoring/Foglight-administrators/w/Admins-wiki/5646.monitoring-application-availability-using-Foglight-utility-agents

    Best regards

    Golan

  • URL and monitoring of log files by using Foglight

    Hi all

    We are using Foglight 5.6.4 and running foglight agent manager 5.6.7 inside.

    We try to follow a URL and log files that are hosted on a machine Windows 2012...

    Know will support implement our requirements (IE 2012 Windows monitoring)?

    Could you share your ideas on this pls...

    Thank you best regards &,.

    Guenoun

    This will be an interesting question.

    Article don't not specify how, but there is an interesting question of the platform.

    Includes support for windows 2012 Fglam for Foglight 5.6.7

    http://eDOCS.quest.com/Foglight/567/doc/core/SystemRequirements/platforms.2.php

    But agents being inherited are not a bunch of 2012.

    That's what I would have tried (and it is not guaranteed, due to the fact that we don't have a lot of 2012 windows):

    I would try the package inherited from windows 2003 because windows 2008 is was not the agent of webmointor by mistake) and see if it works. That follows you suggest to use the package of windows 2003, but again, we have the unknown of the platform is windows 2012

    https://support.quest.com/SolutionDetail.aspx?ID=SOL69379&category=solutions&SKB=1

    If it is a test environment and you can upgrade to Foglight 5.6.10 you can have a bit of an easier time because we have a NEW monitor of web agent which is not a part of the legacy package (it's also prettier)

    http://eDOCS.quest.com/Foglight/5610/files/CartridgeForWebMonitor_5610_ReleaseNotes.html

    And then I will complete with the package inherited from Windows 2008 to monitor newspapers, it is not tested yet and don't know if officially supported but I suspect it should work (just remember to do the 'Show packages for all platforms' "when you deploy the package).

    Hope this helps.

    Golan

  • Check WebLogic Console output in the log files

    Hello

    I want to save my messages of the weblogic console and SOP of printing in a log file to analyze the results and the exception. For example, suppose I have a booth along the JVM field on which a WAR or EAR file deployed. The output of the print statements and thrown exceptions appear on the console of the black JAVA virtual machine but I want them saved to history and the track. WebLogic Server connects to them and put somewhere or we need to do it manually? If yes then what is the way to do it?

    Concerning

    -Magwaa

    Hi Magwaa,

    Take a look at this blog post explaining how to redirect the output to the console: Blog of Fusion: Weblogic: Console Redirection of log file output

  • Monitor log file for Netbackup

    Hello

    We seek to monitor Netbackup backup errors in Foglight and I was wondering the best way to go about it.  In essence the software Netbackup written in a file (C:\Program Files\Legato\nsr\logs\backup_failure.log) and I want to draw attention when we get some messages, such as "Impossible" or "unsuccessful save sets. I tried to use the logfilter legacy agent, but this doesn't seem to work or I set up correctly.  No matter which help out me?  Also, everything what we use, that we will be able to draw the line complete or just be a generic alarm syaing "there is a problem with a nackup of netbackup" kind of thing?

    All advice appreciated.

    Thank you

    Davie

    Hey Davie,

    The LogFilter legacy agent is probably the best way to control this log file. Here's and example of how to configure the list of game:

    For your case, you should be able to enter "Down" and "Unsuccessful save sets" on separate lines in the Match string column, and then matched with the appropriate alarm severities.

    The resulting configuration alarms like this:

    The alarm message reports the first 255 characters of the corresponding journal line.

    After setting up and activation of the LogFilter agent, check the agent log to verify that he was able to find and read the target log file.

    Kind regards

    Brian Wheeldon

  • How to minimize the log files in CS?

    Hi all

    We have the database logs, archive logs and logs the content server that is stored on the disk. Y at - there any configuration setting that we can do to minimize the number of files that can be stored.

    Also, is there a configuration that allows you to set different absolute paths for these logs files? As well as the folder / space management becomes easier.

    Hello Mohan,

    These are the configuration variables that control the size of the log file, the directory and the number of newspapers to keep:

    TraceDirectory entered configuration defines which directory you want the tracking logs files to write to.

    TraceDirectory =

    The configuration entry TraceFileSizeLimit sets the size of each log file.  The default value for this setting is 1048576, which is equivalent to 1 megabyte.

    TraceFileSizeLimit = 10000000

    The configuration entry TraceFileCountLimit sets the number of log files that are in the rotation.  The default value for this parameter is 10.

    TraceFileCountLimit =

    Events tracing output can be controlled with a setting similar to the use of a single for the track directory.

    EventDirectory =

    Thank you

    Srinath

  • Problems with 'Connection to the Site target' where is the log file?

    Hi all!

    I am VSphere replication deployment to 2 VCenter servers with 1 Center Server in a lab environment. I successfully deployed and saved VRM devices to both VCenter servers. I also installed the SRM agent on the same VCenters servers as well. Both plugins are appearing in both VCenter servers. When I try to connect to the site target either VCenter, I get the following error below after I click 'OK '. What log file should I look at to determine my problem?

    vrm1.pngvrm2.png

    I solved this problem today.

    I had the two configuration of VCenter servers for both use TCP 8080 for HTTP traffic, I have uninstalled/reinstalled VCenter at both ends. I accepted the default 80 TCP HTTP this time and I was able to connect to my remote / targeted VCenter in the connections section.

    What is strange, is that the VRM devices said they used TCP 8080 to save the VRM instance/database on the Service Platform controller (VCenter) and recorded everything very well. I was able to perform very well with configured TCP 8080 local replication.

    My company has sometimes display TCP 80 a vulnerability and try to use other ports where possible.

  • Generic Unix 11.1.1.7.0 - blank log file connector

    Hello

    We installed the generic Unix 11.1.1.7.0 for IOM 11.1.1.5.4 connector. The connector works well, but there is no log occurs in the log file. After doing configurations as described in the document to the activation of logging, the log file is generated, but there is no paper inside message. Even if you try with details of incorrect connection to the target. Exceptions are seen in the server logs, but not in the Connector log file.

    Here are the contents of my logging.xml file

    <? XML version = "1.0" encoding = "UTF-8"? >
    < logging_configuration >
    < log_handlers >

    < name log_handler = "console-handler" class = 'oracle.core.ojdl.logging.ConsoleHandler' format ='oracle.core.ojdl.weblogic.ConsoleFo
    jonhino ' level = "WARNING: 32" / >

    < name log_handler = "odl-handler" class = 'oracle.core.ojdl.logging.ODLHandlerFactory' filter ='oracle.dfw.incident.IncidentDetectionLo
    gFilter ">"
    < property name = "path" value='${domain.home}/servers/${weblogic. Name} /logs/$ {weblogic. Name} - diagnostic.log "/ >"
    < property name = value 'maxFileSize' = ' 10485760 "/ >"
    < property name = "maxLogSize" value = "104857600" / >
    < property name = value "encoding" = "UTF - 8" / > ".
    < property name = "useThreadName" value = "true" / >
    < property name = "supplementalAttributes" value ='J2EE_APP.name, J2EE_MODULE.name, WEBSERVICE.name, WEBSERVICE_PORT.name, composite_inst
    ance_id, component_instance_id, composite_name, name of the component "/ >"
    < / log_handler >

    < name log_handler = 'wls-domain' class = 'oracle.core.ojdl.weblogic.DomainLogHandler' level = "WARNING" / >

    < name log_handler = "message-GOSA-handler" class = "oracle.core.ojdl.logging.ODLHandlerFactory" > "
    < property name = "path" value='${domain.home}/servers/${weblogic. Name} / logs/GOSA/msglogging "/ >"
    < property name = value 'maxFileSize' = ' 10485760 "/ >"
    < property name = "maxLogSize" value = "104857600" / >
    < property name = value "encoding" = "UTF - 8" / > ".
    < property name = "supplementalAttributes" value='J2EE_APP.name,J2EE_MODULE.name,WEBSERVICE.name,WEBSERVICE_PORT.name'/ >
    < / log_handler >

    < name log_handler = em-journal-Manager ' level = ' NOTIFICATION: 32 ' class = 'oracle.core.ojdl.logging.ODLHandlerFactory' filter ='oracle.dfw.i
    preloaded. IncidentDetectionLogFilter ">"
    < property name = "path" value='${domain.home}/servers/${weblogic. Name}/SYSMAN/log/eMoms.log'/ >
    < property name = value 'format' = "ODL-Text" / >
    < property name = "useThreadName" value = "true" / >
    < property name = value 'maxFileSize' = ' 5242880 "/ >"
    < property name = value 'maxLogSize"=" 52428800 "/ >"
    < property name = value "encoding" = "UTF - 8" / > ".
    < / log_handler >

    < name log_handler = em-trc-Manager ' level = "TRACE: 32" class = "oracle.core.ojdl.logging.ODLHandlerFactory" > "
    < property name ='logreader: "value ="off"/ >"
    < property name = "path" value='${domain.home}/servers/${weblogic. Name}/SYSMAN/log/eMoms.trc'/ >
    < property name = value 'format' = "ODL-Text" / >
    < property name = "useThreadName" value = "true" / >
    < property name = "local" value = "fr" / >
    < property name = value 'maxFileSize' = ' 5242880 "/ >"
    < property name = value 'maxLogSize"=" 52428800 "/ >"
    < property name = value "encoding" = "UTF - 8" / > ".
    < / log_handler >

    < name log_handler = "unix-handler" level = "NOTIFICATION: 1" class = "oracle.core.ojdl.logging.ODLHandlerFactory" > "
    < property name ='logreader: "value ="off"/ >"
    < property name = "path" value='${domain.home}/servers/${weblogic. Name}/logs/unixConnector.log'/ >
    < property name = value 'format' = "ODL-Text" / >
    < property name = "useThreadName" value = "true" / >
    < property name = "local" value = "fr" / >
    < property name = value 'maxFileSize' = ' 5242880 "/ >"
    < property name = value 'maxLogSize"=" 52428800 "/ >"
    < property name = value "encoding" = "UTF - 8" / > ".
    < / log_handler >

    < / log_handlers >

    <>recorders

    < name of creator = "" level = "WARNING: 1" > "
    < manager name = 'Manager of odl' / >
    < manager name = 'wls-domain' / >
    < manager name = "console-handler" / >
    < / recorder >

    < name = 'org.identityconnectors.genericunix logger' level = ' NOTIFICATION: 1 "useParentHandlers ="false">
    < manager name = "unix-handler" / >
    < manager name = "console-handler" / >
    < / recorder >

    < name = "oracle.iam.connectors.icfcommon logger" level = "NOTIFICATION: 1" useParentHandlers = "false" > "
    < manager name = "unix-handler" / >
    < / recorder >

    < creator name = 'oracle' level = ' NOTIFICATION: 1 "/ >

    < name = "oracle.adf" / recorder >
    < name="oracle.adf.desktopintegration"/ recorder >
    < name="oracle.adf.faces"/ recorder >
    < name="oracle.adf.controller"/ recorder >
    < name = "oracle.adfinternal" / recorder >
    < name="oracle.adfinternal.controller"/ recorder >
    < name = "oracle.jbo" / recorder >
    < name = "oracle.adfdt" / recorder >
    < name = "oracle.adfdtinternal" / recorder >

    < name = "oracle.bam" / recorder >
    < name="oracle.bam.adapter"/ recorder >
    < name="oracle.bam.common"/ recorder >
    < name="oracle.bam.system"/ recorder >
    < name="oracle.bam.middleware"/ recorder >
    < name="oracle.bam.adc.security"/ recorder >
    < name="oracle.bam.common.security"/ recorder >
    < name="oracle.bam.adc.ejb.BamAdcServerBean"/ recorder >
    < name="oracle.bam.reportcache.ejb.ReportCacheServerBean"/ recorder >
    < name="oracle.bam.eventengine.ejb.EventEngineServerBean"/ recorder >
    < name="oracle.bam.ems.ejb.EMSServerBean"/ recorder >
    < name="oracle.bam.adc.api"/ recorder >
    < name="oracle.bam.adc"/ recorder >
    < name="oracle.bam.eventengine"/ recorder >
    < name="oracle.bam.ems"/ recorder >
    < name="oracle.bam.webservices"/ recorder >
    < name="oracle.bam.web"/ recorder >
    < name="oracle.bam.reportcache"/ recorder >

    < name = "oracle.bpm" / recorder >
    < name="oracle.bpm.analytics"/ recorder >
    < name = "oracle.integration" / recorder >
    < name="oracle.integration.platform.blocks.cluster"/ recorder >
    < name="oracle.integration.platform.blocks.deploy.coordinator"/ recorder >
    < name="oracle.integration.platform.blocks.event.saq"/ recorder >
    < name="oracle.integration.platform.blocks.java"/ recorder >
    < name="oracle.integration.platform.faultpolicy"/ recorder >
    < name="oracle.integration.platform.testfwk"/ recorder >
    < name = "oracle.soa" / recorder >
    < name="oracle.soa.adapter"/ recorder >
    < name="oracle.soa.b2b"/ recorder >
    < name="oracle.soa.b2b.apptransport"/ recorder >
    < name="oracle.soa.b2b.engine"/ recorder >
    < name="oracle.soa.b2b.repository"/ recorder >
    < name="oracle.soa.b2b.transport"/ recorder >
    < name="oracle.soa.b2b.ui"/ recorder >
    < name="oracle.soa.bpel"/ recorder >
    < name="oracle.soa.bpel.console"/ recorder >
    < name="oracle.soa.bpel.engine"/ recorder >
    < name="oracle.soa.bpel.engine.activation"/ recorder >
    < name="oracle.soa.bpel.engine.agents"/ recorder >
    < name="oracle.soa.bpel.engine.bpel"/ recorder >
    < name="oracle.soa.bpel.engine.compiler"/ recorder >
    < name="oracle.soa.bpel.engine.data"/ recorder >
    < name="oracle.soa.bpel.engine.delivery"/ recorder >
    < name="oracle.soa.bpel.engine.deployment"/ recorder >
    < name="oracle.soa.bpel.engine.dispatch"/ recorder >
    < name="oracle.soa.bpel.engine.sensor"/ recorder >
    < name="oracle.soa.bpel.engine.translation"/ recorder >
    < name="oracle.soa.bpel.engine.ws"/ recorder >
    < name="oracle.soa.bpel.engine.xml"/ recorder >
    < name="oracle.soa.bpel.entity"/ recorder >
    < name="oracle.soa.bpel.jpa"/ recorder >
    < name="oracle.soa.bpel.system"/ recorder >
    < name="oracle.soa.dvm"/ recorder >
    < name="oracle.soa.management.facade.api"/ recorder >
    < name="oracle.soa.mediator"/ recorder >
    < name="oracle.soa.mediator.common"/ recorder >
    < name="oracle.soa.mediator.common.cache"/ recorder >
    < name="oracle.soa.mediator.common.error"/ recorder >
    < name="oracle.soa.mediator.common.error.recovery"/ recorder >
    < name="oracle.soa.mediator.common.message"/ recorder >
    < name="oracle.soa.mediator.dispatch"/ recorder >
    < name="oracle.soa.mediator.dispatch.resequencer.toplink"/ recorder >
    < name="oracle.soa.mediator.filter"/ recorder >
    < name="oracle.soa.mediator.instance"/ recorder >
    < name="oracle.soa.mediator.management"/ recorder >
    < name="oracle.soa.mediator.metadata"/ recorder >
    < name="oracle.soa.mediator.monitor"/ recorder >
    < name="oracle.soa.mediator.resequencer"/ recorder >
    < name="oracle.soa.mediator.resequencer.besteffort"/ recorder >
    < name="oracle.soa.mediator.resequencer.fifo"/ recorder >
    < name="oracle.soa.mediator.resequencer.standard"/ recorder >
    < name="oracle.soa.mediator.service"/ recorder >
    < name="oracle.soa.mediator.serviceEngine"/ recorder >
    < name="oracle.soa.mediator.transformation"/ recorder >
    < name="oracle.soa.mediator.utils"/ recorder >
    < name="oracle.soa.mediator.validation"/ recorder >
    < name="oracle.soa.scheduler"/ recorder >
    < name="oracle.soa.services.common"/ recorder >
    < name="oracle.soa.services.identity"/ recorder >
    < name="oracle.soa.services.notification"/ recorder >
    < name="oracle.soa.services.rules"/ recorder >
    < name="oracle.soa.services.rules.obrtrace"/ recorder >
    < name="oracle.soa.services.workflow"/ recorder >
    < name="oracle.soa.services.workflow.common"/ recorder >
    < name="oracle.soa.services.workflow.evidence"/ recorder >
    < name="oracle.soa.services.workflow.metadata"/ recorder >
    < name="oracle.soa.services.workflow.persistency"/ recorder >
    < name="oracle.soa.services.workflow.query"/ recorder >
    < name="oracle.soa.services.workflow.report"/ recorder >
    < name="oracle.soa.services.workflow.runtimeconfig"/ recorder >
    < name="oracle.soa.services.workflow.soa"/ recorder >
    < name="oracle.soa.services.workflow.task"/ recorder >
    < name="oracle.soa.services.workflow.task.dispatch"/ recorder >
    < name="oracle.soa.services.workflow.task.routing"/ recorder >
    < name="oracle.soa.services.workflow.user"/ recorder >
    < name="oracle.soa.services.workflow.verification"/ recorder >
    < name="oracle.soa.services.workflow.worklist"/ recorder >
    < name="oracle.soa.services.workflow.performance"/ recorder >
    < name="oracle.soa.services.cmds"/ recorder >
    < name="oracle.soa.wsif"/ recorder >
    < name="oracle.soa.xref"/ recorder >

    < name = "oracle.ucs" / recorder >
    < name = "oracle.sdp" / recorder >
    < name = "oracle.sdpinternal" / recorder >
    < name="oracle.sdp.messaging"/ recorder >
    < name="oracle.sdp.messaging.client"/ recorder >
    < name="oracle.sdp.messaging.driver"/ recorder >
    < name="oracle.sdp.messaging.engine"/ recorder >
    < name="oracle.sdp.messaging.parlayx"/ recorder >
    < name="oracle.sdp.messaging.server"/ recorder >

    < name = "oracle.wsm" / recorder >

    < name = "oracle.wsm.msg.logging logger" level = "NOTIFICATION: 1" useParentHandlers = "false" > "
    < manager name = "GOSA-message Manager" / >
    < manager name = 'wls-domain' / >
    < / recorder >

    < name = 'oracle.sysman logger' level = ' NOTIFICATION: 32 "useParentHandlers ="false">
    < manager name = em-journal-Manager "/ >"
    < manager name = em-trc-Manager "/ >"
    < / recorder >

    < / recorders >
    < / logging_configuration >

    Let me know if I missed any configuration.

    Concerning

    Cédric Michel

    This has been resolved. Use the Patch 14271576.

  • Location of the FRReportSRV.log file?

    Hello

    We are eager to see a newspaper when our users run reports.  This journal should have the time, report and username.  I heard that the FRReportSRV.log file would be very useful, but I can't seem to find it.  As far as I know, there is no BI or BIplus folder anywhere on our servers.  We have only the Financial Reporting and other BI applications.  This could be a problem of version?  We are version 11.1.2.2.300.  We had this in our old environment file, 11.1.1, but I can't seem to find in our new environment.  Is there a different newspaper, that I should be looking?  Really any help on this would be welcome.

    Thank you

    Jeff C.

    Hi Jeff,

    Yes, there is change in the directory structure in 11.1.2.x.  You should find EN newspapers under user_projects\domains\EPMSystem\servers\FinancialReporting0\logs.

    FRLogging.log is the journal name, you should seek instead of FRReportSRV.log.

    Here are the details on EMP 11.1.2.2 newspapers: Installation and Configuration Guide Release 11.1.2.2 Troubleshooting

    In addition, you can activate "Follow user" the workspace to track the activities of FR.

    Kind regards

    Santy.

Maybe you are looking for