Scope of data logging

Hello

I'm trying to create a scope that will record a data section so that it can be consulted at a later date... I use a queue to store the data that I log on, although when the section of my code is run that stores the data in the queue, no data ends here and no error is produced. I really don't see what I'm doing wrong. Is it possible someone could have a look att the attached vi (recorded for LabVIEW 8.0) and see what I'm doing wrong, it would be greatly appreciated!

Thank you, Alec

You AND-ing, the event to place of Capture with a FALSE constant, which will always give you false and you'll never go to the "Stored update" event. Remove this piece of code and it will work as you can hear.

The best way to determine which button was pressed would be to use a structure of the event. In addition, it would be better to have a one Stop button that stops all the loops. See this post for options on how to do it: http://forums.ni.com/t5/LabVIEW/What-is-the-preferred-way-to-stop-multiple-loops/m-p/1035851

Tags: NI Software

Similar Questions

  • Access ntuser.dat.log in NETWORK services folder refused even in safe MODE

    I try to accessntuser.dat.log in the file services NETWORK even in safe MODE but not private.  I am logged in as ADMINISTRATOR.  I guess that's the case as well if I connected any other user?

    I looked in the MGR TASK to try and identify what process this block, but I am at a loss.  Anyone know which circles I need to pass through power to life this file?

    Guessing that you mean this file:

    c:\Documents and Settings\NetworkService

    You cannot open the file because XP is running and the file is in use and the Task Manager will not help you indicate what (s) blocking process your effort to open the file (it is not what is the Task Manager for).

    Sometimes, you can copy a file open and then open the copy, but it will not work with ntuser.dat.log and your efforts to access the file will give results like:

    If you want to open the file, you can open a session under a different name and access the file of ntuser.dat.log of another user in this way (since the other user will not in use).

    For example, if I am logged on as user ElderK I can't access my ntuser.dat.log file but I can access the file owned by another user as Jose in looking here:

    c:\Documents and Settings\Jose

    Or, you can start on something like a Hiren Boot CD and access the file from there since your XP will not work.

    I see no reason to watch the ntuser.dat.log file is binary data, then maybe you can tell us what you're trying to do and why (or you just practice).

  • the Ntuser.dat.log file size

    Hello

    I found this hidden file (located in the C:\Document and Settings\MyUsername\ folder) ntuser.dat.log , keeps changing itself... (every few seconds/minutes)
    its size is 1 KB to 400kb ~
    Is this normal?
    I thought that he should only updated when logging in or out, am I wrong?
    My OS is Windows XP SP3.

    Hello

    The NTUSER. DAT is a registry file. NTUSER each user. DAT file contains the registry settings to their individual account. The Windows registry, is a "central hierarchical database" that contains information about user profiles, hardware and software contained on a computer. Does Windows constantly reference registry throughout its operation of its files. The configuration of the "HKEY_CURRENT_USER" of the registry branch is supported by NTUSER the current user. DAT file.

    Note: this section, method, or task contains steps that tell you how to modify the registry. However, serious problems can occur if you modify the registry incorrectly. Therefore, make sure that you proceed with caution. For added protection, back up the registry before you edit it. Then you can restore the registry if a problem occurs. For more information about how to back up and restore the registry. Check out the following link.
    http://support.Microsoft.com/kb/322756

  • Ntuser.dat.LOG files are important to include in a backup to transfer files to a new computer?

    Ntuser.dat.LOG files are important to include in a backup that is used to transfer files to a new computer? When you try to backup or copy files I get the message that these files (there are 4 of them) cannot be saved or copied because they have already opened and used on the computer. A single file is 256 KB and another has 0 KB. Can not see the other two

    Hi MAnnetteFox,

    Thanks for posting the query on Microsoft Community.

    You do not need to take backup of these files and they are normal files that will be generated.

    In the future, if you have problems with Windows, get back to us. We will be happy to help you.

  • Unable to show the data logged in simple cdc?

    Hello

    I want to get the changed data only inserted into the target table.

    For this,.

    I create model & select JKM Oracle Simple.

    I create the data store (with data) source & target (empty) table.

    Next. I add source table to cdc and start log.

    again, I insert a row in the source table. After I check source table-> cdc > data logged--> "changed data are coming with success.


    cdc_err2.png


    I create interface for the changed data only inserted into the target table.

    I drag the Table logged as a source and the required target. On the data from the Source store, check the option box " JOURNALISÉ DATA ONLY " for this example I used IKM SQL update SQL command.

    I checked the source data after selecting " JOURNALISÉ DATA ONLY " option in the properties of the source. I have not logged data.

    cdc_err1.png

    Please help me,

    Thanks in advance,

    A.Kavya.


    You must define who subscribed using Jornalized data, in this case, 'CONTROLLER '.

  • Error ORA to rename data/log files

    Hello

    I wanted to move my data files to the new location, and now my TEMP was not moving properly.

    SQL > bootable media.
    ORACLE instance started.

    Total System Global Area 4259082240 bytes
    Bytes of size 2166488 fixed
    922747176 variable size bytes
    3321888768 of database buffers bytes
    Redo buffers 12279808 bytes
    Mounted database.
    SQL > ALTER DATABASE RENAME FILE ' / oracleGC/oem11g/oradata/oem11g/temp01.dbf' TO ' / oradata/oem11g/data/temp01.dbf';

    Database altered.

    SQL >
    SQL >
    SQL > alter database open;

    Database altered.


    SQL > SELECT name FROM v$ datafile;

    NAME
    --------------------------------------------------------------------------------
    /oradata/oem11g/data/System01.dbf
    /oradata/oem11g/data/undotbs01.dbf
    /oradata/oem11g/data/sysaux01.dbf
    /oradata/oem11g/data/users01.dbf
    /oradata/oem11g/data/Mgmt.dbf
    /oradata/oem11g/data/mgmt_ecm_depot1.dbf

    Now, I get the following errors:


    When I try to rename, I get the error below: the dbf is in both places.

    SQL > ALTER DATABASE RENAME FILE ' / oracleGC/oem11g/oradata/oem11g/temp01.dbf' TO ' / oradata/oem11g/data/temp01.dbf';
    ALTER DATABASE RENAME FILE ' / oracleGC/oem11g/oradata/oem11g/temp01.dbf' TO ' / oradata/oem11g/data/temp01.dbf'
    *
    ERROR on line 1:
    ORA-01511: Error renaming data/log files
    ORA-01516: file nonexistent log, datafile or tempfile
    "/ oracleGC/oem11g/oradata/oem11g/temp01.dbf".

    user771256 wrote:
    Yes is working now.
    Wouldn't it appears with the following?

    SELECT NAME FROM V$ DATAFILE;

    Nope,
    Given that the file that you are interested in is a temporary file (temporary tablespace) and not of datafile he show up in v$ datafile but v$ tempfile

    Concerning
    Anurag

  • Challenge of data logging

    I designed the VI attached to achieve a certain task. I've been on it since yesterday but decided to take a different approach.

    I want to save data in the log.xls file every five seconds (or as the timer would clarify), continuously. When the counter reaches 2 loop the inner circle, ends the inner loop. This means that there will always be three records in the log at any time.

    By clicking on the button, a 'snapshot' of the file log.xls must be taken, which means that all the content of the log.xls file should be stored in the snapshot.xls file. This should happen only once and the inner loop will continue to run afterwards (IE continue to record data in the log.xls file). Please make sure that the snapshot.xls file is not overridded with new data from the log.xls file.

    Indeed the Instant file would contain all the log.xls file at the point of the inner loop ends once the button is pressed.

    I ran the code several times, but nothing in the file snapshot.xls.

    Where should I go from here please?


  • Smaller data logs

    I use SignalExpress, full version, with a cDAQ9174 3.5.0, 9205 and 9217.

    The temperature data are 1 sample/s.  The tension strings are 1000/s.  I use the DC function on data V and which seems to be of 10 samples/s.  I also monitor the AC RMS voltage after a high-pass filter.

    The test runs for a day or two, I would like to log in order to keep the data every 10 seconds or more.

    From what I found, I should be able to use the statistical function, set it to the average, set the time for the calculation of the average of ten seconds, and then check the measurement to restart at each iteration, and then connect the statistical data.

    However, I can't find the setting for the length of the iteration, and the reboot box is not available on the DC database.

    Thanks for your help!

    Hi Arvo,

    The acquisition steps DAQmx returns an array of data sampled at treat - for the length of the iteration, I would change the number of samples on DAQmx Acquire input.  For example, if you buy 1000 Hz, acquiring 10 k samples will cause DAQmx Acquire back 10 seconds of data to be processed.  Log on the DC values and value at the end of this treatment and the result should be a point every 10 seconds.

    Best regards

    John

  • CPU, RAM and disk, file system information for the data, logs, temp, Archive for database

    Hello gurus,

    How can we see what size of CPU and RAM for the database? According to the previous steps of this great site, I could able to do
    SQL> show parameter sga_max_size
    
    NAME                                 TYPE        VALUE
    ------------------------------------ ----------- ------------------------------
    sga_max_size                         big integer 800M
    SQL> exit
    Therefore, the size of the RAM for the database to which I have connected
    rootd2n3v5# ioscan -kfn |grep -i processor
    processor    0  13/120         processor CLAIMED     PROCESSOR    Processor
    processor    1  13/121         processor CLAIMED     PROCESSOR    Processor
    processor    2  13/122         processor CLAIMED     PROCESSOR    Processor
    processor    3  13/123         processor CLAIMED     PROCESSOR    Processor
    rootd2n3v5#
    The one above was the command give the OS level that I don't know what information he gives and what database, since there are 12 databases on this server.

    How can I get the CPU information?

    And where can I find information for information of system data files, logs, temp, Archive?
    <div class="jive-quote">select * from v$version</div>
    BANNER                                                           
    ---------------------------------------------------------------- 
    Oracle Database 10g Enterprise Edition Release 10.2.0.5.0 - 64bi 
    PL/SQL Release 10.2.0.5.0 - Production                           
    CORE     10.2.0.5.0     Production                                         
    TNS for HPUX: Version 10.2.0.5.0 - Production                    
    NLSRTL Version 10.2.0.5.0 - Production

    Ask the system administrator if the database server is a physical machine, if it is hardware partitioned or created under a virtual machine Manager? Check the operating system utilities. On AIX uname to identify if you are on an LPAR.

    In Oracle, you can query the parameter $ v to cpu_count to see how many cpu Oracle says he sees.

    HTH - Mark D Powell.

  • Using the data logged in an interface with the aggragate function

    Hello

    I'm trying to use logged data from a source table in one of my interfaces in ODI. The problem is that one of the mappings on the columns target implies a function (sum) overall. When I run the interface, I get an error saying not "a group by expression. I checked the code and found that the columns jrn_subscriber, jrn_flag, and jrn_date are included in the select statement, but not in the group by statement (the statement group contains only remiaining two columns of the target table).

    Is there a way to get around this? I have to manually change the km? If so how would I go to do it?

    Also I'm using Oracle GoldenGate JKM (OGG oracle for oracle).

    Thanks and really appreciate the help

    Ajay

    "ORA-00979"when the CDC feature (logging) using ODI with Modules of knowledge including the aggregate SQL function works [ID 424344.1]
    Updated 11 March 2009 Type status MODERATE PROBLEM

    In this Document
    Symptoms
    Cause
    Solution
    Alternatives:

    This document is available to you through process of rapid visibility (RaV) of the Oracle's Support and therefore was not subject to an independent technical review.

    Applies to:
    Oracle Data Integrator - Version: 3.2.03.01
    This problem can occur on any platform.
    Symptoms
    After successfully testing UI integration ODI using a function of aggregation such as MIN, MAX, SUM, it is necessary to implement change using tables of Journalized Data Capture operations.

    However, during the execution of the integration Interface to retrieve only records from Journalized, has problems to step load module loading knowledge data and the following message appears in the log of ODI:

    ORA-00979: not a GROUP BY expression
    Cause
    Using the two CDC - logging and functions of aggregation gives rise to complex problems.
    Solution

    Technically, there is a work around for this problem (see below).
    WARNING: Problem of engineers Oracle a severe cautioned that such a type of establishment may give results that are not what could be expected. This is related to how ODI logging is applied in the form of specific logging tables. In this case, the aggregate function works only on the subset that is stored (referenced) in the table of logging and on completeness of the Source table.

    We recommend that you avoid this type of integration set ups Interface.
    Alternatives:

    1. the problem is due to the JRN_ * missing columns in the clause of "group by" SQL generated.

    The work around is to duplicate the knowledge (LKM) loading Module and the clone, change step "Load Data" by editing the tab 'Source on command' and substituting the following statement:
    <%=odiRef.getGrpBy()%>

    with
    <%=odiRef.getGrpBy()%>
    <%if ((odiRef.getGrpBy().length() > 0) && (odiRef.getPop("HAS_JRN").equals("1"))) {%>
    JRN_FLAG, JRN_SUBSCRIBER, JRN_DATE
    <%}%>

    2. it is possible to develop two alternative solutions:

    (a) develop two separate and distinct integration Interfaces:

    * The first integration Interface loads the data into a temporary Table and specify aggregate functions to use in this initial integration Interface.
    * The second integration Interfaces uses the temporary Table as Source. Note that if you create the Table in the Interface, it is necessary to drag and drop Interface for integration into the Source Panel.

    (b) define the two connections to the database so that separate and distinct references to the Interface of two integration server Data Sources (one for the newspaper, one of the other Tables). In this case, the aggregate function will be executed on the schema of the Source.

    Display related information regarding
    Products

    * Middleware > Business Intelligence > Oracle Data Integrator (ODI) > Oracle Data Integrator

    Keywords
    ODI; AGGREGATE; ORACLE DATA INTEGRATOR; KNOWLEDGE MODULES; CDC; SUNOPSIS
    Errors
    ORA-979

    Please find above the content of the RTO.
    It should show you this if you search this ID in the Search Knowledge Base

    See you soon
    Sachin

  • Data logging using queues

    Hello

    I'm constantly measure data (i.e. a voltage signal) using DAQmx and storing data.

    In an old facility, I stored the data in a table, and whenever the table reaches a certain size, I recorded it in a file. However, this resulted in the loss of data during storage.

    Then I read on the use of queues, had a look at the examples, read a few tutorials and thought I understood the principle.

    However, I can't get my example works.

    Thanks for any help.

    Kind regards

    Jack

    If the 2nd so that the loop is executed then it means that the local variable always keeps the old true value even if every time that you run the code, the local variable stops the 2nd loop. Clear the local at the beginning of the code and run it and check variable with the vi I joined.

  • How to display date information in a data log table

    Hello

    I want to display data during the data acquisition process. How can I add the time stamp as one of the column in the Table of 'results' located bottom left of the sample code?

    Thank you

    Ryan

    Here is an example of the use of ordinary table instead of the express. You will need to convert your dynamic data to a scalar with the dynamic data conversion function.

  • ASCII data logging / or something readable!

    Hi, I was using the attached code to exploit/collect data from a system of GC and it used to be saved as a file in a text which I opened and transfer the data to Excel or laboratory of origin. For some reason, the format that it is written in the file has changed, and if I try to import directly to origin or something like ascii, it now says that the ascii data is not readable by columns. If I try to open in something like a block of text, it just displays as lines of symbols, perhaps ansii?

    I have absolutely no idea of what maybe I changed to do this and how to return to a "readable" format, any help would be greatly appreciated!

    Thanks, Kelly

    OK, convert your DBL timestamp and use the following format (be sure to set the schema format display of \-codes first constant!):

    %.5e\t%.5e\n

  • ORA-38729: not enough flashback database data log do FLASHBACK.

    Hello

    I'm trying to flashback database just for an hour. My flash_back_retention is set to 1440 (1 day), db_recovery_file_dest_size is set to 200G (and is used only 9 G). I could see newspapers of flashback for the last 2 days and logs archiving during 7 days available on ASM. But I'm not able at the database of flashback for another 10 minutes.

    SQL > flashback to timestamp to_date database (January 6, 2012 22:10 ',' MON-DD-YYYY HH24:MI:SS');
    database of Flash back to timestamp to_date (January 6, 2012 22:10 ',' MON-DD-YYYY HH24:MI:SS')
    *

    SQL > select * from v$ flashback_database_log;

    OLDEST_FLASHBACK_SCN OLDEST_FLASHBAC RETENTION_TARGET FLASHBACK_SIZE
    -------------------- --------------- ---------------- --------------
    ESTIMATED_FLASHBACK_SIZE
    ------------------------
    2.3219E + 12 5 FEBRUARY 12 1440 9567993856
    4230193152


    SQL > select flashback_on from gv$ database;

    FLASHBACK_ON
    ------------------------------------------------------
    YES


    SQL > show parameter recov

    VALUE OF TYPE NAME
    ------------------------------------ --------------------------------- ------------------------------
    db_recovery_file_dest string + DGSSD
    great db_recovery_file_dest_size whole 200G
    recovery_parallelism integer 0

    It is not the problem of the logs archiving or flashback logs. Can you please give me advice?


    Thank you
    Krmreddy.

    Hello
    You wrote: * "I try flashback database just for an hour." But you have issued

    You wrote:
    SQL> flashback database to timestamp to_date('06-JAN-2012 22:10:00','DD-MON-YYYY HH24:MI:SS'); 
    

    January 6, 2012, is a month now!

    Edited by: Manguilibe Feb 6 KAO. 2012 21:55

  • application of logging data

    Hello

    Wondering if you could help me with my problem.

    I am developing an application for data logging. The user will look at the actual data are read from a card thermocouple CDAQ and when it is happy that the temperature conditions are set, press a button to connect to a data point in each of the 16 channels and then look at active data again ready to connect the next set. There must be a three minute delay between each registered point.

    Status of the program. I'm getting real data and I have a button that can be pressed to save the data. When this button is pressed, the program will in another State in the state machine. At this point, I would like to than setpoint temperature placed in a table. And each time a set point is recorded data will be placed in a new row in the table.

    Currently, I can't get the data in a table and each point connected to a new row in the table. Please could you help.

    My program is attached.

    I looked at the loop FOR that records the data.   You have N = 1, but you're autoindexing data 1 d table.  This means that only the first data point is currently registered.  The table of the you initialize it will have 16 data points, all with the same value.  I don't understand why you are creating a table of the unique element for each iteration of the loop and insert into your table of 'data '.  Also, why is the outside OF the loop N = 1?

    BTW, you might want to take a step back and think about all of the structure of your program.  A machine architecture of real state with a structure of the event would be preferable.

Maybe you are looking for

  • the song sequence

    Hello Since this update for a given title, 'CD', song or 'track' selection is no longer moves in the regular sequence, i.e.track 1 to 2 to 3 and so on. The automatic progression is random, going to randomly say 1 to 3 to 5 or 1 to 7 or 9 or 8, then s

  • Suddenly I cannot view my options or modules. What did you do?

    When I click the icon on my toolbar options and then click on Options or modules, nothing happens, so I can't access either.

  • Satellite 5205-S505 - how to remove the battery with the damaged battery latch?

    I try to set aside my laptop (satellite 5205-s505) today. But I found that the release latch was damaged somehow. What can I do? My seriously damaged lcd. I have method to close the LCD to save energy (just use external moniter)?

  • Tecra S1: Gets you a bootable USB key to work?

    Hello I tried many different tools to get my USB to work, but I guess that's the manipulation between my laptop and the USB key. In the bios, I have already "FDD".... "enabled. Maybe someone got his operation. One that works with the Tecra S1?What to

  • Can satellite U400-112 - I retrieve C partition only?

    OK, so I want to recover my C partition only, and "Repair my computer" in the start menu does not work for some reason, I get an error saying: failed to load windows... I have the recovery DVDs, but if I use it, it says it will restore my entire hard