Frequency of logging data SCC-68

How fast can I enjoy data acquisition.  I took 4 analog inputs and carving to ASCII via LabView SignalExpress.

I keep getting buffer overflow errors saying that the sampling period is too short.  I am currently 20 Ms.

How short can I do?  It seems a little ridiculous that this thing can not taste<>

Hi carleethian,

The SCC-68 is the Terminal Board, which specific data acquisition card, you are using (probably a PCI card inside your computer)?  This would determine the maximum rate that you can enjoy at the.  Using the CCS modules with the SCC-68?

Whatever the exact DAQ card, you should be able to sample above 50 Hz.  What you set up as the number of input samples of your step DAQmx acquire voltage?  This should be high enough so that the loop SignalExpress can catch data from data acquisition card.  I used to shoot for about 1/5 of my sampling rate to maintain a speed of loop 5 times per second.

Best regards

Tags: NI Products

Similar Questions

  • To generate a sinusoidal waveform of given the frequency of acquiring data 6251

    How can I generate a sine wave of given the frequency of acquiring data 6251? I tried to use the generation of waveform of the signal processing Toolbox, but it seems that it is written that the first or the last sample, not all. I have faced this condition when I tried to write through 1 d double multichannel analog waveform.

    In the examples, look at Gen Con voltage Wfm - Int CLK.

    If you continue to have problems, according to the code you have written.

  • The logged data is not loaded in the target table in cdc-compatible?

    Hello

    I tried with cdc-simple concept on single table.

    I had loaded, journaled table (only changed records) inserted in the simple target in cdc table. Its working fine.

    When I work on cdc-consistent and logged data are not loaded in the target table

    For this, I used the data model, it has 3 data stores. log the without option of data, its works very well.

    When I am trying to load tables logged in the target table in interface, its running fine.

    To do this, I chose "logged data only.

    Although I have not changed the records in the target table. target table is empty after the executed insterface.

    err1.png

    err4.png

    err2.png

    err3.png

    I chose the real option of insertion in ikm. But the logged data that is not inserted in the target table.

    Please help me.

    Thankin advacnce,

    A.Kavya.

    Hello

    You must EXPAND WINDOW and LOCK SUBSCRIBERS before consuming the CDC data:

    http://docs.Oracle.com/CD/E14571_01/integrate.1111/e12643/data_capture.htm#ODIDG283

    Subsequently, you unlock Subscriber and purge the log.

    Better to put a package to automate the whole thing.

  • Difference between redo logfiles, Undo Tablespace, Archive log data file

    Can some please highlight the difference between Undo and Redo Logs
    Also why we need to separate the archive logs when the recovery log data can be written directly in data files...

    Help you will be highly appreciated...

    Hello

    Ed gave you a very good answer.

    Rememeber database files are online and they are written to by DBWR process. So we have the database files, redo log files and archive logs.
    In order to avoid all the crawl log when you perform a recovery, database performs control points that summarize the State of the database. This operation of control point provides a shortcut to the recovery. At the checkpoint, the database knows that all incorrect pages were written to disk (for example, database files). At the time of recovery, the log (which includes-finished and unfinished transactions) is used to bring the database to a consistent state. The system locates the time of the last checkpoint and returned to this position in the log file. Then he restores forward all completed transactions (read committed) that occurred after the last checkpoint and rolls back all transactions that were not committed, but that began before the last checkpoint. This is where online log files are used.

    Now imagine that you need to back up your database GB 100 + all 10 minutes. It would be a waste of space! So you take a backup of your database at time t and backups of archiver process redo logs periodically to check the newspapers so that redo log files can be replaced and RMAN will use the last backup and archive the log files to recover your database at the time.

    Now, I mentioned the checkpoint process. The Checkpoint process regularly launches a control point that uses DBWR to rewrite all the blocks Sales in the data files, so synchronize the database. Imagine one dam is running and have exhausted all the redo log files. At this point, Oracle will wait until all Sales already queued blocks have been written to the buffer on disk (database files) foremost him redo log files can be considered as superfluous and available for re-use (i.e. can be overwritten). This will result in the following message in the alert.log:

    Thread 1 advanced to log sequence 17973
      Current log# 3 seq# 17973 mem# 0: /oracle/data/HAM1/log3aHAM1.dbf
      Current log# 3 seq# 17973 mem# 1: /oracle/data/HAM1/log3bHAM1.dbf
    Thread 1 cannot allocate new log, sequence 17974
    Checkpoint not complete
      Current log# 3 seq# 17973 mem# 0: /oracle/data/HAM1/log3aHAM1.dbf
      Current log# 3 seq# 17973 mem# 1: /oracle/data/HAM1/log3bHAM1.dbf
    Thread 1 advanced to log sequence 17974
    

    I am sure you have done the following:

    alter database mount;
    
    Database altered.
    

    When you mount a database, Oracle combines the instance started the database. Oracle control files are opened and read. However, no checking such as the restore/recovery is performed

    alter database open;
    
    Database altered.
    

    An open command opens the data files, recovery logs, it performs automatic and consistency of database recovery. At this point, the database is now ready to be used by all valid users.

    HTH,

    Mich

    Published by: Mich Talebzadeh on November 19, 2011 16:57

  • Last logged Date and Points not updated profile

    Under my profile, last logged Date means 24 October 2008 08:29, yet I was connecting on a one or two times a day!

    I also answer a certain number of discussions today and answering questions etc, yet my points have not updated in my profile?

    Does anyone else know this?

    Hi malala.

    There is a bug in the "last connected" - it appears a number of users (perhaps all) see the last logged date of Oct. 24 or 25 October 2008.  I asked my development team to investigate.

    Calculation of points is sometimes delayed for several hours - it is a normal feature.  Calculation of points is a relatively expensive operation, so improves the overall performance of site if we calculate points in real-time.  It is a change, we did at the beginning of 2008.

    Having to connect several times per day is a known bug.  In my view, there are two basic causes: we have a solution for those who will be released in February, and we are investigating the other.

    Best regards, Robert

    Robert Dell'Immagine, Director of VMware communities

  • The most effective way to log data and read simultaneously (DAQmx, PDM) high data rates

    Hello
     
    I want to acquire the data of several Modules cDAQ using several chassis to
    high data rates (100 k samples per second if possible). Let's say the measurement time is 10 minutes and we got a large number of channels (40 for example). The measured data is written to a PDM file. I guess, the memory or the HARD disk speed is the limits. For the user, there must be a possibility to view the selection of channels in a graph during the measurement.

    My question: what is the best and most effective way to save and read data at the same time?

    First of all, I use an architecture of producer-consumer and I don't want to write and display the data in the same loop. I expect two possibilities:

    [1] to use the 'DAQmx configure logging.vi' with the operation 'journal and read' to write the data to a PDM file. To display the data in a second loop, I would create a DVR samples documented and 'sent' the DVR for the second loop, where the data will be displayed in a graph (data value reference). This method has the disadvantage that the data of all channels is copied into memory. Correct me if I'm wrong.

    [2] use 'DAQmx configure logging.vi', but only with the "journal" operation to write the data to a PDM file. To view the selected data, I had read a number of samples of the TDMS file in the second loop (I'm currently writing the TDMS file). In this case, I have only one copy data from the selected channels (not), but there will be more HARD drive accesses necessary.

    What is the most effective and efficient solution in this case?

    Are there ways to connect and read data with high frequencies of sampling?

    Thank you for your help.

    You say that the measurement time is 10 minutes. If you have 40 channels and you enjoy all CHs at 100 kHz, it is quite a number of values.

    In this case, I always try to approach under the conditions of use. If a measure is only 10 minutes, I just connect all PDM data and create a graphic module that could be in the same loop of consumers where connect you the data. You can always work on the raw data files big offline afterwards, the extraction of all the information you need (have a look at the product called NI DIAdem: http://www.ni.com/diadem/)

    The main issue is that the user needs to see in the graph (or perhaps a chart can be useful too). Lets say that the graph is 1024 pixels wide. It makes no sense to show multiple data to 1024 points, Yes? Every second will produce you 100 data points k per channel. What is the useful information, which should see your username? It depends on the application. In similar cases, I usually use some kind of data reduction method: I use a moving average (Point by point Mean.VI for example) with a size of the interval of 100. This way you get 100 data points of 1000 per channel every second. If you feed your graph every second with these average values, it will be able to data points in 1024 of the store (as a default) by channel (curve), which is a little more than 10 minutes, so that the user will see the entire measurement.

    So it depends on the frequency at which you send data to the consumer. For example, collect you values 1024 by iteration of the producer and send it to the consumer. Here you can make a normal means calc or a bearing (according to your needs) and he draw a graphic. This way your chart will display only the values of the last 10 seconds...

    Once I programmed some kind of module where I use a chart and not a graph, and the user can specify the interval of the absolute timestamp that is traced. If the data size is larger than the size of the chart in pixels, the module performs an average calculation in order to reduce the number of data points. Of course, if you need to see the raw data, you can specify an interval that is small. It all depends on how you program zoom functions, etc... In my case I hade a rate of 1 Hz, so I just kept all data in RAM limiting the berries to keep 24 hours of data, so that technicians could monitor the system. In your case, given the enormous amount of data, only a file read/write approach can work, if you really need access to all of the RAW data on the fly. But I hope that the values of working capital means will be enough?

  • application of logging data

    Hello

    Wondering if you could help me with my problem.

    I am developing an application for data logging. The user will look at the actual data are read from a card thermocouple CDAQ and when it is happy that the temperature conditions are set, press a button to connect to a data point in each of the 16 channels and then look at active data again ready to connect the next set. There must be a three minute delay between each registered point.

    Status of the program. I'm getting real data and I have a button that can be pressed to save the data. When this button is pressed, the program will in another State in the state machine. At this point, I would like to than setpoint temperature placed in a table. And each time a set point is recorded data will be placed in a new row in the table.

    Currently, I can't get the data in a table and each point connected to a new row in the table. Please could you help.

    My program is attached.

    I looked at the loop FOR that records the data.   You have N = 1, but you're autoindexing data 1 d table.  This means that only the first data point is currently registered.  The table of the you initialize it will have 16 data points, all with the same value.  I don't understand why you are creating a table of the unique element for each iteration of the loop and insert into your table of 'data '.  Also, why is the outside OF the loop N = 1?

    BTW, you might want to take a step back and think about all of the structure of your program.  A machine architecture of real state with a structure of the event would be preferable.

  • acquire, analyze, and log data using the technique of the statemachine

    I'm technical state machine learning to develop code to acquisitoin of data and logging. I think I wrote the correct code for this purpose, but I couldn't have done the following:

    All the values(>0) of warning should appear on table 1 d on the façade. The code is written only the first value.

    All values must be recorded using to write to the palette of the spreadsheet and written on the excel sheet. Code is only writing first value.

    I wonder what is the error. The code I created is attached, I would be grateful if someone fix the code and post in response.

    Thank you in advance for help

    See you soon

    Hi kwaris,

    "What is the error": well, you use a lot of unnecessary conversions. Really a lot of...

    Why do you convert a dynamic scalar value, then convert to table, then index an element of the array? Why convert to dynamic when everything you needis a simple "build table" node?

    OK, I've included a shift register where you will store your table. It of just a simple 'how to', but not the best solution for all cases. But it should give you a clue...

  • How can I delete log data?

    When I open google mail, and I start typing, a list of connections are presented to me... I erased all my history, but this is not enough, these names are always appearing. How can I remove it please?

    You can attach a screenshot to show that field entry you want to say?

    Use a type of compressed as PNG or JPG image to save the screenshot.

    If these names are not displayed in the password manager and also delete the formhistory.sqlite isn't help then then you may be using an extension that stores these data or Google might be to store this data.

    Start Firefox in Safe Mode to check if one of the extensions (Firefox/tools > Modules > Extensions) or if hardware acceleration is the cause of the problem (switch to the DEFAULT theme: Firefox/tools > Modules > appearance).

    • Do NOT click on the reset button on the startup window Mode without failure.
  • Logging data source expressions

    Hello

    We use NI TestStand 2012 here in our society, and I have a simple question.

    In any test pass/fail (numeric, String, pass/fail regular), is it possible to have the TestStand to include the data source expression in the default report?

    I can add it manually in the component "Other results", but I have to do for each step.

    I see that on conditional expressions and flow measures are included in the report, but not the my expressions of pass/fail, which are found in the data source tab.

    It seems obvious to me that the data source should be there, otherwise I just in the report the name of the step and the result and no information of what has been actually tested.

    Thank you in advance!

    Leandro

    It does not show what the '10' value means: speed, voltage, current, RPM, etc. unless we call actually step in this way.

    Thanks again.

    There is a 'units' set numerical limit markets which addresses this problem. You can even specify custom units if you would like, although there are many built in those you choose as well. Also the name of the step should usually give an idea of what exactly is being tested.

    For the recording of Boolean expressions, probably the simplest thing is to use a stage model that is preconfigured to connect to the data source, or you can create a new custom step based on the test of success/failure that connects to the default data source type, or you could write a tool that passes through the sequence files and change all tests pass/fail to connect their datasource.

    -Doug

  • DB2 PA question of frequency of capture data:

    Hi all

    How to get the data collected on the middleware to write in the repository of Performance on a daily basis?

    Thank you for any input!

    You should look at this community post by Darren as I believe it will meet your needs:

    Order the interim repository in Foglight performance of loading analysis

    David Mendoza

    Foglight Consultant

  • WHAT ARE the DATA SAFE-T-LOG and SELF-T-JOURNAL OF DATA PROVIDED BY ETI LTD?

    WHAT IS SELF-T-LOG DATA PROVIDED BY ETI LTD?

    Title of the Moose: ETI LTD

    Moved from feedback

    Hello

    A Bing search engine for "journal of t eti FAS ltd" (saf - t, no safe-t) indicates that it is legitimate, no malware

    software. You use a work computer? Could another user downloaded this?

    If you are * some * you have no reason to use these data, it can be removed, but is he there

    no doubt, I would not delete it immediately.

    http://www.bing.com/search?q=ETI+Ltd+SAF+t+log

    Don

  • The phones call log smart blackBerry showing an incorrect date

    When I access the journal to call from the main screen, select a call and select view history, the date appears wrong. Even if I make a call, hang up and display the log, it shows the date until the next day. Example: Make a few calls last night (March 08), then had to announce date and time; After that all calls, call log consulted for times and showed calls made on Mar 09. However, if I go to Messages / display folder / phone call logs, dates are shown correctly. No idea why this is?

    Weird.

    Try a reboot simple: with the BlackBerry device powered on, remove the battery a few seconds and then reinsert the battery to restart.

    See if the dates agree now.

  • Unable to show the data logged in simple cdc?

    Hello

    I want to get the changed data only inserted into the target table.

    For this,.

    I create model & select JKM Oracle Simple.

    I create the data store (with data) source & target (empty) table.

    Next. I add source table to cdc and start log.

    again, I insert a row in the source table. After I check source table-> cdc > data logged--> "changed data are coming with success.


    cdc_err2.png


    I create interface for the changed data only inserted into the target table.

    I drag the Table logged as a source and the required target. On the data from the Source store, check the option box " JOURNALISÉ DATA ONLY " for this example I used IKM SQL update SQL command.

    I checked the source data after selecting " JOURNALISÉ DATA ONLY " option in the properties of the source. I have not logged data.

    cdc_err1.png

    Please help me,

    Thanks in advance,

    A.Kavya.


    You must define who subscribed using Jornalized data, in this case, 'CONTROLLER '.

  • Unable to show the data in Table logged?

    Hello

    I want to get the changed data only inserted into the target table.

    For this,.

    I create model & select JKM Oracle Simple.

    I create the data store (with data) source & target (empty) table.

    Next. I add source table to cdc and start log.

    again, I insert a row in the source table. After I check source table-> cdc > data logged--> "changed data are coming with success.


    cdc_err2.png


    I create interface for the changed data only inserted into the target table.

    I drag the Table logged as a source and the required target. On the data from the Source store, check the option box " JOURNALISÉ DATA ONLY " for this example I used IKM SQL update SQL command.

    I checked the source data after selecting " JOURNALISÉ DATA ONLY " option in the properties of the source. I have not logged data.


    Please help me,

    Thanks in advance,

    A.Kavya.


    Hello

    Table of $ J, you have the subscriber calls 'CONTROLLER', but in your filter interface (depending on your screenshots), the name of the Subscriber has been "SUNOPSIS '. Please change the name of the Subscriber in your interface filter 'CONTROLLER' and then try to run. It will be useful.

    Thank you

    Ajay

Maybe you are looking for