Nor-6120 - sample real resolution rate
I use the card OR-6120 and labview 8.2.
And I have not found information on the resolution of sampling rate.
So I wonder if anyone knows what is the resolution?
And I can get the real value of the sampling frequency of the card once I programmed the sampling frequency that I want.
Referring to the technical details of the frequency of sampling for a PCI/PXI-6120 is 800 kech. / s with NOR-DAQmx divers.
If you want to get the actual sampling frequency with LabView and DAQmx drivers use a Calendar property node and select sample clock > rate.
Tags: NI Hardware
My experience with Labview is quite limited, and so far I have mostly editing and debugging code which was bequeathed to me. I use USB DAQ to taste to 10 samples/second card and were not too concerned with the precise timing. I am interested in a new application of hardware and software of NOR and identified the USB-6343 map like the one to use (900 ksps / s 500kS/s and input, output). The plan is to generate an accurate output signal (amplitude and timing accuracy) which is a bit facings on the input signal. I don't plan to write the log data to a file at that rate, just to capture, process and output.
It is relatively simple to be sampled and transmission of analog signals at these high frequencies of sampling? Currently I use just loops timed precisely to the millisecond, but that would require a precision microsecond timing. Is this something that is pretty standard to achieve or are these frequencies of sampling carried out a totally different way? Can anyone provide links to point me in the right dirrection?
I'm sorry if this question has already been asked, but I hope that it is generic enough to not be a burden. It is a project that I take in my spare time, so I am just in the things of plan which includes determine how difficult side things software is going to be.
The acquisition DAQ clocks will give you the best time you can get. If recover you the data in the form of wave, t0 represents the start time and dt, the time interval between samples. If you've read continuously, all samples of data are synchronized compared to t0. The accuracy of dt is equal to the precision of the oscillator of time base on the DAQ hardware, usually +/-100 ppm.
I don't remember the clock on the USB-6343 frequency, but many devices use 80 MHz. You won't get exactly 900 kHz to 80 MHz source, because this isn't a full report. 80 MHz / 900 kHz = 88.888...
The use of millisecond clock is called avoiding the and is fast enough, nor free from jitter to do something like what you have in mind.
I am preparing a demonstration and want to display the allocation of resources in real time using vSphere performance dashboards. Unfortunately, the default sampling rate and an average of periods are not optimal for display assignments for resources with our demonstration. Can someone tell me if there is an easy way to change these default values?
The sampling rate for the graphics in real time is 20 seconds, which is probably OK, but I WOULD change for 10 seconds. However, the main issue of concern is the number of samples in which this average is calculated. This value seems to be very high (I think I saw 300 in a document). Because this number is very high, the changes I make say every minute (for demonstration purposes), allocation bands don't really show up very well in vSphere performance dashboards. For example:
If my host has had 10 GB memory active during the last 5 minutes and my demo releases suddenly a very large part of this memory, the performance table the sudden deallocation with the last word on average 10 minutes of data and shows a gradual decline in the allocation of memory.
This behavior is probably what is desired in a system of monitoring, BUT for my demonstration, I want to show a more realistic view of the allocation of resources using vSphere performance dashboards. This could be accomplished if periods of rate, for a sample mean are configurable.
Any help is greatly appreciated.
On the other hand, I think it's that they do not change settings realtim. It's good for my purpose
I do not know the slow answer due to the memory alloc/dealloc.
BTW if you think the answer above resolved your problem, you might mark it as response.
I'm running a VI for the 9239.
In it, I updated the sample clcok 2048.
I then put the number of samples to 20480, IE rotate for 10 seconds.
However, instead of the final point to be measured to 10s it measure to the 9.82992
If it is set to 2048 samples the final time is 0.98256
I realize that say, if I had a rate of 5 samples per second and one took five samples, I'd get the next 0,.2,.4,.6.8
But if I had to wear it to 10 samples, I'd get
0,.2,.4,.6,.8, 1, 1.2, 1.3, 1.4, 1.6, 1.8
The important point here is that the offset remains the same
In addition 1-1/2048 =.99951171875 pas.98256.
Anyone know what is happening here?
Add to that,
You made a very good point about the first sample being taken at t = 0. Accounting for this, we get:
(20480 / 2083.33333) * (1-1/20480) = 9.82992
(2048 / 2083.33333) * (1-1/2048) = 0.98256
What exactly gives the number you reported.
I have a somewhat periodic signal, each cycle having his inerference S.A max, min, mean, RMS ect.
I like to write * ALL * these data to a single file, with a scale of absolute time, where the cycle data will be stamped with the maximum of the cycle time.
cycle data columns must be empty at the time as the time of the signal peak value.
Who's asking 2 questions:
1. can I build a 'waveform' with a not constant sampling rate where I just stamp each data point arbitrarily? This will help me to get the result I want?
2. is there a better way to convince labview to write data of the cycle next to the signal, with time stamp correct?
An example is attached.
Thanks for the tips...
10 x Heuter, your example is useful - the problem is resolved.
I use Labview 11 and the NOR-2920 (Ettus Radio) to build a FSK receiver, I the Modulation toolkit.
My carrier is 169 MHz, my flow is 256Kbits/sec, my tour of frequency is around 89. 5 kHz, I choose a 16, my symbols FSK samples/symbol is 2 (0-1).
My receiver is close to work, however, I am getting an extra of 5 bits in my 408 bits frame data.
My sync bits are all the tail and the front end of my frame data which has also a meter 16-bit images on the front of the frame, which runs from 0 to 65535 without losing data.
I think that the extra bits are the result of rate IQ I try to set to 4096K based on 256 K * 16 = 4096 K
The radio seems to default to 4,166667 M for the rate of IQ.
The ratio of 408bits /415bits = 4096/4166 coincidence or not?
Is there a way around this problem in the configuration of the NOR-2920 or how I manage the sampling and recovery of data.
Any guidance would be greatly appreciated.
All interested for:
The answer is Labview a an I / Q resample module.
Connection between the extraction of radio and the demodulation modules, set the sampling frequency entry real want = rate * of the data samples / symbol.
Work so far, don't know what is happening when other tolerances overclocking begin to influence the data, but for now his work. Also don't know why this is not incorporated into the receiver or the fetch modules for sampling rate that you actually need from the beginning but I'm still learning.
I'm looking at possible solutions for data acquisition. I use 4 or 5 entries analog and two digital inputs. During the analogue entered most of them will not need sampling extremely quick rate except for one who needs the least 100ks/s. I noticed solutions cost-effective have overalls sampling rate (eg. 250 ksps / s) which extends on all channels. For these products, such that the NOR-9205 compactRIO module, is possible to distribute unevenly sampling rate between channels (ie. could I give up 100ks/s for a single channel and spread the 150ks/s rest between the remaining channels in use)? Thanks in advance for any help,
To answer your question on the sharing of the sampling rate, it is not possible to have a single module different sampling frequencies, as described in this KB: here (http://forums.ni.com/t5/Multifunction-DAQ/Aggregate-sampling-rate-and-Multichannel-Scanning-Rate/td-....)
In the case of the 9205 this module multiplexes between all channels (32 cases set up in single ended mode or 16 in differential) this means that the sampling rate of 250 kech. / s matches total on all channels.
If you are using the differential mode then the samples per second on each channel will be 250 kech. / s divided by 16, IE 15KS/sec. However if you only specify 4 channels max sampling frequency will be 250 kech. / s divided by 4, IE 62.5 kech. / s.
One way around this is to use 2 x 9205 in one of our new CompactDAQ chassis, which has 3 engines of timing to HAVE. This allows to set different timings in 3 different modules. What is described in this KB: here (http://digital.ni.com/public.nsf/allkb/E7036C1870F6605686257528007F7A72)
I'm sorry of this reply took so long, and I hope the above information helps.
Please do not hesitate to answer questions.
If you want I could get one of our technical sales engineers to give you a call to discuss further with you data acquisition system?
I was wondering if someone could clarify the relationship between the samples through second rate and sampling.
I have USB6008 and USB6363 of the tips that I work with in Measurement Studio.
For the nec USB6008 entry differential (up to four channels of HAVE), if I put a sampling frequency to 8192 samples per second, and I updated 2048 samples per channel with:
Task.Timing.ConfigureSampleClock ("", 8192, SampleClockActiveEdge.Rising, SampleQuantityMode.ContinuousSamples, 2048)
Am I right in assuming that:
- through the 4 analog inputs 8192 samples will be collected every second
- 2048 samples will be taken by way of analog input for each scan
- The time between two successive data points is 1/2048 seconds (about 0.5 ms)
The help page for the "Timing.ConfigureSampleClock method" should have a hypertext link to the page "Sample clock" in NOR-DAQmx help, that contains this text: "this sample clock sets the interval of time between samples. Each tick of the clock starts the acquisition or the generation of one sample per channel. "When the sample clock frequency is 8192, then the time between two successive data for a single channel is 1/8192 seconds. The time between the data points on two adjacent channels is controlled by the clock to convert, which can be controlled independently on most devices (up to a point).
In the window of the DAQ Assistant, there are two options for playback of samples:
1. "samples to read."
2 "rate (Hz)" "
Could someone please define the meaning of these? context-sensitive help is not clearify the meaning of these for me.
Examples might help too:
for example, now I have the settings to:
Reading samples: 1
Rate (Hz): 200
What is read what that means in terms of samples per second?
I'm recording given to 200 Hz?
Any help would be greatly appreciated.
See you soon,.
Hey thanks crossrulz, I use continues for my type of sample. OK so I take 200 samples per second, because I have my rate fixed at 200, but what are ' samples to read?
I put 'samples to read' 1, does that mean?
This means you'll read 1 sample whenever the DAQ Assistant is called. A sample is added to the buffer of each 5ms, so there are probably more data in the buffer. That's why you might want to read more. For example, if your loop takes one second to read, you should read at least 200 samples to keep the buffer cleared.
I have a frequency generator (5406) that I use to generate a frequency sweep (1 ms by frequency).
When I read the signal with a digitizer (5112) it seems that the duration at each frequency is bad.
I've attached a photo (I hope) to identify the problem.
The sinus begins here, at a frequency of 20 kHz to 1710 (coordinates, left panel).
Because I taste 3 MHz and that each stage of the frequency is 1 ms, I expect to see the next step to 4710.
That is not the case (right panel)
Any help appreciated
Also attached to this message is a copy of a part of the block diagram.
The explanation of what you see, is that even if you ar affecting your sample NOR-5112 3 ms/s, you are actually sampling at 3.03MS / s.
If you are referring to the specs of the device , you will notice that this Council has a maximum rate of 100MS/s of sampling and can sample at a rate decimated 100MS/s/n where n is an integer between 1 and 100e6. If the sampling frequency is not feasible by the digitizer, the pilot will force the value of the next achievable sampling frequency that is higher than what you asked. In this case n = 33 and your actual sampling rate is 3.03MS / s.
If by your calculation for 1ms, you will get 3000 samples at the rate of sampling or 3 ms/s. This would pass to 3030 samples at a rate of 3.03MS / s. I think that this corresponds to what you see on the image. To be certain of the current rate of your Board is sampling, try to use VI "niScope sampling rate" to query the actual sampling frequency of your digitizer.
I'm trying to get a better idea of what is happening when I'm DAQ using LabView RT compared to using the DAQ assistant (which cannot be used in LVRT). In the DAQ assistant I could just say to acquire 300 samples at a rate of 3000 s/s. The buffer waiting for 300 samples to fill, I could then take these samples and average in statistics to get a nice clean data every 100 ms. I can't find out how to do something similar in LVRT.
I am currently using a loop of data collection separate from my time critical loop operating at 100 ms. I didn't tell him how many samples to catch so I guess he catches just real-time data point every 100 ms. A number of data (pressure, temperature, flow, etc.) is written in a cluster that is sent to a network shared variable (which is sent to the loop of the host). If I run the loop of the host to 1000 ms (where I save the data in a spreadsheet) just grabbing the 10th point data that the shared variable of the network trying to send to the host? Is - this there anyway use LVRT in a similar way to the DAQ assistant, so that I can get the data more smooth.
Oops - do not know what has happened to the code snippet (perhaps it was too big?) that I posted. I will attach the VI himself here...
Specifications for a card PCI-6221 or a 6251 tell me a frequency external clock for the counters/timers can be up to 20 MHz. But there is nothing that says that the maximum rate of the door (sample clock) for an event in the buffer demand metering. I find that on the 6221 I put the sample up to 2.6 MHz, but at 2.8 MHz clock, nothing happens. In other words, no error is generated, but the counties in the data buffer do not change either.
Can anyone tell me anything about the limits of this application?
Thank you, Aaron.
One reason, I took it was because my client talked with an engineer NOR, but they got confused between the input clock and the door (which is sometimes called by NOR as "sample clock"). If my client believed that my code did not give him the best possible performance.
He is now convinced that I'm just doing and he found another solution to his problem.
Hello, this is my first post here,
I've been wondering about this:
When you configure a new daq with the daq Assistant, is the resolution applied to the voltage / number of entries you define, or is it a fixed value per volt by channel.
I have a 6015 usb data acquisition, it has 16 inputs analog, 16-bit resolution 250 kech. / s 09:50 volts.
so is this as (2 ^ 16) / (16 * (-10-10)) = my resolution through volt?
or that this has nothing to do with the amount of channels,
If I want to measure more than 0 to 5 volts, my 16-bit apply from 0 to 5, or even more scale of data acquisition (which is 09:50 volts)
You will get full-resolution 16 bits on each channel regardless of the number of channels is configured. Renault most are multi-plexed if your specification of sampling RATE in usually a global significance that you can only scan channels with a sampling frequency = MAX RATE / number of CHANNELS. Also, most NI Renault have a Programmable Gain amplifier on the front-end server, so if you specify a range of voltage smaller in your task configuration the amplifier will automatically increase the gain to use most of the possible BIT ADC. You see care device and for more details...
To get the best quality image on the external monitor should I optimize for iMac or monitor? Native res monitor 1920 x 1080 and iMac is 1680 x 1050. I'm mirroring the iMac.
If any value for us, nor spec search what resolution your iMac will rely on a screen outside and who will give you your answer. This info is on the specification system for YOUR iMac. You will need to match that spec with the resolutions your external display supports.
I use NEITHER-WSN kIT includes a gateway (9792) and of the NODES(3202,3212,3226 all programmable) NOR-WSN and I also try experiments with NOR-USRP 2921, so I thought to replace NOR-USRP 2921 of GATEWAY ETHERNET of NOR-9792 TIME REAL as it also works on the same 2.4 GHz.
USRP 2921 can act as a bridge or not? This is possible, please suggest.
The short answer is no.
The longer answer is that a WSN node uses a radio of fixed function with a full protocol stack. The USRP is a generalist radio which covers a set of common frequencies, but there is no such thing as a fully functional turnkey battery. It could be designed? Maybe during many years-men targeted development.
The USRP is mainly used as a search feature. Most of the research implements a small aspect of a Protocol, ususally focus on the physical or the MAC layer, then it customizes and evaluates the results.
The right solution for WSN is a WSN gateway.
Maybe you are looking for
I had this error on update, I have windows 10 • Error of Hewlett-Packard - Company WSD multifunction printer, MFP, HP ENVY 7640 of material other series - Null Print - 0 x 80070103 • Update driver HP for Fax - series HP ENVY 7640 - error 0 x 80070490
Hello I encountered some problems with my netbook (Portege M800-10W) after the BIOS update.The update of the BIOS program appears failed. Then my Netbook automatically rebooted and now I can't start my PC more.All the LEDs will light up but the scree
With an old windows XL, I thought that my firefox is old and unsafe. I try to download a new version. To download an exe installation file, it seems to me, but then nothing happens. According to the 'help' maybe there should be an 'OK' button to clic
the bar of menus in itunes is in a different language. All the rest in English, but the bar menu is in Chinese. How can I change it back to English. I bought something when my vpn has been connected to Singapore and the menu bar has changed after
A few months ago, my laptop battery started draining to a low ability to singnificantly. I got the error message when you start saying that my battery capacity is low (error 601). I looked again and decided to simply hold out a little longer and use