DAQ Assist reading bad voltage

Hello

I use the DAQ Assistant VI to read a voltage analog input of a card of National Instruments PCI-6221. I read the AI0 pin voltage. I provide voltage directly on this pin of a continuous feed, but the measurement of voltage obtained from the DAQ Assistant is incorrect: it seems to be reduced by a factor of about 1/3. For example, if I have 4 volt power to pin AI0, DQA help bed 1.43 Volts. I used a multimeter to confirm that the voltage on the PIN AI0 is actually 4 Volts and I know that the problem is with my LabVIEW program and not my diet.

Here are the steps that lead to my problem:

1. in the block diagram, I insert a block DAQ Assistant.

2. in the Properies of the DAQ Assistant, I select an analog input-> voltage

3. select channel ai0

4. I click "test" to test the channel

5. the tension is indicated as being 1.43 Volts, although 4 Volts is provided to pin (this is confirmed by a multimeter).

6. to ensureI click OK to complete the configuration of the DAQ Assistant. I run the program and draw blood. The graph also 1.43 Volts.

Does anyone have an idea why this can be produced. I have spent a good 4 hours trying to diagnose this and haven't found anything.

Thank you

Alnaif adnane

It's normal. Do a search for the topic "ghost".

Tags: NI Software

Similar Questions

  • DAQ Assistant: Reading of the samples that are no longer available

    I'm trying to use the DAQ Assistant to read some data, but I get "error 200279 occurred at DAQmc Read (analog 1-d Wfm NChan NSamp) .vi:2.» The error message suggests to increase the size of the buffer, most frequently read data or by specifying a fixed number of samples to read instead of reading all the available samples. My sync settings are as follows: continuous samples for the Mode of Acquisition, 1 K samples to read, 1 K Sample Rate (Hz). I'm curious to know what would be the best settings for this. If I set the mode to 1 sample (on request), would simply take a sample each time the loop works? Help, please.

    Thank you

    Brian

    Brian,

    Please post on the Forums of NOR.  Adnan a reason on why you get this error. If you use the sample on request, you will get only a value returned for each iteration of the loop. If you only need to try from time to time and there is no need to be very consistent, then this might work for your applications, but it depends on what type of application you have. How fast do you need taste?

  • Units of the number of samples and rates for the DAQ Assistant units

    Hello

    I use the DAQ assistant for analog voltage of an input OR data acquisition card. What is the difference between the rate and the number of samples in the DAQ assistant and what are the units of the two?

    Thank you.

    The number of samples is how many discrete to measures. Rate (per second) is how fast to acquire the specified number of samples.

    If number of samples is 100 and the rate is 1000 samples per second, then the acquisition would take 0.1 second (100 / 1000).

    -AK2DM

  • Using DAQ Assistant to read voltage of 9205

    I am new Nock in it and I tried to read the voltage level of 9205 relating to 9172. I use it in XP mode virtual because windows 7 does not have labview 8,9. I installed the drivers for data acquisition.

    When I check the meter in automation and Explorer, it works very well which means it reads im DC voltage supply. When I created a VI facilitate data acquisition, I chose the right channel, the entry as analog voltage, the numeirc indicator shows it is-10 to 10. I noticed he did the same thing, even once the USB is disconnected, which means that the function helps daq was not save the data of the 9205.

    Can someone help me?

    Hi aaclabview,

    The way in which you have added the device to MAX makes the device act as a simulated instrument. Simulated instruments only generate sine wave data to test a piece of code without using any material. A simulated device has the icon yellow as shown in the screenshot you provided and are completely dissociated from any material, so add or remove the usb device has no effect.

    The problem with the help of Windows XP mode as mentioned in the above KB is the USB transfer must be enabled for the measurement and Automation Explorer inside the Virtual Machine detect the device. Using the unit in this way is not supported or recommended by National Instruments and can lead to instability and the latency of the errors in the acquisition of data even if a connection is established.

    It is a more sure bet to try and install LabVIEW and hardware drivers DAQ as described above on the real Windows 7 machine to try and run inside the XP mode.

  • How can I programmatically change the parameters of voltage range in a DAQ Assistant

    Hello

    First post here.

    I need to be able to change the properties of voltage range of a daqmx assistant DAQ based on user input.  My material, an SCXI module - 1102C does not change this property on a running task, so I would together the range of input voltage analog before activating the DAQ Assistant, or break the DAQ Assistant immediately after it starts, set the values, and then resume.

    I don't know how to change the task ahead because the DAQ assistant creates the task when it is running, and there is no job before that.

    In the attached photo, I have a conditional section, configured to run only if the loop iteration is 0.  I take the task of the Daq assistant, sending him stop vi of task, set the property and then send the task with the snap the vi task. I can watch him run with lightweight debugging on, and everything seems to work properly, but on the second (and all others) iteration of the loop, I read I. Max and it seems that a re DAQ Assistant set it to the 5V.  You can see what's going wrong here?

    BTW, there is a continuous acquisition and the code doesn't produce error messages when executing.

    I've encountered a similar question someone posted here in 2006, but his question was specifically a Labview API (VB, I think) and not a real solution of G.

    Attached are the real vi in question and a PNG of the block diagram.

    Thank you!

    Ruby K

    First of all, if you want to start getting beyond the basics with the DAQ hardware, you have to stop using the DAQ assistant and do it with lower level VI DAQmx.  There are hundreds of examples in the finder of the example.  You can even make a right-click on the DAQ assistant and select open front panel.  This will create a Subvi, you can open and see what is happening behind the scenes.  Do it.  I think you will find that the task DAQ is recreated on each (although I'm not 100 percent the way parameters are established or maintained in each section of this sub - VI).

    The second problem is that you have a bit of a race on iteration 0 condition.  These two property DAQ nodes are running at the same time.  Thus, when you read the AI. Max, this can happen before or after the AI. Max is located in the structure of your business.

    Thirdly, make sure that involve you your son of the error.

  • Using the DAQ assistant voltage vs time graph

    I'm relatively new to all Labview and terms and everything which affects programming. I've read tutorials and everything trying to understand things. One thing that I have a problem is the DAQ assistant. Now, if I wanted to place the DAQ assistant on the block diagram of labview and I have everything set up so that the voltage will travel in the DAQ hardware, how would I set up my block diagram so that I can get a graph of voltage vs time in which data begin recording until the voltage reaches a certain tension I was inputing and change such as 30 or 40 volts. The data will also stop recording when the voltage reaches the same number. I also want to be able to multiply the number of voltage coming out a number that I can change myself before it is graphed over time. Example, I mean the voltage to start recording when he reached 40 volts. Now when the voltage comes out of allows it to DAQ assistant say he is somewhere read 10 volts and the number I want to multiply by 5. So, I want to be able to multiply the voltage by 5 and then since it will be 50, it would begin graphing this number over time.

    You would need to have a Boolean value which controls whether the (amplified) voltage is greater than N.

    If so, he would send this value to a graph, if not, the tension would not get graphically.

    Here is an example: (do not try to copy this code exactly, because it does not use a signal, but rather a whole number that is being created)

  • Precise triggering voltage input and output generation in the DAQ Assistant

    Hello

    I wonder if anyone has come across a simular problem with the synchronization of input and output voltage. I use a box 11 LabView and NI USB-6259. I have been using the DAQ Assistant to configure the input and output channel. In particular, my task is to generate a single rectangular "pulse" as the output voltage to drive a coil and once the pulse went to get a signal from a sensor of magnetic field and get a power spectrum. This means that the order and the time during which the DAQ Assistant is used is extremely important. For example, the output voltage channel must be opened first for 2 seconds. Subsequently, the channel of input voltage must be open for 1 second, in which the sensor signal is obtained and post-processed. Only after these tasks are performed in this order he can can be repeated in a loop until the experiment is over. I don't know how to trigger data acquisition assistants (one for entry) and the other for the voltage output correctly. Y at - it a trick?

    See you soon

    Michael

    Hi Dave,.

    Thank you that I wired the error strings but the timing issue was unrelated to it. In the DAQ assistant, I simply had to choose the continuous aquistion of the 'samples' methods 'N-switch' for input and output voltage and all works fine now.

    Thanks again

    Michael

  • I have a DAQ Assistant configured to read several channels at the same time. When I have a graphical indicator of wire to the output, I see all my signals mixed together. How I divided them into separate signals?

    I have a DAQ Assistant configured to read 2 channels at the same time. When I have a graphical indicator of wire to the output, I see 2 signals mixed together. How I divided them into separate signals?

    When I wire any type of indicator, it is show that a release of a single channel.

    I want 2 indicators showing 2 different signals as expected from 2 channels configured. How to do this?

    I tried to use split signal but it end by showing that 1 out of 1 signal two indicators.

    Thanks in advance.

    Yes you are right. I tried, but I don't have the result.

    I just find the path. When we launch the split signal, we should expand it (split signal icon) by top, not the bottom. It took me a while to understand this.

    Thank you

  • Consecutive calls to DAQ assistant

    Hello

    I'm working on something that is very probably simple.  Maybe the problems stem from a bad initial design choice.   The VI (and subVIs) are used at a voltage output, read another tension and react accordingly.

    First the error I get is "error-200547.

    Here's how the program works:

    1 MOVR.vi

    This generates two analog output signals, controlled by the same signal generator.  There is also a digital signal, but I don't think that's the problem.

    2 MTUL.vi (and MDTL.vi)

    These use MOVR and read another voltage.  Essentially, the voltage must be created until the limit is reached, and he decides to stop.

    These two work as expected on their own.

    3 IsoMeasure.vi

    This is where the problems occur.  Basically, this VI take MTUL and MDTL and makes a loop in a loop for, change the frequency of each increment.  The observed performance is MDTL will work and try to start MTUL.  It's when 'Error-200547' is thrown.  The error code appears to be understandable, but "autostart" isn't clear for me using the wizard.

    I would avoid using all daqMX code, but I will if I have to.  If that's the suggestion, a good example is that sort would be great.  If I can put the autostart Assistant, I guess it would help as well.

    Thanks for all the tracks.  I think it should work.

    Hi drevniok,.

    The reason why you get this error is that you try to restart your DAQ Assistant several times in your application. One important thing to note is that a task DAQmx configured and started only once each time the DAQ Assistant is called for the first time. Therefore, since you are stop and start the DAQ Assistant, in your application, the second time you call the wizard, it does not start the task. This is made more so by the fact that the function of writing in the DAQ Assistant DAQmx has his automatic starting of entry set to False.

    Using the DAQ Assistant for Analog Output returns an error-200547

    That being said, the DAQ Assistant is mainly used as a quick and easy to set up and use your DAQ hardware, however, it is a bit limited in functionality compared to the lower levels DAQmx live. This is a case that illustrates this limitation and therefore, I believe that the best solution this problem would be to use the DAQmx LabVIEW vis a lot shipping examples that can help you get started developing your application. These lie in you NEITHER example Finder under the menu help. "" The example I want to show you is the Regeneration.vi Clk - no Cont Gen Wfm - Int voltage under input and output material"DAQmx" analog generation "voltage.

    Is another resource, I want to tell you the getting started with NO-DAQmx: Homepage, which are a collection of tutorials online on DAQmx programming.

    I hope this helps.

  • DAQ Assistant acquires the data into segments

    I'm writing a code that reads and records the voltage, temperature and pressure on a cdaq-9174 using or 9221 and cards or 9173.  The problem is that when the daq assistant is set to N samples outputs the data blocks in the graphics.  I wish it were a continuous stream so I can see what is actually happening.  I tried to change continuously, but it gets an error or is has data about 16000 points in 10 seconds, which is a lot more that I prefer.  The code I am using is borrowed from another person and then the installation exactly it works on this computer, but not mine.  Does anyone have any suggestions on how to solve this problem.  I enclose the code as well as the sub live he uses.

    Thanks for all the help.  I didn't know at first that the NI 9237 card has a minimum sampling rate of ~ 1600 Hz.  I am now able to taste to 2000 Hz then use decimating continues to write in my file at 200 Hz.

  • DAQ Assistant does not export the values on the scale

    Hello all-

    Potentially stupid question but here goes: I'm using the DAQ Assistant to read in 4 analog input voltages, continuous sampling acquires data at 10 Hz 1 point, using LabView 12 on a machine with an acquisition of data USB-6341 simulated device (because my office is more comfortable than the lab!). I want to change the first two signals of voltage to temperature and humidity, respectively. I used the «create a new...» "in the 'Custom Scaling' drop-down in the"Voltage configuration"tab for each of these channels, named gave the slope and the intercept at the origin for the respective linear scales and click OK."

    When I test the code - and yet once again, I'm not on a machine with a 'real' DAQ system, I use a simulated device, and it seems that NEITHER MAX generates a sine wave of long period with little noise on top for this - I do not get the results on the scale of my 'signal', I get the raw tension. (If you run my code, I will join, the Relative humidity must be between 0 and 100 and temperature-40 to 60, is not 0 to 5, for example.)

    So, what happens? Is there some flag or setting that I missed? The scaling only works on voltage data 'real' of a 'real' instrument DAQ, instead of a simulation (which is why I mentioned twice!)? I have to do something in NI MAX as well as Labview?

    Thanks for any help you can give.

    John Easton

    Simulations devices will not respond to custom scale.  They are just supposed to allow you to configure your device without errors when you do not have the unit on-site.

    "NOR-DAQmx simulated devices create a noisy sine wave to all the entered analog." Simulated data other set-up is not available at this time. »

    http://www.NI.com/white-paper/3698/en

    They generate a sine with an amplitude equal to half of your specified input range.  If you want to work with simulated data that would be more realistic for your application, you could write a VI to generate the data and have a business structure to manage both "simulations" and "real", then you could switch back depending on whether you have access to the material.

    I just checked this with a PCI-6254 I install and simulated a PCI-6254.

  • Connection diagram missing in DAQ Assistant generate the signalling block

    This is my first post so please excuse the quality of my description.

    When I double click on the block of data acquisition - Assistant, there is no tab connection diagram I can access to see how things are wired to the top. I have a NI USB-6211 connected by USB and it is used to control many different sensors and a power supply. Currently, he works for everything and is hard wired correctly, but only blocks DAQ Assistant has a connection diagram available, the other are not. One who has a connection diagram is used to measure a voltage. Others who do not are used to generate a signal. I would really like to be able to see patterns of connection for each block.

    -Any help would be appreciated

    -Thank you

    You can always do like those who never use the DAQ Assistant and read the manual. Right click on the device in MAX and selecting "stitching of the device" works too.

  • Question of sampling DAQ Assistant

    Hi all

    I have attached a VI that I used to record data from a few different voltage and the channels of the strain. All channels and sensors work perfectly and the VI reads data very well, but will be only newspaper for about 9 seconds. I found that the reason is that, within that period, the amount of updated read samples is reached. I also found that the reason for which is rather than on the statement sampling DAQ assistant 50 or 100 Hz it samples at a rate of about 1.6 kHz (0,000620 seconds). I have nothing seems to change the rate of that sample to so if anyone has any ideas it would be much appreciated!

    Thank you very much

    Andrew

    I think I can explain what is happening. In a previous post, the op mentions using a 9237. That has a sampling rate minimum of 1.63 kech. / s. The rate in the corresponding file.

    Andrew,

    Read the Manual would have answered your question.

  • Calendar and the problems of data collection with the DAQ Assistant

    Hello NOR Developer area,

    I am a Novice of LabVIEW and have seen how helpful you all can be, and if I come to ask for your help.

    I'm having some trouble with a VI I built that specifies an input voltage, a SCB - 100 connected to a PCI-6031E and converts this tension in a temperature displayed on a waveform table. The goal is to give a constant reading of the temperature and display it in a chart for as long that the VI is running (and to reset the chart the next time the tracks of VI).

    The problems I've encountered currently are:

    -After a few minutes of the VI running, I get an error message 200279: tried to read samples that are no longer available. The requested sample was already available, but has since been replaced. (to the DAQ Assistant express VI).

    -I don't know how to change my chart so that the minimum value X is both during which the VI was launched and have the maximum X value increases with each iteration of the loop. Currently, I have the VI get the time system and contributing to the property node X scale. This worked for the graph of the voltage, but not for the temperature chart

    I appreciate those of you who took the time to read my post.

    Thank you all for your help.

    Sincerely,

    Ethan A. Klein

    SB candidate in Chemistry & Physics

    Massachusetts Institute of Technology

    Class of 2015

    PS I enclose my VI to give you a better understanding of my current situation.

    E A Klein wrote:

    Thanks for writing.

    What property node is talking?
    I do not understand that many different data types. How can I go on the treatment of all the data?

    (Did you mean I should wire 'blue' data for mathematical functions rather than using the node property tension?)

    Sincerely,

    In fact, one of the nodes property.  I mean specifically the tension property node.  But in reviewing, I noticed the other nodes in property for the chart.  Just set auto-scaling to the X scale and that should take care of two of the nodes property (right click on the graph, X scale-> AutoScale X Scale).  I also recommend placing your mathematical functions in a Subvi to make things easier to read.  Attached, that's what I think you're after.

    I hope that these small tweaks will speed things up enough to avoid your error.  If this isn't the case, then we should begin to look at the design of producer/consumer model or take readings at the same time.  It might also be worth looking away the DAQ Assistant and DAQmx real screws.  But one step at a time.

  • about precision DAQ assistant

    Hi all

    I am new to LabVIEW so it could very easily just a simple problem. I apologize that the VI is a little cluttered... The goal of the program is to measure the voltage of batteries in a circuit when a pulse of 5 v is provided. He did this and almost makes me the results that I need, but all the timing is a bit off. For example, the graph connected to the data shows that the impulse starts about 100 ms, when I specified for 150ms... And even that only moves a little with each series. Usually no more than +/-5 ms, but I feel that the equipment is much more precise than that. I also noticed that it will not produce the pulse for 16 ms as I asked. The shortest, he seems to have is about 23ms, and even once, it revolves around a little with each race. The big thing here is that is not accurate as I know, it should be. I looked at using the daqmx rather than the daq assistant, but it doesn't really seem to make a difference. What could I do wrong or overlooking? I use the NI 9264 to produce the signal and the 9205 to read. Thanks in advance for the help!

    Thank you very much

    Bradley

    Your problem is that you rely on Windows Calendar.  It's NEVER accurate.  How accurate do you need time to be?  You must configure the task of analog output at multiple points and provide a waveform for output instead of just putting random in the software.

Maybe you are looking for