VI to convert input signals NI 9402 in a RPM value, based on the frequency of the pulses


I'm looking for a VI convert an input signal NI 9402 in a RPM value, based on the frequency of the pulses. Is there such a thing that exists in the library of national instruments?

I run LAbview 2014 integrated control and monitoring on on a cRIO 9802 high performance integrated system with NEITHER 9402, 4 channels, 50 LV, LV TTL Module input/output digital, ultra high speed digital i/o for the cRIO module.

Any help would be greatly appreciated.

The easiest way is to use the FPGA to get the time between the edges of your pulse increase (shift registers to maintain the current situation and the time will be necessary).  This will give you the period.  If it's a single pulse per turn, then the number of laps is just 60/T, where T is the time in seconds.

Tags: NI Hardware

Similar Questions

  • How to compare analog input signals?

    Hi all

    I use PCIe6363 DAQ to collect the analog input signals. Mode of input signal is continuous and single channel several example. The sampling frequency is 2 ms/s, number of sample 100KS or less. This means DAQ 100KS of collect and draw a line/curve. I want to compare the two curves. The problem is DAQ continuously collects data and plot also continuously. Would you please is it possible to compare the curves of this operation continuous operation. The main goal is to justify whether or not the signal of incomeing maintain consistency.

    Thank you very much


    You can store a waveform in a shift register.  Then you have in memory compared to the new waveform.

  • No input signal (4-20 my expected) on NI 9208

    Hello world.

    I'm having trouble getting the analog input signal (4-20 my) using NI 9208 mounted on the cDAQ 9188.

    Here is the configuration:

    -pressure transmitter 2-wire connected (IN: + 24V) (OUT: pin NI 9208 AI6 code) (power from an external source sharing 0V and ground with pin NI 9208 COM)

    - NI 9208: Vsup = + 24V (pin 19 & 37); COM = 0V is GND (PIN 9; 10; 28-29);. AI6 = issuer pin

    [Clip, a drawing of the wiring]

    -Pilot OR DAQmx 9.3

    -10 Labview

    -Windows 7

    Using a DAQ assistant set up to acquire the analog current signal, the value I get on screen is fluctuating between 0.001 and 0.1 my.

    I have inserted a device running on-line measurement between the transmitter OUTPUT pin and the pin of the 9208 AI6, here I actually get the signal 4-20 my.

    As additional information, I measure 28V between Vsup and COM; 0.3V between AI6 and COM; and 27.7V between Vsup and AI6.

    To make it as clear as possible I might add that first, every input channel has been connected to a transmitter, each giving the screen the value of my ~0.01 (ammeter not yet tested on each I-online the next step in my investigation, actually performed). Then I tried to identify the problem to connect to a single issuer (AI6). Result is always the same.

    As the NI 9208 once functionned well a few months earlier, I think that the problem come from a transmitter that can have been wrongly. (I'm actually cheking all connections).

    My question would be: is the 9208 comprises usually this device damaged (ADC?)? Or maybe it seems as if there is something I missunderstand?

    No matter what piece of advice would be greatly welcomed.

    Also, please forgive my English aproximous.

    Thank you for your consideration.

    Adrien QUEUCHE

    Well, that was my mistake. I was reading my where I should have read A...

    To any administrator: the subject may be deleted, I guess. Unless I deserve to be shameful for eternity...

    Thank you.

    Adrien QUEUCHE

  • Error, input signal is out of range on installing Windows 7

    Hello again, I install Windows 7 on my computer of mothers, but when installing, restart the computer, the screen does not work it says input signal out of range can be dis is because the video card is not a driver, we have a disk with driver say but. Help thanks for all this

    I want to thank you to you all. I solved the problem, I just connect the PC to the TV via a cable VGA windows starts normally and I go to control panel and then to the system and security, then the administrator tools, then the configuration system. When the system configuration window opens I go to the page "boot" and click on "basic video" and then apply the settings and then click on "ok" and I restart the computer. When the computer restarts I unplug the TV and plugged VGA on the monitor cable. I expect windows to load and load successfully when all was fine with the monitor. I change the screen resolution to the highest that my monitor supports. And now everything is fine. Thank you.

  • Count the number of 1 is present in digital waveforms obtained by converting the pulse signals.


    I use Analogtodigital.Vi to convert the pulse of the sequences in digital.signals.I am able to get the representation of digital waveforms of impulses.

    But how to count the number of 1 is present in the converted digital waveform. I want to count the number of 1 is present in the digital waveform converted.

    Thanks in advance.

    Have you tried the block scheme of similar to the of opening?

    It creates an array 2D uncompressed 1 and 0, which is the binary 16 bits A/D conversion of each element in the array Y of the input waveform. You can use the DWDT digital Boolean to convert a 2D Boolean table. Then convert Boolean values to 1.0 and summarize the array of integers. The sum must be the number of 1 bits in the digital waveforms.


    Note: The VI attached is saved in version 8.6. When I have it saved for the previous Version a warning was generated about the possible differences in the versions. Let me know if it doesn't work, and you are using which version of LV.

  • Monitors flicker and "Input Signal Out of Range"


    I have a Mac Mini (end 2012) 2.3 GHz Intel core i7 running up-to-date El Capitan, with two HP monitors 2071d.

    These monitors have DVI connections. It is connected with Mini DisplayPort to DVI adapter, and the other with HDMI to DVI adapter (provided with the Mac Mini).

    Everything has been fine for a year until this week. Two monitors are:

    (1) intermitantly began flickering of the scrambled images

    (2) flickering intermittently at black or white noise

    (3) intermittently flickering white noise

    (4) the message of the form 'input out of range Signal. 1600 x 900 at 60 Hz settings."

    How is this problem go and come, but rather continues today.

    After research, I tried to start safe mode, but the problem persists.

    I read that put the computer to sleep and then wake up could solve this problem... it worked a

    couple of times, but the issue soon returned.

    I tried to reset several times VRAM/LANDAU and SMC.

    I doubled checked all the connections.

    Under the system of preferences > display (when the image is stable long enough for me to see) everything looks OK. Under the screen of the 'Default display' tab is selected. On the color tab, there are three HP2071d listed (with a selected) in the window display profiles. I've linked by selecting one of the other "HP2071d" listed without noticeable change.

    I also reset both monitors to factory settings via the menu screen of the monitors.

    I also came across this mini Mac EFI Firmware Update 1.7 for flickering on end of 2012 Mac Minis, but when I tried to run I got "this software don't is not supported on your system. Ugh. It turns out that my Mac Mini is already updated to MM61.0106.B0A (2015-002)

    My Information System > Hardware > graphics/display info is:

    Graphics Intel HD 4000:

    Chipset model: Intel HD Graphics 4000

    Type: GPU

    Bus: integrated

    VRAM (dynamic, Max): 1536 MB

    Seller: Intel (0 x 8086)

    Device ID: 0 x 0166

    Revision ID: 0 x 0009


    HP W2071d:

    Resolution: 1600 x 900 at 60 Hz

    Per pixel: 32-bit color (ARGB8888)

    Display serial number: 6CM3171PFF

    Mirror: on

    State of the mirror: mirror material

    Online: Yes

    Rotation: support

    HP W2071d:

    Resolution: 1600 x 900 at 60 Hz

    Per pixel: 32-bit color (ARGB8888)

    Display serial number: 6CM3170P31

    Main screen: Yes

    Mirror: on

    State of the mirror: mirror of Master

    Online: Yes

    Rotation: support

    I am of the ideas of what else to check/try.

    Thanks for any assistance that anyone can provide.

    bdcs64 wrote:


    I have a Mac Mini (end 2012) 2.3 GHz Intel core i7 running up-to-date El Capitan, with two HP monitors 2071d.

    These monitors have DVI connections. It is connected with Mini DisplayPort to DVI adapter, and the other with HDMI to DVI adapter (provided with the Mac Mini).

    Quite the same configuration except here, with 2 Samsung S22B310 DVI monitors on an i5 2.5 GHz in late 2012.

    Everything has been fine for a year until this week. Two monitors are:

    (1) intermitantly began flickering of the scrambled images

    (2) flickering intermittently at black or white noise

    (3) intermittently flickering white noise

    (4) the message of the form 'input out of range Signal. 1600 x 900 at 60 Hz settings."

    How is this problem go and come, but rather continues today.

    Both monitors isn't good. Which would lead us to think that the Intel HD 4000 is overheating and the problem is not with any of the adapters or cables. Although, if one of the cards was not internally, it could put additional heat stress on the graphics chip.

    Personally, I will try a few things.

    1. turn off, unplug everything, open the bottom cover and give it a good cleaning to make sure it cools properly.

    2. try each monitor individually for awhile configurations see if the problem follows the monitor cable adapter installation.

    (I know he'll be a brake after obtaining used monitors to double)

    3. try using > MacFanControl to monitor and help control the temperature of the GPU ITPP.

    FWIW, I do not use screensavers, but I do a lot of videos as well as some games of light on both my 2010 and 2012 of the Mini and I really believe that MacFanControl has really helped to extend the life of graphics on these two systems.

    I use the control based on the sensors on the sensor of GPU ITPP and configured so that it starts slowly rise in power of the fan above 129' F

  • off message input signal range by turning on pc - monitor hp s2031


    I have a HP S2031's monitor I got a few weeks ago.  After I got a new video / graphics card in the computer (because the original card couldn't handle all around 1600 x 900 (60 Hz) great things.)  However, I just noticed that when I turn on the monitor and the pc - first before windows starts it is a message indicating that the input signal is out of reach of the settings at 1600 x 900 at 60 Hz.

    Now, I don't remember see this message when the original video card was in the pc ~ but I also don't always stay right there while windows starts so I can't say with certainty that the message did not exist with the original video card.

    I go to the menu on the screen section and settings are set for 1600 x 900 at 60 hz and I also right click on the desktop and in the display so that settings can also be programmed correctly it and everything is set correctly.

    This isn't a problem, unless I'm trying to access the bios.    I tried to go into bios to see what kind of options are there for the video configuration and the only options we configuration video AGP Aperture Size (64 MB). Main (AGP) graphics card. AGP hardware detected - AGP card. Only problem was that at the time where I got this far in the bios that the monitor went to sleep on me. which makes it difficult to get out of the bios and boot the computer normally on the monitor going to sleep on me.

    Anyone has an idea as to why I'm getting that message about the input signal and how do I fix as well as how do I stop the monitor going to sleep while I'm in the bios?

    I am running Windows XP Professional with sp3 and all updates and the monitor is the HP S2031.  The graphics card is a NVIDIA GeForce FX5200 and the monitor is connected to the pc with the monitor's DVI port since this card has only 2 ports dvi on it. I had an old card in the pc (which has the VGA port) but he could not show the 1600 x 900 with this card (and I'm not sure that the message was there with the old card, since most of the time that I'm not just sitting there waiting to start everything)

    I didn't notice the message until today and nothing has changed - other than the new video card and who is in the pc for about 2 weeks now.

    Any help would be much appreciate.


    Sorry that it took so long to get back to you, but things have been very busy here (between the two to get hit with a foot of snow before Halloween and loss of power for a week still to go back to normal)

    In any case, as for the video card - it is a card ASUS NVidia GeForce FX 5200 which has 2 DVI connectors which him.  According to the information on the map


    Expansion / connectivity Compatible Slots 1 x AGP connectivity Interfaces

    1 x S-video input - 4 pin mini-DIN (with adapter), 1 x S-video output - 4pin mini DIN (with adapter), 1 x video composite input - RCA (with adapter), 1 x video output composite - RCA (with adapter), 2 x DVI - I - 29 pin combined DVI, VGA - HD D - Sub (HD-15) 15-pin (with adapter)

    Anyway, I tried to plug the monitor into the other DVI port, and it now works fine.  I guess that a port is bad or it is not intended for making that comes with this monitor.

    Thanks for the help and suggestions.  So they were very much appreciated.

  • SMU-4304 causing the ripple on the input signal?

    I have an SMU-1082 chassis that contains a high-6341 and a PXI-4304 module.  To check my code, I have connected the analog input (channel 0) of the 4304 to the digital output (PFI 12) of the 6341.  My program VI shows a ripple of Vpp 0.2 on the analog input that I'm not using a scope.

    The wiring is SMU-6341 [12 PFI, DGND]-> SCB - 68 a,--> TB-4304 [AI0 +, -]-> SMU-4304

    I have attached photos of the verses reach the graphical VI.  The scope is the AI0 + AI0-terminals and the TB-4304.

    Y at - it a supplement on the ground that I should use, or is - this normal for the-4304 to add the ripple?

    Thank you


    Short answer, is that there is nothing wrong with what you see.

    You have connected a digital output signal low impedance to a digitizer analog high input impedance. Since a digital signal is essentially a square of variable in time wave and square wave have edges of transition that contain information of very high frequency, you will almost always see a form of "ripple" (see animation synthesis of fourier of a signal square from this Wikipedia page ). Thus, a digital output signal is more concerned with the synchronization and the upgrade to be a square wave perfect.

    In addition, you can see additional "ripple" because of differences between the SMU-4304 and the noculars that you have demonstrated. the noculars can be a combination of a bandwidth of upper entrance (which can come from various sources like low sampling frequency on the 4304 which would result in a higher frequency of information recorded by the noculars for smoother transitions to research) and, possibly, a lower input impedance (causing less, if any, the reflection of signal which would cause the ringing of the signal).

  • Python DAQmx triggers a reaction of output with an input signal

    Hello world

    I use a NOR-6251 Board with a python GUI.

    I want to send data on the output channel on each falling edge of the input signal when I click on a start button.

    The level of the output signal may be 5v/0/1 (0v).

    The frequency of input signal squares is 2 MHz for 50 on it (see the attachment for more information).

    I don't know if it is better to use the analog inputs or input meter for the input signal. I think I have to set a clock pulse at the frequency of 10 MHz.

    I tried several solutions with bad results.

    Does anyone have the answers to my problem?

    Thanks in advance.

    Thanks for your reply.

    It works...

  • USB-6211: analog input signal affecting another of the same map AI


    I use the DAQ-nor-6211 map and DAQmx features to read a hammer and a signal of the accelerometer and then use other LabView functions to make the FFT of these analog input signals. However, it seems that the analog inputs where the hammer and the accelerometer are connected generate a kind of noise or influence in other entries of this data that is not connected to any other sensor acquisition board.

    I've had different experiences in order to check if the problem is with reading the card: put the accelerometer and hit the dog in another table where the DAQ card table was located (to avoid the vibrations on the map and a possible noise), ai1 entry was logged on the differential mode on the dog and the ai4 of entry is connected to the output (z axis) of the accelerometer. The other 2 ai2 and ai3, entries that can also be read by my LabView program, are open (i. e., any other sensor is connected to the card). When the structure where the accelerometer is located is struck by the hammer, the signal of ai2 ("x axis" seen in the first attached document) has a curve (on the time domain) which initialize almost at the same time that the hammer and the a3 of entry has a weak signal, but with the swing as well as the signal of ai4. The document "hammer ai1 + z_axis connected_ _x_axis disconnected ai2 + y_axis ai3 ai4" images that I captured the chart created in LabView. On these graphs, it is possible to check on the FFT the ai3 signal and ai4 has the same behavior (with different intensities), and enlarged figure of time domain image, we can see that the signal of ai2 increase almost at the same time of the signal of the hammer (ai1). The signal picked up by the sensors are probably creating a sort of noise on open entries ai2 and ai3.

    Another experiment was conducted to check if the signal from a single entry that may affect the signal read from each other near the entrances: the DAQmx task Create channel had a physical channel has changed: ai3 entry has been modified by ai7 (maintain the same connection mode: differential), and the results are visible on the second attached document. In the graphs obtained in this experiment, it seems that the entrance of the hammer (ai1) affects the signal of input ai2 and ai7, which are not connected. And the ai4 signal does not seem to influence the other inputs, because he has a different curve on the graph of the FFT.

    The same experiment was conducted using the CSR connection (change threads and create the DAQmx Channel Configuration), but the results were the same as those found using differential connection.

    Finally, if the output of the accelerometer is connected on the ai2, the signal of the other open entries ai4 and ai7 seem to be affected by the signal of the accelerometer on ai2 (last document attached).

    Could you tell me if the problem I encounter is caused by the DAQ card with this information that I gave to you? And if the answer is Yes, do you know if there is a way to avoid this noise create in one entry on the other hand, it please?

    Thank you

    Maybe Ghosting or crosstalk? Just an idea.

  • In Labview 8.5, what happens if the input signal exceeds the scope of the signal defined by the DAQ Assistant?

    Hi all

    This should be a pretty simple question, but I can't seem to find the answer online and currently do not have the functionality to test this:

    I'm using LabVIEW 8.5 and have a VI that imports data from sensor through the DAQ Assistant. In the configuration tab, there is a range of signal input. What happens if my sensor exceeds this range? I get a warning? The default value is the maximum (or minimum)? I was interested in writing a code to display an error that I approach the limits of this range, but did not know if I also need to include code to display an error if the scope is exceeded as well.

    Thanks for the help,


    Hello, Tristan,.

    The behavior depends on the selected range and the device you are using.

    If you are using a device with a single input range is valid, we will use this range, even if you set a smaller minimum and maximum in the DAQ Assistant.  So, if your device only supports ±10V and you set the range to ±8V, you will still continue to get valid data after your top sensor 8V until what you approach 10V.  When you reach the limit of the extent of your device, the output will be 'rail', and simply return the maximum value until the signal is less than the maximum value again.

    Note: A device that is nominally ±10V usually has a go-around (such as ±10.2V) which are usually specced in the manual.

    However, if you use a device with several ranges of entry then things become more complex.

    NOR-DAQmx player will choose the smallest range that entirely covers the interval you choose.  For example, suppose that your device supports the following input range: ±0.2V, ±1, ±5V, ±10V and you choose 0V - 3V as the range in the DAQ assistant.  The NOR-DAQmx driver will focus on the input range and the list of the entry lines that your hardware supports and choose the smallest encompassing the entire range that you set.  This would be the ±5V, because this is the only beach that contains up to 3V.  Thus, all between ±5V input signal is returned and none outside this range will be 'rail' to the maximum or minimum value.

    We do this because using small beaches make more efficient use of the resolution of the ADC.  So, we try to use the most effective range based on what you ask without picking up a range that will make you miss data.

    Let me know if I can clarify it more.

  • -Data 200141 was replaced before it can be read by the system. Mechanism of data transfer is interrupted, try to use DMA or USB in bulk. Otherwise, divide the input signal before taking the action.


    Installation program:

    2 x PCI-6602


    Sampling the five PWM signals of 50 kHz using five counters (2 on a map) and three on another for about 10-15 seconds by recording continuously.

    All meter tasks are configured for DMA transfer.


    I get 200141 errors from time to time.


    I tried to increase the size of buffer and all tasks of meter are set to DMA. In the error message the last suggestion is to "divide the input signal before taking the action. I don't understand this suggestion. What is meant by "split the signal before taking the action?

    I am open to other solutions to the problem.


    Yes, I know that the 2 MB/s sound do not like much, but it's a way of high load very low tolerance to try to get 2 MB/s.  You have 5 DMA controllers to negotiate access to the bus and each transmits only 1 or 2 samples of 32-bit whenever he gets access.

    I've seen published baseline data where the maximum sustained rate was< 1="" million/sec="" (don't="" recall="" if="" it="" was="" mbytes="" or="" msamples). ="" as="" i="" recall,="" finite="" acquisition="" mode="" allowed="" higher="" rates="" for="" shorter="">

    Ah yes, here is a link that leads to the other links.  See the section on "The counter of the FIFO" in the first message.  Do you see a * very * significant difference in the performance of the M series for the series X-series.  Here are data for counters of the E series.  (It is fair to note that the comparative analysis was conducted with a much older PC hardware).  For the 6602 counter chip was designed between E- and M-series series, so you can probably expect performance in-between.

    Also note that the benchmarks seem to have been done with a task of window unique tent of owning all the bandwidth PCI as possible.  Since you would have 5 tasks they negotiate access, you lose definitely even more overhead.  In addition, for fair comparisons, your 50 kHz PWM would act as a measure of 100 kHz since you have 2 semiperiods to DAB per cycle of 50 kHz.

    Now that I've seen benchmarks once again, I am convinced that it is a no-go for you with just the 6602.  The good news is that the series X-series seem able to yet more ridiculously than I remembered.

    -Kevin P

  • IM using usb-6009 input signal acquisition

    IM acquirng a domain controller of the signal in the analog input (CSR), a volt of entry in 1.9 ports V is read on the (ai0) port of entry when I use a multimeter to read the value and at the same time I used an indicator to show the data read in the port of entry in my code, the led will turn to 2.5 sometimes 4V... If any body could help me this please answer me...

    I have another doubt too, I read a document in which they mentioned everything using the CSR that the positive end of the source must be connected to the analog input channel and the negative end to the ground in the acquisition of data... but its im using a dc signal I don't have how to connect and what to do...

    Hi Lisette,

    I Don t know what could happen. However, this could be a wiring problem.

    First of all, if you read this signal I'll recommend over the analog GND with your digital GND. You also this thread from the mass of the signal.

    Another thing, you might be able to do is to measure in differential mode, you can connect your 6009 NOR as the multimeter.

    Here's a tutorial that may be useful for you:

    Here are also a few KBs talks about what concerns you. Take a look to see if something, it might help.

    Finally, don't forget to use the Op Amp in follower configuration to avoid damaging your 6009.

    I wish you success!

  • The NI 9215 input signal voltage

    Hi all

    I tried to test NI 9215 BNC in MAX. And the acquisition of data took place without input signal.

    The voltage for all 4 channels, I read is about 10.4 Volt.

    Is this fair? I thought, the voltage should be 0 without input signal.

    Best regards


    If you leave the open entry, it will tend to float to one of the rails.  From my experience, it's almost always the top, so the 10.4 volts.

    If you short-circuit the input, you'll get certainly 0 volt.  You will probably get 0 or nearby yew, you put a resistance across the entrance.

  • How do I get the analog input signal and send it to output analog (real time)

    Hello world

    I do a simple task in Visual C++ and I use PCI-6221(37 pin).

    Basically, I want to send the same signal of "analog input" to the "analog output".

    at the same time (or almost), to make real-time application.

    Can someone provide me with sample program please.

    I would be grateful if you could provide me with the great tutorial that explains

    step by step everything about NOR-DAQmx for C/C++ programming.

    Best regards


    This is my code in C++, you can optimize it if that seems too messy. This code reads the analog input signals and exports it through the analog outputs.

    To make this code additional work of the directories include and library directories must be added to OR.

    I hope it helps someone.

    #include "NIDAQmx.h".

    #define DAQmxErrChk (functionCall) {if (DAQmxFailed (error = (functionCall))) {goto error ;}}

    int main (int argc, char * argv [])
    Int32 error = 0;
    TaskHandle taskHandleRead = 0, taskHandleWrite = 0;
    Read Int32 = 0;
    float64 context [1000];
    char errBuffRead [2048] = {'\0'};
    char errBuffWrite [2048] = {'\0'};
    bool32 done = 0;
    Int32 wrote;

    DAQmxErrChk (DAQmxCreateTask("",&taskHandleRead));
    DAQmxErrChk (DAQmxCreateAIVoltageChan(taskHandleRead,"Dev1/ai0","",DAQmx_Val_Cfg_Default,-10.0,10.0,DAQmx_Val_Volts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleRead,"",100.0,DAQmx_Val_Rising,DAQmx_Val_ContSamps,0));
    DAQmxErrChk (DAQmxCreateTask("",&taskHandleWrite));
    DAQmxErrChk (DAQmxCreateAOVoltageChan(taskHandleWrite,"Dev1/ao0","",-10.0,10.0,DAQmx_Val_Volts,NULL));
    DAQmxErrChk (DAQmxCfgSampClkTiming(taskHandleWrite,"ai/SampleClock",100.0,DAQmx_Val_Rising,DAQmx_Val_ContSamps,1000));

    DAQmxErrChk (DAQmxStartTask (taskHandleRead));
    DAQmxErrChk (DAQmxStartTask (taskHandleWrite));

    While (! fact &! _kbhit())


    DAQmxErrChk (DAQmxReadAnalogF64(taskHandleRead,1,10,DAQmx_Val_GroupByScanNumber,dataRead,1000,&read,));

    DAQmxErrChk (DAQmxWriteAnalogF64(taskHandleWrite,read,0,10.0,DAQmx_Val_GroupByChannel,dataRead,&written,));


    If (DAQmxFailed (error))

    DAQmxGetExtendedErrorInfo (errBuffRead, 2048);
    DAQmxGetExtendedErrorInfo (errBuffWrite, 2048);
    If (taskHandleRead! = 0)


    DAQmxStopTask (taskHandleRead);
    DAQmxClearTask (taskHandleRead);
    If (taskHandleWrite! = 0)


    DAQmxStopTask (taskHandleWrite);
    DAQmxClearTask (taskHandleWrite);
    If {(DAQmxFailed (error))
    printf ("error DAQmx: %s\n",errBuffRead); ")
    printf ("error DAQmx: %s\n",errBuffWrite); ")
    printf ("end of the program, press the Enter key to quit\n");
    GetChar ();
    return 0;

Maybe you are looking for