Question of Shunt of LabVIEW

Greetings,

I'm working on a pump shrimp. I have problems of measurement of voltage across a shunt when tension, it reaches a certain point (~ 13 Volts or more).

Reading, as seen in the excel worksheet, takes a dive until the voltage on another channel is measured under 13V (it is in fact through a voltage divider 1/5).

Material:

NEITHER cDAQ-9184

NEITHER 9211

NEITHER 9402

NEITHER 9207

R shunt = 0.0005 ohms

I have attached the excel file of data where the measure of incorrect amperage appears in red. He did this on another pump as well.

When the input voltage is greater than 13.0 Volts (on the 1st channel of the NI 9211), the shunt voltage difference (on the 2nd channel of the NI 9211) would jump down in value.

It ruins my collection data will more 13.0 volts.

I'm looking for advice on how to solve the problem. I'm not sure if this is a hardware problem or a software that needs to be changed.

The volts are read by a voltage divider, but they have been accurate to date.

Thanks in advance,

-Tom

It is the voltage to the DAQ entry that counts with this.

For differential inputs, it is not the differential voltage that matters.  Each entry in the differential input must be less than 10.2V to the town (on the ground).

Tags: NI Software

Similar Questions

  • Question about support for LabVIEW DLLS and Unicode

    Hello

    I have a question about LabVIEW and DLL functions calls.

    I use a DLL (sorry, I can't share it) that was written in C. It was written to support Unicode and non-Unicode function calls.

    The Unicode function is valid, then FunctionNameW is called if FunctionNameA is called.

    I am building a few VI to access the library. I have the regular functions of FunctionNameA work.

    My question is, does LabVIEW support versions of function FunctionNameW Unicode, and if so is it necessary Although LabVIEW is already working with the standard function call?

    Am I being redundant or what should I build in Unicode support?

    The first time I tried to test the Unicode functions, I had an error, and I guess this is a system setting.

    Thank you for your time in advance.

    DB_IQ wrote:

    I don't think I have TO implement the Unicode, but I want if I can.

    For what I do, I think almost it is not serious. But I wanted to know if it could be used.

    The short answer is "Yes, you can do it."  However, it may open a new Pandora's box.  If you're not careful, problems and complications that can still spread to other projects that are not using Unicode!  It is better not to summon this monster unless there is absolutely no other way to do the job.

  • Beginner question absolute to the Labview with GPIB connection device

    Hi all

    Firstly, sorry if my question is already posted, but I tried searching with nothing doesn't.

    My situation: I have a HP digital osciloscope, a HP 54602 B using the GPIB and trial version of labview 8.6. Now, I want to communicate with her from labview. The big question is: how? Before that I used only labview with NI DAQ card and serial port for communications of the instrument.

    the detailed question

    1 is it important (to programming in labview later) what GPIB interface that I use in my computer? Must be of OR? I still have no interface right now.

    2. I ran a search on the osciloscope Web site of NOR and find an instrument driver in:

    http://sine.NI.com/apps/UTF8/niid_web_display.download_page?p_id_guid=E3B19B3E93F7659CE034080020E748...

    . How do they come?

    3. I know there are a few examples for the GPIB, but since I do not have the interface, I can't try it. Is it possible to simulate instruments GPIB?

    Thanks for all your help and assistance

    Hi opiq;

    Welcome to the world of instrument in LabVIEW Control!  I'll try and answer your questions one-at-a-time:

    1. N ° from a point of view of LabVIEW, you can use any interface supported by VISA and your instrument bus.  This includes (but is not limited to) boards of NOR-GPIB, 3rd-party GPIB, USB, LAN, LXI, VXI, PXI, etc..  Of course, if you do not have a GPIB interface, I would recommend a NEITHER - this is the safest way to get the experience 'it works '.  I tend to use an adapter USB GPIB (DISCLAIMER - I work for the OR).
    2. You have two options for installing the device driver.  "" The simplest is in LabVIEW to select Tools "Instrumentation" instrument Drivers find... and use our instrument driver search tool which will be step helps you locate, by downloading and installing the driver.  The other option is to download the zip file, unzip it in \National Instruments\LabVIEW 8.6\instr.lib and restart LabVIEW.  So if you look in your palette of instrumentation, you should see the driver listed under instrument drivers.  The pilot will include some examples of the use of the driver (are all certified instrument drivers).

      In fact, I would recommend this driver rather than that you connected.  It is a more modern design and install examples in the viewfinder of the example ("Help" find examples...).

      Here's a video of the above process (for a different instrument, but the process is the same)

    3. Unfortunately, there is not an easy way to simulate this instrument.  I would just recommend familiarizing yourself with the examples before acquiring a GPIB interface.
  • question on locking of LabVIEW amplifier design

    Hi all

    I use myDAQ with the NI DAQPad-6259 pinout for the input signal measurement. and I want to build an amplifier to lock by labview to measure the input signal. Does anyone have this kind of tutorials or vedios that enjoy helping me build this? I didn't use the Toolbox, but trying to figure out how to build and why it should be.

    Thank you.

    Look here:

    http://www.NI.com/white-paper/5613/en/

    http://forums.NI.com/T5/dynamic-signal-acquisition/lock-in-amplifier/TD-p/1057126

    http://forums.NI.com/T5/LabVIEW/lock-in-amplifier-using-NI-USB-6251/TD-p/1550972

    http://forums.NI.com/T5/dynamic-signal-acquisition/lock-in-amplifier-and-DAQmx/TD-p/282419

  • Model of LabVIEW VeriStand 2013 IO question

    During a project, that I'm working on, we decided to update our version of NI LabVIEW 2013 and 2013 VeriStand. For this project, we manage a combination of models, the .lvmodel and the .dll (compiled model Simulink). These models are deployed to an RMC 8354 using VeriStand. Following this update we have questions on our models LabVIEW successfully deploy. I was able to reproduce the problem with a very stripped down or isolated, version which I enclose. In this model, there is a digital control, a Boolean control, and a cluster that contains a digital control. There is also an indicator of each corresponding data types. All numerical values are double precision. The model block diagram is empty except for the controls and indicators, no cables or other elements. Without you connect controls and indicators of the model will be fine. If I connect the 3 orders and 3 indicators from different types of data, I get an error, the log to deploy it is attached to the post.

    I am also attaching VeriStand project files, the VI of model and the integrated version of the model.

    I wonder, can anyone, or any who already has, reproduced this unit or a similar problem? Anyone has any ideas on what may have caused this error to occur or how to solve the problem?

    Also available as a download manual here. More information on patches can be found here.

  • LabVIEW for control unlimited, a camera and a light meter (Minolta T - 10A)

    Hello world

    I'm totally new in the use of data acquisition software, but I found out, I use LabView for continuous measures (intervals of 15 min for 1 month or maybe more), the thing is that I need to get pictures simoultanously (Canon camera EOS) and illumination (in lux) device: Minolta T-10 a vision series T-10MA. My question is what software LabVIEW do I need to buy (or my institution actually) in order to get this. I look at the options of LabVIEW software and cannot understand what is needed. Subsequently, I guess that I need controllers for each of the devices (minolta and canon). Can you please tell me if this is possible and how to do it. I would really appreciate it.

    Steph.

    The following items are required: LabVIEW full, all drivers of devices (including DAQmx and pilots of the Vision) and the Acquisition of Vision software package.

    The Minolta luxometer has a connector for a PC and some communication software (which I did not inspect) which, I suppose, would allow you to take readings.  There is a comment about current if the USB cable is connected - I don't check if there is a power connector on the luxometer (since you will need to leave it plugged in for a month, the same problem with the camera).  As far as the camera is concerned, I don't know if it is controlled from a PC - my experience has been with (the size of 640 x 480 pixels) video cameras, which are probably controlled by LabVIEW.

    As for simultaneous readings, it is something that LabVIEW did pretty well.  I advise you to consult a professional if it's anything other than a school project, maybe even hire someone to work a few week of.

    Bob Schor

  • Protocol Bluetooth Stack in LabVIEW

    Hello

    What I'm trying to do is to use LabVIEW on a PC with an attached Bluetooth dongle to talk to an owner designed Bluetooth device to connect with Bluetooth phones. I need the program to send some commands to the patented device, disconnect, then connect and talk to another device. The patented device that I want to connect is essentially just a server waiting for a cell phone to communicate with her. Once connected, garbage are exchanged between the phone and the device, and once authenticated, the Unit completed its function.

    I've tested my Bluetooth dongle using the Simple Bluetooth Client.VI and the Simple Server.VI of Bluetooth running on two different PCs. They have been able to connect and transfer data properly.

    My question arises, however, in LabVIEW allows you to specify a channel to connect via Bluetooth. The device patented essentially being a dumb device, she wants to follow the typical Bluetooth Stack Protocol and start once paired frequency hopping. He will not be able to devote himself to a single channel and frequency with the computer of pigs.

    So I want to know is how LabVIEW treats the Protocol. The string that you specify in the VI Bluetooth actually affects the Protocol or that's just the way it communicates with the Bluetooth dongle? I have just need to know if LabVIEW supports already the Bluetooth protocol, or if there is a way around a dedicated channel. I want the computer to display and behave like a mobile devices view.

    I read through several of the NOR tutorials, examples and discussion forums, but I couldn't find answers to my question. I could be missing something simple or easy here, but I want to assure you that it is still possible to achieve before I start to develop the code.

    What I use:

    LabView Full 2012

    IOGear Bluetooth 2.1 USB Micro adapter

    I must admit that I'm not the most technically savvy person when it comes to Bluetooth stacks and protocols, so please excuse mistakes with technical information, I could have. I must, however, work colleagues who are very experienced in Bluetooth (but not in LabVIEW), worry is not on the level of technical language that you use.

    Thank you for your considerations in my problem. Do not hesitate to ask others (within limits reasonable owner) or clarification.

    Kind regards

    Bronson

    bronsonmock,

    The screws of LabVIEW Bluetooth and functions use RFCOMM, which is a connection protocol that exposes the Winsock interface. RFCOMM is a simple protocol that emulates communication series. The RFCOMM interface defines the clients and servers of Bluetooth.

    Creating client applications and server Bluetooth in LabVIEW is similar to creating applications server and client for TCP communication. A Bluetooth Server uses the Protocol for discovery of Service (SDP) to broadcast the availability of the services on the server contains and listens to incoming connections. A client creates an outgoing RFCOMM connection to a server. Once the client and server connect to each other, they exchange data until the client or the server terminates the connection or the connection is lost.

    Bluetooth of LabVIEW functions are really just a wrapper for the Windows bluetooth library functions. You can take a look and see how these protocols are processed to a lower level comprehension. I've also included this document where I shot the first two bits of information.

    Use of LabVIEW with wireless devices:

    http://zone.NI.com/reference/en-XX/help/371361J-01/lvconcepts/using_lv_with_wireless/

  • Missing Teststand VI with the new installation of Teststand/Labview

    All,

    It is perhaps a silly question, but I installed Labview and Teststand fresh on a new machine. When I run Labview, all my Teststand Vi (VI Labview referencing Teststand) in block diagrams are question marks... and of course select the menu too. Does anyone remember how to fix this? I have installed only once and never had that happen.  Thoughts?

    Hi SimpleJack,

    What was the order of your installation; First LabVIEW or the first TestStand? If you have installed TestStand first, then the image that you have demonstrated is a possibility.

    Kind regards

    Perry S.

  • Process data XML of LV

    Hi, SPANKING NEW GUY BRAND here, with what is probably a Newb question entirely:

    Formula: LabView and TestStand 2014

    In Labview, I have a VI that extracts data from an XML file, the Treaty and spits out another XML file

    The result is:

    Name gross Score, rank

    To the test, I want to run the VI, take the output file and report

    Name Grade, Pass (/Fail) based on the raw score...< 60="">

    All I could do successfully is to create a report with all the elements of the output XML (1 item per line)

    Thus,.

    1. is there a way to create a report with more then 1 item per line of return?

    [Have

    Name: John

    Score: 100

    Grade: A

    Name: Fido

    Score: 50

    Category: F

    ...

    wanted:

    John, A, Pass

    Fido, F, Miss

    ...]

    2. how to read & USE the XML data, producing an item that was not in the XML file?

    Let's say you have a LabVIEW code that creates 3 tables for name, raw score, and rank by doing the analysis it seems that you are already able to do.

    In TestStand, you use a loop For for each element of the 3 tables.  Then, you have a None any adapter that is a test of numerical limit.  The Source of data for this step is the raw score and limits less than 60.  In the expression of the post, you can add the line Step.Name = name [i] + ', ' + Grade [i].  (I is the index of your loop.)

    I don't think I understand what you get your second question.

    Pulido Technologies LLC

  • NOR-MAX for VISA DURATION 4.2

    Hello

    I posted this on the Narcotics Control Instrument, but maybe it's not in the right place since I didn't get all the answers. Here's my question:

    I have been looking for more than an hour and have not led to what. Is there a version of the NOT-MAX runtime? I have deployed executables LabVIEW on many other PCs and I encounter a problem on one of them where the resource VISA vi does not return the exact list of the COM ports even as drop-down list of the vi box VISA set up a Serial Port. For example, the list box might show COM1 and COM4 COM10, but VISA resource find vi shows ASRL1::INSTR, ASRL4::INSTR, and ASRL12::INSTR (should be ASRL10::INSTR). So I need a program such as NOR-MAX to change the alias on the PC with the execution of VISA 4.2 engine so that it corresponds to what shows the VISA configure Serial Port vi, incidentially, that matches what is shown in Device Manager.

    The PC in question does NOT the LabVIEW development environment installed, so it has NEITHER-MAX. is there a stand-alone exe OR-MAX I can install it on other PCs without the IDE LV? Manual change of visaconf.ini will do the trick? If so, is that what I should remove all aliases, uninstall all external (VCP) ports in the Device Manager, restart the PC (or not) and start again, reconnect the devices?

    Thank you for any help or advice.

    Ed

    James,

    Thank you for your response. Yes, I use the Application Builder and no, I chose not install NOR-MAX, but I see there in my file of project under "additional installers. I will consult the installer to run from the Web site of NOR.

    After that I posted this question, I have experienced yet. I have manually edited visaconf.ini so that numbers ASRL agreed with Device Manager of / ' Windows port assignment and low and behold, it worked! If this happens on the PC client, installation of MAX would probably be best solultion, however.

    Now I'm wondering if the VISA resource find vi is just read this file? I guess that Yes.

    Thanks again,

    Ed

  • Threads

    Hi all

    I have a few questions on threads of Labview and Windows. We have the producer/consumer for acquiring data model.

    Loop producer acquires the data and pushes it to the queue.

    Producer loop is a quite complex, it contains NO analog output and another two cards IO programmed through the DLL.

    I would like to optimize the speed of the loop of producer to do, I offered to put the loop of the producer to the other thread.

    Now I'm wondering if it is possible, and how, if I can be sure that producer loop runs, in the different Windows thread, with a priority in real time ?

    I'm familiar with Server VI etc, but has not found a way to be shure that it works in the separate Windows thread. It is possible at all?

    Michael.

    mishklyar wrote:
    [...] Windows with real-time priority thread? [...]

    Michael,

    It's antagonism. Windows and real-time exclude each other.

    That being said, we can try to improve the performance of certain tasks/threads to achieve performance comparable to real time.

    First of all, you should not run any other software in the background. Disable services non-critical as screen saver, disk hard indexing, firewalls and virus analysis tools.

    Second step: increase the priority of LabVIEW.exe in real-time using the Task Manager. You must do this because all threads created by LV have maximum priority to LabVIEW.exe regarding the operating system. Please note that from this moment, a deadlock in BT will most likely hang the whole system.

    Third step: create different priorities for your LV 'tasks. The easiest way is by using call loops.

    Please note that this approach I have static is strongly discouraged and can lead to an unstable system. If you need real-time applications, it is recommended to use dedicated real-time platforms such as LV RT or FPGA.

    hope this helps,

    Norbert

  • How can I programmatically change the parameters of voltage range in a DAQ Assistant

    Hello

    First post here.

    I need to be able to change the properties of voltage range of a daqmx assistant DAQ based on user input.  My material, an SCXI module - 1102C does not change this property on a running task, so I would together the range of input voltage analog before activating the DAQ Assistant, or break the DAQ Assistant immediately after it starts, set the values, and then resume.

    I don't know how to change the task ahead because the DAQ assistant creates the task when it is running, and there is no job before that.

    In the attached photo, I have a conditional section, configured to run only if the loop iteration is 0.  I take the task of the Daq assistant, sending him stop vi of task, set the property and then send the task with the snap the vi task. I can watch him run with lightweight debugging on, and everything seems to work properly, but on the second (and all others) iteration of the loop, I read I. Max and it seems that a re DAQ Assistant set it to the 5V.  You can see what's going wrong here?

    BTW, there is a continuous acquisition and the code doesn't produce error messages when executing.

    I've encountered a similar question someone posted here in 2006, but his question was specifically a Labview API (VB, I think) and not a real solution of G.

    Attached are the real vi in question and a PNG of the block diagram.

    Thank you!

    Ruby K

    First of all, if you want to start getting beyond the basics with the DAQ hardware, you have to stop using the DAQ assistant and do it with lower level VI DAQmx.  There are hundreds of examples in the finder of the example.  You can even make a right-click on the DAQ assistant and select open front panel.  This will create a Subvi, you can open and see what is happening behind the scenes.  Do it.  I think you will find that the task DAQ is recreated on each (although I'm not 100 percent the way parameters are established or maintained in each section of this sub - VI).

    The second problem is that you have a bit of a race on iteration 0 condition.  These two property DAQ nodes are running at the same time.  Thus, when you read the AI. Max, this can happen before or after the AI. Max is located in the structure of your business.

    Thirdly, make sure that involve you your son of the error.

  • wrap a vi with indicators in a stand-alone block

    I am a rookie in Labview, so this may be a common problem with the rookie.  However, I asked this question to more experienced Labview users and left without solution.  This post is a little long and it actually contains 2 little issues - apologies.  I hope it's really a matter of rookie and not a limitation of the language.

    When I started to work with Labview, I imagined being able to create virtual instruments for my test equipment automate my experiences.  I started with a power of BK1687B with a USB interface, but no .vi manufacturer.  The manufacturer provided a sufficient description of the commands on the virtual serial port PC, and I was soon able to write a work .vi who put a photo of the front of the power supply on the Panel before Labview and superimposed with digital display to reflect the appearance of the real power.  I created a while loop around the code to have it regularly interview the real instrument and update the virtual instrument Labview to read the same as the real power.

    I have already created controls to change the voltage & current in a separate all parameters in a loop outside the while loop for virtual food (modeling of the interface for the conduct of another program). Then I ran into my first problem - how do they communicate over the setting of control while in a loop (the rest of my modeling program) in all loop the virtual operating power supply?  I tried (successfully) so he can work with local variables, but when I went to create a subvi, Labview want to shoot all the indicators of the power supply sub - vi - what a mess!

    Then, instead I have re-written this with global variables for exchanging control signals, and it has worked and reduces the complexity. but using these globals is far from my vision of how I want to get this job.  I want a way to I/O well defined and easily identified in my block diagram of a single block that represents the virtual BK1687B power supply (see attached figure).  I don't want the communication to do with a bit of invisible global variables, I want signals are grouped between my virtual power PSU and the part of the control program which will use power - I want a thread for this function (I could make do with separate and on the son of cluster).  The problem is that I want those signals to pass in and out of the main loop of the program and the loop program of power supply via internal cluster updated cables at each iteration of the loop of the program.

    Question 1: How do you pass variables through the while loop in both directions with the updates that occur at each iteration?

    Question 2: How to wrap the .vi for feeding complete with all its indicators, controls and graphics of decoration as a block (its own thread) that can be loaded and connected on another program?

    In general, I would like to be able to separate from the rest of the exploitation of the virtual instrument of power supply control program.  I want the provision to have its own representation of the façade which includes all its indicators, controls and graphics of decoration pulled into the program as I load an icon representing the power supply in the block diagram.  I can't find how to make this encapsulation full virtual food.  Encapsulation is also partially inhibited by not being able to wire connect the input and output signals communication in the virtual instrument.


  • Latch up to the liberated behavior

    Behavior behavior latch - until the-released from switches seems a little strange to me, and I was wondering if anyone could shed some light on a question I have on this subject.

    As a quick experiment, I created a very simple VI containing only a while loop with a structure of the event, some controls Boolean latch - until the released and digital indicator.

    What I expected was that the display will show '1' and '2' then one of these Boolean controls was depressed after having released it. What I found, it was that it displays "2" after pressing the button, but stayed like that after it freeing.

    Using the window Inspector events, I checked that the occurrence of the event was made to fire when I pressed the button and when I got out, but in both cases, it seems to be reading the key as 'true '. I find that really odd - would be unwise for the button to read also true in a case and false in the other?

    To get the expected behavior, I had to do something very ugly:

    Since the button seems to be in the same condition in the two executions of the case, has no way to distinguish between the press and release of the events except keeping the external State to the structure of the event. My question is basically: why LabVIEW developers have chosen to design it this way? What is the logic behind this?

    In the structure of your event, you have an OLD VALUE and NEW VALUE provided to you.  You can do meaningful with those if you need to distinguish further the transition.

    Consider simply using the SWITCH up PUBLISHED mechanical action,

    Which generates a VALUE CHANGE event on the press and a VALUE CHANGE event on the release.  The NEW VALUE allows you to distinguish which.

  • 3D ground incorrectly display

    I opened the example VI, 'Plot.VI Surface 3D', but the graphical indicator on the front panel 3D did not display correctly. What is the problem with that? Is this a bug of Labview or associated with my hardware?  And how could I solve this problem?

    Hello book,

    It would be extremely helpful if you can provide additional information on the version of LabVIEW and what OS you are using and what hardware noticed you could be a problem actually is.  Have you tested this on other computers that do not show the problem?

    Now all we have to go is that you use a LabVIEW version that includes the 3D surface plot.

    That being said, it sounds like the question mentioned here:

    LabVIEW 2011 and 2011 SP1 known issues: the 3D picture controls can display seamlessly

    http://www.NI.com/white-paper/13164/en/#238713_by_Category

    It seems that there is a problem with some integrated graphics cards and a known solution is to update the drivers of the card.  Additional information is provided in the link above.

    Best regards

Maybe you are looking for

  • Fedora 17 is not install in my laptop HP Compaq 320

    I bought a laptop Hp Compaq 320, after liberation the fedora 17, I want to install Fedora 17 on my HP Comapq 320, when I inserted the dvd of Fedora 17, after starting then the stroke of the machine and its Logo only, HP shows. At this time, I can't d

  • Speakers are not reinstantiate sound on the internet.

    My speakers are not reinstantiate sound on the internet. I can listen to music from the computer, but when I go on Inernet, for example on you tube, my speakers aren't reinstantiate sound, and I checked all the buttons, they are ok. Can someone help

  • Purchase-CS6 - advice needed!

    HelloI think Adobe CS6 buying from a company with versions of download for a reasonable price.I just quit my job and use a product range of Adobe to work and College/University for about 18 years.I think I have Freelance Web Design / Graphic Design /

  • Problem to open an INDD file in the latest version of InDesign

    Hi and thanks in advance.I can't open an INDD sent by a client, initially, I got this message:He said baisically that "it cannot open the file because it was recorded in a newer version of Adobe InDesign (CS8.0)", so I requested an update of the soft

  • Can I get the forest at the specified index of a BigInteger

    Hi, I added two BigInteger each for a binary number in this way://x and y are StringBuilder object having large binary number on each BigInteger a = new BigInteger(x.toString(), 2); BigInteger b = new BigInteger(y.toString(), 2); BigInteger c = a.add