Enqueue cluster for Labview Teststand, return a cluster by notification

Hi all

I get very very frustrated that I can't find a way to spend

an asynchronous VI with a cluster of Teststand reflecting a LV

cluster using a queue.

This works if wire you the cluster Teststand for the VI is there

an order appropriate to link to. Yet, the VI has no

Connector, he expects that a cluster in queue, read of the

PropertyObject by 'GetValVariant', do some magic

and is supposed to return another cluster using Teststand

a notification

I can read singular elements of the cluster in queue, but cannot

Run in a cluster of LV with, say, the good old 'VariantToData '.

which indeed works great inside LV

Same thing happens when I try to regain a Teststand cluster

using 'SetValVariant' even if both the Teststand cluster

and the cluster of LabView are the same format and structure.

The 'Wait' Teststand notifier action is linked to the right cluster

property, but Labview fails at the "SetValVariant", because not of

the right type, even if I have sunk the variant data.

I wonder if I do wrong or Teststand and Labview

are simply not designed to work together seamlessly.

Here's the example I'm putting in place, but the reading of the

cluster fails, while the alternative 'False' works (123.00)

Defining the notifier cluster fails too, but its definition the

elements indivualy also fails because Labview is

What are 'Compliance' and 'Comments' in

the 'True' alternative, although I tried all

hierarchy tree.

You might have figured out, but I really really need to

get this working, and I can't understand where the

default is (next between the Chair and the keyboard) because

the principle works in LabView.

David Koch

Okay guys, I give up all hope, that I had to do this work.

This is a WIP Teststand container/Labview cluster motor relay.

TestStand 2013-> TCS_main_2.0.4.seq (run TCS_teststand_2.0.0.vi)
LabVIEW 2013-> TCS_teststand_2.0.0.vi (run TCS_instruments_2.0.0.vi)
LabVIEW 2013-> TCS_instruments_2.0.0.vi (interface instruments)

The main idea was to catch Teststand queued of containers,
convert a Variant, to analyze their structure and
to match with their counterpart of cluster of Labview.

The engine is initialized to 'learn' the Labview
data structures, storing the data type and the names of the clusters
different elements, as well as the cluster hierarchy.

A format of target data type and an associated action are also
stored for each identified cluster/container.

Then the engine is waiting for incoming Teststand queue
containers, convert them to a Variant, to analyze their
structure and * TRY * to match with the stored
having been analyzed Labview homologous groups.

This is what I call the "footprints".

The three main problems were... are:

1 - different naming convention (TCS_string_teststand_name_convert_2.0.0.vi)

2 - format of different data type (TCS_variant_tree_convert_2.0.0.vi)
3 - different (TCS_propertyobject_tree_parse_2.0.0.vi) data structure

I think I was very close to resolving these issues, but I'm
short on time. As an entrepreneur, I've spent
more than a month of work on this value. Who is
too many given my dead line.

I'm quite sad to be leaving this unfinished, he would have
was great to get the motor relay pole upward and
running, having just to place a container Teststand
to get something done automatically at the other end
and just wait for a notification sent for analysis.

What's left to do is the following, as an exercise
left for the reader:

To "explode" the data structure to correspond to of Teststand "
still more verbose simpler data format

This means for example that a Labview wave have to
be transformed into a cluster of 3 datas. You can read the
suite of document, but it is not very clear on how
The LabVIEW data type formats are converted to their
corresponding Teststand counterpart:

http://zone.NI.com/reference/en-XX/help/370052N-01/tsref/infotopics/labview_data_types/

Beware, this involves a lot of trial and error, as
creating a large Labview cluster with all data
type of support, import it in Teststand and create
a custom, data type and then compare each converted

a data type.

B point A would improve the adequacy between Teststand
containers and clusters of Labview.

With a longer data structure mirrored, it will indeed
help 'TCS_fingerprint_search_2.0.0.vi' to compare
the structure of the Teststand container with the correspondent
LabVIEW cluster structure, the elements names
Always different Convention.

Conversion of data format C - needs to be done, place holders
are ready to be filled.

There are still a lot of work to do in 'TCS_variant_tree_convert_2.0.0.vi '.

Remember that the type of data "FileGlobals.tcscluster".
mirrors (wise Teststand) the 'TCS_instruments_2.0.0.ctl '.
type definition.

This means that a string has not transformed a waveform.
Data format should be very close to each other. That's why
There is no need to focus too much on these "exotic" data
format conversions.

Thanks again to all the people who have invested some time
in trying to help me solve this issue.

Thanks to nathand for its cluster monitor not recursive.

David Koch

Tags: NI Software

Similar Questions

  • How to send a variable for Labview TestStand?

    Hello

    32-bit Teststand 2014

    32-bit Labview 2015

    Not sure if a matter of Labview and Teststand.

    I created a simple labview VI that has an indicator of success/failure.

    I created a simple teststand when having a success/failure of test with the labview VI as the file.

    I have a labview GUI that runs the teststand sequence and indicates if the sequence whole success or failure as a popup message.

    I have disabled reporting teststand.

    On the labview GUI, how can I show an indicator that my test pass/fail success or failure after the step of the sequence is over?

    If this can be done during the execution of the sequence, then how to display an indicator that my test pass/fail success or failure after the entire sequence ends?

    For example,.

    If I run two digital tests in my sequence. 24VDC testing and other tests of 3.3Vdc. After the execution of my order, I invite myself just at the end if the two tests either success or failure. However, if a test fails I will not be able to know the man who. I want to be able to tell which test failed with an indicator on my labview GUI and not a teststand generate report.

    I'm not sure if it's something to teststand I need installation with local variables or elsewhere in labview. I don't have a whole test report, I want to just the user to see an indicator that a particular functionality test adopted during the teststand sequence.

    Any help would be greatly appreciated.

    Thank you


  • Convert a name cluster Labview Teststand container name

    Hi all

    I realized that day that there is no way to convert a Labview

    cluster in a container of Teststand and vice versa. So I'm currently

    work on a cluster/container recursive parser that would be

    rebuild the structure of mirroring in variant/PropertyObject

    tree. For this I use the following table:

    http://zone.NI.com/reference/en-XX/help/370052N-01/tsref/infotopics/labview_data_types/

    The problam I am now facing is to convert the name of each

    ELEMENT by its name to stay compatible with the

    the structure of mirroring. That's why no space or special characters

    to follow the naming convention Teststand. You can

    See what happens when you create a custom data Teststand

    Type a connector for Labview.

    So, is there a VI that do the trick, or am I still pruposed to do

    Everything by hand? Thanks for your comments. Even the one who

    will not make me happy.

    David Koch

    Abandoned project, topic closed:

    http://forums.NI.com/T5/NI-TestStand/enqueue-TestStand-cluster-to-LabVIEW-return-a-cluster-by/m-p/30...

    David Koch

  • Entry value change when he went from teststand for labview.

    Hi all

    I try to spend my labview VI value hexadecimal teststand. When I run the VI independently giving the value as 00000010 I get the expected results of the VI.  But even when value is passed through local variable teststand, when it reached VI of entry it was conerted to 000000 A 0 and so I'm getting wrong output.

    I am not able to understand how the value goes to the value of the input parameter when she went from teststand for labview.

    Any ideas?

    I discovered the error.  I was entering the value in the wrong format in teststand... instead of 0 x 10 I've entered only 010 and was therefore is converted to A.

  • Implement the Std::Vector < < Point2i > > Std::Vector in dll wrapper for LabVIEW

    Hi, I'm writing a wrapper dll that using OpenCV function. I had been sucessfully implement Std::Vector by referring to "An array of clusters to a dll C sending".

    And now, I want to implement the Std::Vector<>> who is a lot like table 2D but each line items may be different.

    In LabVIEW, I attribute a range of cluster of the dashboard cluster of 2 I32 elements, structure which is shown below:

    I think it has the same functionality as Std::Vector<>> in C++.

    So I plug this data on the "Call library function node" structure and generate C code that indicated below:

    /* Call Library source file */
    
    #include "extcode.h"
    
    /* lv_prolog.h and lv_epilog.h set up the correct alignment for LabVIEW data. */
    #include "lv_prolog.h"
    
    /* Typedefs */
    
    typedef struct {
        int32_t elt1;
        int32_t elt2;
        } TD4;
    
    typedef struct {
        int32_t dimSize;
        TD4 elt[1];
        } TD3;
    typedef TD3 **TD3Hdl;
    
    typedef struct {
        TD3Hdl elt1;
        } TD2;
    
    typedef struct {
        int32_t dimSize;
        TD2 elt[1];
        } TD1;
    typedef TD1 **TD1Hdl;
    
    #include "lv_epilog.h"
    
    void funcName(TD1Hdl arg1);
    
    void funcName(TD1Hdl arg1)
    {
    
        /* Insert code here */
    
    }
    

    Then, I write this code show below in dll wrapper:

    void funcName(TD1Hdl Blobs)
    {
        vector < vector > blobs;
    
        // Distribute contents of blobs to Blobs from LabVIEW
        MgErr err = mgNoErr;
        size_t arraySizeInBytes = Offset(TD1, elt1) + sizeof(TD2)*blobs.size();     // Determine row size
        err = DSSetHSzClr(Blobs, arraySizeInBytes);
    
        if (err != mgNoErr)
            return;
    
        (*Blobs)->dimSize = blobs.size();
    
        for (size_t i = 0; i < blobs.size(); i++) {
            arraySizeInBytes = Offset(TD3, elt) + sizeof(TD4)*blobs[i].size();  // Determine col size of each row
            err = DSSetHSzClr((*Blobs)->elt[i].elt1, arraySizeInBytes);
    
            if (err != mgNoErr)
                return;
    
                    /*......................*/
            }
    }
    

    When I call LabVIEW dll, the program get interuption(i.e shutdown) on line where I want to determine the size of each row.

    Could someone give me some suggestions on this subject or promote another application of this requirement?
    Thank you very much.

    MgErr funcName(TD1Hdl Blobs)
    {
        vector < vector > blobs;
    
            Labeling(image_binary, blobs);         // the prototype of this function is: Labeling(Mat &binary, Vector> &blobs)
    
        // Distribute contents of blobs to Blobs from LabVIEW
        MgErr err = mgNoErr;
        size_t arraySizeInBytes = Offset(TD1, elt) + sizeof(TD2)*blobs.size();      // Determine row size
    
            if (contours == NULL)        {                Blobs = (TD1Hdl)DSNewHClr(arraySizeInBytes);                if (Blobs == NULL)                         err = mFullErr;        }        else                err = DSSetHSzClr(Blobs, arraySizeInBytes);
            if (err != mgNoErr)                return err;    for (size_t i = 0; err == mgNoEerr && i < blobs.size(); i++) {
            arraySizeInBytes = Offset(TD3, elt) + sizeof(TD4)*blobs[i].size();  // Determine col size of each row
    
                    if ((*Blobs)->elt[i].elt1 == NULL)                {                        (*Blobs)->elt[i].elt1 = (TD3Hdl)DSNewHClr(arraySizeInBytes);                        if ((*Blobs)->elt[i].elt1 == NULL)                                    err = mFullErr;                }                else               err = DSSetHSzClr((*Blobs)->elt[i].elt1, arraySizeInBytes);
                    if (err == mgNoErr)                {                        (*((*Blobs)->elt[i].elt1))->dimSize = blobs[i].size();
    
                            /*......................*/
    
                    }
            }
            (*Blobs)->dimSize = i;        return err;}
    

    Personaally I've usually done like this. Already, the tar of DSSetHSzClr() indicates if there was something wrong and that the handle cannot really become NULL to call this function.

    To be entirely correct and safety integrated, you must do more than that. But as long as you assume that the incoming picture is always smaller that the outgoing Board will be (usually it be 0 items when you enter this function, but if you reuse sort table in the diagram, by storing it in a registry change for example, this may not be true more) this will be enough.

  • What is the best way to open close, then send instrument labview teststand model handles parallel?

    I have a number of test systems that use a parallel model with labview. We have a number of instruments (PXI).

    What is the preferred method to open, close and passing instrument handles with labview teststand?

    Hello

    A few ways

    1 package return the session as a U32 handle, there are a few VI TestStand i of the palette of TestStand that you can use to make conversation.

    2 through a LabVIEWIOControl. TestStand handles these.

    3 do something fance as the use of LVGOOP and leave the handle as an object property and leave in memory. for example, do not pass it back at all.

    One thing, you'll have to monitor multiple executions trying to talk to the same instrument, use some lock/unlock or synchronization to avoid thi.

    Concerning

    Ray Farmer

  • 2651 a conversion of TSP Script for LabVIEW

    Hello

    I have a problem on the conversion of all TSP scripts that contain functions and appeal for the end loops. I'm new to the TSP with models of trigger scripts. I used the Script Builder (TSB) Test tool and am able to run any TSP and generate raw data, but I don't seem able to convert most of the codes in the command of LabVIEW VISA or loader LV TSP TSP and run it to generate data... I can't find any tutorial or examples how to do it.

    Let's say that... Use the example of KE2651A_Fast_ADC_Usage.tsp (pulse) and I'll just focus on the portion of function CapturePulseV (pulseWidth, pulseLimit, pulseLevel, numPulses). I have seen a few examples of LV that says loadscript myscript and close on endscript.  I did a lot of different approaches, and I kept getting errors in particular the print function that I am not able to generate data through LV by to read the data in the buffer to inside the instrument. Some approaches, I have had no errors but no data... Some approaches, I got error-285.

    The part of the code TSP pulse that works in TSB is here (I'm not including loadscript and endscript) and what is the RIGHT way to modify the code for LabVIEW and run it and obtain data? Thank you:

    function CapturePulseV(pulseLevel, pulseWidth, pulseLimit, numPulses)
        if (numPulses == nil) then numPulses = 1 end
    
        -- Configure the SMU
        reset()
        smua.reset()
        smua.source.func            = smua.OUTPUT_DCVOLTS
        smua.sense                  = smua.SENSE_REMOTE
        smua.source.rangev          = pulseLevel
        smua.source.levelv          = 0     -- The bias level
        smua.source.limiti          = 5     -- The DC Limit
        smua.measure.autozero       = smua.AUTOZERO_ONCE
    
        -- Use a measure range that is as large as the biggest
        -- possible pulse
        smua.measure.rangei         = pulseLimit
        smua.measure.rangev         = pulseLevel
    
        -- Select the fast ADC for measurements
        smua.measure.adc            = smua.ADC_FAST
    
        -- Set the time between measurements.  1us is the smallest
        smua.measure.interval       = 1e-6
    
        -- Set the measure count to be 1.25 times the width of the pulse
        -- to ensure we capture the entire pulse plus falling edge.
        smua.measure.count          =
                        (pulseWidth / smua.measure.interval) * 1.25
    
        -- Prepare the reading buffers
        smua.nvbuffer1.clear()
        smua.nvbuffer1.collecttimestamps    = 1
        smua.nvbuffer1.collectsourcevalues  = 0
        smua.nvbuffer2.clear()
        smua.nvbuffer2.collecttimestamps    = 1
        smua.nvbuffer2.collectsourcevalues  = 0
        -- Can't use source values with async measurements
    
        -- Configure the Pulsed Sweep setup
        -----------------------------------
        -- Timer 1 controls the pulse period
        trigger.timer[1].count          = numPulses - 1
        -- -- 1% Duty Cycle
        trigger.timer[1].delay          = pulseWidth / 0.01
        trigger.timer[1].passthrough    = true
        trigger.timer[1].stimulus       = smua.trigger.ARMED_EVENT_ID
    
        -- Timer 2 controls the pulse width
        trigger.timer[2].count          = 1
        trigger.timer[2].delay          = pulseWidth - 3e-6
        trigger.timer[2].passthrough    = false
        trigger.timer[2].stimulus       =
                    smua.trigger.SOURCE_COMPLETE_EVENT_ID
    
        -- Configure SMU Trigger Model for Sweep/Pulse Output
        -----------------------------------------------------
        -- Pulses will all be the same level so set start and stop to
        -- the same value and the number of points in the sweep to 2
        smua.trigger.source.linearv(pulseLevel, pulseLevel, 2)
        smua.trigger.source.limiti      = pulseLimit
        smua.trigger.measure.action     = smua.ASYNC
        -- We want to start the measurements before the source action takes
        -- place so we must configure the ADC to operate asynchronously of
        -- the rest of the SMU trigger model actions
    
        -- Measure I and V during the pulse
        smua.trigger.measure.iv(smua.nvbuffer1, smua.nvbuffer2)
    
        -- Return the output to the bias level at the end of the pulse/sweep
        smua.trigger.endpulse.action    = smua.SOURCE_IDLE
        smua.trigger.endsweep.action    = smua.SOURCE_IDLE
        smua.trigger.count              = numPulses
        smua.trigger.arm.stimulus       = 0
        smua.trigger.source.stimulus    = trigger.timer[1].EVENT_ID
        smua.trigger.measure.stimulus   = trigger.timer[1].EVENT_ID
        smua.trigger.endpulse.stimulus  = trigger.timer[2].EVENT_ID
        smua.trigger.source.action      = smua.ENABLE
    
        smua.source.output              = 1
        smua.trigger.initiate()
        waitcomplete()
        smua.source.output              = 0
    
        PrintPulseData()
    end
    
    function PrintPulseData()
        print("Timestamp\tVoltage\tCurrent")
        for i=1, smua.nvbuffer1.n do
            print(string.format("%g\t%g\t%g",
                                smua.nvbuffer1.timestamps[i],
                                smua.nvbuffer2[i],
                                smua.nvbuffer1[i]))
        end
    end
    

    I finally solved it myself! I first create support shell, according to the documents, but the problem was with functions of scripts but I solved by introducing VISA separate, feature writing and THEN retrieve the data from the instrument directly by VISA buffer read more. I did TSP_Function Script Loader that allows simply copy/paste codes teaspoon (any * .tsp) of: TSB program or incorporated into this type of function and loader.vi, name (parameters), defined by its own pasted script then it will generate RAW files directly in the array of strings that can be broken into pieces or restructured into what you want as for the graphics, etc..

    That's all I really need to do, I can do codes of tsp in LV and get the data off of it easily via the function defined. Now, this Loader.VI behaves in the same way that TSB keithley-made program I use.

    Here I add Loader.vi Script TSP_Function (in LV 2011 +).

  • From the telnet session between calls of LabVIEW, TestStand

    TestStand, I'll call telnet.llb VI.

    In a step TestStand am opening a telnet session to IP address and the collection of the telnet (U32) connection. In the next step of TestStand, I pass the telnet connection to a telnet writing VI, but this error occurs.

    "Dequeue item to acquire Semaphore.vi:1-> Write.vi:1-> Telnet Write.vi.ProxyCaller Telnet"

    The telnet open, write, read and close the excellent work if I get the number of telnet in the same VI session. But I need to keep the session open between TestStand calls because a huge VI is not possible.

    Thank you

    Josh

    Verify that your adapter for LabVIEW has reserved the execution. If it is already, I don't know, you may need to create a parallel thread that keeps the session active.

    CC

  • Data for LabView - Motorolla RAZR dashboard

    When view LabView, the Android Market data dashboard tells me, 'your device is not compatible with this item."

    I have a Motorola RAZR.  Any suggestions?

    Hello duane,.

    At that time, dashboard of data for LabVIEW is available on shelves.  In addition to the Apple iPad and iPad 2, Android tablets only under 2.3 or later are supported.  To support the Tablet 7 '' Android, we had to make some design changes.  It should be even more changes due to the small size of a phone.  We might consider a version for phones based on the popularity of the application on each platform and the information received in return.  For more information, see the product page or the LabVIEW Web Interface Builder and data dashboard discussion forum.

    Grant M.
    Senior Software Engineer | LabVIEW tablets | National Instruments

  • How to prepare an iMac mid 2007, 4 GB RAM El Capitan running for resale or return to factory settings...?

    Hello

    How to prepare an iMac mid 2007 20 ", 4 GB RAM El Capitan running for resale or return to factory...? (If possible)

    As I now have a new iMac 21.5 "and also wants to sell out but don't know how to do this?

    Thank you? Hope that a simple question is equivalent to simple answer?

    Angus.

    Angus,

    Read the post of Niel in: iMac for gift preparation

  • Support for LabVIEW 2016

    Can someone tell me what VirtualBench drivers will be available for LabVIEW 2016?

    Release date is... right now! NOR-VirtualBench 16.0, with the help of LabVIEW 2016, is available here:

    (My apologies for yesterday, does not but it takes a little while to download pages to go live.)

  • Support of NOR-DNET for LabVIEW 2013

    We currently use OR DNET 1.6.6 with LabVIEW 2011. I installed LabVIEW 2013 now also on my computer and tried to synchronize all of the drivers with my installation of LabVIEW 2011.

    Well, it seems that NEITHER-DNET does not support LabVIEW 2013, at least officially. compatibility of Version of LabVIEW and NOR-DNET indicates that NEITHER-DNET 1.6.6 supports 2011 NOR-DNET 1.6.7 2012 LabVIEW and LabVIEW.

    The list NOR system driver November 2013 set OR DNET 1.6.7 defined pilot. When I try to install it, there is no support for LabVIEW 2013.

    My question is, if there is a plan to include support OR DNET for LabVIEW 2013 or later in the game to pilot?

    I copied the directories vi.lib\DeviceNet and vi.lib\nidnet of LabVIEW 2011-2013 and I can load my programs without any problems. I always did not build an executable and does not run on the test set-up, but projects can be loaded in LabVIEW 2013 without any screws of brocken. should I expect any problems running LabVIEW 2013 with the NOR-DNET to 1.6.6 and 1.6.7 driver?

    Nick

    There should not be problems but it is a former pilot, we will not be updated for the future version of labview.

  • Is there a Module NXT for LabVIEW 2010

    Is there a Module NXT for LabVIEW 2010?  If not, are there plans to bring a available and if so, when?

    Hello

    I found a link to a download for the module 2010 here.

    I hope this helps.

  • How to convert the file with VI for LabView .lib to CVI?

    Hello

    I have the .lib file for my lock in the amplifier. The lib file is written for labView and already contains a visual interface (like ActiveX). The point is that I need this file for programming in CVI (8.5). How can I convert a LabView ICB .lib?

    Thank you.

    Denis.

    You said that you had a .lib file and in fact you have an llb file. There is a huge difference between an "i" (eye) and a 'l' (ell).

    There is a LabVIEW Instrument Driver Export Wizard but it's only for new pilots of project lifestyle and you need LabVIEW. There is no other way to convert a LabVIEW driver to a driver of the CVI. If the LabVIEW driver uses methods and properties of the ActiveX, you will need to write the same thing in CVI.

  • What type of support for labview usb protocols?

    Hello guys I want to connect a device to your pc via a usb port and I want to use labview to analyze the data.

    But first of all, I would like to know what usb protocols support for labview.

    USB, USB CDC and other TMC?

    I ve read labview recognizes a raw device to the usb, but what is? Is it VI to read and send data, or I have to do? If these VI exists, they do the handshake? flow control?

    Thank you.

    Please read the chips and have a look at the USB specification (it is linked to in the nuggets).

    TMC, CDC, MAss Storage are all built on the basic USB protocols.

    LabVIEW has no built-in support for one of these classes of devices except Test and measurement.

    Shane

Maybe you are looking for