OR PCIe-1433 grabber in project VI

I have a LabVIEW project where I can set up the camera is connected to the NI PCIe-1433 map. But I have to close and open NI MAX to view the video.

Is there an example on how to pull the stream into a LabVIEW project video? I know I've seen this before with webcams and a cRIO.

Hi Jason,

It's good that you can see and acqurie images from the camera in MAX. I advise to use some of the expedition to start. You need get the MAX interface, for example 'img0' name and use it as input to the Init.vt IMAQ.

You can access the latter in LabVIEW by going to help-> find examples-> hardware input and output-> Vision Acq-> NI - IMAQ-> high level-> HL Grab.vi

Tags: NI Software

Similar Questions

  • Basler spL2048 - 70km with card PCIe-1433, low cadence

    Dear community LabView,

    I'm trying to run a Basler Sprint Mono spL-2048-70 km with a frame grabber module NI PCIe-1433. While the initial start-up was plug-and-play, turning the camera to the specified maximum speed (300 kHz) proves to be a challenge.

    Namely, I use the camera with a trigger in burst mode (main trigger frequency 10 Hz + burstmode up to 300 kHz) but the maximum reproducible work rep for the burst rate is around 30 kHz. I tried various options to increase the overall pace over the last few days but hit a wall.

    The system is a desktop based Win 7:

    Win 7 Enterprise SP 1

    Intel Core i5-2500 @ 3.3 GHz

    8 GB ram

    64-bit system

    I have attached the sample code I use below. Its the first time I tried to create fram high rate of requests so I hope it's just a very obvious mistake that can be fixed quickly and elegant.

    Thank you very much in advance

    Dear NIko,

    Thanks again for your involvement.

    I found yet another entry in the forum which helped me a lot.

    https://forums.NI.com/T5/machine-vision/acquisition-rate-is-half-of-trig-rate-why/TD-p/2029160/page/...

    Even if it of another camera and resembles another question originally it solved my problem. It seems one must trigger the acquisition card in ' Action: trigger start of acquisition "mode and the device itself in" exposure control: triggered "in order to avoid long latencies and reach higher than 100 kHz framerates.

    Brief overview:

    -use the Vi posted

    -define the Framegrabber Action: trigger the start of the acquisition

    -define a camera exposure control: triggered

    -Use 512 pixel AOI (via Basler CTC +)

    -OR-Max allows you to set the source of external camera command-line (moving the switch directly to the camera)

    --> Aquire burst mode with frequency intraburst up to 200 kHz (not sure if about 200 kHz creates problems)

    ---> be happy

    The specification from the point of view, the camera must be able to still 300 kHz. The frequency of my goal is 200 kHz I see this issue as resolved.

    Thanks a lot for your help and I hope this helps the rest of the community, and

  • IMAQ Grab Acquire.vi error when you use an external trigger on a card NI PCIe-1433 (sync problem?)

    According to my recent post on getting up and running with the NI PCIe-1433 camera link card, I ran into a bit of a snag.

    When you use the internal trigger on the camera, everything works 100%. I can view all the data from the camera in MAX as well as in the labview project. However, whenever I have set the mode switch is where things start to fall apart.

    What I have confirmed:

    -Camera is switching between inside and outside triggering.

    -NI PCIe-1433 camera link card is set up properly. While in external mode, I can trigger the camera by using a function generator and check the wire to the MAX. Everything works fine.

    When the unit is in external mode, the function Acquire.vi enter IMAQ - my mistake VI. The error is:

    Code :-1074397150

    The possible reasons for a timeout.

    Now, I have it set up so that a mistake here will not end the LabVIEW file. Sometimes, data of interest makes however (about every 10 seconds-ish). So what seems to be the case, it's that this external trigger signal is not in the lineup when the clamp is attempted. Is it possible to synchronize these? May reference the trigger signal external sort in my LabVIEW project so that the clamp is performed only when that trigger impulses?

    So I solved my problem. He was in time. The external trigger that I used for the device was simply too slow. I was initially using a trigger from 2 Hz to be able to view the values changing on LabVIEW probes. But it was enough to get enough data to move above the camera cable to assemble a picture and kept it in time. Travel up to 9 kHz solved the problem. No adjustment to the camera settings or LabVIEW code was necessary.

  • Time of exposure with MC1362 externally triggered and PCIe-1433

    Hello

    I'm capturing images with the help of a Microtron EoSens CL MC1362 camera and NI PCIe-1433 acquisition card. I have a question about exposure times - I don't know if it is related to the acquisition, the camera or the combination card, but I hope someone here can help you.

    I'm running an external signal to the SMB connector on the card acquisition and the camera set to run in "Mode Pulse width", that allows to the capture card to take control.

    My question is:

    How can I determine the exposure time in the settings of the external signal? It is a square wave, and the cycle seems to control the exposure time.

    I guess the exposure time, E = (D/f), where f is the frequency of the signal and D is the duty cycle? for example, for f = 1 kHz, and D = 80%, exposure would be E = 0.8ms. Is this correct?

    The manual of the device can be found here:

    http://G4.com.tw/Web/file/product/UserManual/995-EoSens%203CL-MC1361-manual.PDF

    and I've included a screenshot of the revelant portions that seem to describe how the pulse width. In particular, he said "exposure time is defined by the width of the external signal. Which is equivalent to my idea of E = D/f?

    Thanks in advance for any help.

    This looks good to me.  The exhibition remains on while the pulse is high.  When you calculate E, you're just calculating the duration of the pulse is on.

    Bruce

  • How do LV2009 see my PCIE-1433?

    Hi all - I think that this problem is just a matter of choosing the right download.  I am changing an LV application that has been installed on the demo system provided by a seller. Very kindly they offer all their source code, but it is written in the 2009 version, and they do not include the development environment. I downloaded 2009 sp1 and installed using my 2014 SP1 license. So far so good.

    When I open the VI, LV could not find one of the sub - vi IMAQ. Eventually, I realized that I needed to install the 2009 version of the CQI Vision System. Downloaded and installed, and who took care of most of the missing screws. There are still some which required the development of Vision Kit, but I do not have this license, so I removed those and tried to run the VI. I get errors that it could not open the framegrabber. It turned out that I had not released the driver for PCIE-1433.

    So I downloaded the driver, installed, package and the PCIE-1433 is still not recognized.  Now I'm stuck.  I know that this system should work somehow, because demand for the seller has worked until I started playing with it.

    Ideas, people?  Install everything in a different order (above is the actual order I installed things).

    Thank you!

    Crazy

    It seems that the PCIe-1433 needs driver IMAQ 4.4 or later, which is included in going TO 2010.03. You need this driver and LabVIEW 32-bit.

    Getting started with the NI PCIe-1433

    http://www.NI.com/PDF/manuals/374000a.PDF

    What Versions of NOR-IMAQ, NOR-IMAQdx and come from e/s NOR-IMAQ, with my Version of the software for Acquisition of Vision?

    http://digital.NI.com/public.nsf/allkb/6C42133468D66324862578BC00655CF8

    NOR-IMAQ compatibility with different Versions of LabVIEW

    http://digital.NI.com/public.nsf/allkb/DB928F6D5E9D6B97862579A7006B2850

  • INSTALLATION OF PCIE-1433

    I try to install a PCIe-1433 and I'm not able to get (1) image, and (2) a single camera 0 is available.

    I install the Software NI Vision Acquisition on a CD, dated September 2011 that includes NEITHER-IMAQ 4.6.1, NOR-IMAQdx 3.9.1 and NOR-IMAQ1/0 2.5.  I have two cameras Imperx Bobcat 1610 I want to operate simultaneously.  When I run the Configurator of Bobcat, he only found a camera and works as expected for that camera.   To see if there is a problem with one of the cameras, I linked each in 1433 one at a time.  The Bobcat Configurator found two cameras but only when they were connected to the farthest port 1433 of the input trigger.

    When I open the peripheral OR-IMAQ able and Automation Explorer tab, I get the following response

    img0:NIPCIe - 1433

    Channel 0: Basler A504K

    Which is consistent with the Bobcat Configurator doesn't find just a single camera.

    If I click on the Basler A504K, I get a video window with tabs to SNAP GRAB etc.

    If I click on the snap-IN or GRAB software component tab, I get the error message can NOT DETECT UNMISTAKABLE VIDEO SOURCE.

    The 1433 doesn't seem to be visibly damaged.  I don't know if I have installed the appropriate software or using the MEASUREMENT and AUTOMATION EXPLOER CORRECTLY or have a hardware problem

    Thank you

    FETHIBELFODIL

    fethibelfodil,

    You are looking in the PCIe-1430, because it is our only capture card that supports two buses cameralink.  However, be aware that two cameralink cameras must use the basic configuration.

    Kind regards

    LVbum42

  • Why are the characters sets are getting deleted using the card PCIe-1433?

    Hello

    I have several framegrabbers PCIe-1433 in my possession and I have a routine that sends out packets of 1 KB of calibration data to a camera at a time package.  Usually after a few starts, a time-out error because not enough data is received by the camera (the camera reacts returns with a character of receipt of each whole package).  The strange thing is that the same code works fine on the other boards of framegrabber no 1433 (using framegrabbers 1428-PCI and PCIe-1429, although I'm not able to take advantage of the faster transfer speeds) on this computer.

    I think the problem is the use of framegrabber PCIe-1433 AND model of the computer.  I tried the same code on two Dell Precision T3500 and they both allow only a few packages to send before that happen a transmission error.  When I try the same code and the same jury 1433 on a Dell Precision T3400 code works fine and I am able to send 1000 a lot of packages to the camera without error.

    It can be the cause of this problem?  I tried to change the cameralink cables and had the same problem.

    Thank you

    Bruce


  • IMAQ under Win XPx64 with PCIe-1433

    I have an IMAQ application written with VS2008 (managed C++, using PInvoke). The 32-bit version works fine. But I want to build a 64-bit version.

    I have IMAQ 4.4.0 (March, exit 2010) installed. My framegrabber is a PCIe-1433. I am running Windows XP Pro x 64 on a machine of BOXX Technologies.

    When I build an x 64 version, I get the error 0xBFF60150 of imgSetBufferElement (...) IMG_BUFF_ADDRESS...). imgShowError tells me that this means "the operation is not supported for 64-bit applications."

    This error means I can't do a x 64 build with IMAQ under XPx64?

    Or does this mean that one of the parameters to imgSetBufferElement (...) IMG_BUFF_ADDRESS...) is incorrect?

    Thanks in advance.

    Hi jhc2,

    If you look at the signature for imgSetBufferElement, you will see that the value of parameter is defined as a uInt32. It is not compatible with a 64-bit being the pointer 64 bits wide application. If you take a look at the C examples included with your version of IMAQdx you can see how the replacement function imgSetBufferElement2 must therefore serve as a replacement in the new code (imgSetBufferElement should now be listed as "obsolete" in the header). The new version replaces the old version of the function and is compatible with 32-bit and 64-bit code. The syntax and parameters that are passed must be identical.

    Eric

  • Selection of encoder to use with NI PCI-6221 for a project of inverted pendulum

    Hi, I'm a mechanical engineering student is his last years, for my final project I do an inverse pendulum system, the University already offered me this data acquisition card which is a NI PCI-6221, and I have to get the other components (motor continuous, encoders, cables, servoamplificateurs, etc.).

    My concern is to choose the right encoder so I have no problem... I was told that the best way to measure the position of the carriage uses an absolute encoder mounted on the motor shaft and some incremental encoders for measurement of angles of clock, but I don't know if the 6221 can handle this kind of data to an absolute encoder, and if so, what are the main parameters for selection? as the format, the number of bits.

    In the case where it doesn't work I have to go with the option of incremental encoders for both measure the position of the carriage and the angle of the pendulum, I believe the 6221 can manage entries in quadrature encoders, there´re a lot of examples of this, but since models of incremental encoders are wide enough there´re some features that I worry about : frequency vs the sampling frequency response and the output type.

    I found a catalogue which includes two types of digital incremental encoders said their models have 300 KHz frequency response, being the only differences, the output and food, anyway the 6221 can handle this freq resp? sampling rate of the card being 250 kech. / s, would there be any conflict?

    They offer two types of output: TTL/74LS04 and line pilot, and even if I go further with the hohner encoders they have the following outputs/freq RESP: RS-422 (TTL compatible) / 300 KHz, push-pull differential/200 KHz, NPN Open Collector/100 KHz and push - pull without complementarity/200 KHz.

    Any help would be well received

    PD: I don't know if a similar topic has already been posted, I'm again like this... I searched in other posts, but found nothing,


  • simulate the 6703 PCI card for remote project development

    I am remote programming an application that uses an NI PCI-6703 card connected to the computer in the lab.  So how do choose the right port using DAQmx when I know the specific channel that the hardware is connected to?

    Thank you very much

    Vince

    I have the NMS infromed, using MAX, under Interfaxes and devices, right-click on the NOR-DAQmx devices to create new OR-DAQmx devices.  Select the device under nor-DAQmx simulated devices, that's all.

  • Cannot add to the target FPGA project

    I installed labview 2010 with en FPGA in real time

    In MAX under hardware, I see RIO0 under RIO devices.

    I also installed NI - RIO3.0, that I can see in MAX.

    When I try to add my FPGA PCI-7831R target in my project I don't see not all FPGA targets.

    How can I add my PCI-7831R to my project?

    LabVIEW FPGA and real-time 2010 require NEITHER RIO 3.5.1.  NOR-RIO 3.0 won't the good support for LabVIEW to create targets in the project, even if the FPGA will be detected in MAX.  You can install OR-RIO http://joule.ni.com/nidu/cds/view/p/id/2144/lang/en3.5.1.

  • Error 0xBFF6001F when you use Basler acA2040-180kc and NI PCIe-1430

    Hi all

    I use a camera of the Basler acA2040-180kc and the NI PCIe-1430 to acquire images. I can't do a wink or a puncture in MAX now.

    1. I know I need to have the camera file in the devices NOR-IMAQ. Initially, I went to NOR-IMAQ devices/img 0: NI PCIe-1430/Port 0: / ni.com /Search camera (right click) and I downloaded the file from camera on the Industrial Camera Advisor Web site. This is the URL: http://sine.ni.com/apps/utf8/nipc.product?pid=11013&asid=1102. But when I tried to open, MAX gave me "error 0xBFF60108: the file of the camera does not support the current interface type.

    2. then, I found that I could use the camera file generator. I generated the file, and I could open in MAX. But after connecting the camera, I still couldn't make a wink or a puncture in MAX. He gave me "error 0xBFF6001F: could not detect a recognizable video source.»

    I don't know what the problem is now. I use a PoCL.MDR of 199745A - 05 National Instruments to DTS 5 M of cable. I plugged the camera link port 0 PCIe-1430 to port "B" at the back of the camera (I guess that means 'Base'). The LED on the back of the camera was not turned on, so I guess that it was not connected.

    Moreover, I also borrowed a Board NI SMU 1435 and DTS in two cables SDR to connect the camera. It worked. It says on the Industrial Camera Advisor Web site the NOR - hardware Compatible is NI PCIe-1433 and 1435-SMU-NI. So is this the reason? I have PCIe-1430 on the spot so I want to really use it.

    This is the first time to post a question. Thanks a lot guys!

    PoCL is Power-over-CameraLink and allows the framegrabber power the camera. (To my knowledge), PoCL cables are backward compatible for use in a situation of non-PoCL. Since the 1430 does not PoCL, you will need to feed externally. The connector between the ports of cameralink should leave you the power on the outside.

    The other problem is that this camera is a full-configuration camera while the 1430 supports only basic configuration. It would be possible to put the camera in a basic configuration of tab to reduced performance, but you will probably have to change the files to the device yourself.

    I think the easiest path would be to use the PCIe-1433 or complete PXI-1435 configuration (and also support PoCL).

    Eric

  • My acA2040 - 180km NE Basler isn't visible NI Max

    Hello

    I am setting up my new area scan camera (Basler acA2040-180 km NE) using the link to the camera. However, I don't see the camera in the measurement and Automation Explorer (version 5.4). I'm guessing this has something to do with the drivers. When I installed the Basler pylon Camera Software Suite, the detected acquisition card the camera as "cam0: Basler GenICam Source." By clicking on it, the error message saying "error 0 x 80004005 code error in imgShowError is unknown." But when you use the pylon Viewer (64 bit), the camera is nowhere to be seen again in the device window.

    Web site OR says that he accepts acA2040 - 180km (for PCIe-1433 cards and SMU-1435), which I suppose is different in a way that it has a slightly different CMOS sensor.

    The capture card that I use is the OR PCIe-1473R with circuit FPGA on it. I'm under 64-bit Windows 7 Professional with Labview 2012 SP1, including the latest FPGA module OR with stains.

    So far, I have:

    • Updated device drivers (February, 2013)

    • Updated to version 5.3 Vision Acquisition software

    • Updated the NEITHER-IMAQ for the latest version

    Given that I still consider myself a novice when it comes to Labview and Labview programming, is there something simple I've forgotten? Thanks in advance

    Best,

    Kari

    Hi Garzaa,

    I already answered you through our chain of support, but I copy the answer here so that everyone can enjoy!

    The NI PCIe-1473R Frame Grabber contains a reconfigurable FPGA in the path of the image that allows the integrated image processing. This means that complete communication between the device and the capture card through the FPGA. It is then a big difference comparing to the other the standard frame without FPGA grabber.

    It also means that the camera will not watch able top & Automation Explorer.

    With this capture card, you will need to build and use an FPGA VI using the camera.

    However, it is still possible to control the camera from the host by a third party (such as pylon for example) if necessary application. You can do this by binding series through the FPGA routing to the host computer. Then, on the host computer, you will need to run the server series that serves as an interface series-TCP. Finally, you can use the CameraLink Serial Remote System Selector tool in MAX (in select the menu 'Tools' MAX, then 'NI Vision', then "Select CLSerNat Systems") to create a virtual serial port that you can use in any application to communicate with your camera. See some information here: http://digital.ni.com/public.nsf/allkb/A56C0DAD5FD5B23286257A61005DF16F?OpenDocument

    The FPGA VI and the Serial server you need are already included with all examples of Vision-RIO delivered with the OR 1473R. The best advice is to start with an example to see how it works.

    Kind regards

  • Save images in table

    Hi all

    I know this has been asked before, and I read a lot of solutions, so I added a bit of detail to clarify the question...

    LabVIEW 2013 SP1

    Vision Assistant 2013 SP1

    PCIe-1433 camera link Frame Grabber

    Camera link, 280 fps

    I had an application that read images from the PCIe-1433, using the buffer 1.

    I called IMAQ acquired copy VI buffer to copy the image in the buffer zone, and then I would put that image on the disk.

    That worked well, but access to the drive could slow down after a few seconds.

    So, with a little help from support NOR, I went to use the (example HE Ring) several buffers.

    Now I do not call the "IMAQ acquired copy buffer" VI, I call the VI "IMAQ extracted buffer" to get the images in the input buffer.

    But, since I am not copying the buffer to the 'image workspace"(IMAQ Create), I now only have a reference to the image.

    Is there an another VI, or otherwise, to get the actual image so that I can store?

    At the moment we are going to capture thousands of images for further processing, so I don't really want to create thousands of buffers at installation time to save all images (as other examples have illustrated).

    I hope it's clear.

    Any thoughts?

    Thank you

    Jeff

    Hi all

    I discussed this with Earl to Technical Support, OR it was very useful.

    Apparently, when I capture an image buffer, I get a reference to the location of buffer (I knew that).

    When I was passed the image to the VI 'Write a binary file', this tool was the reference to the image (copy the pixels of the image to the binary file).  (apparently not documented where I could find)

    However, when I replaced the VI "To write a binary file" with "Insert to matrix" VI, LabVIEW does not dereference the pointer and stores the reference.

    And, there seems to be no tool "Derefrence this pointer.

    So, what we come with...

    After the capture of the image of the buffer, am I on the 'Image' (pointer) VI "Image of copy to the table."

    It gives me a table 2D containing actual pixels of the image and NOT a reference.

    Can I save that image data in my table.

    After I capture all my images, the loop ends, I loop in the table, read the 2D array and pass it to the VI "Arry copy image" to retrieve the image.

    I then write the image in the binary file for custody.

    In doing so, other applications of utilitiy that already exist as follows in the binary file must be compatible with the file that is created.

    Thanks for all the ideas and thoughts.

    Jeff

  • Selection of the correct workstation problems

    Hello

    The project consists of a scanner with a device of scanning line of 4096 pixels, operating at a frequency of acquisition of 40 kHz, for which we bought, among other products, a framegraber of the NI PCIe-1433. Inspecting web speed is 10 m / s and the width is 0.9 m.

    The main task of the scanner is analysis of the particles and find the dimensions and coordinates of the particles based. The first step will be the binarization, same binary morphology, particle filter and analysis of the particle. If all goes well, we will try to find other types of defects requiring filtering even.

    We will develop our application using loops in parallel as well as parallelize all possible tasks, can also be we, we will need to use the pipeline for the tasks of the series, too.

    For this application, we decided to use a workstation Dell Precision, but I m in trouble, deciding to work wright station

    Mainly, the doubt is the processor is best for these applications. Workstations Dell Precision comes with Intel I3, I5 (dual-core), I7 (up to 4 cores) and Xeon (5600 up to 6 cores). I don't know which one is better and the number of cores will be necessary. I guess that's better hearts less run fast, many cores to work more slowly, I'm wright?

    The operating system we intend to use it s Win 7 64 bit.

    Also how much RAM you suggest?

    Any help will be welcome.

    Thank you.

    Hello

    Finally, we decided to buy a Dell Precition T3500 wich have a quad core Intel Xeon 3, 2 GHz with 6 GB of RAM ECC.

Maybe you are looking for

  • WiFi works but Ethernet does not work

    I just changed the ISP. I use the modem cable of the MRC, which is the same as one that uses Xfinity. I'm also using a Linkysys EA4500 router. I have not had any problems with setting up yesterday before they changed over this morning. I have two iMa

  • HP Pavilion a1310y: can not find the drivers for Pavilion a1310y on Win 7

    Hello worldI installed my mother HP Pavilion a1310y to Windows 7 Pro. It works great except for what seems to me to be pilots. The audio does not work and the screen is the thing where everything is bigger and looks like he needs an update.On-site HP

  • Do I need to install the player "Direct memory access controller? It is a 32-bit windows 7 Professional.

    Here is the story: I install an OEM Windows 7 Professional 64 bit for its new PC, he is not happy. and I replaced by 32-bit professional to formate the system with the same product key. (I ve been said the keys are for 32 and 64) and it works. A few

  • Width QT and fonts

    Please help me understand the irregularity in the width of the characters. Copy the following code: QFontMetrics fm(font()); double x=fm.width("A"); qDebug()

  • How to prevent the reversal after the Arabic text PDF?

    I have Adobe Acrobat Reader DC. I used the "fill" option to fill the shape with the Arabic text. However, after that record, when I open the text file is switched from left to right (like a mirror), and each letter is indicated separately.What can I