Hardware timed CI/CO, operation of PCI, I / AO with specific execution order

Hi all

I work with a USB-6361 NI data acquisition hardware and I'm writing a LabView 2011 program which sends a signal to my output device and reads several values of various measuring devices. I have a ramp voltage signal (in a table format) and I need to sequentially send the elements of this array on my AO device with very precise timing (ideally timed material). As soon as the value is written to the device of the AO, I need my tasks HERE and CI to record all available data and my task to send a single impulse before the next array element is sent to the device AO. All this needs very precisely timed such that if I perform a measurement on 1000 points in my table of ramp with a timing of 1000 Hz, the program will have exactly 1 sec. to run (less overloading caused in initialization and closing tasks) and produce a table of data consisting of 1000 steps of AI and CI. I had initially tried to achieve this by using a structure of sequence within a timed loop software, but because I won't get into multiple channels of Amnesty International and CI and tables of data, this has proved to be excessively slow. The other danger that I fear I could run is the case in which the LabView program runs more slowly than the process of acquisition data itself, thus causing data to write to the buffer and the measures being lost.

I didn't spend a lot of time working with the procedures, timing and synchronization, so I apologize if this is a rather naïve question - I'm having trouble reconciling restrict them inherent software and order of execution with the hardware timing. What is the best way to solve this problem? I can only think of two reasonable approaches to the problem:

(1) should I try to trigger my CI tasks and HAVE run on the completion of the task of the AO (and if so, how)?

(2) I have the CI and tasks run on a separate clock with a start delay initial light as to compensate for these tasks if they occur reliable after that each value AO is sent to my device?

Finally, I build an identical for a real-time embedded controller program of SMU-8100. The approach will be different in this case?

Thanks in advance for your answer.

I can give you an overview of the steps you'll need, but be ready to spend experiment a little time and some of these individual pieces of troubleshooting before you put them all together.

First of all, a X-series 63xx allows all you need to accomplish a very precise synchronization in hardware.  When you then switch to a real-time controller, you must include an X-series device in the chassis to maintain this capacity of equipment.

Second, 1 kHz is not yet a challenge to LabVIEW to catch up with your hardware, even with powered lower RT controller.  You just need to program the DAQmx tasks to let the hardware & driver do most of the work.

1. I would dedicate a counter of the 6361 to generate the clock shared for other DAQmx tasks.  I tend to create a clock and operating cycle of 90-95% for this type of application of stimulus / response to the response time & maximum stabilization.  Generate the stimulus at the cutting edge of the clock and gain response data on the edge of leak - before the next sample of stimulus.  Call it the task of the clock.

2 set up a buffered AO task that uses the output of clock as the sample clock. Configure the polarity to be sensitive to the cutting edge (usually, this will be a rising edge).   Start the task AO * before * starting the task of the clock.

3 configure tasks HERE and CI to use the output of clock as their clocks in the sample.  Configure the polarity to be sensitive at the trailing edge.  Start tasks I and CI * before * starting the task of the clock.

4. the output pulse should be a task of meter output, configured to generate a single pulse redeclenchables.  Configure it to use the leading edge of the clock as the trigger signal output.  Set the time of 'low' and the 'initial period' to equal precision<1 msec="" delay="" you="" want="" to="" make="" sure="" that="" you="" get="" the="" same="" timing="" on="" every="" trigger.=""  be="" sure="" the="" high="" time="" allows="" the="" full="" pulse="" to="" fit="" within your="" clock="">

5 examine the "producer-consumer" model so that you can separate your data from your CQI data processing.  You should stream to file continuous rather than it accumulates in large networks.  Write speed of file can be unpredictable, this is why access to the file must be in a separate loop that acts as a data consumer that the loop of CQI data produced.  When you spend real time, assign a lower priority in the process of writing file.

-Kevin P

Tags: NI Software

Similar Questions

  • No puedo instalar hardware device VIA Networking Technologies Cardbus PCI Wireless LAN Adapter para W7, so alguien tiene driver is lo agradeceria

    No puedo instalar hardware device VIA Networking Technologies Cardbus PCI Wireless LAN Adapter para W7, so alguien tiene driver is lo agradeceria.

    El driver worked well with Windows XP, upgrade to pero al hacer a W7 me da problemas, visit el sitio oficial, pero no tienen el driver para Windows 7.

    Please select your language from the drop-down menu at the bottom of the page to post your question in the language of your choice. The forum in which you've posted is for English only. If you can't find the desired language, support for additional international sites options are by following the link below:

    Please, select su idioma in her lista desplegable anterior to send you in el idioma of choice su pregunta. El foro Québec ha published're para frances only. If usted no encuentra el idioma no desee por encima of las options para support otros destinos international themselves can find following el siguiente enlace: (sp)

    http://support.Microsoft.com/common/international.aspx

  • PCIe-653 x with DAQ traditional

    Hello

    We want to update some of our network using a PCIe-6536. Is it possible to run this card with traditional NIDAQ?

    Does anyone have experience with the PCIe-653 x with DAQ traditional. Or is it only possible to run this card with DAQmx?

    Greetings

    Oliver

    Hi Oliver,.

    PCIe cards not are usually not supported by the old traditional DAQ driver.

    In any case, you should consider updating your application to DAQmx as TRANS. data acquisition support is quite limited (see also the official statement of OR below)

    "Caveat ".  NOR-traditional DAQ
    (Legacy) is an older driver with the obsolete application programming
    interfaces
    (API) for the development of acquisition data, instrumentation and control
    requests for former of National Instruments DAQ devices. You must use
    NOR-DAQ traditional (old) that in some circumstances. Refer to the
    File Readme of NOR-DAQ for a glimpse of the two API NOR-DAQ, the
    benefits of
    NOR-DAQmx and more information about when to use NOR-traditional DAQ
    (Legacy)
    including a complete list of devices supported, operating systems,
    application
    versions of the software and the language versions. You can install the most recent
    version of
    NOR-DAQmx software, available at ni.com/downloads.
    "

    Kind regards
    Bernd

  • Dell inspiron E1505 with Windows Vista system restore to factory setting led by mistake. The operating system version is incompatible with Startup Repair tool

    I tried to restore my Dell Inspiron E1505 laptop to the factory setting.  I pressed F8 then selected repair and then Dell Factory Image Restore.  Used to format hard drive.  He formatted the hard drive and then he restored the system.  Suddenly, a small window pops up without any details, but saying "error".  I restarted the laptop, it entered into Startup Repair mode.  But even once showed error "operating system version is incompatible with startup repair".  I don't have my windows CD/DVD.  Although the windows sticker is still on the laptop.  Please advise me how to install the system on the laptop so I can use the same.  Please note that I am in Gurgaon (India).

    Thank you

    Hello

    Because the computer is located in a no start situation and you don't have the installation disc, it is best to contact the manufacturer for assistance.

  • Two operating systems; Windows 7 Professional with an option of Windows 10

    Two operating systems;  Windows 7 Professional with an option of Windows 10.

    As a Neolithic Windows, I'm comfortable with Win 7 like all my old XP applications have continued to work with Win 7.  Now I want to try to get around the current Win 10 operating system through experience to see what Win 10 offers me in every sense of the use practable. I don't want to lose the existing installation of Win7 which works well, I think to a new PC.

    I need a route which can accommodate my learning needs.  I'm open to suggestions that might involve touchscreen options to win 10 OS while keeping the mouse driven navigation for Win 7 applications.  Clues on a postcard!

    10 dual-boot Windows, you will need to buy retail (it's only free if you use it to replace Windows 7, not to start the long side).

    So first, buy an installation Windows 10 CD/USB stick to your favorite tech store and install it as a dual-boot option.

    You will find that there is no dichotomy of the 'touch' vs 'mouse' use Windows 10.  You can use the mouse as you do today in Windows 7: there is literally not a single instance of a hidden mouse-gesture-operated menu, you need to know the secret movement to activate (unlike Windows 8, Windows 10 has been designed so that a user of Windows 7 could jump in without any training).

  • Deeper debugging on the "deployment operation failed on the agent with an error...".

    I tried to deploy a very simple plugin, a couple of settings with Snmp Fetchlet. Nothing complex.

    ILINT pass (although I'm not specifying a file of target.xml as examples and models do not seem to correspond to what expected the DTD), here is the result:
    -bash-3, 00 #... /emctl ilint d 0-i sysman/emd/no_targets.xml m sysman/admin/metadata/my_storage.xml - c sysman/admin/default_collection/my_storage.xml
    Oracle Enterprise Manager 10g Release 10.2.0.1.0
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    Not analysis Instance file target (targets.xml)
    Validate the Collection target sysman/admin/default_collection/my_storage.xml file
    Validating target metadata file sysman/admin/metadata/my_storage.xml...
    Target the sysman/admin/metadata/my_storage.xml successfully validated metadata file

    The plug-in is imported into Oracle EM GM 10 5 correctly.

    When I try to deploy, I get an 'error '. I go to the screen deployment errors and warnings and that's:
    Deployment operation failed on the agent with an error state

    Where can I find some sort of journal or information more deeply in the error state?

    Paul

    The preferred credentials you set for the agent where you try deploy it, are the same identifying information for the user who owns this agent install? During the deployment, files that are placed on the agent receive the same permissions and owner as install other files on the agent.

  • Hardware Diag tool-AMD Phenom 9650 PCI 1394 Bus Reset test don't NO WHAT to DO?

    Hi all

    IM new to this forum and I am here because I ran into a problem with my HP diagnosis are.  I have a HP Pavilion Elite m9402f I bought about 3-4 years ago.  Recently, my hardware diagnostic tool was omitting the

    Bus PCI 1394 reset the Test

    AMD Quad Core Phenom 9650 Pro MB152-F1

    Can someone tell me if it is a common problem, and if there is a simple solution.  If its an error with the diagnostic software possible or if I am looking at a major hardware problem.  Whatever it is, I want to fix this ASAP.

    Any input would be great!

    Thanks for your time

    Jen

    The processor have a test as well?  The reason why I ask is Reset of Bus PCI 1394 test is separate from the processor and belongs to the motherboard subcategory.

    1394 is usually a firewire port.  The system has a firewire port?

    According to the following picture, #4 is the firewire port.

    Based on the results of the test, the FireWire has failed (or train to fail).  If it is something that you do not use, then I'd be too concerned about it.  The only solution that I know would be to replace the motherboard in his together which may be more it would be interesting if the port is not used.

    If the BIOS has the ability to disable (some systems, some do not), then you can disable the port all together and he can jump the device as BIOS should report it as disabled.  I note that most of the systems do not have this option.

    Let me know if that answers your question.

  • Hardware timing on PXI-6259

    I'm having a problem of selection of a clock for the PXI-6259. I've seen several posts on the forum on this issue, but the solution seems to be "read the manual"; I am not able to total, try it.

    I have a chassis with an analogue 6723 static on the map and a 6259 multifunction data acquisition. I provide 4 sets of 2 digital inputs: a guideline and a pulse line. The user selects a number of impulses, a pulse width, and a destination for the pulse train. I use the number of pulses and pulse width to set up a digital waveform clocked. I can write the direction static c on the 6723 and the train of pulses at the timed HW DO on the 6259.

    My problem is to select a clock for the pulse train. I tried using the sample AO clock and the sample clock HAVE to give me the ability to adjust the pulsewidth. I add the additional sample for the "wait until what Done.vi", but it times out when one of these clocks are used. If I use the time base of 100 kHz, the vi works well - but only for a 10us impulse. With the help of the time base destroyed my ability to change the pulse width.

    I have attached the screw below; He let me just tie three.

    The vi parent - control Tach, Full.vi

    The creation of impulse vi tasks - create Pulse Channels.vi

    The output vi - generate Tach Pulse.vi

    The RWA selector is a typedef that selects one of the four outputs to send the train direction of signal and pulse to.

    The typedef of Direction is converted to a Boolean value to write to the direction of entry.

    I first create an array of tasks for the static DO the map of 6723, a special track.

    Then, I create an array of tasks for the pulse on the map of 6259, including the establishment of the clock. I do this by taking the desired number of pulses, doubling (each pulse takes two samples), then adding one to allow the impulse to settle. I also add another sample, in addition to that for the vi "wait until what". This is the vi where to select which one to use.

    I then choose the task of a pulse and direction, enter the values in the buffer and start explicitly jobs. After that wait until done return, I stop the tasks.

    Waiting until it times out (I used up to 10 s) am the AO or sample clock is used. If I remove the wait until done, I get a warning (task may have stopped before all the written samples) with no output pulse train. It works very well with a time base of 100 kHz, but of course its fixed and cannot make any other sample rate other than 100 kHz/10us.

    I have three questions:

    1. the main problem - what I'm doing wrong with the clocks?

    2. I put the program in time loop to allow the sending of one of the 4 outputs. I can send reverse or forward impulses out even any number of times, but the VI produces an error if I try to switch to another output (turned off, so I don't have the exact error code; it tells me that the task is reserved).

    3. is it necessary to replace the task in the table when I'm done with it (for example, replace the code table subset)? What is exactly included in the thread of the task?

    Had some extended time for testing today, and I found a solution. It is much more complicated as the example shows, although it seems obvious, once I write it.

    It takes two tasks and two knots of timing to accomplish the digital generation timed by material:

    A single task and node of synchronization for the digital task, I had put in place already

    A single task and calendar for an analog task node allow the sample clock. I needed an analog task with a channel not used AND a timing node. Analog synchronization node should have the same sample rate and the number of samples that the node digital synchronization, but the source entry should be left blank to use the default clock (sample clock HAVE for a fictitious analog input channel).

    Sailing smooth after that. Thanks for the help and insight!

    -Nick

  • Hardware timed interruption

    Where can I find sample code for an interruption of TMR/CTR channel so I can create a material interruption timed?

    ex. clock frequency = 1 MHz.  Period of loop = 100ms.

    1. start the meter.

    2 trigger interrupt when the counter = 10000.

    3 reset the counter.

    4 run the code interruption.

    5. Repeat steps 2 to 5.

    Thank you!


  • IMAQ PCI-1424 compatibility with Labview 8.6

    Hi, the computer that I used for image analysis had a broken motherboard issue, and I replaced it with a new one. I removed both the drives and the jury from the old computer to the new and then installed IMAQ Labview 8.6, but Labview 8.6 is not compatible with the software programmed with labview 5.0, so I uninstalled Labview 8.6 and re-installed Labview 6.0 image processing. I still have some problems to run the Analyzer software and image processing. After you have uninstalled the software or labview, the computer does not start correctly, poping up error as information "check your hard drive to ensure that it is properly configured and completed. Run CHKDSK /F to check for hard drive corruption, and then restart your computer."

    Some basic information on this computer and the image acquisition system (introduced in 2000) are as follows:

    • Windows 2000 operating system;
    • LabVIEW 5.0 basic package;
    • IMAQ vision for Labview advanced;
    • IMAQ PCI-1408, Cables IMAQ-BNC-1 and or-IMAQ for Windows NT / 95;
    • IMAQ-A2504-1 (1 m).
    • Sony XC - 55 1/3 "Interline transfer progressive scan CCD, 659 (H) X 494 (V), C-mount with JB - 77, SERVICES-12 p-02 and DKT503M;
    • Computar H1212FI 1/2 "mount C 12 mm w/focus and iris (10 inches FOV at 24 work dist.);
    • Extension tube Kit Computar VM100;
    • GRAFTEK high frequency fluorescent light 11 ";

    -What hardware IMAQ I use compatible with Labview 6.0 or 8.6? Should I install also drivers for the hardware of the IMAQ when I remove the card IMAQ? All the solutions for my problems? Thank you very much in advance!

    Hi simpra,.

    Here are two links to articles that list compatibility between Windows & LabVIEW and LabVIEW and IMAQ.   Why do you say that the image processign software was not compatible with the LV 8.6?  A message appear?  Or it simply did not work?  Also, are you talking about Module Vision Development, by chance?

    I'm afraid that I can't help hard drive corruption, if that's the case here.  Have you been able to go beyond that?

    My suggestion is to upgrade to the latest version possible.

    I hope this helps!

    Kristen H.

  • 2015 Macbook Pro 2.8 GHz &gt; 16GB RAM 1 TB &gt; PCIe SSD &gt; overheating with Thunderbolt Display 27 "&gt; Constant MAX RPM fan speed

    Hello community,

    First of all, I want to congratulate Apple on the construction of such a laptop well. It is a luxury product that is clearly desired by most people, which I understand very well that he's a new owner. But entering the heart of the problem that I face outside this machine being an external engineering masterpiece.

    Professional background:

    I'm a computer engineer / scientist / programmer by profession. I am currently employed at Hewlett Packard Enterprise (HPE) as a Senior Consulting Engineer level IV for their BIG Data Software Division. Yes, I program (Java and c#), scripts (powershell, batch and SQL), solutions architect, design, to test software and debugging, implement large enterprise-level solutions ranging from a cluster of 2 mirrors servers-to-of thousands of servers to multiple clusters (even from a perspective of work station of the missions of deployment from 200 KB - to - 50 jobs in the world) work with SQL databases on a daily basis, working with storage devices (SAN mutli-million dollar NAS and DAS), provide an infrastructure troubleshooting and problem resolution, have 8 international recognised certifications, a degree of master of science and involved with concerts on the side as well as build computers as a hobby. My computer knowledge is vast and my RESUME goes beyond what I have briefly described. It is important that I give this background to demonstrate that my level of computer knowledge is beyond the scope of this post.

    Laptop Usecase:

    Recently I bought mid-2015 Macbook Pro 15 "2.8 GHz Intel i7 quad core, AMD dedicated GPU, 16 GB of RAM, 1 Portable SSD PCIe TB from an authorized Apple dealer. To maximize the beauty of this gem, I also bought a monitor Thunderbolt Apple 27 "due to its ability to offer a dedicated gigabit NIC and additional USB connectors. Basically, the screen acts as a central docking for the laptop and offers additional i/o that does not have the laptop (somewhat a disappointment that it's called a 'PRO' for laptop - this can be discussed in an extra thread).

    I will mostly use the laptop for tasks related to work as operating on dozens of remote servers via RDP, using office 2011 and 2016 Lync, Eclipse for Java programming, access the HPE intranet for related tasks at work, set up conference calls, connection with clients remotely via WebEx or TeamViewer and typical everyday computing. For fun, I'll use the laptop to compose music through Logic Pro X, browsing the Web, doing work with Photoshop/Lightroom/and other products Adobe, Web browsing, Email, Youtube, NetFlix, coding, and other basis for computing tasks.

    Question:

    I am primarily an engineer Windows to the heart and don't work well with Unix/Linux, so forgive me if certain acronyms and terminology is slightly outside when it comes to the OS.

    The question that I am currently facing is how HOT and high cell phone VOICE gets during very little use. It seems that, given the 27 "Apple Thunderbolt connected display cause heating and the fans to be excessive and loud. 'A little' work around the question I unplugged the lightning poster power adapter and have dedicated the trendy laptop adapter. It 'a bit' reduces the amount of heat generated, which reduces the speed of the fan to be a little less quiet. However, the noise level of fans and heat is not reduced enough to convince me!

    After the control process during the time of the laptop being extremely strong and warm, I noticed the only vivid process is the process of "kernel_task". However, I debunked this theory by leaving the unit to return to a normal state, that required me disconnected the Thunderbolt display, and the process ran again without all the negative effects of heat and noise.

    So, why Apple would create a laptop that indicates the external monitor is fully compatible, can support up to 2 external displays with no problems and provides a support for this configuration, if there are any other problems when connecting?


    * Please do not provide responses that I have received a bad laptop or a bad monitor. Appears not to be the case because when I am doing absolutely nothing on the machine, heat is minimal and the fans are 'enough' quiet, even when connected to the Thunderbolt display. However, even simple web browsing on the Thunderbolt display forces do MAX out fans and heat is generated quickly.

    I love this laptop. His elegant, fun to work on, is luxurious and is a potential item that meets all my needs. But I just can't justify spending more than $4 K on a configuration that will cause problems and significantly reduce the lifetime of the laptop.

    1 should I return the laptop and love at first sight appear and waiting for Skylake edition which will be released in the coming months (energy consumption reduced, rising in power and performance, and (?) one possible different GPU that excels much better than 'crap' AMD GPU?

    2. or should I paste and see if a solution or resolution is proposed to the issues that I am experiencing?

    3. I see the same problems all over the internet. Anyone else here has the same problems?

    4. why Apple would invest in this GPU 'cheap' for such a machine power? AMD? Really? (< (less="" than)="" $150="">

    5. How can I host a Lync call with clients when I hear more the laptop that I can hear them?

    My thoughts:

    I spent $6 000.00 on the construction of an incredible machine at home. It has an Intel i7 5960 X Octa-Core processor, nVidia GTX 980, 64 GB RAM Corsair DDR4, 12 TB of disk 7200 RPM, a 512GB SSD for the OS and temporary storage, power 1200 Watt, card reader 5 in 1, a player Blu Ray optical blazing fast, liquid cooling, a Premium Asus Sound Card, the card mother 'best!' on the market (I finished this statement that fans just went back to MAX RPM0 - maybe) because I'm throwing my personal build an Apple forum?), a mechanical keyboard, mouse laser with 12 K dpi and five monitors 24 ".

    I was willing to spend that money on a laptop of luxury that I can travel with, but is it really worth what I'm going through? Also, others on the Internet said they have returned their Macbook Pro for the same model without the dedicated GPU and no longer meet the problem. BUT why someone would invest $3 K in a machine without a good graphics card? IT MAKES NO SENSE TO ME.

    PLEASE, I BEG YOU! Can someone help APPLE me this question?

    If I am not convinced to the contrary, Apple then lost me as a customer forever and return to Windows, I oppose the $4 500.00 I spent. If I am convinced to stay, then Apple will get $4K out of me at all times every 3-4 years, I invested in most any software.

    Also, if I can post a video showing you what I'm listening to, then you will be in shock. Honestly, I'd like to keep this machine, but I need help to understand what is happening and what can be done to correct the problem.

    Thank you

    mjaestewart

    Everyone from Apple have a preview? I am pretty surprised to have not received a response.

    Please help me out here.

    Thank you

    mjaestewart

  • PCI/4070 Flexdmm with Windows 7

    See the datasheet shows Vista/XP Compatible and has been wondering if puts up-to-date or another gives Windows 7 compatibility? Thank you

    The latest NI-DMM driver is version 3.0.5 which will work with the PXI/PCI-4070 and works with Windows 7. It is available for download here:

    NOR-DMM 3.0.5-Windows Server 2003 R2, 32-bit/XP 32-bit/Server 2008 R2 64-bit/Vista/7

    Hardware devices (DMM) to support the hardware and the software requirements are listed in the Readme of the driver download page.

  • Maximum latency to read/write for PCI-DIO-32HS with NOR-DAQmx?

    I was asked to evaluate using the PCI-DIO-32HS and NOR-DAQmx on MacOS X to essentially close a control loop.  My first concern is whether or not it can run fast enough.

    So my questions are: what is the maximum latency from the moment of receiving a signal strobe reading data from external hardware, then read a single 16-bit sample and sending it upward by NOR-DAQ to a user space application?  And, similarly, what is the maximum latency of sending a unique 16-bit value from a space application user down through NOR-DAQ, then write a port and affirming a strobe signal of external-data writing material?

    Thank you.

    Hi AliasMe,

    Thanks for posting and welcome to the Forums EITHER!

    Do you need to process the data in the software and generate a value on this basis?  DIO-32HS (alias PCI-6533) offers in the buffer of inputs and outputs, but you'd have a latency in your control system relying on the OS to do treatment.  Since this period is dependent on the system, I can't give a specification for the amount of time it would take.  However, integrating data in memory, processing and writing back to the card would likely be on the order of several milliseconds.

    Generally, we recommend FPGA in situations of control like this, so you can do all the processing in hardware.  However, if provide you more details on your request we can look into all the options available.

    -John

  • Sequence structure flat inside the timed loop and execution order

    I have some problems trying to implement a flat sequence structure when you use a loop timed on a target of cRio VI

    I tried with or without the while loop around the structure of sequence flat, and I also tried to replace the 'Non-deterministic loop' with a timed loop

    The problem is that the program seems to run only once, then get stuck somewhere

    I am writing a program that performs the following operations as soon as possible:
    1. read the Pos_MC of entry on the FPGA
    2 send the value of Pos_MC to the VI target (on cRio CPU)
    3. calculate a value of output based on Pos_MC with a PID block ("exit PID')
    4. send 'PID output' to the FPGA
    5 write "PID output" analog output "MOOG".

    In addition, I want the program to return the measured value "Pos_MC" to a host VI for the recording of data

    So that the output of PID is calculated and sent to the FPGA as quickly as possible, I placed a flat sequence structure to ensure that it happens before you send the output to the nondeterministic loop for recording data

    Also, I want the digital input 'Stop' to be able to stop the loop deterministic (the timed loop)

    I read much more entries than that and the help of several PID and exit, but I rewrote the code for a single entry and exit to make it easier to illustrate

    Screenshot of the code is shown in 'target code.png' and 'fpga code.png.

    The VI themselves are attached in the next post (cannot attach files of more than 3)

    Question 1:
    Any advice on how to get this race? Thank you!

    Question 2:
    Is also my correct understanding in that, using this structure, each 0.9ms (fpga loop time) comes the following:
    1. the input ("Pos_MOOG") is read on the fpga
    2. the production of PID is calculated on the cRio with some delay to computation (for example 0.1ms)
    3. the output of PID is then written for analog output "MOOG" in all about 0, 1 - 0.2ms
    4. the FPGA program then waits until 0.9ms spent and repeat the process

    As opposed to the next pass whenever performing a loop is started on the FPGA:

    1. the FPGA reads the input and written on the output (the output of the execution of the previous loop PID)

    2. then the entry is sent the cRio, PID output is calculated and sent to the FPGA

    3. the new release of PID is maintained until the next time through the loop

    Thank you!

    PHG wrote:

    Thanks for the input guys, any advice as to how I could get the feature in scenario 1?

    I still say that the best route is just putting all the logic of the control in the FPGA.

    Other alternatives include 1) the use of DMA FIFO sedn data back or 2) use interruptions so that the FPGA code can not read the output level until the RT.

    DMA FIFOs are usually very limited, and I would not use them in this situation since I belive said it this code to do for the many outputs.

  • What operating system is now available with the new Macs?

    About to buy a new iMac. Need very soon. As ordered by next week. Prefer to El Cap installed (would rather wait Sierra went through the selection process before entering a new operating system to work). Anyone knows which OS is available on the Apple Store now (since the Sierra was not yet released to the public)? And if El Cap is still available - when someone anticipate he will not be available? (I want to configure what I buy and places like MacMall, while large, are not as versatile).

    Without doubt, El Capitan, as the Sierra is not released until September 20.

Maybe you are looking for

  • I can't pocket on my android

    I can't pocket of Firefox on my android

  • Whose idea?

    The idea is the recent appearance of light colours for the letters, figures, graph and map lines agains clear background? I find lines, streets etc. very light and invisible?  Who can I write question and suggest that readability should outweigh fash

  • New update of software gave me a slow computer. Help!

    Hi all! I finally gave in to the boring software update popping up last night and am now regret. Let him install before shutting down my MacBook Pro, and today, my computer runs very slowly. Everything takes forever to load. When I click on a tab of

  • Re: Tecra A8 RTC battery?

    National parts lists one battery Cmos/RTC for my laptop (8312) Although on my replacement card, there is no battery. Could be the cause of the stuck up cooling fan?I read that Toshiba is using capacitors instead of the CCF.Can someone help me with th

  • Dynadock and Satellite Pro U400-135 - external monitor question

    Hello.We have a serious problem on the 26 dynadock vga e Satellite Pro U400-135 unit The problem is:If set to host as a single external monitor during startup frequent windows loses resolution external monitor. Example of resolution monitor post seti