data FPGA clock enable (5171r)

Hello!

I work with a scope of the SMU 5171r reconfigurable.  It is installed and the sample program works well, but I'm in trouble when I try to make my own program.  As well as the clock of 40 MHz usual the has 5171r of a 250 MHz clock that drives the a/d converters (clocks of 125 MHz and 375 MHz are also provided).  When I run my program, it behaves as if the data clock does not work.  I have attached the code that I am running, which sends a value down to the FPGA via DMA FIFO, channels to the field of 125 MHz clock, and then sends it immediately to the 40 MHz field and go back to the computer (again, via DMA FIFO).  This program runs as expected when everything is on the field of 40 MHz, but when I run the secondary loop on the clock of 125 MHz (data clock).  I didn't recover.  The 5171r has also 8 IO reviews and I put pairs of them to switch the clocks 250 MHz, 40 MHz and 125 MHz as a validation test-based and I don't see that the 40 MHz active clock.

I spent some time looking into the design of instrument libraries to see if there was some step that had to be taken to enable the data clock or set the clock source (it may be deprived of an internal or external source), but I found nothing.  The user base of these things isn't great, but maybe someone has an idea of where to go to get this clock is going to know?  1 hour per cycle to compile my trial and error has been quite slow!

Thank you very much!

Ben

Hi again,

I kept digging on this and finally found the place in the IDLs where the clock is set.  Looks like I misunderstood the degree to which the IDLs are necessary to operate the device, so in a sense, the premise of my original post was in fact wrong.  If I could make a humble suggestion - it would be nice to have some sort of documentation that gives and overview of the libraries and how they fit in the physical capabilities of the device.  Or maybe it exists already and I just have not found?  I can't complain too much that since libraries are completely open, but to determine what they do by browsing each function is a rather painful process!

Tags: NI Software

Similar Questions

  • How to increase data FPGA clock?

    I use LV 14 and 14.5 USRP.  When I plug my new USRP-2940R-120 and run a program written for a USRP-2940R-40, the driver gives me a message that says "you are using a bitfile configured for data clock which is slower than the rate required for the bandwidth of this device. Risk of aliasing in your signal. "(error 1073383043).

    Fair enough.

    Then I looked at the data of properties of the clock in the application project and that I could not change the values in there.  Then... How can I change the data FPGA clock frequency and what do I change to?

    Thank you.

    Hi emonk,

    We have created a new FPGA target what samples at 200 MECH. / s to use the increased BW. You need to recompile your application using the new target. My recommendation on the best way to proceed is as follows:

    1 generate a new project of the sample. With NEITHER-USRP 14.5 all the projects in the sample have now both targets (120 MPS and 200 projects of average size)

    2. don't customize you the FPGA in your application? What if not all you have to do is use the bitfile for the target of 200 MECH. / s of the project in your host code example. If you did, you will have to redo these customizations in the FPGA VI of high level of the target. It can be trival or difficult depending on your application,.

    3. you will then need to recompile. Because of the faster clock rate, the meeting schedule is more difficult, your FPGA IP may need to be optimized if you work in the field of data clock. I suggest that the kickoff from 5-10 compile at first because of the variabiliy in each compilation. If all still fails, use the timing to study reports where optimizations are needed.

    I would like to know if you have any questions or encounter any problems.

  • Synchronize the FPGA clock for clock RT?

    Hello

    I use a sbRIO-9612.  Data are acquired for several weeks, and the problem is that the clock of RT of derivatives. I found a technical document to synchronize the clock of RT with SNTP server:

    http://digital.NI.com/public.nsf/allkb/F2B057C72B537EA2862572D100646D43?OpenDocument

    But I do not find anything on the FPGA clock. My data are acquired by the FPGA, my question is: How can I synchronize my FPGA clock with my RT clock or the SNTP server? (this is probably a stupid question, but she clearly explains my problem) SbRIO is suitable for my needs? Should I waive any 'acquisition of FPGA based' and use a different hardware architecture to perform data acquisition synchronization?

    Thanks in advance for any help.

    Julien

    Hi Julien,

    Take a look at "RT masters FPGA synchronization Example.vi" on the FPGA Timekeeperpage.  There's a Subvi, which uses a timed writing periodically present to the FPGA so that the FPGA can have a domain time synchronized with respect to RT if you have questions about this example, try to publish to category of Discussion of the project.

    -Steve K

  • PCI-6534 receives data to clock twice

    I have a simple test developed in place to check the operation of the card PCI-6534.  A written test generator a double-ended (hi-lo) word of 16 bits of zeros with one being bitshifted to the left (MSB).  It is sent to a custom Board that converts the double signal over single-ended and then sends the map of PCI-6534.

    Something like this:

    0000000000000001

    0000000000000010

    0000000000000100

    0000000000001000

    0000000000010000

    etc...

    I have two different computers and each has a PCI 6534 installed with exactly the same configuration accept the driver.  Computer 1 PCI-6534 driver version 1.12.0f0 (circa 2006).  Computer 1 shows the diagram above as expected.  Computer 2 a driver 2.0.0f0 (circa 2007).  2 computer shows the diagram below:

    0000000000000001

    0000000000000001

    0000000000000010

    0000000000000010

    0000000000000100

    0000000000000100

    0000000000001000

    0000000000001000

    0000000000010000

    0000000000010000

    So, it looks like 2 computer gets the same data on the top and bottom of a clock cycle.

    I was about to check the clock seen by 6534 on computer 1 against 2 computer the data generator trial to show the difference.  However, in doing so, I discovered that now 2 computer shows:

    0000000000000001

    0000000000000010

    0000000000000100

    0000000000001000

    0000000000010000

    I can't reproduce the problem even if I ask the question!

    Any ideas?

    Thank you!

    We have two different vintages of differential boards.  It turns out that the newer boards have a slightly different flavor of the chips that the older 26LS32.  When we swapped the chips on the latest map with those of the former, we were able to solve the problem.

    So, with the non-functional chips installed, if we turn the clock low slow reeeaal, we can get multiple samples of data such as the differential line noise is enough that she takes back as clock.  If we turn up to, say, 6 MHz all right.  It is because at that time, the clock runs at a higher frequency than the noise and overclocking the card.

    I still have yet to understand what the parameters are new chips really are the root cause of the problem, but it works.

    -Shrew

  • How fast and precision of the FPGA clock reference

    Maybe I need to make a request as below, but the required accuracy is about single 10ns. FPGA can help on this? I'm new newphie on FPGA.

    -Triggered by an external pulse and start a counter specific number for a high-frequency clock signal (maybe 100 MHz)

    -Triggered by a different external channel to stop the meter and lock the result.

    -Since the result of the lock, calculate the length of time of the trigger of 1e-2e.

    Thank you

    Hey,.

    Our precision clocks FPGA is generally 250ps, but because of your post, it seems you are talking about generating a signal with the specified precision.

    I suggest you to contact your local office OR a detailed description of your request to see what suites better material for it.

    Christian

  • Using the C-series SCTL DIO module with slower than the top level [FPGA] clock

    Hey all,.

    I'm running online research on a problem that I have a lot of success.

    I have a chassis with integrated FPGA, top-level 9030 clock 40 MHz.  I have a NOR-9401 DIO C Series module plugged and the value that will be managed by the FPGA target.  I need to count some linear encoders to exactly 10 MHz, no more, no less.  They are periods and gives a result of such kind that if I oversample or underestimate, I get garbage.

    If I create a SCTL and assign a source of synchronization derived from 10 MHz, I get an error code generation who:

    "Node read e/s for DIO3 FPGA is used in a clock domain that it does not support.  Areas of clock supported include: the clock of higher and clocks that have a rate that is a multiple of 40 MHz, for example 40 MHz, 80 MHz, 120 MHz and so on. »

    I tried several ways to work around this problem; First I tried just using a while loop with a loop set to 4 is ticking timer, but it then takes 9 clock cycles to perform the count for a reason any (although this code may compile in the SCTL without any problem).  I then tried to use the SCTL with a constant of 'true' AS a hack for a 'timed sequence' framework-related, and that certainly has not worked.

    Are there any strategies or techniques, or settings somewhere to work around this limitation on the AID I need to taste exactly 10 MHz?  I'd like to do this quickly in the software and get this rolling as soon as POSSIBLE.

    An image of the relevant section of the code is attached, I'm happy to provide you more things on request.

    Thank you very much!

    Maia Bageant

    Thanks for the reply!  The problem ended up being a hardware problem based on how coders were connected.  Now that I've fixed it, they're perfectly happy are oversampled.

    I guess my question is always legitimate to other applications, but not necessary for encoders a.

  • "Aggressive Wi - Fi for mobile data transfer" Developer enabled / disables itself off to restart

    Hi, I have experienced the behavior in the subject line since the installation of the update of the firmware of Marshmallow. Is it possible to prevent this? Firmware version is 23.5.A.0.575.

    This is a feature that extinguish itself on every reboot. It is not to Sony to change this function but Google that will not do such a thing because it can drain the battery and cause high data usage.

  • HOW CAN I STOP MY DATE AND CLOCK IN TIME TO RETURN TO 01/01/2001

    SOMETIMES AT THE START OF MY PC IT WILL GO BACK TO 11/01/2001 AND I HAVE TO MANUALLY FIX THE GOOD INFO... THIS HAPPENS A LOT AND IS A NUISANCE. WHAT CAUSES THIS AND HOW CAN I STOP IT FROM HAPPENING?  THE PC DON'T ME LET HIM DO ANYTHING UNTIL I HAVE CORRECT

    POOPERDO

    Usually caused by the CMOS battery.  If you have not changed your in addition to a few years, this is certainly the reason.  Google how change it for your computer.

  • Clocks on board and derivatives of FPGA

    Hello

    I use a PCIe FPGA 7852R card to collect data at 200 kHz of channel 5. I was pretty confident that my collection frequency was correct, because my program was based on a derived clock of 40 MHz I had chosen to use a derived clock because the on-board clock is running at 40,02 MHz (instead of the 40 MHz specified for some unknown reason I ignored).

    However, today one of the members of the LabVIEW support team told me that the FPGA clock rate may change during the compilation and optimization (constraints). He stressed that if the FPGA can run that fast, then it displays an error (as you might expect). He said also that in some cases, the on-board clock can run faster, but in this case no error will be displayed. I saw no errors in the past (which means that the system had no problem with that fast running), however, I'm not sure if during one of my compilations my derived clock could have run faster than 40 MHz. If it can happen? This will affect the time of all of my previously acquired data. Unfortunately, I have not saved any of my journal compilation of .

    Thank you very much in advance.

    Kind regards

    Varun

    Hi VarunSreenivasan,

    I think that our support has meant that compile logs will show you some pieces of logic could operate at higher speeds. During the process of compilation, the compiler will determine the clock frequency more rapid that can be used for your logic and to ensure that the configured clock frequency is less than or equal to this value. If you have decided to try to run your code to 200 MHz and logic cannot run this fast, then you get an error. If you have decided to run your logic to 200 kHz and it is possible to run faster compilation logs will show you then the max frequency that will work, but the logic still run to 200 kHz. The code should run no faster than the frequency you specified.

  • Card FPGA and data acquisition synchronization

    Hi, we are control and data acquisition of several hardware devices (including Photodetectors and translational stages). Until last week, we used all the controls and acquisition using a PCIe-7852R FPGA board. However, we decided to move the acquisition part to a PCIe 6363 DAQ card to improve the sharpness of the tension. During the test, I found that the internal clocks in the FPGA and the DAQ cards are slightly inconsistent (not just a phase delay, but a difference in the period).

    I know because I have generated a square wave (period = 20) using the FPGA and gains using the data acquisition card (at a rate of 200 kHz, that is, 1 taste every 5). I have observed acquired place shifts 5 every 5 seconds approximately. Such a change does not occur if the production and acquisition is done using the same Board. Therefore, the only explanation is that the data acquisition and FPGA cards clock frequencies are different. According to my calculations, the percentage difference between their time clock must be 5/5 s = 0.0001%.

    Therefore, I wonder if there is anyway to synchronize clocks between them. Or, is it possible that I can drive the FPGA clock-based DAQ hardware, or vice versa? Also, please let me know if there is something trivial as I fix.

    Thank you very much.

    Kind regards

    Varun

    Hi Varun,

    my post was only one solution...

    Your data acquisition card may take an entry to control sampling of trigger. In this mode, samples draw on a rising edge of the external clock signal. As long as you stay within the limits of the DAQ (100 MHz for your card) material sampling works perfectly. There are even examples coming with LabVIEW explaining how to program your data acquisition card...

    This mode use you your FPGA as clock source sampling for data acquisition. Both will run on the FPGA clock in sync. When the FPGA is a bit out of 40 MHz, so it won't matter because both devices are triggered on the same clock signal...

  • LabVIEW FPGA: Integration node clock wrong

    Hello

    I'm having some difficulties to understand how the clock is part of the node IP for LabVIEW FPGA and was hoping to get some advice.

    What I try to do is to set up a digital logic circuit with a MUX feeding a parallel 8-bit shift register. I created the schema for this Xilinx ISE 12.4, put in place and can't seem to import the HDL code into an intellectual property node. When I run the VI, I am able to choose between the two entries for the MUX, load the output in the shift register, clearly the shift register and activate the CE.

    My problem is that when I switch to the entrance of THIS, he should start 1 sec shift (Boolean true, SCR, High, what-have-you) in the registry once each clock period. Unfortunately, it instantly makes all 8 bits 1 s. I suspect it's a question of clock and here are some of the things I've tried:

    -Specify the input clock while going through the process of configuring IP nodes.

    -Adding an FPGA clock Constant as the timed loop.

    -Remove the timed loop and just specifying the clock input (I'm not able to run the VI that I get an error that calls for a timed loop)

    -Do not specify the clock to enter the Configuration of the IP node and wiring of the FPGA clock Constant to the clock input (I can't because the entry is generated as a Boolean).

    -Remove an earlier version of the EC who had two entries up to a door and at ISE.

    -Specify the CE in the process Configuration of the IP nodes.

    -Not specify this in the process of setting up nodes IP and wiring it sperately.

    -Various reconfigurations of the same thing that I don't remember.

    I think I'm doing something wrong with the clock, and that's the problem I have. Previously, when I asked questions to the Board of Directors on the importation of ISE code in LabVIEW FPGA, a clock signal is not necessary and they advised me to just use a timed loop. Now, I need to use it but am unable to find an explanation online, as it is a node of intellectual property.

    Any advice would be greatly appreciated, I'm working on a project that will require an understanding how to operate clocks the crux of intellectual property.

    Thanks in advance,

    Yusif Nurizade

    P.S. I have attached my schematic ISE and the LabVIEW project with one of the incarnations of the VI. The site allow me to add as an attachment .vhd file, but if it would help I could just paste the body of the code VDHL so just let me know.

    Hello Françoise,.

    I spoke to the engineer OR this topic and it seems that it was sufficient to verify that your code works, by putting a wait function of 500 ms on the while loop to check that the registers responsible and clear. I'm glad that it worked very well!

  • Independent of the 40 MHz clock Onboard FPGA PPS GPS meter

    Could you give me some advice or point me in the right direction?

    I need to time counter accurate/tick synchronized by GPS PPS synchronization signal.

    I got cRIO S.E.A. GPS Module that trigger PPS signal at the bottom of basket FPGA. PPS rising edge signal precisely mark the beginning of the second.

    When I measure the number of ticks of 40 MHz clock FPGA embedded in a (PPS) second number of ticks is not 40.000.000 ticks but something like 39,999,800 ticks.

    I want to measure with precision time between edges up/down digital signals. Because the clock of 40 MHz FPGA are drifting I can't use it.

    How could set up my own witch counter ticks will be put away with signal GPS PPS?

    I don't expect that I could provide 40 MHz FPGA on board the oscillator.

    Should I create my own counter which will increase the opposite by the value 1.000,005 (40 000 000 / 39,999,800) each tick and this counter will be running timed loop (chronometric 40 MHz FPGA embedded base). Increment the value (1,000...) will be updated every seconds dependent PPS vs FPGA clock drift.

    My configuration: cRIO-9025, cRIO-9116, S.E.A. GPS + cRIO module, high-speed DIs NI 9402, RT/FPGA LabVIEW 2011

    Looking forward to your hearing you.
    Peter

    Hi all

    This could help you.

    Timekeeper FPGA

    (I do not understand how this could be released on 23/05/2012)

  • Data NOR 6561: 36 MHz clocked

    Hello

    I need to send/receive the LVDS data from/to a product. Data must be clocked at 36 MHz and the clock will have in the product.

    Can you confirm that if I use a PCI-6561:

    -J' I need a 36 MHz external clock, connected to the jack IN CLK (clock edge can't be used?)

    -I have to export the sample clock on one of the CLK OUT ports

    Thanks for your help,

    Anne

    Hi Anne,.

    You're right on all points.  The 6561 can only generate frequencies from 200 MHz/N with its clock on board.  You might get 40 MHz or 33.3 MHz, but not of 36 MHz.  You can connect a 36 MHz clock source to CLKIN and use it as your sample clock source.  You can also export this clock with your data (if you use the DDC clock on terminals, data and clock will be a guaranteed phase relationship).

    Hope that helps,

    Keith Shapiro

    National Instruments R & D

  • How to generate the clock 8,192 MHz on the labview FPGA 7854R series card?

    Hello

    I use the NI PXI-7854R series card that has 5 FPGA from Xilinx Vertex on him. I'm drifting clockwork 8,192 MHz to 40 MHz on board the FPGA clock. But he ended up giving me a single clock 8 MHz. is it possible to access the PLL located on FPGA Xilinx?

    Concerning

    If you are familiar with VHDL you can make your own "component-level IP' (aka CLIP) with a PLL Xilinx inside.  There are topics for help and examples of how to define a CLIP.  Here's one that shows how instantiate a DCM (similar to a PLL) in a CLIP:

    http://zone.NI.com/reference/en-XX/help/371599F-01/lvfpgahelp/fpga_clip_clock_ex_code/

  • How can I change the clock from the fpga to the entire code

    Hi I am new to the labview and using 8.6 demo (fpga modules in real time).

    I developed some codes to practice and you want to change the clock freq 200 MHz of 40 M for all codes.

    I mean how to use clock derived for my vi.

    You create a clock FPGA derived from the "clock of 40 MHz on board" in the project manager.

    Then in the properties for the FPGA, you can select 'Top level Clock '.  High level for the FPGA clock can be that what follows: 80 MHz, 120 MHz, 160 MHz, 40 MHz, 200 MHz

Maybe you are looking for

  • Skype launches automatically download files sent! I need help!

    Hello! I installed the 6.18 and sent since it starts to download files automatically in a default folder. I have not found any way to set a different folder for it or to close it completely off Autostart (in previous versions, all necessary files all

  • Qosmio X 500 - 04N - Factory reset, how long it takes?

    I chose to reset my X 500 04N back to factory settings and the system has made it's har thing Drive then open windows to "Preparing for first use", but was then "System Configuration" (Toshiba window) and the restart almost two hours now. " I'm not e

  • Cannot install BT STACK for VISTA

    Hello I just formatted my Satellite A200 1 MB (PSAE0E)I downloaded BLUETOOTH STCK of stack official BT page Toshiba for VISTA, but when I tried to install it, he showed me the following message: "The operating system is not sufficient for the executi

  • How can I reduce the size of the screens viewed on my computer.

    Sometimes, I get a screen which is higher then my computer screen so I can't see or click on the bottom of the screen action buttons, maximize on upper rt button is gray and doesn't fix it.  y at - it a general command to reduce the size of the scree

  • CDP of block/filtering of catalyst on ports

    Hi all is it possible to filter cdp throught access ports on catalyst switches? We have viruses on the site that is looking for other devices throught CDP Protocol or the nearby Mikrotic. If I block - MAC Protocol 800, packet type Boadcast to mikroti