How to increase data FPGA clock?

I use LV 14 and 14.5 USRP.  When I plug my new USRP-2940R-120 and run a program written for a USRP-2940R-40, the driver gives me a message that says "you are using a bitfile configured for data clock which is slower than the rate required for the bandwidth of this device. Risk of aliasing in your signal. "(error 1073383043).

Fair enough.

Then I looked at the data of properties of the clock in the application project and that I could not change the values in there.  Then... How can I change the data FPGA clock frequency and what do I change to?

Thank you.

Hi emonk,

We have created a new FPGA target what samples at 200 MECH. / s to use the increased BW. You need to recompile your application using the new target. My recommendation on the best way to proceed is as follows:

1 generate a new project of the sample. With NEITHER-USRP 14.5 all the projects in the sample have now both targets (120 MPS and 200 projects of average size)

2. don't customize you the FPGA in your application? What if not all you have to do is use the bitfile for the target of 200 MECH. / s of the project in your host code example. If you did, you will have to redo these customizations in the FPGA VI of high level of the target. It can be trival or difficult depending on your application,.

3. you will then need to recompile. Because of the faster clock rate, the meeting schedule is more difficult, your FPGA IP may need to be optimized if you work in the field of data clock. I suggest that the kickoff from 5-10 compile at first because of the variabiliy in each compilation. If all still fails, use the timing to study reports where optimizations are needed.

I would like to know if you have any questions or encounter any problems.

Tags: NI Products

Similar Questions

  • data FPGA clock enable (5171r)

    Hello!

    I work with a scope of the SMU 5171r reconfigurable.  It is installed and the sample program works well, but I'm in trouble when I try to make my own program.  As well as the clock of 40 MHz usual the has 5171r of a 250 MHz clock that drives the a/d converters (clocks of 125 MHz and 375 MHz are also provided).  When I run my program, it behaves as if the data clock does not work.  I have attached the code that I am running, which sends a value down to the FPGA via DMA FIFO, channels to the field of 125 MHz clock, and then sends it immediately to the 40 MHz field and go back to the computer (again, via DMA FIFO).  This program runs as expected when everything is on the field of 40 MHz, but when I run the secondary loop on the clock of 125 MHz (data clock).  I didn't recover.  The 5171r has also 8 IO reviews and I put pairs of them to switch the clocks 250 MHz, 40 MHz and 125 MHz as a validation test-based and I don't see that the 40 MHz active clock.

    I spent some time looking into the design of instrument libraries to see if there was some step that had to be taken to enable the data clock or set the clock source (it may be deprived of an internal or external source), but I found nothing.  The user base of these things isn't great, but maybe someone has an idea of where to go to get this clock is going to know?  1 hour per cycle to compile my trial and error has been quite slow!

    Thank you very much!

    Ben

    Hi again,

    I kept digging on this and finally found the place in the IDLs where the clock is set.  Looks like I misunderstood the degree to which the IDLs are necessary to operate the device, so in a sense, the premise of my original post was in fact wrong.  If I could make a humble suggestion - it would be nice to have some sort of documentation that gives and overview of the libraries and how they fit in the physical capabilities of the device.  Or maybe it exists already and I just have not found?  I can't complain too much that since libraries are completely open, but to determine what they do by browsing each function is a rather painful process!

  • Satellite L300 PSLB8E - how to increase the ram clock speed

    I have a laptop which is equipped with 2 slots for ram PC2-6400
    1 - Samsung module M4 70T5663QZ3-CF7 [800 MHz] 2GB
    2 - Samsung module M4 70T2864QZ3-CF7 [800 MHz] 1 GB
    GPU: Intel GM45
    Processor: Intel Core 2 Duo T5870 Mobile @ 2000 MHz
    According to the manufacturer, this kit allows you to work with the 667/800 MHz memory speed, but the problem is that reference programs show the actual frequency of the ram at 399 MHz.

    Question No. 1:
    (a) why not clock the memory works in 800 MHz but only 399 MHz,
    (b) how to change to a 800 MHz

    Question No. 2:
    I'm going to buy a processor
    Penryn Core 2 Duo T9600 processor (1066 MHz FSB) (2.80 G) Hz or T9400 (2.53 G)
    and install it in the computer
    and increase the memory of 2 x 4 GB 800 MHz (8 GB)

    I explained that this possibility is provided for by the manufacturer according to the Maintenance Manual for:
    Toshiba computer
    Satellite L300/L305
    Satellite Pro L300
    EQUIUM L300
    SATEGO L300
    (PSLB8x/PSLB9x)
    (PSLBAx / PSLBBx)

    If this operation will help to improve system performance

    I put t know what programs you use and if it is good or not if this info can be good and perhaps not. I put t know. The fact is that you n t have one any influence on the functionality of memory and I doubt that you can do something and change some values.

    Exact specification for laptop laptop does not for me but I think that Satellite L300 must have Intel GM45 chipset and I presume that for your motherboard FSB is 667 MHz. You can improve up to 8 GB and compatible 4 GB RAM module has PA3670U-1M4G part number.

    Upgrade CPU on Satellite L300? I put t know man. A friend of mine has this model of laptop and I don t know if it's the right decision to spend money to upgrade this old model of laptop budget.
    http://APS2.toshiba-tro.de/KB0/TSB9401AX0001R01.htm

  • How can I use internal clock case OR USB - 6259 BNC for the acquisition of digital data in my own big software?

    I want to integrate the ANSI C sample program ReadDigPort - ExtClk.c in my own big package.

    I want to use the internal clock of the BNC NI USB-6259 (.. 80 kHz 120 kHz).

    In the document:
    High speed M: Series Multifunction DAQ for USB - 16-bit, up to 1.25 MECH built-in BNC connectivity. / s,.
    is written:
    Or sample DI source clock: Any PFI, RTSI, HAVE sample or convert clock, AO, Ctr n out internal and many other signals sample clock
    The digital subsystem doesn't have its own dedicated internal synchronization engine. Therefore, a sample clock must be provided another subsystem on the device or from an external source.

    How can I use internal clock case OR USB - 6259 BNC for the acquisition of digital data in my own big software?
    With what other subsystem on the device can generate a source of the clock? How?

    It is possible to set a clock on an internal counter (for example ' Dev1/ctr0"):
    Creates channels to generate digital impulses that define the freq and dutyCycle and adds the channel of the task that you specify with taskHandle.
    DAQmxCreateCOPulseChanFreq (taskHandle, "Dev1/ctr0" units, clockName, idleState,
    initialDelay, freq, the duty cycle); works

    But it is not possible to drive this internal clock to a terminal (for example "/ PFI0/Dev1"):
    DAQmxErrChk (DAQmxCreateCOPulseChanFreq (taskHandle, "/ PFI0/Dev1", clockName, units, idleState, '))
    initialDelay, freq, the duty cycle); does not work: error DAQmx: measurements: type I/O of the physical channel does not match the type of I/O required for the virtual channel you create. Name of the physical channel: PFI0. Name of the virtual channel: clock

    The sample clock source can be derived from an external terminal (for example "/ PFI0/Dev1"):
    Sets the source of the sample clock, the sample clock rate and the number of samples to acquire or generate.
    DAQmxCfgSampClkTiming (taskHandle, "/ PFI0/Dev1", maximumExpectedSamplingRate, DAQmx_Val_Rising, ")
    DAQmx_Val_ContSamps, bufferSize); works. Acquire or generate samples until you stop the task

    But it is not possible to derive the internal counter of the clock (for example ' Dev1/ctr0"):
    DAQmxCfgSampClkTiming (taskHandle, "Dev1/ctr0", maximumExpectedSamplingRate, DAQmx_Val_Rising,
    DAQmx_Val_ContSamps, bufferSize); does not work. Error: Acquire or generate samples until you stop the task: make sure that the name of the terminal is valid for the specified device. See Measurement & Automation explore valid names of terminals. Property: Property of DAQmx_SampClk_Src: DAQmx_SampClk_ActiveEdgeSource device: Terminal Source Dev1: Dev1/ctr0

    Hi datafriend,

    using what it says is correct:

    Or sample DI source clock: Any PFI, RTSI, HAVE sample or convert clock, AO, Ctr n out internal and many other signals sample clock
    The digital subsystem doesn't have its own dedicated internal synchronization engine. Therefore, a sample clock must be provided another subsystem on the device or from an external source.

    This means that if you do not use an external signal as clock you can use the sample clock to HAVE it on board or at the output of the internal counter.

    There are also 2 ANSI C examples in this regard:

    http://zone.NI.com/DevZone/CDA/EPD/p/ID/4485

    http://zone.NI.com/DevZone/CDA/EPD/p/ID/4488

    So in both cases you have to use a fictitious task you need only for the generation of the internal clock (HAVE or CTR)

  • How can I change the clock from the fpga to the entire code

    Hi I am new to the labview and using 8.6 demo (fpga modules in real time).

    I developed some codes to practice and you want to change the clock freq 200 MHz of 40 M for all codes.

    I mean how to use clock derived for my vi.

    You create a clock FPGA derived from the "clock of 40 MHz on board" in the project manager.

    Then in the properties for the FPGA, you can select 'Top level Clock '.  High level for the FPGA clock can be that what follows: 80 MHz, 120 MHz, 160 MHz, 40 MHz, 200 MHz

  • How do you do the clock and always visible date in windows 8?

    Hello

    On operating systems earlier, the date and the clock would be in the lower right.  The new windows 8 only, not to the date and the clock displayed for a bit.  I have to have the date and clock show at all times when I'm on the computer.  Weather in websites or otherwise.  I searched some topics and doesn't seem to be a solution.

    I must have, if this isn't an option, then someone at Microsoft windows needs to understand this.

    Thank you.

    Hi Carmen,.

    Unfortunately, that option is not possible because it is designed in this way.

    Hope this information helps.

  • Synchronize the FPGA clock for clock RT?

    Hello

    I use a sbRIO-9612.  Data are acquired for several weeks, and the problem is that the clock of RT of derivatives. I found a technical document to synchronize the clock of RT with SNTP server:

    http://digital.NI.com/public.nsf/allkb/F2B057C72B537EA2862572D100646D43?OpenDocument

    But I do not find anything on the FPGA clock. My data are acquired by the FPGA, my question is: How can I synchronize my FPGA clock with my RT clock or the SNTP server? (this is probably a stupid question, but she clearly explains my problem) SbRIO is suitable for my needs? Should I waive any 'acquisition of FPGA based' and use a different hardware architecture to perform data acquisition synchronization?

    Thanks in advance for any help.

    Julien

    Hi Julien,

    Take a look at "RT masters FPGA synchronization Example.vi" on the FPGA Timekeeperpage.  There's a Subvi, which uses a timed writing periodically present to the FPGA so that the FPGA can have a domain time synchronized with respect to RT if you have questions about this example, try to publish to category of Discussion of the project.

    -Steve K

  • How to increase the sampling rate in this VI?

    Hi all

    I have recently inherited this mess of a VI and can not figure out how to increase the sampling rate. I tried to change the "ms of waiting ' clock, but it does not add more data points." The main VI, as well as the Subvi, which contains a Daq Assistant to a load cell and LVDT is attached. Any ideas on how to improve the sampling without a complete overhaul would be greatly appreciated!

    Thanks in advance!

    If you are grateful, feel free to give congratulations and mark the topic as resolved.

  • My internet speed is too low. How to increase internet speed?

    I use the broadband internet connection. I check my internet speed.
    My internet speed is too low. How to increase internet speed?

    Message was edited

    Hello

    : D you can not increase the speed of the internet!
    It depends on your internet service provider and only the ISP can increase!
    If you have some problems with the speed of data transfer, I recommend contacting your internet service provider!

    Good bye

  • How to save data in the text file of Spartan 3

    Hi all

    I would like to kindly save the data table text file or a spreadsheet on vi using fpga spartan 3e as an fpga target. Once I added all the functions related to the operation of file, it gave an error that these functions are not supported by the target device.

    could you please help me with this

    Thank you

    Rania

    Hi David,

    Thank you for posting. You use LabVIEW? If so, what version of LabVIEW FPGA do you use? You use a host VI, or any deployment of code at your target to run? The file IO VI probably won't compile to target because they are not intended to be used on your host computer. Resources and the paths of files do not exist on the target FPGA, but rather on the side of the host. I have included a link below that describes how to transfer data between the FPGA and host. I hope this helps!

    http://zone.NI.com/reference/en-XX/help/371599F-01/lvfpgaconcepts/pfi_data_transfer/

  • How to increase transmit UDP Ecris VI size

    I get the following error trying to send a message using UDP VI transfer:

    113 error in UDP, write to dataServer.vi

    Possible reasons:

    LabVIEW: A message sent on a datagram socket was larger than the message buffer internal or other network limit, or the buffer used to receive a datagram was smaller than the datagram itself.

    Does anyone know how to increase the size of the internal buffer to allow larger datagrams?

    Hello Davida2000,

    You should be able to use 8192 bytes which is 0x2000 in hexadecimal. The problem you are experiencing would be driven to Winows buffering, so the best solution may be to make a loop and the queue and extract the data from there.

  • Urgent: How to get data analog sbRio 9632?

    Hi ~
    I have new in Labview.
    I want to ask how to get analog data of the SbRio-9632 with FPGA?
    Example of
    I want to get the voltage/current of a solar panel.
    So I connect the unit to the analog input.
    But how to write a program to read the data?
    Need urgent help ~ thanks ~

    Hi GTHao,

    I think you can consult this manual for sbRIO 9632. There the guide step by step on how to extract data from sbRIO (although it says about RIO that includes cRIO but they are more a less the same thing in regards to getting analog data for your case)

    Please take note the voltage and current that the analog input sbRIO pines can take before you plug.

    It will be useful.

    Thank you

    Warm greetings,

    Lennard C

  • How to increase the Tablespace/datafile max_size

    Hello

    Under current size and max_size of the data_file obtained query. Now, I want to increase the size of the data file.

    SQL > select FILE_NAME, nom_tablespace, BYTES/1024/1024, MAXBYTES/1024/1024 of dba_data_files
    2 where nom_tablespace like '% Tablespace_one' nom_tablespace order;

    FILE_NAME NOM_TABLESPACE BYTES/1024/1024 MAXBYTES/1024/1024
    -------------------------------------------------------------------------------------------------------------------------------------
    4988 5000 TABELSPACE_ONE /xxx/yyyyy/ABCD/data/file_name.dbf

    I want to multiply the max_size up to 6GB.

    Please suggest how to increase the max_size up to 6GB file.

    Thank you and best regards,
    Vincent.
    Setting the maximum size
    It makes only sense to set a maximum size for a datafile if autoextend is on. Hence, the statement to set the maximum size requires to specify autoextend on.
    alter database datafile '/xxx/yyyyy/ABCD/data/file_name.db' autoextend on maxsize 6G;
    
  • How to increase disk space without guest operating system reinstallation

    Hello

    I take advantage of VMWARE with linux host and vista 64 comments, but I encountered a problem, running out of disk space.

    During the installation of VISTA, I chose 25 GB of disk space, as recommneded by IG of installation).

    Now I need more space.

    According to the virtual disk manager, I did

    VMware-vdiskmanager mydisk.vmdk 40 GB - x

    increase to 40 GB. It seemed to work, as I received a message,

    "Disk expansion completed successfully."

    Now "The VM setting" windows says HD 40 GB. But internally under VISTA, disk space is always 25 GB. In fact, the size of disk file (view on the side of linux) is only 22 GB. No idea how to increase disk space. He must start from the installation, but it's a little painful.

    Thank you.

    dw104

    If the guest is Vista or Windows 2008, then you should be able to extend the partition while the system is running.

    Take a look at this article: http://www.bleepingcomputer.com/tutorials/tutorial133.html

    Windows NT/2000/XP/2003 won't let you do that, while the disk is online.  Vista and Windows 2008 changed things.

    For versions of Windows Vista/2008, look for the Dell ExtPart.exe utility, as it will do the job for you.

    See you soon,.

    Jase McCarty

    http://www.jasemccarty.com

    Co-author of VMware ESX Essentials in the virtual data center

    (ISBN:1420070274) Auerbach

    Please consider awarding points if this post was helpful or appropriate

  • How to increase! Buffer Cache Hit Ratio

    Hi all

    my database performance is low how to increase the buffer Cache Hit Ratio

    Database version: 8.1.7.0.0
    ~ Data upward since: 09:56:23, September 23, 2010
    ! Buffer Cache Hit Ratio: 81.6157
    ~ The library Cache Miss Ratio: 0.03
    ~ Dictionary Cache Miss ratio: 6.6979

    [Use of the shared pool] Exec time 0 seconds
    ~ Total unused Mb: 251.88
    ~ Total MB used: 98.12
    ~ MB total: 350
    ~ % Pool used: 28.04

    Buffer Cache Hit Ratio is an indicator of sense of the performance of the system.
    Are there users complain? If there are no complaints, there is also no problem.

    The best way to increase the buffer Cache Hit Ratio is executing statspack identify offensive SQL and to grant this SQL offensive.

    ----------
    Sybrand Bakker
    Senior Oracle DBA

Maybe you are looking for