/r/FPGA

Photograph via snooOG

A subreddit for programmable hardware, including topics such as:

  • FPGA
  • CPLD
  • Verilog
  • VHDL

A subreddit for programmable hardware, including topics such as:

  • FPGAs
  • CPLDs
  • Verilog
  • VHDL

Discord Server:

Related subreddits:

  • General Electrical and Computer Engineering discussion

/r/ECE

  • General electronics discussion

  • /r/electronics/

  • Electronics help / discussion

  • /r/AskElectronics

    /r/electronic_circuits

  • Discussion on (hardware) chip design

  • /r/chipdesign

  • Other FPGA related subreddits:

  • /r/fpgagaming

    Links to tools to get started:

    Meme posts allowed on Fridays ONLY. Please make sure to flair.

    /r/FPGA

    63,889 Subscribers

    2

    FPGA share in Defense/Aerospace

    Looking at the current US defense job market, I see that there’s very low demand for a “software” jobs like MCU programming. Researching I found that 10 years ago there were various DSP for example from TI like C6678 that was built specifically for radar applications, and now I hardly see them, and TI seems to retire from that niche too.

    However seems like FPGA is doing really well in that field, so the question is what applications apart from various high parallel data processing(radar, etc.) require those nanoseconds performance edge, hardware design can provide?

    1 Comment
    2025/02/02
    11:05 UTC

    1

    25% Pay Cut for More Interesting Design Role?

    Hello,

    I am about to graduate in June with a MSEE. I have two job offers on hand but I’m having a really hard time deciding which one to take.

    The first job is higher paying ($125k base with up to 20% profit sharing, $15k sign on bonus, $12.5k relocation bonus). It is a post-Si validation role for a chip company in the Bay Area.

    The second job is lower paying ($110k with no profit sharing, no sign bonus, $5k relocation bonus) but will be for a power electronics design role for a defense company in San Diego.

    Including the yearly bonus of 20%, I would be taking a 25% pay cut taking the design role. However, hardware design is significantly more interesting to me than hardware validation python scripting. My thesis project is also focused on power electronics. I’ve also heard that the growth experienced as a design engineer is very valuable.

    In my early career, should I take the money, or the more interesting job?

    Will the money literally “pay off” in the long run over taking a more interesting job?

    3 Comments
    2025/02/02
    10:19 UTC

    0

    FGPA for NPU

    Is a FPGA a good route for a NPU device? If so would anyone have any recommendations? I need it to mostly do inference. The model will be doing object detection. I figure it will definitely need a good amount of ram and of course being able to run some version of linux. I really appreciate any help with this, thanks!

    2 Comments
    2025/02/02
    03:31 UTC

    2

    AI and advancements in PnR (place and route)?

    I'm asking here despite it being a question both applicable to PnR for both FPGA and ASIC design flows.

    Have EDA companies gained any meaningful improvements in this stage of the design process using AI? And "real AI", not the vibe that everything and anything remotely software related is called "AI" nowadays.

    I ask because I'm still skeptical of AI (LLMs specifically) churning out great front end RTL or test bench components. They seem great to get ideas and create skeleton code, but nothing I'd actually put in production or something verbatim.

    Back-end design processes however, seem a lot more ripe for the pickings for AI advancements to have a huge impact, but that's my high level view. Curious if anyone has in depth opinions or seen stuff in the industry that's publicly available for me to go research?

    0 Comments
    2025/02/02
    00:38 UTC

    17

    Hardware acceleration

    How hard would it be to use hardware acceleration with FPGA on a STM32 Nucleo board?

    I am developing a robot for student robotics competition,and learning digital design in the meantime.

    Among other things the robot is calculating its current position and voltage output to motors every 1ms and i thought maybe it could be good to accelerate this process with an fpga?Just and idea,idk how possible or practical it is

    7 Comments
    2025/02/01
    22:23 UTC

    0

    Low power clock design for socs

    I'm using vivado for the first time and I have a project submission in a day. I have all the codes and everything but I don't know how to use it except adding the first clock divider.v and .tb file them simulating it. I have further codes as well but I don't know how to add where to add and proceed further. At the end I've to show the comparison of power analysis of both. Need help asap

    5 Comments
    2025/02/01
    17:20 UTC

    3

    Clocking Blocks in SystemVerilog

    I'm trying to understand how clocking blocks work and I'm a little bit stuck. I have this example task here:

    (I spent 15 minutes trying to format a multi-line code block and include a waveform image in the same post)

    https://preview.redd.it/gyoiom823kge1.png?width=723&format=png&auto=webp&s=4a93f6eb5579ec3fc78623bb386c98d1711080b3

    My understanding from the LRM is that @(cb) is equivalent to @(posedge clk) where I've defined the clocking blocks event to be off of the rising edge of clk. As I read that task, I would expect to see the address and read enable bit to be driven for one and only one clock cycle. This does not happen and I have not been able to find a sufficient explanation as to why. I'm sure the problem lies in my understanding of how clocking blocks work but I am at a loss as to why. I should point out that I am not using program blocks - modules only.

    Now, in the simulator, I see this:

    So the signals are changing in the clocking block, but on the negative edge of the clock?! Can someone explain why I see what I see and how this is the expected behavior based on what I read in the LRM regarding clocking blocks?

    https://preview.redd.it/h4z1m73e2kge1.png?width=740&format=png&auto=webp&s=4a90f03bd4d5c9660d343b7feafc9e5c42937c8e

    4 Comments
    2025/02/01
    16:21 UTC

    0

    Programming FPGAs on MacOS: How-to

    31 Comments
    2025/02/01
    15:01 UTC

    3

    Please give criticism

    Hey all, a few days ago I posted asking for your biggest frustrations while developing. For the most part, it seemed like the biggest problems were slow build timer, lack of collaboration, and basically only 2/3 options of very mid developmental software. i put together this mock idea and wanted your thoughts. In all, it would be a vs-code like ide where synth and sim go thru the cloud and there are 0 local installs. no installing vivado, etc. development goes thru the browser (or not, if thats better), and teams have the ability to collaborate. as of right now, im just using open source platforms, but if you guys think this is a good idea i will try and get the big players in. rn, i mess around with fpga's as a hobby, so i would appreciate any advice to make this a real idea cause tbh i probably don't know my lefts from my right even tho i feel like a quant:

    https://fabrix-cbaf6b.webflow.io/

    13 Comments
    2025/02/01
    10:30 UTC

    10

    Can my college turn my license off?

    In a lab class, we completed to files in VSCode. My laptop died, so I had to wait until the end of the day to return to my room and charge it. Whenever I tried to run the programs in Modelsim to get my wave graphs, I would get a LM_License_Error notification which I could not figure out how to fix. However, when I tried again the next morning it seemed to work fine. I tried again today in the evening to find the same error. The environmental variable we all use is given by the college, so does anyone know if they are able to close our licenses in the evening or something? Do I just have a weird error or is it my school?

    7 Comments
    2025/02/01
    01:22 UTC

    2

    FPGA to FPGA communication with Optical Transcievers

    I have two of these FPGA boards that I would like to communicate the chip temp and chip voltage of one to the other. One will be used as a DAQ basically and not send any data itself to the other. I tried to write my own IPs to do this and am having issues with interfacing with the XADC. I wrote IPs that will just send serial data to the tx pin of the transceivers and read the rx pins based on a simple data struture that has 1 byte Start Delimiter, 1 byte Paylod length, n bytes of Payload and 1 byte Checksum. I have yet to test it because I cannot get the data out of the XADC properly. So my question is can anyone help me interface with the XADC or am I going about this all wrong and is there a better way with the provided IPs? If that is that case what would a good structure be for the IP block design?

    7 Comments
    2025/01/31
    22:09 UTC

    1

    Help Needed, DE10-Lite Board Instantly Fails on Startup

    Programmer instantly fails when clicking start. Altera USB-Blaster driver is up to date. I am at a loss of what could be causing it. This program was working a couple of months ago, the only difference is I have a new (identical) DE10 board and a new USB-Blaster cable.

    https://preview.redd.it/nhvuqctshege1.png?width=794&format=png&auto=webp&s=9e7892e4927e2188bd335518a86b6c3633e55423

    1 Comment
    2025/01/31
    21:51 UTC

    1

    Vitis 2024.2 doesn't work

    is Vitis 2024.2 not working for you too? I can't create a platform, I click the button and nothing happens

    2 Comments
    2025/01/31
    19:28 UTC

    1

    A workaround to overcome the "full Vitis installation error" and start vitis --classic

    If you have a full Vitis installation and you're still seeing:

    Error: --classic option is only supported by full Vitis installation.
    If you wish to run the classic Vitis IDE, please download the AMD Unified Installer and perform a full Vitis installation.

    You can use this:

    awk '/Error: --classic/{skip=1; next} /exit/ && skip{skip=0; next} {print}' $(command -v vitis) > myvitis && chmod +x myvitis && ./myvitis --classic

    More info:

    In 2024.2 bin/vitis is a bash script that gates vitis --classic on two empty files in the Vitis install: .vitis_embedded and .vitis_for_hls.

    The "one-liner" above removes the exit . It's written so that it may work across versions. You can also delete or rename these files.

    Here's the logic:

    vitisNewOption=1
    for arg in "$@"; do
     
    ...
     
      elif [[ "$arg" == "-classic" || "$arg" == "--classic" ]]; then
        vitisNewOption=0
      fi
    done
     
    ...
     
    if [ -e "$XILINX_VITIS/.vitis_embedded" ]; then
      export VITIS_EMBEDDED_INSTALL=true
    fi
    if [ -e "$XILINX_VITIS/.vitis_for_hls" ]; then
      export VITIS_HLS_INSTALL=true
    fi
     
    ...
     
    if [ $vitisNewOption == 0 ]; then
       if [[ "$VITIS_EMBEDDED_INSTALL" == "true" || "$VITIS_HLS_INSTALL" == "true" ]]; then
        echo "Error: --classic option is only supported by full Vitis installation."
        echo "If you wish to run the classic Vitis IDE, please download the AMD Unified Installer and perform a full Vitis installation."
        echo
        exit 1
       fi
       "$RDI_BINROOT"/loader -exec rdi_vitis "${@:2}"
       exit $?
    fi
    0 Comments
    2025/01/31
    15:13 UTC

    19

    undergrad tryna get an internship

    hey guys, im a CompE freshie who went down the rabbithole of FPGA design and coding. i love how different it is as compared to regular coding and am quite interested in learning more (i have gone through most of the resources on Nandland)

    i kinda want to try and get an internship so i can have more exposure and im looking for any advice to beef up my portfolio! im currently taking a digital design course a year in advance because i want to be able to take higher level modules on it HAHA

    do you guys have any advice / any recommendations for a solid project for my resume?

    thanks in advance for yalls guidance and support 🙏

    3 Comments
    2025/01/31
    11:43 UTC

    1

    Quartus Signal Tap: Display/show signal twice?

    https://preview.redd.it/mkdonqc4tage1.png?width=1357&format=png&auto=webp&s=0a18fde019eea9312aafe78e8799f42890de6b48

    Hey guys, does anybody know how to display/show the same signal twice in Signal Tap?
    The only work around that I found is just to create another instance that runs then in parallel with the other instance. Obviously, I could assign another signal and list it, but it's just cumbersome shit.
    Any help would be appreciated!

    0 Comments
    2025/01/31
    09:30 UTC

    23

    Veryl 0.13.5 release

    I released Veryl 0.13.5. Veryl is a modern hardware description language as alternative to SystemVerilog. This version includes the following features:

    • Support to override dependencies with local path
    • Introduce inst generic boundary

    Please see the release blog for the detailed information:

    https://veryl-lang.org/blog/annoucing-veryl-0-13-5/

    Website: https://veryl-lang.org/

    GitHub: https://github.com/veryl-lang/veryl

    6 Comments
    2025/01/31
    07:10 UTC

    2

    Tips for hardware algorithm in Verilog

    I've been learning Verilog and can implement basic algorithms like Vedic multiplication and Carry Lookahead Adders (CLA). However, when I try to tackle more complex ones like CORDIC or SRT division, I get overwhelmed. There are so many architectures and reference codes available, and I struggle to figure out which approach to follow.

    How do you break down and choose the right architecture when implementing these algorithms? Any tips on understanding existing reference code without getting lost in the details? Any help would be appreciated! Thank you for reading

    2 Comments
    2025/01/31
    03:45 UTC

    5

    Best approach to fast data streaming through FPGA or Zynq

    Hi everyone, apologies in advance for the beginner's question and the long post.

    For a university project I am looking into collecting data from multiple synchronized ADCs and streaming it to PC, I believe an FPGA or Zynq-based solution is needed due to the high data rate, and I'm looking for advice regarding the easiest path to implementation.

    Our custom PCB is wired to sample data simultaneously from 4x AD7771 ADCs, for a total of 32 channels, and each sample is 32 bits long. We are aiming for 50kHz sampling rate, but can't reach it with the microcontroller board we are currently using as digital interface (Teensy 4.1). Knowing that commercial devices with similar data rates use FPGAs and SoCs, I am thinking of using a Zynq-based board and do the following:

    • Parallelize readout from the ADCs using a custom FPGA core (now done serially with bit read functions on Teensy)
    • Store readout bytes into a memory location accessible by both FPGA and ARM core
    • Stream data to PCB by using the ARM core

    This would be my first embedded project which involves something other than a microcontroller so I want to make sure I set off on the right foot. My questions are:

    - Am I right in thinking a Zynq is the best embedded solution for this project? Or would a pure FPGA with a soft core processor work too? My understanding is that soft processors would run at much slower speed than a physical ARM core.

    - The required data rate is 51.2Mbps (32ch x 32b x 50 kHz). Am I right in thinking that I will need full-speed USB capability (480 Mb/s) for this project? Is there any existing devboard wired for full-speed USB?

    Many thanks!!

    13 Comments
    2025/01/31
    02:01 UTC

    13

    Avoiding bad practices

    Hello!

    I work as a software engineer for satellites, so I'm fairly familiar with what FPGAs can do. But, I thought it's about time to actually do some development for one just to be less ignorant.

    I've found some interesting resources which are maybe a good starting point, e.g., this for Verilog development, but I obviously don't know enough to know if it's good or not.

    I bought a Tang Nano 9K dev board because it was the cheapest option. I'm using Gowin as my EDA but I suspect it's not the best choice.

    I'd love to hear some resources and advice for getting up to speed with FPGA development. A steep learning curve is completely fine. I'd like to know of the professional/industry standard tools used, rather than just hobbyist things. Obviously my Nano 9K is hobbyist, but it's barely my foot into this.

    My boss heard that I'm getting into FPGAs and mentioned I could be put on an external course on behalf of the company so that they can get a proper FPGA engineer. What tools and information should I be aware of if I get put on a course like this?

    Many thanks in advance!

    5 Comments
    2025/01/30
    23:33 UTC

    19

    Is it viable to develop an AI accelerator by using an FPGA together with VRAM chips?

    I know it would probably be limited in comparison with a more general purpose modern GPU with a small nm process and maybe not as energy efficient but maybe with clustering multiple units something decent can be achieved and be price competitive for a specific use case like LLM inference. What would be a good approach to develop something like this? Any chip recommendation or insights?

    8 Comments
    2025/01/30
    22:17 UTC

    1

    Trouble with SPI Slave on CMOD A7 & Zynq Z2

    Hi reddit,

    I'm working on a fun hobby project GitHub link involving a blinking LED using the CMOD A7 and/or Zynq Z2. The idea is to set the operation mode and PWM speed via SPI by writing to a "register" on the FPGA to then do my logic with. I followed this guide for implementing an SPI slave in VHDL, but I'm having trouble getting it to output my RX_data.

    My Setup:

    Block Design: Includes an SPI slave module (downloaded from the guide above) , IO for SPI but also a button for `RX_req`, memory module and a constant X set to 1.

    Issue: The SPI slave isn't properly outputting the received data (RX_data). which means my memory module is useless too.

    BD

    Maybe issue

    I also found this but idk how to solve this.

    Timing issues

    1 Comment
    2025/01/30
    19:58 UTC

    4

    Tips for verifying AXI UVM?

    Hello everyone, I'm a senior in EE at Purdue, currently working on AXI UVM. While there are some resources available, does anyone know of a good guide on verifying a bus protocol? I’d really appreciate any help ASAP, as I’m in Senior Design and this is my last semester. My current plan is to verify the AXI master using a BUS model.

    0 Comments
    2025/01/30
    18:15 UTC

    3

    FIFO IP buildinfifo independent clock

    I had a lab today where we were supposed to implement the FIFO using vivado IP catalog. But after 3-4 hours no one was able to produce the results in testbench. the dout just doesnt read at all. tried reading document, it talks about giving rst a cycle lasting for at least 5 rd and wr clock cycles. but it still didnt worked to the point giving random stuff in testbench now. If someone has any experience on using it. let me know what im doing wrong here or any examples that implements the exact ip.
    heres the testbench simulation and code. rd_clk is 50Mhz while wr_clk 100Mhz with 8 bit width and 512 depth.

    \define wrclk_period 10 ;`

    \define rdclk_period 20 ;`

    module simulation_ffio();

    reg rst;

    reg wr_clk ;

    reg rd_clk ;

    reg [7:0] din ;

    reg wr_en ;

    reg rd_en ;

    wire [7:0] dout ;

    wire full ;

    wire empty ;

    initial begin

    rst = 0 ;

    #100 ;

    rst = 1 ;

    #100 ;

    rst = 0 ;

    #100 ;

    end

    datareadwritethroughIP FG0(rst,wr_clk,rd_clk,din,wr_en,rd_en,dout,full,empty) ;

    initial wr_clk = 1 ;

    always #5 wr_clk = ~wr_clk ;

    initial rd_clk = 1 ;

    always #10 rd_clk = ~rd_clk ;

    integer i ;

    initial begin

    #300;

    wr_en = 1'b0;

    rd_en = 1'b0;

    for(i= 0 ; i<=5 ; i = i+1 ) begin

    wr_en = 1 ;

    din = i ;

    #\wrclk_period ;`

    end

    wr_en = 0 ;

    #\wrclk_period ;`

    #\rdclk_period ;`

    rd_en = 1 ;

    #50;

    rd_en = 1'b0;

    end

    endmodule Edit1: https://pastebin.pl/view/d398706c code uploaded

    testbench fifo

    5 Comments
    2025/01/30
    17:44 UTC

    1

    Waveform wont simulate

    Everytime i try to run my wvf simulation, this thing pops up for probably half a second or quarter a second, and nothing will happen. I happen to record my screen and take a screenshot of the dialogue box that is appearing so fast. After the dialogue box disappears, nothing will happen, no "read only" window will appear, as it was suppose to happen if i do it in school

    Any help will be nice, Thank you!

    https://preview.redd.it/jc460mf436ge1.png?width=638&format=png&auto=webp&s=a3df3476f0397cb544ca99158da2eaef7e5af00c

    0 Comments
    2025/01/30
    17:35 UTC

    7

    Improving Floating Point Precision in an Algorithm on FPGA

    Hi everyone,

    I’ve been working on implementing the Artificial Bee Colony (ABC) Algorithm in Verilog for an FPGA, targeting the minimization of a 10-dimensional sphere function. So far, the implementation works, but I’m running into a precision issue during fitness value calculations.

    The fitness value is calculated using the formula:
    Fitness = 1 / (1 + f(x))

    The problem is that the precision of the fitness value seems limited. For example, when the fitness value reaches 0.99999988079071, the next improvement jumps directly to 1.0. As a result, the function doesn’t get minimized beyond a magnitude of 10^-13.

    I’m currently using Vivado’s Floating Point IPs with 32-bit single precision for all arithmetic operations. The only solution I can think of is switching to double precision, but that would require significant changes to the design.

    So, are there any alternative ways to improve the precision that I might be missing ?

    9 Comments
    2025/01/30
    14:55 UTC

    3

    How to make a bitstream persistent/NV ?

    Im using Zedboard FPGA, and Ive a very simple design that is fully RTL verilog based not using any MCU or blockdesign. I'm trying to find a tutorial on how to load this bitstream into some non-volatile memory in the board so Is not errased when disconnecting the power.
    So far I only find tutorials on how to do it when using the SDK for blockdesings with more complex setup. I would prefer to keep it simple only using Vivado.

    12 Comments
    2025/01/30
    14:03 UTC

    32

    Noob question sorry

    Context: I am studying CS in uni

    Why is quartus and modelsim so fucking shit? Don't even ask me for clarification, don't you dare, you know what I mean, was modelsim made for windows Vista or something? What is this unfriendly ass UI? Why is everything right click menus everywhere? Who made this? WHY DOESNT IT TELL ME THERE ARE ERRORS IN MY VHDL BEFORE COMPILING??? WHY DO THINGS COMPILE ON QUARTUS BUT THEN DONT COMPILE ON MODELSIM??? Do people use other programs? I am so lost e erything is so easy except for navigating those pieces of shit 😭 It could just be because my uni uses an older version but it's just from like 2020 afaik?

    23 Comments
    2025/01/30
    13:30 UTC

    Back To Top