Impact-Site-Verification: dbe48ff9-4514-40fe-8cc0-70131430799e

Search This Blog

Deep Learning on a Xilinx FPGA with MATLAB Code

FPGA-based hardware is a good fit for deep learning inferencing on embedded devices because they deliver low latency and power consumption. Early prototyping is essential to developing a deep learning network that can be efficiently deployed to an FPGA.

See how Deep Learning HDL Toolbox™ automates FPGA prototyping of deep learning networks directly from MATLAB®. With a few lines of MATLAB code, you can deploy to and run inferencing on a Xilinx® ZCU102 FPGA board. This direct connection allows you to run deep learning inferencing on the FPGA as part of your application in MATLAB, so you can converge more quickly on a network that meets your system requirements.


Join us on Telegram: https://t.me/matlabcastor

No comments

Popular Posts