Use MATLAB to Prototype Deep Learning on an Intel FPGA
FPGA-based hardware is a good fit for deep learning inferencing on embedded devices because they deliver low latency and power consumption. Early prototyping is essential to developing a deep learning network that can be efficiently deployed to an FPGA.
See how Deep Learning HDL Toolbox™ automates FPGA prototyping of deep learning networks directly from MATLAB®. With a few lines of MATLAB code, you can deploy to and run inferencing on an Intel® Arria 10 SoC board. This direct connection allows you to run d