Intel® Vision Accelerator Design with Intel® Arria® 10 FPGA
As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-F100 is a PCIe-based accelerator card using the programmable Intel® Arria® 10 FPGA that provides the performance and versatility of FPGA acceleration. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads. >> Difference between Mustang-F100 and Mustang-V100.
Half-height, half-length, double-slot.
Supported OpenVINO™ toolkit, AI edge computing ready device.
FPGAs can be optimized for different deep learning tasks.
Intel® FPGAs supports multiple float-points and inference workloads.
PCIe FPGA Highest Performance Accelerator Card with Arria 10 1150GX support DDR4 2400Hz 8GB, PCIe Gen3 x8 interface
OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware and maximizes performance.
It can optimize pre-trained deep learning model such as Caffe, MXNET, Tensorflow into IR binary file then execute the inference engine across Intel®-hardware heterogeneously such as CPU, GPU, Intel® Movidius™ Neural Compute Stick, and FPGA.
Get deep learning acceleration on Intel-based Server/PC
You can insert the Mustang-F100 into a PC/workstation running Linux® (Ubuntu®) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-F100 can also work with Intel® OpenVINO™ toolkit to optimize inference workloads for image classification and computer vision.
Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit (Support Windows 10 in Mar. 2019 & more OS are coming soon)
OpenVINO™ toolkitIntel® Deep Learning Deployment Toolkit- Model Optimizer
- Inference Engine
Optimized computer vision libraries
Intel® Media SDK
*OpenCL™ graphics drivers and runtimes.
Current Supported Topologies: AlexNet, GoogleNet, Tiny Yolo, LeNet, SqueezeNet, VGG16, ResNet (more variants are coming soon)
Intel® FPGA Deep Learning Acceleration Suite
High flexibility, Mustang-F100-A10 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, and MXNet to execute on it after convert to optimized IR.
*OpenCL™ is the trademark of Apple Inc. used by permission by Khronos
QNAP NAS as an Inference Server
OpenVINO™ toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance. When used with QNAP’s OpenVINO™ Workflow Consolidation Tool, the Intel®-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO™ toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-F100 to achieve optimal performance for running inference.
1. QTS 4.4.0 (or later) and OWCT v1.0 are required for the QNAP NAS. (OWCT v1.0 coming soon)
2. To use FPGA card computing on the QNAP NAS, the VM pass-through function will be disabled. To avoid potential data loss, make sure that all ongoing NAS tasks are finished before restarting.
Intel® Arria® 10 GX1150 FPGA
PC: Ubuntu 16.04.3 LTS 64-bit, CentOS 7.4 64-bit (Support Windows 10 in Mar. 2019 & more OS are coming soon)
Voltage Regulator and Power Supply
Intel® Enpirion® Power Solutions
8G on board DDR4
PCI Express x8
Compliant with PCI Express Specification V3.0
Power Consumption (W)
Operating Temperature & Relative Humidity
5°C~60°C (ambient temperature)，5% ~ 90%
Active fan: (50 x 50 x 10 mm) x 2
169.5 mm x 68.7 mm x 33.7 mm
*Preserved PCIe 6-pin 12V external power
Dip Switch/LED indicator
Support up to 8 cards. Please assign a card ID number (from 0 to 7) to the Mustang-F100 by using rotary switch manually. The card ID number assigned here will be shown on the LED display of the card after power-up.
*Standard PCIe slot provides 75W power, this feature is preserved for user in case of different system configuration.
Check Compatible NAS Models
TVS-872XT, TVS-872XU, TVS-872XU-RP
Please remove the Thunderbolt 3 PCIe card from the TVS-472XT, TVS-672XT and TVS-872XT in order to install the Mustang-F100
Compatibility for Mustang-F100:
TS-883XU, TS-883XU-RP, TVS-872XU, TVS-872XU-RP, TS-1283XU-RP, TS-h1283XU-RP, TVS-1272XU-RP, TVS-1282, TVS-1282T, TVS-1282T3, TS-1683XU-RP, TS-1685, TVS-1672XU-RP, TS-2483XU-RP, TVS-2472XU-RP, TS-2888X, TS-h686, TS-h886, TS-h1683XU-RP, TS-h2483XU-RP
| Stock code || IBQ-QAMF100-A10|
| Brand || Qnap|
| Series || Accelerator|
| Model || Mustang-F100-A10-R10|
| Type || PCIe v3 x8, Arria 10 GX1150 FPGA, 8GB DDR4|
| UPC || 842936100887|
| ROHS || Y|
| Date added || 20/02/2019|
Mustang-F100-A10-R10 Spec Information
| Form Factor / Drive Height || 169.5 x 68.7 x 33.7 mm|
| RAM/Cache || 8GB|
| Guarantee || 2yr|
| Processor || Intel Arria 10 GX1150 FPGA|
| Weight (kg) || 1kg|
| Watts || <60W|
Why People Choose Us?
14 Day Return Guarantee
If you change your mind, you can return all or part of your order.
Next Day Delivery
Order before 4pm*, and in-stock products will be with you the next day. (Or Saturday. Or even Same Day).
Wherever you are in the world, we can get your order to you at high speed.
SpanStor Build & Test
Our free service helps you choose the right NAS and Drives, then makes sure it is built & ready to use as soon as you get it.
Professional Live Support
We are here to assist you - before, during and after your purchase, by phone, email or live chat.