QNAP Mustang-V100 interface cards/adapter Internal
Stock code: MUSTANG-V100-MX8-R10 | Quick code: 112652

QNAP Mustang-V100 interface cards/adapter Internal

Mustang-V100, PCI Express x4, Intel Movidius Myriad X MA2485, 169.54x80.05x23.16 mm

QNAP Mustang-V100

£892.46 (ex. VAT) Shipping calculated at checkout

Out of stock

Custom Spanstor Build & Test

As part of our Spanstor Build & Test services, any enclosure bought with drives, we will include all firmware installations, physical drive installations, RAID/cache configurations and drive bad-sector checks, all at no additional cost. If you have any questions, please contact our friendly sales team by chat, phone or email.

Configure this product

As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI) you not only need more storage space on your NAS, but also require the NAS to have greater power to optimize targeted workloads. The Mustang-V100 is a PCIe-based accelerator card using an Intel® Movidius™ VPU that drives the demanding workloads of modern computer vision and AI applications. It can be installed in a PC or compatible QNAP NAS to boost performance as a perfect choice for AI deep learning inference workloads.

OpenVINO™ toolkit
OpenVINO™ toolkit is based on convolutional neural networks (CNN), the toolkit extends workloads across Intel® hardware and maximizes performance.

It can optimize pre-trained deep learning model such as Caffe, MXNET, Tensorflow into IR binary file then execute the inference engine across Intel®-hardware heterogeneously such as CPU, GPU, Intel® Movidius™ Neural Compute Stick, and FPGA.

Get deep learning acceleration on Intel-based Server/PC
You can insert the Mustang-V100 into a PC/workstation running Linux® (Ubuntu®) to acquire computational acceleration for optimal application performance such as deep learning inference, video streaming, and data center. As an ideal acceleration solution for real-time AI inference, the Mustang-V100 can also work with Intel® OpenVINO™ toolkit to optimize inference workloads for image classification and computer vision.

QNAP NAS as an Inference Server
OpenVINO™ toolkit extends workloads across Intel® hardware (including accelerators) and maximizes performance. When used with QNAP’s OpenVINO™ Workflow Consolidation Tool, the Intel®-based QNAP NAS presents an ideal Inference Server that assists organizations in quickly building an inference system. Providing a model optimizer and inference engine, the OpenVINO™ toolkit is easy to use and flexible for high-performance, low-latency computer vision that improves deep learning inference. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-V100 to achieve optimal performance for running inference.

Ports & interfaces
Internal Yes
Host interface PCIe
Expansion card standard PCIe 2.0
Technical details
Chipset Intel Movidius Myriad X MA2485
Performance
Chipset Intel Movidius Myriad X MA2485
Product colour Black
Colour
Product colour Black
Design
Internal Yes
Product colour Black
Cooling type Active
Number of fans 1 fan(s)
Features
Chipset Intel Movidius Myriad X MA2485
Operational conditions
Operating temperature (T-T) 5 – 55 °C
Operating relative humidity (H-H) 5 – 90%
Weight & dimensions
Width 169.5 mm
Depth 80 mm
Height 23.2 mm
Packaging data
Quantity 1
Packaging content
Quantity 1
Other features
Chipset Intel Movidius Myriad X MA2485

Why people choose us

  • 14 day return guarantee
  • Next day delivery
  • Worldwide shipping
  • Spanstor Build & Test
  • Professional live support

Got a question?

If you have any questions, concerns or would simply like to know more about the products we sell, please don’t hesitate to contact us by using the options below.

020 82 888 555