
NVIDIA RTX PRO™ 6000 Blackwell Max-Q Workstation Edition GPU

The NVIDIA RTX PRO™ 6000 Blackwell Max-Q generation is the ultimate workstation GPU for multi-GPU workloads, delivering unmatched scalability and performance. With 96GB of GDDR7 memory, RTX PRO 6000 Max-Q enables professionals to tackle massive datasets, complex simulations, and AI-enhanced applications with unparalleled performance and precision. Combining multiple RTX PRO 6000 Max-Q GPUs delivers unmatched compute power, memory capacity, and data throughput, making it ideal for mission-critical applications that demand the latest advancements in AI, graphics, and compute. From data science and agentic AI to professional visualization and virtual production, RTX PRO 6000 Max-Q delivers the scalability, reliability, and innovation needed to drive breakthroughs and push the boundaries of what's possible.
Product Specifications
|
NVIDIA RTX PRO™ 6000 Blackwell Max-Q Workstation Edition |
NVIDIA RTX 6000 Ada |
CUDA Parallel Processing cores |
24,064 |
18,176 |
NVIDIA Tensor Cores |
752 |
568 |
NVIDIA RT Cores |
188 |
142 |
Frame buffer memory |
96 GB GDDR7 with ECC |
48GB GDDR6 with ECC |
Memory Bandwidth |
1792 GB/s |
960 GB/s |
Max Power Consumption |
300W |
300W |
Graphics Bus |
PCI Express 5.0 x16 |
PCI Express 4.0 x16 |
Display Connectors |
DisplayPort 2.1 x4 |
DisplayPort 1.4a x4 |
Form factor |
4.4” H x 10.5” L, FHFL Dual Slot |
4.4” H x 10.5” L,dual slot |
External power supply |
1x PCIe CEM5 16-pin |
1x PCIe CEM5 16-pin |
You can expect performance improvements not only in graphics performance but also in learning and inference.
product photo






Click here for inquiries
NVIDIA RTX PRO™ 6000 Blackwell Max-Q Workstation Edition uses PCIe CEM5 16pin for external power supply. Therefore, your workstation may not be able to support it or the power capacity may be insufficient. We recommend using it with a workstation or server that has been verified to work with it in advance.
We can also guide you to a server that has been verified, so if you are interested, please contact us from the following.