HP will be using Nvidia's A800 datacenter GPU for upcoming Z-branded workstations made for AI as indicated by a GTC presentation description (via @agamsh on X). Originally developed as a GPU that would comply with U.S. sanctions on China, it lost its reason to exist when U.S. sanctions expanded to ban the A800. The new Z AI workstation is seemingly one way Nvidia and HP have decided it can reuse the cut-down Ampere graphics card.
When the U.S. introduced its initial GPU export bans in October 2021, it banned Nvidia's flagship A100 and H100 GPUs from being sold to China. However, since these regulations were based on performance characteristics, Nvidia was able to launch the cut-down A800 and H800 as replacements, which performed pretty much the same as the flagship cards but had a significantly lower GPU-to-GPU bandwidth. That was enough to comply with U.S. sanctions, though it did make the A800 essentially a worse A100.
But a wrench was thrown into Nvidia's plans when the U.S. government revised its export rules, and suddenly the A800 was no longer legal to sell to China. While it could be used in other markets, having lower bandwidth for GPU-to-GPU connections undoubtedly reduced demand from customers who didn't want the slower connection speeds.
Nevertheless, Nvidia is trying to find a way to sell these A800s. Back in November, Nvidia effectively launched the A800 for Western markets, saying it would be “the ultimate workstation development platform," though HP's presentation is the first indication we've gotten that A800 workstations are finally happening. There's no indication what CPU these Z workstations will use, but it will almost certainly be either Intel's Emerald Rapids Xeon chips (or perhaps Sapphire Rapids) or AMD's Threadripper 7000 series.
Nvidia's marketing materials for the A800 frequently mentioned AI, and so does HP in its GTC presentation description, so we can probably expect that other A800-powered workstations will focus on AI too. The demand for AI acceleration in servers and datacenters is self-evident given how popular (and expensive) the H100 is, but it's not so clear whether there's much demand for AI in workstations. At the very least, developers for AI software would like access to AI hardware, so that's one hypothetical use case.