I have always been fascinated - what are the reasons anyone would want a Unix workstation at that time over DOS/Windows? Can somebody come up with a few examples? Genuinely missing the knowledge, as I was using MS-DOS in the 90s.
Microsoft itself was a leading Unix vendor in the 1980s with Xenix.
Every Microsoft developer had a Xenix workstation for things like email, access to network disks, running a decent C compiler, and debugging.
DOS was practically a single-program environment with no memory protection and no networking. Unix offered much better productivity for software developers.
Engineering in general was a field that used Unix workstations heavily. Microsoft didn’t become competitive until Windows NT in 1993.
Memory protection was not possible on the 8086 and quite half-baked on the 80286 (you could switch to protected mode but then you lost access to hardware BIOS facilities that relied on real mode, and switching back to real mode required hard-faulting the processor because there was no architectural support for it). The Intel 80386 was the first fully-featured x86 CPU wrt. running memory protected OS's.
The reason being that they thought no one would care about those legacy MS-DOS applications, everyone would be running to adopt OS/2 on 286, hence no need to go back into real mode.
In hindsight, Microsoft seems to have lost two opportunities to already be on the forefront from UNIX, first with giving up on Xenix, then by not really embracing the POSIX subsystem on Windows NT.
Linux would never taken off in such alternative realities.
Not that it matters that much now with WSL, and Azure Linux.
In addition to what the others have said, the specs were often a generation or two of what was available in a generic Intel box. When it was introduced in 1993, the Indy, SGI's lowest-specced workstation, could handle 256 MB of RAM and was clocked at 100 MHz, which was way beyond anything you could get for a PC, and that's before even mentioning the dedicated graphics hardware for 3D and video workloads. If money was no/little object, then the workstation vendors were happy to take your hand. (And your wallet.)
As efficiencies in cutting-edge hardware improved, the gap closed. Intel and AMD leapfrogged the smaller design firms running these companies, and more and more vendors threw in the towel on hardware design, switching over to standard x86 hardware. By the early 2000s, distinctive OSes like Solaris and NEXTSTEP were just legacy GUIs that could be installed on commodity PCs, although many flavors were discontinued outright in favor of Linux, leaving these companies (several of which were swallowed by HP) without any moat or vendor lock-in. (Notably it happened to NEXTSTEP twice, once in 1995 and again a decade later when Mac OS X 10.4 was officially released for Intel CPUs.)
Same as we use it now, to be frank. Unix workstations as an interaction model have persisted so long because it works just great.
I was writing a lot of Unix software in that period - database apps, business logic, and so on. For me, using an MSDOS-based system was a compromise, which I enhanced by using Desqview to get multi-tasking - it allowed multiple MSDOS instances on a single machine, in which I ran terminal software, compilers (our apps were being ported to MSDOS...), and database admin tasks - just like today.
What we have today in the form of MacOS or Linux workstations is pretty much what we had back then, too. The power is inescapable.