Why are dummy plugs a thing? What can you do with them that you cannot do in software? (asking as a person who had no issues with having 18 virtual displays and no dummies).
One example: I use software called Looking Glass on my PC for interacting with a Windows virtual machine. I have two GPUs in my computer, an AMD one for the Linux host and an NVidia one that gets passed through to the Windows guest. Looking Glass then captures the NVidia GPU's output and displays it in a window on my desktop. This allows me to use Windows software in the VM and get acceptable performance (Windows has basically required graphics acceleration to run acceptably after 7). The problem is that the NVidia GPU will not do anything without having a display connected. NVidia Quadro GPUs support dumping a monitor's EDID and then mapping that file to an output (so the GPU always thinks that monitor is connected to that output), but their consumer-grade GPUs don't support this. That's where the dummy plug comes in.
A lot of OS / GPU / driver combinations dont actually let you setup virtual displays with arbitrary settings. And you might want it for setting streaming with OBS or games streamings via Steam / Parsec / etc.
Some years ago it's kind a worked for me on Linux with Xorg and open source drivers and Windows with Nvidia, but when it comes to MacOS or Windows+AMD or Intel GPU it simply doesn't work that well.
We use it for testing binary embedded Linux distros where tricking the OS to think there's a display connected introduces a new variable that is not present in the user's deployment - and it's a cheap hardware solution. Buying and installing them is probably more cost-effective than having an engineer writing the `echo on > /sys/whatever` and the logic around it.
Dummy plugs are a lot easier for most people. I added a fake 4K monitor to my desktop via software for remote game streaming, and it was a lot more complicated than I expected[^1].
I have a moded chromebox(booting windows and linux), which refuses to boot without any video device attached to hdmi port. So I had to use a dummy plug.
In addition to what's already been mentioned, I remember there being issues with macs not unlocking the full abilities of the GPU if there was no display present. Maybe there is some software workaround, but a HDMI dummy is cheap and quick and won't disable itself on updates etc.
It seems that linux doesn't support virtual displays. On Windows you can either install a dummy display or have Apollo do it automatically. No such thing on linux.
Windows 7 support is one reason to stick to older GoLang releases. A project in Go 1.21.4 or earlier will work on every Windows release and any computer made since 2009, whereas a version bump to v1.21.5 means it will only work on more recent computers and Win10 and 11 for no benefit.
I think this is a reasonable take. Yes, people shouldn't be running Windows 7 as their daily driver. But if you can support it at basically no effort and without sacrifices that is the right thing to do. Supporting more platforms is a good thing, even if that platform is an old Windows version instead of an Amiga
The Go team isn't making new versions just for fun. Each version since 1.21 has had improvements. Especially the fix/change to for loop variables in 1.22 is very nice to have, and helps preventing to write bugs.
If there's a reasonable expectation that users will use outdated platforms, it makes sense to support them. If there is no such expectation at all, why would one forego the improvements to the language and tooling.
Automatic toolchain switching does not trigger compilation of later Go toolchains; it only attempts fetching a blob (if available) and using that to perform builds[1]. If a system supports Go 1.21 but not 1.23.1 (e.g. Windows 7, mentioned in a sibling comment), then this project will fail to build on said system. Likewise, if a user on Go 1.21 has disabled automatic toolchain switching, the Go CLI refuses to build the project at all.
Overall, I would say the minimum required Go version is indeed the Go version declared in the go.mod, since that is the declared minimum required toolchain.
Do the authors know what "UFO" stands for? Or is it just for clickbait title?
A pigeon in the dark can be a UFO. Or a bat. Or a satellite, an airplane, literally anything. If I throw a sock out of my window, it would be an UFO to my neighbours. Though "F" in UFO would stand for "Falling" in the sock case.
Because, colloquially, a UFO has little green men inside, and the average person who isn't an expert wont know what a UAP is. Of course, when you post to HN, people who know more about it will nitpick about it
> DeepSeek was seriously cool, but it started behaving similar to Google Gemini Pro
You should be able to use the version of DeepSeek that you prefer indefinitely if you host it yourself or choose that specific version with your preferred provider.
R1 takes more time to answer, but I don't remember a single case where I actually compared answers where R1 was worse than pure V3.
And I don't even have to wait that long. If I watch the thinking, I can spot quickly it misunderstood me and rephrase the question without even waiting for the full response.
They often publish "needle in a haystack" benchmarks that look very good, but my subjective experience with a large context is always bad. Maybe we need better benchmarks.
reply