I once tried sending a 3 GB file through it, then kept wondering why my entire system became so sluggish. Turns out it loads the entire thing into RAM... The file didn't go through either.
You can also do this manually (and on older browsers) by creating a FileReader and only loading new chunks after old ones have been transferred. With async APIs like WebSockets and WebRTC this typically requires implementing your own backpressure to avoid blowing up browser memory. See for example how omnistreams does it[0].
Same experience here. Every website I tried so far died when sending large files because it apparently loaded the whole thing into ram and then some. That might be fine on a system with 32gb memory, but my laptop with 8gb dies when trying to send a 7gb file.
The JavaScript implementation is creating a blob with a URL pointing to it so the user can save the file. I might be wrong, but I think that the all the data is in browser memory before it is saved.
Browsers are pretty restrictive about writing to the file system.
Do these other ones have native clients? WebWormHole has a Go client, so you can run everywhere Go works! Unix/Windows/Mobile/Web covers a decent amount of platforms :)