I don’t mean to be pedantic, but beyond the deliberate syntactic echoes, JavaScript and Java were the first two languages with (incompatible!) object-oriented data models enforced by the runtime to achieve widespread adoption with longevity (sorry, Smalltalk!)
Python was invented earlier, but didn’t see wide use until later.
And that they were both massively accelerated by the level of interest in the early WWW is undeniable. No other general purpose languages can say that except perhaps Perl, and it slowly burned out.
Is that really true though? As I understood it JavaScript was mainly adopted because Java was popular at the time. JavaScript originally shipped as LiveScript, and they changed it to JavaScript later. Here is a nice quote on it from Brendan Eich:
“The name JavaScript was chosen when Java was hot, and we were doing LiveConnect to hook up JS to Java applets.”
Here is one from David Flanagan:
“JavaScript was originally developed under the name Mocha… It was renamed JavaScript in a co-marketing deal between Netscape and Sun Microsystems.”
The name change to JavaScript, via a trademark license from Sun to Netscape, was on Dec. 4, 1995 -- still within the Netscape 2.0 beta period. There was no stable release of Netscape launched with LiveScript but not JavaScript support.
Yes, the typing and semantic models are wildly different. The point is that they’re primitive in a way that the other widespread alternative, C++, did not inherit from its Cfront heritage.
There's tons of libraries that use some kind of runtime-observable instance property as a tag, to mimic nominal typing in JS.
The same thing is also possible using prototype identity, if you either use the class keyword syntax sugar introduced with ES5 (?), or if you manually do OOP using prototypes. But the latter is very uncommon.
I'd donate.