Std::string half of all allocations in the Chrome browser process (2014) [0]
The key difference is that web browsers support a highly arbitrary and mutable dataset. A game engine's assets exist in a mostly-static space. The things that are allocated at runtime are things that should have a known maximum, because going over that maximum will start to overrun latency targets. Many assets are streamed, but still hit a certain size and bandwidth budget, and so are still "basically static" - some degree of compilation and configuration at runtime always takes place for rendering features. The genuinely mutable part of game state while playing is in a comparatively confined space, and that allows a lot to be pushed to build time, where it's easier to validate and to maintain.
The features that change this picture are editors and arbitrary data imports. Web browsers are all about these two things. When you click a link the document may load thousands of gigantic images, and I might try to copy-paste the entire contents of Wikipedia into a text box. The engineering requirements are much broader as a consequence, and there are more rationales to need genuine "black box" interfaces supporting a complex protocol, as opposed to a static "calling function switches on a specified enum" approach, which is sufficient for almost every dynamic behavior encountered in game engines.
That's a data synchronization error across multiple related pieces of data, which isn't the same as a POD container like a hashtable corrupting itself.
The standard hammer you would apply to enforce the synchronization in all cases is relational integrity, which is too expensive for a game's runtime environment. You don't always want to synchronize everything all of the time if you want to hit a high framerate target, and a lot of performance features boil down to relaxations on when synchronization occurs. Much of the detailed design in writing a game main loop is in dealing with the many consequences of supporting that.
That's why their recommendations on errors also refer to the earlier build process and Lua integration; by the time the data hits the inner loops of the engine, there shouldn't be a case where it's invalid, because if it is, then you can't have the optimized version either.
On the one hand, this is a predictable outcome if you are trying to shepherd a large codebase through a fast-moving language. Idiomatic Python 1.6 looks dramatically different to idiomatic Python 3.x.
On the other, Rust isn't the language I would want to write lots and lots of code in either. There are a few projects and organizations where it makes sense to do so(namely web browsers, databases, and other kinds of "deep backend, large surface area" types of projects), but most of the things it does well also act as a hindrance to feature development, compared with an idiomatic Java, C# or Go equivalent.
The thing I had to do is to find some themes that I am idealistic about and stick to those. The project is just a mode of exploring the theme, which means that each project and my skillset grows as needed to accommodate. The projects you are describing are completely non-thematic and are just bundles of features, so of course there's no structure to them, no reason to keep going and seeing what's next. And you are probably not money-and-sales-motivated, which is the thing that drives a lot of obvious business ventures.
The first step in finding the theme is in "knowing thyself", of course - strengths, weaknesses, inclinations. Write and rewrite the set of things about yourself that is maximally coherent and self-reinforcing. Then drive down that road as far as you can go: What types of projects does that support? Gradually you'll hit on a common theme, and then you can really start building.
Another way to force this along is this art advice: "Draw the same thing every day." This is a rather crushing challenge to take on, for no matter the subject matter, you'll tire of it, but it quickly brings out your inclinations and therefore the themes you want to work with.
I'm working with Lua right now(gopherlua) as a scripting option for real-time gaming. I've done similar things to your story in the past with trying to make Lua the host for everything and I'm well aware of the downsides, but I have a requirement of maintaining readable, compatible source(as in PICO-8's model) - and Lua is excellent at that, as are other dynamic languages, to the point where it's hard to consider anything else unless I build and maintain the entire implementation. So my mitigation strategy is to do everything possible to make the Lua code remain in the glue code space, which means that I have to add a lot of libaries.
I'm also planning to add support for tl, which should make things easier on the in-the-large engineering side of things - something dynamic languages are also pretty awful at.
You might still run into GC problems, but none of the Go-based Luas (built on Go, rather than binding to another Lua library) I am aware of have a JIT built in.
GP is talking about LuaJIT, you're talking about Lua. Lua has lower performances but should be completely predictable, GC aside (not sure what its GC scheme is), so it's a very different situation.
It's basically true of anything using physics. There are very good pinball simulations, and being reliable digital games they are actually a better, more fair competitive venue, but people who really play pinball still crave the real game because of all the analog parts of it, the nuance of pressing your weight down on the table and how that changes across different games, and how you adjust your play to the specific conditions as things wear down.
Or, in another word, "disposability". We have a lot of systems that aren't repairable, don't get debugged, don't have things fixed mid-flight.
And...it works, with respect to most existing challenges. Restarting and replacing is easy to scale up and produces clear interface boundaries.
One way in which it doesn't work, and which we still fail, is security. Security doesn't appear in most systems as a legible crash or a data loss or corruption, but as an intangible loss of trust, loss of identity, of privacy, of service quality. We don't know who ultimately uses the data we create, and the business response generally is, "why should you care?" The premise of so many of them, ever since we became highly connected, is to find profitable ways of ignoring and taking risks with security and to foster platforms that unilaterally determine one's identity and privileges, ensuring them a position as ultimate gatekeepers.
That means: it's conceptually overloaded for doing quick-and-dirty conventional charts, where you just want to plug in a few parameters and have it "just work" - but excellent once you need to customize and make it work with your specific requirements.
That it has the concepts, and a clear notion of them, is the critical difference. Most libraries, most of the time, don't add new concepts, they just have a premade black box of features and functions. Sometimes you want a premade black box, but often you want to open up the box shortly afterwards, and that creates the inevitable trend towards either remaking it as your own box, or being one of hundreds of people who gradually grow it into a monstrosity that does everything.
But a library that is concept-focused doesn't have to get that much bigger: it's just another kind of interface, like a programming language or an operating system, and that puts it on a more sustainable track.
A representative for the union that represents more than 19,000 academic workers across the University of California system said she was surprised by the university's decision.
"We are shocked by UC's callousness, and by the violence that so many protesters experienced as they peacefully made the case for a cost of living increase," said Kavitha Iyengar, president of UAW Local 2865, in a statement. "Instead of firing TAs who are standing up for a decent standard of living for themselves, UC must sit down at the bargaining table and negotiate a cost of living increase."
Last week, the university filed an unfair labor practice charge against the union, claiming the union has failed to stop the wildcat strike by the graduate students as it is required to do by the collective bargaining agreement.
The union responded by filing its own unfair labor practice charge, alleging the university has refused to meet with the union to negotiate a cost of living adjustment.
Direct from the article. Care to back up your statement?
I thought the same thing as they. From the article:
"The strike, which is not authorized by the union that represents the graduate student employees, is in violation of the current bargaining agreement, the university said."
It's not clear that the union authorized it, despite what they're quoted as saying further down the article.
I don't know enough about unions to know whether they could authorize a strike after not explicitly authorizing initially.
It reads like the union didn't authorize the strike, but was fine seeing what would happen/causing a reaction.
At the legal level, unions are obligated to prevent wildcat strikes. That's why people talk about a grand bargain between labor and capital happening mid century: unions would channel labor conflict into a bureaucratic process, and in return capital would agree in principle to negotiate with the union.
In practice, union bureaucrats really dislike wildcat strikes. They dissipate the negotiating power of the union; they basically show the entire hand of the union's most powerful weapon, at a time the union views as suboptimal. And it's usually the threat of a strike, not the strike itself, that companies are more scared of: a company highy desires to avoid a disruption, but if it's already in progress, they don't have an incentive to negotiate unless the union has a really strong hand.
I wonder if the union seriously considered a strike or not. I read the article but I missed anything that indicated deliberation over the striker's cause on the part of the union. In the grand bargain the union has to refrain from merging with management or else it is illegitimate.
Probably not, because the strikers obviously had no leverage. The union officials knew exactly how this would go down, and they accurately believed that more subtle means of pressure would allow the workers to maximize their benefit.
Why would the union strike? The wages were exactly what they had bargained for, and the majority had voted in favor of. It isn't like they were working without a contract.
The tradeoff in goto-vs-exception is that a goto needs an explicit label, while an exception allows the destination to be unnamed, constrained only by the callstack at the site where it's raised.
That makes exceptions fall more towards the "easy-to-write, hard-to-read" side of things; implied side-effects make your code slim in the present, treacherous as combinatorial elements increase. With error codes you pay a linear cost for every error, which implicitly discourages letting things get out of hand, but adds a hard restriction on flow. With goto, because so little is assumed, there are costs both ways: boilerplate to specify the destination, and unconstrained possibilities for flow.
Jumping backwards is the primary sin associated with goto, since it immediately makes the code's past and future behaviors interdependent. There are definitely cases where exceptions feel necessary, but I believe most uses could be replaced with a "only jump forward" restricted goto.
The key difference is that web browsers support a highly arbitrary and mutable dataset. A game engine's assets exist in a mostly-static space. The things that are allocated at runtime are things that should have a known maximum, because going over that maximum will start to overrun latency targets. Many assets are streamed, but still hit a certain size and bandwidth budget, and so are still "basically static" - some degree of compilation and configuration at runtime always takes place for rendering features. The genuinely mutable part of game state while playing is in a comparatively confined space, and that allows a lot to be pushed to build time, where it's easier to validate and to maintain.
The features that change this picture are editors and arbitrary data imports. Web browsers are all about these two things. When you click a link the document may load thousands of gigantic images, and I might try to copy-paste the entire contents of Wikipedia into a text box. The engineering requirements are much broader as a consequence, and there are more rationales to need genuine "black box" interfaces supporting a complex protocol, as opposed to a static "calling function switches on a specified enum" approach, which is sufficient for almost every dynamic behavior encountered in game engines.
[0] https://news.ycombinator.com/item?id=8704318