I have a C2 OLED and it is a really nice TV. I've never connected it to the internet or tried to root it though. It behaves as a simple no-frills display.
> Arc Core is designed with MinIO as the primary storage backend
Noticing that all the benchmarking is being done with MinIO which I presume is also running alongside/locally so there is no latency and it will be roughly as fast as whatever underlying disk its operating from.
Are there any benchmarks for using actual S3 as the storage layer?
How does Arc decide what to keep hot and local? TTL based? Frequency of access based?
The benchmarks weren’t run on the same machine as MinIO, but on the same network, connected over a 1 Gbps switch, so there’s a bit of real network latency, though still close to local-disk performance.
We’ve also tried a true remote setup before (compute around ~160 ms away from AWS S3). I plan to rerun that scenario soon and publish the updated results for transparency.
Regarding “hot vs. cold” data, Arc doesn’t maintain separate tiers in the traditional sense. All data lives in the S3-compatible storage (MinIO or AWS S3), and we rely on caching for repeated query patterns instead of a separate local tier.
In practice, Arc performs better than ClickHouse when using S3 as the primary storage layer. ClickHouse can scan faster in pure analytical workloads, but Arc tends to outperform it on time-range–based queries (typical in observability and IoT).
I’ll post the new benchmark numbers in the next few days, they should give a clearer picture of the trade-offs.
I am currently doing this! Working on an MMO game server implemented in Elixir. It works AMAZING and you get so much extra observability and reliability features for FREE.
I don't know why its not more popular. Before I started the project, some people said that BeamVM would not cut it for performance. But this was not true. For many types of games, we are not doing expensive computation on each tick. Rather its just checking rules for interactions between clients and some quick AABB + visibility checks.
I distinctly remembered that Eve Online was in Erlang, went to go find sources and found out I was 100% wrong. But I did find this thread about a game called "Vendetta Online" that has Erlang... involved, though the blog post with details seems to be gone. Anyway, enjoy! http://lambda-the-ultimate.org/node/2102
You'll never get a modern FPS gameserver with good performance written in a GC language. Erlang is also pretty slow, it's Python like performance. Very far from C#, Go and Java.
The other reason is that the client and the server have to be written in the same language.
> The other reason is that the client and the server have to be written in the same language.
This isn't true at all.
Sure, it can help to have both client and server built using the same engine or framework, but it's not a hard requirement.
Heck, the fact that you can have browser-based games when the server is written in Python is proof enough that they don't need to be the same language.
Mobile users will hate you when your game drains their battery much faster than it should.
> I'm talking about AAA online games here, which 99% are built in c++ and the rest in c#.
It still doesn't apply. There's absolutely nothing stopping you from having a server written in Java with a game client written in C#, C++, or whatever.
I'm really curious why you think client and server must be written in the same language. A TCP socket is a TCP socket. It doesn't matter what language opens the connection. You can send bytes down from one side and decode them on the other. I mean, sure, if you're writing the server in Java and use the language's object serialization functions to encode them, you might have a hard time decoding them on the other side if the client is in C, but the answer then is to not use Java's object serialization functions. You'll roll your own method of sending updates between client and server.
Because games are built with engines, and you're not going to re-implement all the simulation / systems in a different engine or language. Why would you? A gameserver is basically a game client stripped from rendering with a bit more logic.
I'm talking real time games here, not an .io game over websocket/json.
I don't know about Erlang, but in other GC languages I've used, the GC only matters if you allocate; if you pre-allocate all of your buffers when the game is created, the GC doesn't matter. The other points remain true though.
I've been told that Erlang is somewhat popular for matchmaking servers. It ran the Call of Duty matchmaking at one point. Not the actual game servers though - those are almost certainly C++ for perf reasons.
Network connection, lobby, matchmaking, leaderboards or even chats, yes. But the actual simulation, probably not for fast paced twitchy shooter.
Also not just for performance reasons, I wouldn’t call BeamVM hard realtime, but also for code. Your game server would usually be the client but headless (without rendering). Helps with reuse and architecture.
In the case of Call of Duty: Black Ops 1. Thee matchmaking + leaderboards system was implemented by DemonWare (3rd party) in Erlang.
Erlang actually has good enough performance for many types of multiplayer games. Though you are correct that it may not cut it for fast paced twitch shooters. Well...I'm not exactly sure about that. You can offload lots of expensive physics computations to NIF's. In my game the most expensive computation is AI path-finding. Though this never occurs on the main simulation tick. Other processes run this on their own time.
The biggest hurdle to a game server written entire on the BEAM is the GC. GC pauses just take too much time, and when you need to get out (for example) 120 updates per second, you can't afford it. Even offloading stuff to C or C++ does not save you, because you either have to use the GC, do a copy, or both.
Game servers typically use very cheap memory allocation techniques like arenas and utilize DOD. It's not uncommon for a game server simulation to be just a bunch of arrays that you grow, never shrink, and then reset at the end of the game.
Good point. Yeah I guess it wouldn't cut it for any fast-paced twitch shooter. Especially with a 120 update per second deadline. A non-deterministic GC pause could have disastorous effects, especially in a tense shootout. I don't know much about GC theory but the GC in BEAM is per process and heap-based? I'm not sure exactly what that entails, but can you not structure the main simulation process to take advantage of this fact?
I find myself interested in developing multi-player simulations with more flexible deadlines. My MMO runs at 10 ticks. And its not twitch-based. So the main simulation process can have pauses and it wouldn't have a big impact on gameplay. Though this has never occurred.
As long as: (tick process time) + (send update to clients) + (gc pause) < 100ms, everything is fine?. (My assumption).
Btw what does DOD mean? Is it Data on Demand? Since my game is persistent I can't reset arrays at some match end state. So I store things either in maps on the main server process or I store it in the dedicated client process state (can only be updated via server process).
This is government overreach. These are novelty/fun apps. They are not critical infrastructure or needed in any way. This would be like a court ordering a local bar to serve more than just beer and wine, to accomodate people who like sake and soju. You have free choice to use the social platforms that you want to use. Really don't understand this kind of action tbh.
Facebook is a massive part of social media. Billions of users. It is apart of society in its sheer size. A society decided “we want to make this better” and acted appropriately. I think it’s a noble pursuit for a society to attempt to reduce the clearly negative aspects of social media.
There is no real freedom of choice. The network effect cements big players positions. Try telling an 80 year old grandma with a 20 year old laptop to use mastodon. Likely no one she knows is on it.
Finally, individuals make essentially no difference when choosing to not use FB. But when choosing to not go to a local bar, that may be 0.03% loss of their monthly revenue. The only actor that can reasonably bargain with huge organizations is other huge organizations.
I guess so, and that definitely is the healthier option! That’s what I’ve been doing for close to 15 years.
The only way you can exist on the modern world is to accept the TOS and crap that a small number of companies are in control over. Saying “just don’t use major tech services” makes people revert back to older methods like snail mail, which is just silly, not a real choice.
The consequences of “simply choosing no” matters. If that means you can’t interact in modern society, that means that service is essential, and should be tested as such.
Actually, in many places there are regulations on these things. For example, in France, they are legally obligated to serve several non-alcoholic drinks if they serve beer and wine.
Many jurisdictions do in fact include what drinks are available as license conditions. E.g. A license for beer and wine can be easier to get than a license for liquor. Or brewery (beer only) get a special category with less restrictions (e.g. they aren't required to provide food) etc etc
This is true when there’s plenty of competition that matter. When everybody and every business is in one app, the network effect forces everyone to be there or be invisible. So the “essential infrastructure” label is kinda debatable. I suppose it’s essential for many businesses.
So your analogy should be more like there’s one big shopping mall network in the city that basically everyone has to go because certain stores are only there — and the owners bought any competitor that seemed to start becoming popular in the past so there’s no perspective of competition either.
FarmLogs (YC 12) did exactly this. We used sat imagery in the near-infrared spectrum to determine crop health remotely. Modern farming utilizes a practice called precision ag - where your machine essentially has a map of zones on the field for where treatments are or aren't needed and controllers that can turn spray nozzles on/off depending on boundaries. We used sat imagery as the base for an automated prescription system, too. So a farmer can reduce waste by only applying fertilizer or herbicide in specific areas that need it.