Both those questions are answered clearly in the readme:
> compared to avante
> I think it's fairly similar. However, magenta.nvim is written in typescript and uses the sdks to implement streaming, which I think makes it more stable. I think the main advantage is the architecture is very clean so it should be easy to extend the functionality. Between typescript, sdks and the architecture, I think my velocity is pretty high. I haven't used avante in a while so I'm not sure how close I got feature-wise, but it should be fairly close, and only after a couple of weeks of development time.
And:
> Another thing that's probably glaringly missing is model selection and customization of keymappings, etc...
Without knowing exactly how createNewGroup and addFileToGroup are implemented it is hard to tell, but it looks like the code snippet has a bug where the last group created is never pushed to groups variable.
I'm surprised this "senior developer AI reviewer" did not caught this bug...
Dynamic linking has always been cool for writing plugins.
It is kind of ironic that languages that praise so much for going back to early linking models, have to resort for much heavier OS IPC for similar capabilities.
IIUC Go and Rust resort to OS IPC based plugin system mainly because they refused to have a stable ABI.
On the other hand, at $DAYJOB we have a query engine written in C++ (which itself uses mostly static linking [1]) loading mostly static linked UDFs and ... it works.
[1] Without glibc, but with libstdc++ / libgcc etc.
Doesn’t rust’s static linking also have to do with the strategy of aggressive minimization? Iirc for instance every concrete instantiation of a dynamic type will have its own compiled binary, so it would basically be impossible for a dynamic library to do this since it wouldn’t know how it would be used, at least not without some major limitations or performance tradeoffs
Well if it loads code dynamically, it is no longer static linking.
Also it isn't as if there is a stable ABI for C and C++ either, unless everything is compiled with the same compiler, or using Windows like dynamic libraries, or something like COM to work around the ABI limitations.
Hosting some services on a vps provides far better availability of data, than doing it at home. Especially when you need it the most. For example when you are moving to a new home, and need to save documents, or you are abroad, need a document and there is a power outage at home.
Even if you have to trust a 3rd party with your data, (1) you can minimize the privacy risk with encryption and (2) usually VPS/cloud providers have different privacy guaranties than free Google drive...
It is interesting because SMART being "niche jargon term" is very dependent of the audience. Nowhere the author needs to assume that the post will be posted on HN where the audience is probably larger than the originally intended one.
In many places NixOS is a niche jargon term.
Also, in the first paragraph of the article there is a direct link to smartd that has a clear explanation for what smart is, no need to Google.
This explains why they comply to the EU regulations even in the USA, but not why they don't try to lobby more in the EU to avoid those regulations in the first place.
Essential complexity can also be created and destroyed, though sometimes it happens earlier in the design process. Picking the problem you choose to solve is how you control essential complexity.
Essential complexity is inherent to the problem you have. The solutiom is layered between product design, technical design and implementation. What is essential complexity for a layer can be accidental for the layer above.
It's often a matter of framing. When you abstract, refactor or move complexity it should serve to make the rest of the system/application easier to understand or for those adding features into the application(s) to be more productive as a whole. If it doesn't do one of those things, you probably shouldn't do it.
It's one thing to shift complexity to a different group/team, such as managing microservice infrastructure and orchestration that can specialize in that aspect of many applications/systems. It's very different to create those abstractions and separations when the same people will be doing the work on both sides, as this adds cognitive overhead to the existing team(s). Especially if you aren't in a scenario where your application performance overhead is in eminent danger of suffering as a whole.
> But they should consider whether this matches the size/scope of the problem being solved
In professional software development projects, specially legacy projects, often times the complexity is not justified by the problem. There is always technical debt piling up, and eventually it starts getting in the way.
> compared to avante > I think it's fairly similar. However, magenta.nvim is written in typescript and uses the sdks to implement streaming, which I think makes it more stable. I think the main advantage is the architecture is very clean so it should be easy to extend the functionality. Between typescript, sdks and the architecture, I think my velocity is pretty high. I haven't used avante in a while so I'm not sure how close I got feature-wise, but it should be fairly close, and only after a couple of weeks of development time.
And:
> Another thing that's probably glaringly missing is model selection and customization of keymappings, etc...