> I assume you think that research was also "gut instinct", rather than "thorough research".
My goodness no. "Gut instinct" is how the decisions are made, but the research you're talking about was made for a proposal paper. There are different incentives in play.
For the proposer the incentive is to get something to show for the enormous effort expended in making a proposal - usually months, sometimes years, across dozens of meetings and discussions and presentations. It's soul-destroying stuff. Ideally the sub-committees you're seeing would approve your work and it can go to another committee, more likely they will have suggestions for how it could be altered so as to satisfy them, and after a few iterations that can result in approval of a subsequent revised document, often they just have open-ended questions for you, which perhaps might be satisified in some future proposal document, by answering the questions somehow, or they just aren't interested and you're told to go away.
A show of your extensive research might make it easier to achieve your goal. You have an incentive to make this research seem as comprehensive as possible for that purpose in support of your goal.
But the people making the decision don't have that incentive. They could - in principle - spend hours on reading all the work you did, they could - in principle - replicate that work or even do their own research. In reality they are probably thinking about whether they can break early or move on to something they care about more. I would summarise their reasoning as gut instinct. Does this sound like something we should do? Maybe not. Straw poll question: Do we want this? Vote Against, nothing personal.
I mentioned JeanHyde before. JeanHydge has seen how this sausage is made, be sure to read his experience and think about it carefully before believing any fairy tales you've heard or any imagined process. Remember, the essence of JeanHyde's proposal was just this: 1) It would sure be nice to use blobs of binary data in my programs. 2) The existing ways to achieve this are garbage - so we need a new one.
JeanHyde spent years defending basic obvious stuff in front of people strongly motivated to believe he's wrong since that's just easier than doing any work. At its most basic the question, is, given a lot of bytes of data in a file, or a lot of ASCII hexadecimal values written as C literals, which can be processed more quickly ? The committee was strongly motivated to insist the answer was the ASCII hex, even though JeanHyde had tables showing the raw data is much faster.
The committee hallucinated into existence rules like JeanHyde's proposal can't be in the standard unless there are working implementations. If you're wondering why your C++ compiler didn't have a complete C++ 20 implementation in 2021 you might be surprised to hear that there is such a rule -- that's because there is no such rule, it's an excuse.
Another hallucinated rule is very amusing to Rust programmers. WG21 would like to believe that C++ compilation doesn't result in executing code. So, if Bob makes a malicious C++ program, sure, running the program might be bad, but certainly compiling it is fine. This belief is laughable, but laughing at them won't get your proposal accepted, so you must try to navigate the fantasy world they live in, where their C++, which doesn't have this capability, can accept your proposal, without introducing the capability C++ already has. It's like you're playing Mornington Crescent with opponents who believe there are rules and they know what they are. Terrifying.
And so it isn't in C++ 23. The C++ 23 standard doesn't have JeanHyde's proposal. WG14 took #embed for C23, so C23 does have it, and of course in reality C++ programmers can expect to benefit from that, and that's the awful, miserable reality you're defending.
> how do you know the extra effort in a full Crater run of all crates in crates.io + GitHub is useful, rather than primarily FOMO-driven anxiety?
It periodically finds problems. And Rust is equipped to deal with those problems so the forewarning is practically useful.
In C++ if a syntax change breaks some fraction of programs well, too bad. I guess it would be nice to know, but as you saw the committee might (or might not) do it anyway. In Rust, that can be handled via the Editions mechanism. But to do that you need to know about it before you ship the compiler with the syntax change, so as to mark it as applying only to the future edition you're adding it to.
My goodness no. "Gut instinct" is how the decisions are made, but the research you're talking about was made for a proposal paper. There are different incentives in play.
For the proposer the incentive is to get something to show for the enormous effort expended in making a proposal - usually months, sometimes years, across dozens of meetings and discussions and presentations. It's soul-destroying stuff. Ideally the sub-committees you're seeing would approve your work and it can go to another committee, more likely they will have suggestions for how it could be altered so as to satisfy them, and after a few iterations that can result in approval of a subsequent revised document, often they just have open-ended questions for you, which perhaps might be satisified in some future proposal document, by answering the questions somehow, or they just aren't interested and you're told to go away.
A show of your extensive research might make it easier to achieve your goal. You have an incentive to make this research seem as comprehensive as possible for that purpose in support of your goal.
But the people making the decision don't have that incentive. They could - in principle - spend hours on reading all the work you did, they could - in principle - replicate that work or even do their own research. In reality they are probably thinking about whether they can break early or move on to something they care about more. I would summarise their reasoning as gut instinct. Does this sound like something we should do? Maybe not. Straw poll question: Do we want this? Vote Against, nothing personal.
I mentioned JeanHyde before. JeanHydge has seen how this sausage is made, be sure to read his experience and think about it carefully before believing any fairy tales you've heard or any imagined process. Remember, the essence of JeanHyde's proposal was just this: 1) It would sure be nice to use blobs of binary data in my programs. 2) The existing ways to achieve this are garbage - so we need a new one.
JeanHyde spent years defending basic obvious stuff in front of people strongly motivated to believe he's wrong since that's just easier than doing any work. At its most basic the question, is, given a lot of bytes of data in a file, or a lot of ASCII hexadecimal values written as C literals, which can be processed more quickly ? The committee was strongly motivated to insist the answer was the ASCII hex, even though JeanHyde had tables showing the raw data is much faster.
The committee hallucinated into existence rules like JeanHyde's proposal can't be in the standard unless there are working implementations. If you're wondering why your C++ compiler didn't have a complete C++ 20 implementation in 2021 you might be surprised to hear that there is such a rule -- that's because there is no such rule, it's an excuse.
Another hallucinated rule is very amusing to Rust programmers. WG21 would like to believe that C++ compilation doesn't result in executing code. So, if Bob makes a malicious C++ program, sure, running the program might be bad, but certainly compiling it is fine. This belief is laughable, but laughing at them won't get your proposal accepted, so you must try to navigate the fantasy world they live in, where their C++, which doesn't have this capability, can accept your proposal, without introducing the capability C++ already has. It's like you're playing Mornington Crescent with opponents who believe there are rules and they know what they are. Terrifying.
And so it isn't in C++ 23. The C++ 23 standard doesn't have JeanHyde's proposal. WG14 took #embed for C23, so C23 does have it, and of course in reality C++ programmers can expect to benefit from that, and that's the awful, miserable reality you're defending.
> how do you know the extra effort in a full Crater run of all crates in crates.io + GitHub is useful, rather than primarily FOMO-driven anxiety?
It periodically finds problems. And Rust is equipped to deal with those problems so the forewarning is practically useful.
In C++ if a syntax change breaks some fraction of programs well, too bad. I guess it would be nice to know, but as you saw the committee might (or might not) do it anyway. In Rust, that can be handled via the Editions mechanism. But to do that you need to know about it before you ship the compiler with the syntax change, so as to mark it as applying only to the future edition you're adding it to.