(1) Explain just how malloc() and free() work under the covers and the implications for multi-threading, memory leaks, virtual memory paging, etc.
Maybe also cover some means, algorithms, and code for reporting on the state, status, etc. of the memory use by malloc() and free().
By the way, I know and have known well for longer than most C programmers have lived JUST what the heap data structure, as used in "heap sort", is. But what is the meaning of "the heap" in C programming language documentation?
(2) Cover in overwhelmingly fine detail the "stack" and the chuckhole in the road, stack overflow.
(3) Where to get a reliable package for a reasonable package of code for handling character strings -- what I saw and worked with in C is not reasonable.
(4) From the C programming I did, it looks like a large C program for significant work involves some hundreds, maybe tens of thousands, of includes, inserts, whatever, and what a linkage editor would call external references. There must somewhere be some tools to help a programmer make sense of all those includes and references, the resulting memory maps, issues of locality of reference, word boundary alignment, etc.
(5)How can C exploit a processor with 64 bit addressing and main memory in the tens of gigabytes and maybe terabytes?
(6) How can C support, i.e., exploit, integers and IEEE floating point in 64 and/or 128 bit lengths?
(7) How to handle exceptional conditions with, say, non-local gotos and without danger of memory leaks?
(8) Sorry, but far and away my favorite programming language long has been and remains PL/I, especially for its scope of names rules, handling of aggregates with external scope, its data structures, and its exceptional conditional handling with non-local gotos and freeing automatic storage and, thus, avoiding memory leaks. Of course I can't use PL/I now, but the problems PL/I solved are still with us, also when writing C code. So, how to solve these problems with C code?
(9) For C++, please explain how that works under the covers. E.g., some years ago it appeared the C++ was defined as only a source code pre-processor to C. Is this still the case? If so, then explaining C++ under the covers should be feasible and valuable.
(1) There are several implementations; most are based on Knuth's "boundary tag" algorithms. As to "heap", a stack has one accessible end, a heap is essentially random-accessible. Nothing to do with the heap data structure.
(2) Stack overflow can occur even early within a program. I've campaigned for a requirement that such overflows be caught and integrated into a standard exception handler, to no avail. (3) Why not code your own, so there won't be arguments about it. (4) There are lots of tools for program development, but it's not standardized by WG14. (5) Use wider integer types. (6) Use wider floating representations. (7) Standard C doesn't specify such a facility, but it has occasionally be suggested. (8) There were a lot of books, e.g. on structured system analysis, during the 1970s trying to apply lessons learned. C isn't special in that regard, as many of the big problems don't involve syntax. (9) C++ is now a big language and it takes a lot of work to master its internals.
It has been many years since a C++-to-C preprocessor has been commonplace. There's just too much new stuff in recent C++ to map it all easily into straight C.
> Explain just how malloc() and free() work under the covers and the implications for multi-threading, memory leaks, virtual memory paging, etc.
>
> Maybe also cover some means, algorithms, and code for reporting on the state, status, etc. of the memory use by malloc() and free().
Strictly speaking, these are implementation details that the C standard leaves unspecified. If you want to know how the memory allocation functions work or methods for inspecting the state of the heap you'll need to look at a specific implementation (e.g., glibc, musl, jemalloc, etc.) since the details can vary wildly between implementations.
> Cover in overwhelmingly fine detail the "stack" and the chuckhole in the road, stack overflow.
Both these are not really specific to C, and there should be a lot of resources you can find that explain these concepts ([0], [1] for some example general explanations). Did you have more specific questions in mind?
> How can C exploit a processor with 64 bit addressing and main memory in the tens of gigabytes and maybe terabytes?
> How can C support, i.e., exploit, integers and IEEE floating point in 64 and/or 128 bit lengths?
I think pointer/integer sizes are implementation details. C specifies pointer behavior and minimum integer sizes (and optional fixed-width types), but the precise widths are chosen by the implementation. In the case of floating-point, the sizes are specified by IEEE 754 widths.
In other words, you don't really need to do anything special as long as you pick the appropriate types as defined by your implementation.
> For C++, please explain how that works under the covers. E.g., some years ago it appeared the C++ was defined as only a source code pre-processor to C. Is this still the case?
As far as I know no (production-quality?) C++ compiler has been implemented as a source-level preprocessor for basically the entirety of C++'s existence [2]. The very first "compiler" for C++ was Cpre, back when C++ was still the C dialect "C with classes" (around October 1979), and that was indeed a preprocessor. That was replaced by the Cfront front end around 1982-1983, about when "C with classes" started gaining new features and got a new name. Cfront is a proper compiler front end that output C code, and I think from that point on C++ compilers used "standard" compiler tech.
On stack overflow, my understanding was that could encounter that fatal condition from suddenly a too deep call stack, that is, too many calls without a return. So, if the "stack" is a, say, finite resource, then the programmer should know in the code how much of that resource is being used and act accordingly.
For a preprocessor for C++, I IIRC at one point the definition of C++ was in terms of a preprocessor -- I was just thinking of the definition, that is, get a more explicit definition of C++. I've always understood that always or nearly so C++ implementation was usual compilation. The issue is that at least at one time it seemed difficult to be precise about C++ semantics, that is, what the code would do and how it would do it. Maybe now C++ is beautifully documented.
> So, if the "stack" is a, say, finite resource, then the programmer should know in the code how much of that resource is being used and act accordingly.
And this is true, but IIRC statically determining stack bounds for arbitrary programs is not an easy problem to solve, especially if you call into opaque third-party libraries.
> For a preprocessor for C++, I IIRC at one point the definition of C++ was in terms of a preprocessor
I wouldn't know about defining C++ in terms of transformations to C, and searching for that is more difficult. I would guess that the abandonment of the preprocessor approach to compilation would also have meant the abandonment of defining C++ in terms of C, especially once C++ really started picking up features.
> The issue is that at least at one time it seemed difficult to be precise about C++ semantics, that is, what the code would do and how it would do it. Maybe now C++ is beautifully documented.
C++ has had a formal specification since 1998, which might count as documentation for you.
If the Standard were to make recursion an optional feature, many programs' stack usage could be statically verified. Indeed, there are some not-quite-conforming compilers which can statically verify stack usage--a feature which for many purposes would be far more useful than support for recursion.
Maybe also cover some means, algorithms, and code for reporting on the state, status, etc. of the memory use by malloc() and free().
By the way, I know and have known well for longer than most C programmers have lived JUST what the heap data structure, as used in "heap sort", is. But what is the meaning of "the heap" in C programming language documentation?
(2) Cover in overwhelmingly fine detail the "stack" and the chuckhole in the road, stack overflow.
(3) Where to get a reliable package for a reasonable package of code for handling character strings -- what I saw and worked with in C is not reasonable.
(4) From the C programming I did, it looks like a large C program for significant work involves some hundreds, maybe tens of thousands, of includes, inserts, whatever, and what a linkage editor would call external references. There must somewhere be some tools to help a programmer make sense of all those includes and references, the resulting memory maps, issues of locality of reference, word boundary alignment, etc.
(5)How can C exploit a processor with 64 bit addressing and main memory in the tens of gigabytes and maybe terabytes?
(6) How can C support, i.e., exploit, integers and IEEE floating point in 64 and/or 128 bit lengths?
(7) How to handle exceptional conditions with, say, non-local gotos and without danger of memory leaks?
(8) Sorry, but far and away my favorite programming language long has been and remains PL/I, especially for its scope of names rules, handling of aggregates with external scope, its data structures, and its exceptional conditional handling with non-local gotos and freeing automatic storage and, thus, avoiding memory leaks. Of course I can't use PL/I now, but the problems PL/I solved are still with us, also when writing C code. So, how to solve these problems with C code?
(9) For C++, please explain how that works under the covers. E.g., some years ago it appeared the C++ was defined as only a source code pre-processor to C. Is this still the case? If so, then explaining C++ under the covers should be feasible and valuable.