by vmsp on 10/15/23, 1:40 PM with 178 comments
by maccard on 10/15/23, 2:54 PM
They're not supported in cmake (unless you set CMAKE_EXPERIMENTAL_CXX_MODULE_CMAKE_API to 2182bf5c-ef0d-489a-91da-49dbc3090d2a if you're on the same version of cmake as I am).
My experience is that no IDE's support them properly, so either you need to ifdef out for intellisense, or go back to no tooling support.
The format breaks every update, so you can't distribute precompiled modules.
Last but definitely not least, they're slower than headers.
So, we've got a feature that we started talking about 11 years ago, standardised 3 years ago, not implemented fully, has no build system support, and is a downgrade over the existing solution from every real world use case I've seen. What a mess.
by mathstuf on 10/15/23, 9:06 PM
It is my opinion that hand-coded Makefiles are unlikely to handle modules well. They have the same required strategy as Fortran modules where the (documented!) approach by Intel was to run `make` in parallel until it works (basically using the `make` scheduler to get the TU compilation order eventually). Of course, there's no reliable way to know if you found a stale module or anything like that if there's a cycle or a module rename leaving old artifacts around.
There is the `makedepf90` tool that could do the "dynamic dependencies" with a fixed-level recursive Makefile to order the compilations in the right order (though I think you may be able skip the recursive layer if you have no sources generated by tools built during the build).
Anyways, this strategy fails if you want implementation-only modules, modules of the same name (that don't end up in the same process space; debug and release builds in a single tree are the more common example though), use modules from other projects (as you need to add rules to compile their module interfaces for your own build).
by boris on 10/15/23, 5:19 PM
by humanrebar on 10/15/23, 5:22 PM
Instead, some sort of dependency graph amongst the .ccm needs to be encoded in the Makefiles. There is an ISO paper describing how to query a compiler for this info (see https://wg21.link/p1689). Other papers sketch out how one might incorporate that into a dynamic Makefile graph, but so far nobody seems interested in going that far. Switching to a more actively maintained build system is probably the interesting alternative.
If you want to learn more about how build systems can support C++ modules, see this blog post by the CMake maintainers. The principles translate to most (all?) nontrivial build systems.
by swingingFlyFish on 10/15/23, 3:57 PM
"As far as I know, header files came to be because when C was built disk space was at a prime and storing libraries with their full source code would be expensive. Header files included just the necessary to be able to call into the library while being considerably smaller."
The modules/packages that were built from C/C++ for example in Python and maybe other scripting languages those were all compiled with the inclusion of the .h files. for exactly the reason that the OP states...for efficiency. Header files should really only include definitions, and when you compile it's tight code using only what you included.
Wouldn't modules include everything? isn't that like saying:
import *
as opposed to say:
Import module.package
I'm just confused how this will help statically typed languages like C/C++ into maintaining and compiling to very efficient machine or executables.
Someone enlighten me.
by miki123211 on 10/15/23, 6:17 PM
As far as I understand it, they have a tool that can parse all their (C) files, figure out what each file declares and what declarations it depends on, and then autogenerate a header for every file with just the declarations it requires.
It seems like a sensible approach, but nobody besides sQLite and Fossil (which comes from the same developers) seems to be using it, either in C or C++, which leads me to suspect that there are caveats I don't know about. Is there a reason why this is a bad idea?
[1] https://fossil-scm.org/home/doc/trunk/tools/makeheaders.html
by evrimoztamur on 10/15/23, 3:36 PM
You guys OK in C++ land?
by malkia on 10/15/23, 4:39 PM
by kazinator on 10/16/23, 6:20 AM
I programmed in Modula 2; that definitely didn't need make.
The module definitions need to be parsed and semantically analyzed, so the compiler has to be the build tool.
by jll29 on 10/15/23, 3:46 PM
Can mobiles be nested? (I'm thinking java.util.StringTokenizer.) If so, has a module hierarchy (as for Java) been defined for all STL standard functionality?
Are their any compilers that implement C++20 in full in 2023?
by adjav on 10/15/23, 4:04 PM
by up2isomorphism on 10/15/23, 5:24 PM
by wly_cdgr on 10/15/23, 3:17 PM
by delta_p_delta_x on 10/15/23, 4:08 PM
No, please. If you're using C++20 modules, you also owe it to yourself to use a modern build system...
Edit: it seemed like I fired right into a Makefile-shaped hornet's nest.
by mgaunard on 10/15/23, 3:45 PM
The thing is that they're heavily implementation-specific things. Two of the biggest compiler vendors pushed the model that worked best for their implementation. The Microsoft model won the committee politics, which is why they're the only ones with an implementation. Too bad most C++ developers don't really use Microsoft software to begin with.
They should have just stuck with a simple PCH-but-you-don't-have-to-include-it-first model. That would have been straightforward and immediately useful to everybody.