xparq / nmake-jumpstart Goto Github PK
View Code? Open in Web Editor NEWSingle-file MSVC/NMAKE "jumpstart" Makefile for building source trees without fuss or extra tooling
Single-file MSVC/NMAKE "jumpstart" Makefile for building source trees without fuss or extra tooling
WARNING: '80s tech! Still in early development! ;) It aims to have the following features, though: - [x] Single-file drop-in build to kickstart simple C/C++ projects - [x] Flexible and scaleable enough to allow reasonable growth without worrying about a real build sytem - [x] Familiar config macros, with ready-to-use defaults for common setups - [x] Basic facilities to configure external deps. (libs) - [x] Doesn't impose any additional external deps. itself - [x] Handle src. subdirs transparently (something NMAKE hates to do...) - [x] Build a lib from (part of) the sources - [x] Build executable(s), too, of course - [ ] Basic (configurable) smoke-testing of build results - [x] DEBUG/release build alternatives (DEBUG=1/0, default: 0) - [x] Static/DLL CRT build alternatives (CRT=static/dll, default: static) - [x] Separate target trees for the (incompatible) build alternatives - [x] Cleanup tasks (either a build alternative only, or all outputs) - [ ] Debug/release executables can be the same file (with overwrite) (may require "Auto-detect changes in the build command line") - [ ] Header dependency auto-tracking - [ ] (Some?) support for C++ modules - [ ] Auto-detect changes in the build command line or in env. vars to trigger full rebuild (might be possible even without a thin runner script to capture the make command line) (Note: the "src" and "include" dirs. in the repo are just examples/tests.) DEV. NOTES: * Due to NMAKE's inability to handle arbitrary subdirs flexibly in the source/target tree (namely, in inference rules), recursive dir traversal has to be delegated to shell commands (e.g. in a wrapper script, or some "driver" rule) instead. (While NMAKE can execute shell commands during preproc. time to gather files or subdirs recursively anywhere, the problem is that the results of those commands can't be assigned to macros. Also, !INCLUDE files could be created on-the-fly during preprocessing time, but only very simple ones, as the hostile "programming environment" of the !IF [...] directive combined with the CMD command-line it provides makes a failed Mars-mission feel like a dream holiday in comparison.) * To avoid having to name the taget modules one by one, especially across multiple directories, the only sane way to enumerate them (with most make tools) is 1. to assume a 1-to-1 relationship between the (relative) paths of certain source file types (i.e. the C/C++ translation units) and their corresponding targets (i.e. object files), 2. and map those source names to their matching target names to allow compiling -- via inference rules -- implicitly, identifying them by their types and locations, not their names. However, due to two other inabilities of NMAKE, i.e. that a) it can't do pattern-matching for paths in inference rules, and b) it can only match wildcards on dependency lines -- which is too late for using the results as target lists in other rules (note: not even its false-hope-inducing EXIST() function can deal with wildcards) --, a recursion trick is used, as described below (with its core idea having been borrowed from https://stackoverflow.com/a/65223598/1479945). (Note: GNU make does support path patterns in inference rules, and also has macro facilities to manipulate paths with wildcards during preproc. time, *and* lets you set the results as macro values, all of which make processing filesystem trees practically effortless there.) With NMAKE, where wildcard-expansion results can only propagate from dependency lines downward to command blocks, there's one more "secret" place, however, where those wildcard-matching results can still make their way to rule definitions (as targets): the make command-line itself. So, we can call NMAKE again, now with the expanded target list, and just let it then apply the matching rules to build them. Well, almost... Because of its crippled path matching in inference rules, we also have to pass it the current dir path explicitly, so it can use that in those rules. This also means that we must process the source tree dir-by-dir, can't just grab all the sources, `patsubst` them to their corresponding target name pairs, and pass the whole list to a new make subprocess in one go. Alas, every dir will need a new child NMAKE process. (But at least the dir traversal itself is not recursive...) (Of course, with the abomination NMAKE is, even this will entail a lot of convoluted workarounds, so this is easier said than done -- but at least it _can_ be done, somehow, eventually... What's still a long way away, though, is header autodependecy tracking. GCC is, again, way ahead of MSVC in this regard, unfortunately. I.e. the `CL -showIncludes` option is an insulting joke.)
The usual $(MY_BASE_DIR)/$(SUBDIR)
-> /subdir
errors with empty base.
The (also usual) hack of replacing empty dirs with .
just to avoid this is a) a weak, brittle protection (forget it just once!...), b) results in ugly paths, c) can be avoided, too.
The cost is that avoiding it requires even more cumbersome path construction macro clutter. So, it's questionable whether it's worth it at all.
Especially when it involves deleting stuff...
$(patsubst /%,%,$(path))
could be a way to tackle this.
But how can this be made sensible?! Any leftover files from an (incompatible) altertnative would make it just fail to link, in the best case, or even silently fail later!
A warning about the impending danger of mismatching incompatible (object) code, is probably not enough -- this would still be just stupid...
The real use case here is to overwrite the top-level targets (only)! -- But even then: if the previous build was a different alternative, and more up-to-date than the sources... -> #29 (Autodetect build option changes to trigger full rebuild) -- which is still kinda hamfisted for this, i.e. for the implied use case of "Switch to DEBUG exe" etc...
An old doc. specifically said it will not (esp. when there's no command).
However, the presence of this $(obj_dir)\main.obj:: $(HASH_INCLUDE_FILE)
dep. line blocked (re)compiling main.obj
if it didn't exist! :-/
$(obj_dir)\main.obj:: $(HASH_INCLUDE_FILE)
$(HASH_INCLUDE_FILE):
$(ECHO) " Make last commit ID available for #including..."
$(BB) sh tooling/git/_create_commit_hash_include_file.sh
Now, with a deleted main.obj
, the build proc. went straight to linking, which failed... (Adding an echo
to that dep. rule confirmed that it had indeed been picked up.)
Should be able to build the lib with just telling it it's a lib, and setting the inc. path.
Wow, the entire Makefile gets parsed twice, for some reason! :-o
This is to ease updating the Makefile in client projects -- which is still a manual process of copy-pasting portions of the Makefile! (-> also: #33)
Adding them by hand is pretty much trivial, but clean
would know nothing about it, and leave them around.
(Doing the same for extra libs is far less of a pressing issue.)
Something like renaming finish: ...
and delegating it to the prj layout config, so e.g. sfw would have it like
finish: $(main_lib) $(main_exe) $(test_exe)
Well, OK, using the target directly in the cfg. section is impossible, as all the macros that are typically used in those names would then get prematurely expanded (with incorrect/empty values)! So, another macro would be needed for that, to list the targets separately... :-/
main_lib
/main_exe
can just be not set (commented out)build = ...
macro nowclean
!set objlist=!objlist! $(obj_dir)!_o_:.cpp=.obj!
UPDATE: Gradually getting there, though. v0.09 can build Thor in a dir named like SFML-master (pre3)
. (Note that those parentheses might also spook out CMD et al.! Exclamation marks certainly do!)
It seems pretty robust against \
and /
at least:
all: sub\1.obj sub/2.obj
{sub/}.c{sub\}.obj::
echo do $<
all: sub\1.obj sub/2.obj
nmake $**
{sub/}.c{sub\}.obj::
echo do $<
Would be more streamlined process-wise, but lose the sexy 1-file drop-in property (enabled by inline files, which are not available outside of rules/commands (AFAIK)). :-/
I guess it's because the .ifc is probably not listed as a dependency anywhere, so there's nothing triggering that inf. rule...
If the NMAKE path in $(MAKE)
contains spaces, this command failed, _until I removed the quotes from the supposedly unrelated DIR=
clause!... :-o
rem ... !_make_! ... "DIR=!_dir_!" would FAIL if the NMAKE path has spaces! :-ooo
if exist %%i\*.%%x !_make_! /c compiling DIR=!_dir_! SRC_EXT_=%%x $(custom_build_options) || if errorlevel 1 exit -1
-> xparq/Out_of_Nothing#194, #5!
-> Also, I've seen it locally on a fresh Win10 test OON setup: the problem there was double-quoting $(MAKE)
! (Fixed only there yet!) It seems NMAKE does it already for paths like "C:\Program Files"
!
/V
for inheriting macros, but (my current) v14 doesn't have that option any more. (But that proved to be a red herring: it meant inheriting all macros, in addition to those defined on the cmdline! (Still very interesting why they abandoned that feature. I guess it may have introduced issues regarding what should override what; also.)OK, and these are the (new) defaults:
# Output dir/file suffixes for build alternatives
# NOTE: Must differ from their "no ..." counterparts to avoid code (linking)
# mismatches!
buildmode_crtdll_dir_suffix=.crtdll
buildmode_crtdll_file_suffix=-crtdll
buildmode_debug_dir_suffix=.DEBUG
buildmode_debug_file_suffix=-d
Ideally, suffixes like /crtdll
, /debug
should also Just Work, but unfortunately the MSVC toolchain would stubbornly refuse to implicitly create any intermediate directories... :-/
out.crtdll
-> out/.crtdll
objs: $(src_dir)/$(units_pattern).$(SRC_EXT_)
# Amazingly, using backslashes (instead of /) in the `patsubst` below would kill the inf. rule matching! :-o
# (Despite also changing it to backslash above, in the rule, of course.
# And despite inf. rules being quite robust against / vs \ otherwise.
# What am I missing here, on this late night hour?...)
@$(MAKE_CMD) RECURSED_FOR_COMPILING=1 DIR=$(DIR) $(custom_build_options)\
$(patsubst $(src_dir)/%,$(obj_dir)/%, $(subst .$(SRC_EXT_),.obj,$**))
Well, OK, the need for escaping \
was what I had been missing! :)
objs: $(src_dir)\$(units_pattern).$(SRC_EXT_)
@$(MAKE_CMD) RECURSED_FOR_COMPILING=1 DIR=$(DIR) $(custom_build_options)\
$(patsubst $(src_dir)\\%,$(obj_dir)\\%, $(subst .$(SRC_EXT_),.obj,$**))
Not all are mandatory actually, but specifying which ones are would be even more problematic!
Matching false positives like local.cfg
or broken.cpp.off
etc.
The macro-def. precedence rules don't say that "take precedence" doesn't just mean the order they will be initialized, but also that they can only be set where they had been initialized! :-o (Actually, GNU make does it similarly, except they do provide the override
keyword to control this!)
So, this will be either able or unable to fixup inputs depending on where their initial (e.g. default) value comes from! :-o
!if "$(out_dir)" == "" || "$(out_dir)" == "\"
!message out_dir is empty, fixing up!...
out_dir = .$(out_dir)
!message out_dir = $(out_dir)
!endif
(E.g. out_dir
will always remain empty with nmake out_dir=
!...)
OK, well, even though capitalizing the cfg. macro names (fortunately, NMAKE, too (like GNU make) is case-sensitive) + the normalizing step from /
to \
provides a perfect reconciliation point, bridging the "externs" into "locals", but... it needs to be documented that they are still impossible to set on the command-line, because the recursive NMAKE calls will lose them anyway! :-(
-> Well, added the comment: # Note: the cfg. options can't be set on the NMAKE command line, only here!
Well, now that support for custom build options is materializing (as of r12), this is even less of a concern.
Creating executable: out/example.exe...
link out/obj/main.obj out/example.lib -out:out/example.exe
Microsoft (R) Incremental Linker Version 14.35.32215.0
Copyright (C) Microsoft Corporation. All rights reserved.
LINK : fatal error LNK1104: cannot open file 'LIBCMT.lib'
When running the exact same (copy-pasted) link
command manually, it just works fine.
Since lib creation is cheap (fast) and relatively simple, whereas the actual dep. checking with the current multi-stage tree-traversal (thanks, NMAKE) would be cumbersome and daunting, and, well, I haven't got 'round to it, currently it just does a blatant lib reassembly on every run...
#!!Would fail with fatal error U1037 dunno how to make *.ixx, if they don't happen to exist!
#!!objs:: $(src_dir)/$(units_pattern).ixx
#!! @$(MAKE) -nologo RECURSED_FOR_COMPILING=1 DIR=$(DIR) $(patsubst $(src_dir)/%,$(obj_dir)/%,$(**:.ixx=.ifc))
To avoid the issue when a given source type -- used as a wildcard dependency
-- doesn't exist in the given source dir, this hackey won't work, because they
will catch all the existing cases, too:
$(src_dir)/$(units_pattern).cpp:
#!! Well, no `%|eF` either: it "requires a dependent"! ;)
@echo Bu! Silencing the "Dunno how to make *.cpp" error, when no such file exists.
$(src_dir)/$(units_pattern).cxx:
@echo Bu! Silencing the "Dunno how to make *.cxx" error, when no such file exists.
$(src_dir)/$(units_pattern).c:
@echo Bu! Silencing the "Dunno how to make *.c" error, when no no such file exists.
$(src_dir)/$(units_pattern).ixx:
@echo Bu! Silencing the "Dunno how to make *.ixx" error, when no no such file exists.
To mitigate the stupid "can't build unmatched wildcard" error, the combined
objs: c*
rule can (coincidentally...) cover in one rule the cases, where
at least some .c* source exists in the given dir...
At least (I guess) the .cpp inference rule did include it for compilation, for some reason!
-> #45!
It goes in 2 stages after deleting an .ifc:
Storage.obj
listed in the link deps! :-oAnyway, I guess obj. files should depend on ifcs (like on headers?), which is not yet represented.
(I.e. start with an initial clean.)
The main use case is automating/simplifying the integration of libs into other (lib or app) projects.
I know right?... Wrong!
Given an existing file.msvc
:
if exist *.msvc echo ok
-> ok
if exist *.ms echo ok
-> nothing
if exist *.msv echo ok
-> ok
Like a proper install/uninstall cycle. So one more reason to do no blanket rm -rf
!
One reason is that the output dirs (like obj_dir
) may be shared by a completely different build system, e.g. in my sfw project, the GCC build reuses the same dirs!
FWIW (also -> #21!):
nmake /f nonexisting || echo %ERRORLEVEL%
-> 0
dir nonexisting || echo %ERRORLEVEL%
-> 0
nonexisting || echo %ERRORLEVEL%
-> 0...
I can't ever get a non-0 with this idiom on the CMD command line. (Haven't tried in a batch file now, but would be (kinda-sorta, "historic'ly") interesting to find out what the hell...)
This form still works (in the same context), though: nmake /f nonexisting || if errorlevel 1 echo FAILED!
Still can't capture & forward the original exit code of the failed process.
But at least this form also works with if
: if something nmake /f nonexisting || if errorlevel 1 echo FAILED!
This is to ease updating the Makefile in client projects -- which is still a manual process of copy-pasting portions of the Makefile! (-> also: #12)
OK, for custom build options there's the custom_build_options
macro now, which, if set as documented (in its comment), will propagate the custom macros to the recursed NMAKE processes.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.