On 3/7/2017 5:55 AM, Niall Douglas via Boost wrote:
Given that preprocessor checks are often used for compiler workarounds, and pcpp is not a full C++ frontend, one would have to make sure pcpp defines the same set of predefined macros the compiler does. In the particular case of MSVC that would make libraries like Boost.PP and Boost.VMD treat pcpp the same way they do MSVC, which is probably suboptimal, if at all functional. I guess, for such tandem to be workable, pcpp has to define its own predefined macros, and PP and VMD have to test it before they test other compiler-specific macros.
You've struck exactly at the main use case for pcpp, and why I didn't start from Wave or an existing preprocessor but instead reinvented the wheel and in Python, not C. I specifically am implementing a "partially executing" pre-preprocessor which can be programmatically and dynamically instructed to transform a set of input files with preprocessing commands into other files with some of those preprocessor commands executed or expanded or replaced, and some passed through.
pcpp by default acts as a straight preprocessor, but it can also be told to pass through #if logic it can't fully execute due to unknowns, or partially execute #if logic. It can be told to pass through #define and #undef but also execute them (or not) on a per-macro basis. It would be easy enough to tell it to not execute any preprocessing commands except #include and include guards for example.
I'm sure you can see the big benefits to pregenerating canned preprocessed files such that #including a Boost library is much faster than before because most of the preprocessing work is already done. Right now generating those is tedious work using hacky scripts on source files with splicing metadata injected via comments etc which are brittle. pcpp will allow for a FAR more robust solution which can be safely left to a CI to run per commit if desired.
pcpp is basically Facebook's Warp (https://github.com/facebookarchive/warp) but done much more flexibly and usefully (no offence intended to Warp's developers, but Warp isn't very useful outside a very limited use case).
My ideal end goal is for a download page for a Boost library to provide a set of tick boxes and drop down menus that let a user pre-preprocess a Boost library into a custom "drop in and go" single file edition for their particular use case, just like you can with say jQuery downloads. Again choosing Python instead of C makes that safe and secure. I don't know if I'll have the time to reach that, but I'm an awful lot closer now than I was a month ago.
The practical problem with this is that source files with preprocessor directives often depend on the compiler being used, with its predefined macros, to generate the correct output.
snip...
Niall