On Tue, Aug 14, 2007 at 09:35:49PM +0000, gast128 wrote:
Jens Seidel
writes: On Tue, Aug 14, 2007 at 07:01:34PM +0000, gast128 wrote: Especially as header files are included in all kind of user programs they need to be clean!
I agree that they should be clean, but if they are not what do you do? Waiting
Report it (best with patch), such as in I did recently for SDL Pango: http://bugs.debian.org/cgi-bin/pkgreport.cgi?ordering=normal;archive=0;dist=...
until they are clean?
It depends. If there are really errors in it I suggest to workaround if possible to be compatible (at least for some time). If it is not possible to workaround it change it locally and hope upstream will react soon.
Adapt the headers yourself with previous mentioned drawbacks? Or just treat these headers as foreign and lower the level warning or disable warnings ONLY for those foreign headers.
As long as only warning are affected I would do to obvious step: Switch to the development version (source code repository) where it should be already fixed if the project is still alive (as you sent the patch, do you remember :-) and otherwise just turn warnings off as you suggested. But if such warnings occur only rarely (because you do not change a special piece of your code often and don't need to recompile it) I would just ignore it (once it is reported of course).
This is the way Open Source
We use Boost as open source.
And I really think this is a very good idea :-) To be honest I only used Boost.test and Boost.Log (failed to became official yet). And at least for the second I indeed contacted the autor already. (And in Boost.test I did never find any errors except the current one which is the reason that I'm subscribed since short time.) It not good that I always complain so much. I will obtain Thursday the current Boost code and send patches (for -Wall and maybe later also -Wextra). Promised! Currently I have to fix a few errors in hex-a-hop, a funny game made free only a few days ago (try it out!).
I try often as many compilers as I have access to, it included also experimental ones ...
I have to confess that I do not understand why some people create non-portable code ... There exist so many cross platform libraries.
Well to make things even worse I do create non portable code, and detect this only when we go from one Visual Studio version to the next one. We tried Visual Studio 2005, took me a week to get rid of most errors (e.g. they had removed the non confirming pow(int) overload). If I apply gcc it will cost me a month to get more confirming code without gaining direct commercial value.
Exactly that's why I suggest you to test it with gcc as soon as possible. If you learn to avoid writing non portable code you will profit in the future, right? (But don't worry, I made many such errors as well, it's normal :-)) Again: it costs you nothing (except time). Look at least for major pitfalls.
Employer will not pay that.
I do such stuff often in my spare time. Since I work with/on Open Source it also makes a lot of fun!
And then I even not mention all the non confirming compilers.
Yep. I knew in the past many compiler shipped with Unix workstations which were not conforming. Since these compilers are often installed per default I support working around issues as long as it makes sense. But I do not understand people who install proprietary compilers on systems which do not ship a default one (I know only one such system) if they are non conforming (here we are again at the start of our discussion :-).
appreciate the work invested for all the (non trivial) work arounds. The code becomes quite unreadable, because of those macro's and #ifdef's, but then again there is no alternative except sending non confirming compiler builders to the galleys...
Right :-( Jens