Derek Neighbors wrote: > Well you could choose flags that cause issues. For example on the Alpha > Processor line it is common to issue a flag that changes the way floating > points work. This is often done because if you don't the application > won't compile. So you are forced to either fix the application's source > code so it will compile or to throw this flag so you can get a binary. > However, in doing so you have made the application more suspectible to > buffer overflows, etc. > > Another example would be that enabling feature X at compile time in a > program might be "handy" but extremely insecure. The package maintainers > know this so don't compile with that option, but you as an individual may > not know that so you compile with that option for convience, never knowing > that you are opening a large security in the process. > > Note compiling your own stuff doesn't mean you are insecure or asking for > disaster. You could argue the other side of the coin and say the danger > in dealing with binaries is you can never trust what the packager really > put in that binary. (and you would be correct) I still think that > compiling a production system seems errant from a risk analysis > perspective. Maybe, I'm getting too conservative in my old age. ;) All of this is very true, it didn't occur to me however since my motivation was to be _more_ restrictive when I recompile. Since security was my goal, I hadn't considered compiling in _less_ secure options ... but this is of course a risk. But also assumes that the package maintainer for the binary package has taken these things into account. Compiling from source seems to give some people a warm fuzzy. But I am just happy enough to know that the sources are available for the binary package I install ... since I have neither the time nore the expertise to do a code audit anyway. Austin