View Single Post
PhilE's Avatar
Posts: 71 | Thanked: 65 times | Joined on Oct 2009 @ Brighton, UK
#13
the published code is what the community can review, not the binaries
The former leads irrevocably to the latter - In the case of Linux users who download a source code 'package' of one sort or another, the application is compiled locally. It's not possible to compile a set of source files and have the result be anything other than the binaries derived from that source code.

The big distro makers pre-compile source packages into installable binaries, i.e. RPMs for the RedHat derived distros, PKGs for the Debian derived, etc. This effectively separates the binaries from the compilation process that produced them, so a higher degree of trust is needed on the part of the end user. Most distros demonstrate their trustworthiness by digitally signing their binary packages using GPG or some other key-pair type scheme, making it easy to determine if a binary package has been tampered with or not.

There are a relatively small number of entities such as Adobe (Flash, AdobeAir), CyberLink (PowerDVD for Linux) and some others I can't think of as I type this, who only make binary versions of their software available. They are effectively saying to their end users, "We refuse to show you any evidence that this software is benign in terms of the security of your system and/or data. You'll just have to trust us".

Finally, the security model in Linux is diametrically opposed to that found in many version of other widely used operating systems. The Linux way is that the default user access is always non-administrative, making accidental or deliberate tampering at system level more difficult. The other (OK, I'll say it, the Windows) way is that users by default have free rein over the majority of the operating system. It is this fundamental difference in approach which makes Windows-based malware relatively easy to write. The greater deployment footprint of Windows compared to Linux or MacOS ensures that malware can spread more easily too.

I have spent almost 10 years deploying and maintaining Linux in ISP data centres for both infrastructure and managed/colocated hosting purposes. In my experience, the usual chain of events is that malware gets onto a server as source code, is compiled locally, exploits a vulnerability elsewhere in the operating system or the packages provided with it to gain root access and then begins to do it's dirty work. Particularly for web servers, having /tmp as a file system on its own partition, mounted with noexec, nodev and nosuid flags set, and changing the permissions on the gcc binary to make it executable only by root, will greatly reduce your exposure to most of the more common Linux exploits currently out there.
__________________
Phil Edwards
Brighton, UK
 

The Following 4 Users Say Thank You to PhilE For This Useful Post: