Are You Sure This Is the Source Code? 311
oever writes "Software freedom is an interesting concept, but being able to study the source code is useless unless you are certain that the binary you are running corresponds to the alleged source code. It should be possible to recreate the exact binary from the source code. A simple analysis shows that this is very hard in practice, severely limiting the whole point of running free software."
Bogus argument (Score:5, Insightful)
Re:Bogus argument (Score:5, Informative)
The guy who submitted that article is the person who wrote it. Awesome "work", editors.
Re:Bogus argument (Score:5, Insightful)
But too his credit, he did say a "simple analysis" although when reading TFA he omitted the word "minded" from the middle of that phrase.
Virtually all of his findings are traced to differences in date and time and chosen compiler settings and compiler vintage.
Unless he can find large blocks of inserted code (not merely data segment differences) he is complaining about nothing.
He his certainly free to compile all of his system from source, and that way he could be assured he is running
exactly what the source said. But unless and until he reads AND UNDERSTANDS every line of the source he is
always going to have to be trusting somebody somewhere.
Its pretty easy to hide obfuscated functionality in a mountain of code (in fact it seems far too many programmers pride
themselves their obfuscation skills). I would worry more about the mountain he missed while staring at the
mole-hill his compile environment induced.
Re:Bogus argument (Score:5, Informative)
There are very talented people that can hide things in only a few lines of code. See http://ioccc.org/ [ioccc.org] for some examples that will make your skin crawl.
Re:Bogus argument (Score:5, Informative)
For true malice there's also The Underhanded C Contest [xcott.com].
From their home page: "The goal of the contest is to write code that is as readable, clear, innocent and straightforward as possible, and yet it must fail to perform at its apparent function. To be more specific, it should do something subtly evil."
Re: (Score:3)
Oh man. I sortof feel sorry for the runner-up of the 2009 contest. From the evaluation: "The bug is plausibly deniable as poor coding, and rests on your caffeine-addled inability to notice a ‘0’ instead of a ‘\0’ when testing for end-of-string. The comparison in safe_strcmp has unnecessary terms, which achieves two evil goals: first, it sets up a pattern that fools your eyes, and second, it looks just amateurish enough that the bug, if found, looks like a sophomoric mistake rather th
Re: (Score:3)
Not the GCC, but Ken Thompson's original C compiler. And it was Thompson who fessed up. The other thing that compiler (allegedly) did was insert a back door any time it compiled the "login" program.
Re:Bogus argument (Score:4, Insightful)
There are very talented people that can hide things in only a few lines of code. See http://ioccc.org/ [ioccc.org] for some examples that will make your skin crawl.
True, but any programmer that works in a Professional way should document their code so that it is maintainable. Those programmers that think that their code should be hard to read because that is a good way of keeping their job eventually come down to earth with a thud when their manager tells them that "The door is over there, please watch your fingers on the way out". Usually hard to read code is thrown out and a fresh start is made since it sometimes is so much quicker to do this especially if the System Designer (not the programmer) has documented the concept properly. On a more serious note companies that don't have well documented overview design and code are asking for trouble down the time line.
Re:Bogus argument (Score:5, Informative)
Yeah. Unfortunately, the issues he presents here DO make it more difficult to prove that someone is providing a binary that could NOT have possibly originated from the provided source code.
As an example, the kernel source initially released for the Samsung GT-N8013 (USA Wifi Note 10.1) was not what was used to build the binaries in question.
The "difficult to prove but obvious" - Any kernel built from the provided source had a massively broken wifi driver that would completely stop functioning, usually within 5-10 minutes, requiring the module to be removed and reinserted. Pulling the wifi module source from a different Samsung tarball (such as a GT-I9300 release) would result in a working driver. But how do you prove the source provided is correct?
In the case of the N8013, we were lucky - Samsung changed a bunch of debug printk()s slightly in their released binary. Small stuff, not functionally relevant, such as typo fixes and capitalization differences in their touchscreen driver's debug printk()s - but at least provable to be different.
So we could prove that the kernels didn't match, but couldn't necessarily prove that the biggest functional problem was due to a source difference.
We asked Samsung to provide source that corresponded to the UEALGB build for that device, and their response was, "That build is a leak and hence we are not obligated to provide source for it." Effectively admitting that the provided source was not meeting the requirements imposed by the GPL for that build, and then claiming that the software build preinstalled on every device sold in the USA for the first 1-2 months after launch was a "leak" and thus they didn't have to provide source for it.
Needless to say, between that and other situations, that was my last Samsung device.
Re:Bogus argument (Score:5, Informative)
But unless and until he reads AND UNDERSTANDS every line of the source he is
always going to have to be trusting somebody somewhere.
Even if he reads and undertands every line of the source, he's still trusting someone. He has to read and understand every line of the source code of the complier he is using, and the compiler that compiled that compiler, and so on.
Reflections on trusting trust [bell-labs.com] is almost 30 years old now. It should be well known.
Re:Bogus argument (Score:4, Funny)
I wonder if next week I could get a story published that say, "I don't know if Microsoft is spying on you through your webcam. So it could be true."
Re:Bogus argument (Score:5, Insightful)
Re: (Score:3, Insightful)
To borrow from The Watchmen:
Who compiles the compiler?
Re:Bogus argument (Score:5, Informative)
To borrow from The Watchmen:
Who compiles the compiler?
Your attribution isn't just a little off, it's way off.
Try Iuvenalis, around 200 AD.
Re:Bogus argument (Score:5, Funny)
Re:Bogus argument (Score:5, Informative)
Who compiles the compiler?
I guess it's time to introduce another generation to the devious genius of Ken Thompson [bell-labs.com].
You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.)
Re: (Score:3)
Who compiles the compiler?
You hand-translate it to machine code to create the first binary. Then you apply it to itself. It's a time-proven technique.
Re: (Score:3)
Who compiles the compiler?
You hand-translate it to machine code to create the first binary. Then you apply it to itself. It's a time-proven technique.
Yeah, you could do that.
On the other hand, the early Unix compilers were actually translators that converted C to assembler. You could simply check the assembly code to ensure no surprises, then scan the object code to make sure it matched the assembly source. I spent enough time reading core dumps in my misspent youth that I could disassemble object modules in my head.
More recent compilers generate a generic assembler which they then reduce, optimize and generate code from, but the same tactics can be used
Re:Bogus argument (Score:5, Insightful)
If you're worried about the lineage of a binary then you need to be able to build it yourself, or at least have it built by a trusted source... if you can't, then either there IS a problem with the source code you have, or you need to decide if the possible risk is worth the effort. If you can't get and review (or even rewrite) all the libraries and dependencies, then those components are always going to be black-boxes. Everyone has to decide if that's worth the risk or cost, and we could all benefit from an increase in transparency and a reduction in that risk -- I think that was the poster's original point.
The real problem is that there's quite a bit of recursion... can you trust the binaries even if you compiled them, if you used a compiler that came from binary (or Microsoft)? Very few people are going to have access to the complete ground-up builds required to be fully clean... you'd have to hand-write assembly "compilers" to build up tools until you get truly useful compilers then build all your software from that, using sources you can audit. Even then, you need to ensure firmware and hardware are "trusted" in some way, and unless you're actually producing hardware, none of these are likely options.
You COULD write a reverse compiler that's aware of the logic of the base compiler and ensure your code is written in such a way that you can compile it, then reverse it, and get something comparable in and out, but the headache there would be enormous. And there are so many other ways to earn trust or force compliance -- network and data guards, backups, cross validation, double-entry or a myriad of other things depending on your needs.
It's a balance between paranoia and trust, or risk and reward. Given the number of people using software X with no real issue, a binary from a semi-trusted source is normally enough for me.
Re:Bogus argument (Score:4, Insightful)
Re:Bogus argument (Score:5, Insightful)
And yet, it still means that you can fix it, or even rewrite it in something else, if you want. Not having the source code means this is between much-more-difficult and impossible. The lesson here should be that everything we use should be open source, including compilers and libraries, not "well in theory I might have problems, so screw that whole open source thing .. proprietary all the way!"
Re: (Score:3)
Bad choice of target (Score:5, Informative)
Bad choice of target - .Net does actually have multiple compilers available, including open source. But more to the point for this discussion, it has multiple DEcompilers available, including open source.
Want to know what that nasty MS compiler put in your .Net binary ? - run it through ILSpy.
Don't trust the ILSpy binary - decompile it with itself, or with a.n.other decompiler.
In fact, because .Net decompiles so well, the problem of this article (binaries don't compare) just doesn't occur. Want to check your .Net binary against the supposed source ? - easy (well, a hell of a lot easier than with C++). Build your binary from the source, decompile both binaries and compare the two sets of decompiled source. It works, it is consistent and reliable, and it is one hell of a lot more useful at showing up differences than comparing two binaries.
Re: (Score:2)
You run a test suite.
Which is one reason why important open source programs make sure that the test suite and its sources are also available and up to date.
Or, you examine the source, and then compile it with a compiler from a different source.
Re: (Score:2)
"Exact binaries" is not the point of having the source code.
The use case is "we're using this binary in production, which we didn't build ourselves". That's how open source is generally used in practice, after all - you download the binaries for your platform, and you (maybe) archive the source away somewhere just in case.
Isn't that the strongest practical use case for Open Source in the business world? Sure, you don't plan on maintaining it yourself but you could if you have to. The problem is, if the source doesn't match the object, you can't just fix a bug - y
Re: (Score:3)
No. The strongest practical use case for Open Source in business is that the Open Source version is some combination of better/cheaper than alternate versions, with "better" including the fact that Open Source projects often get updated faster when security bugs (and sometimes other bugs) are found. The possibility of bringing development fully in-house is not a practical solution for 99.99% of businesses. (I'm exaggerating a little, but not much).
Re: (Score:3)
Re: (Score:2)
Re:Bogus argument (Score:5, Interesting)
"Exact binaries" is not the point of having the source code.
You are correct. However, it is a method to confirm that you have received the entire source code.
The point being made is that a binary could always contain functions that are malicious, buggy or infringe on copyright while the supplied source does not.
Case Study:
A software company (lets call them 'Macrosift') takes over project management of a GPL'd document conversion tool. Macrosift contribute quite a bit of code and the tool really takes off. Most users are obtaining this tool be either the Macrosift-controlled repository or a Macrosift partner-controlled repository as a pre-compiled binary. It can even convert all kinds of documents flawlessly into Macrosift's Orifice 2015 new extra standard format which no other tool seems to be able to do.
Newer versions of OpenOffice, LibreOffice, JoeOffice come out and this tool just doesn't seem to be doing the job. Sure, it converts perfectly from everything into MS .xsf but doesn't work so well the other way and won't work at all between some office suits. The project gets forked by the community to make it feature complete. The project managers start by compiling the source, and to their surprise, the tool will not work as well as the binary did. After a year passes, the community realizes they've been had. By painstakingly decompiling the binary, they discover that the function that converts to MS proprietary .xsf is different to that in the source. Another hidden function is discovered in the binary that introduces errors and file bloat after a certain date if the tool is being used solely on non-MS documents.
How else can I ascertain whether you have supplied me with THE source code for THIS binary if I can not produce said binary with provided source code?
Re:Bogus argument (Score:5, Informative)
The latest alpha release [torproject.org] of the Tor Browser [torproject.org] uses a deterministic build process for exactly that reason: users of open source software (or the small minority of users with the necessary technical skills) should be able to check that the published binaries match the published source exactly - no malware, no easter eggs, no backdoors. If someone detects a mismatch, they can alert the rest of the community.
Mike Perry, who spent six weeks getting deterministic builds working for Tor, has some interesting thoughts [stanford.edu] on why this is an important issue for security tools, even if the users completely trust the developers.
I'd like to see more open source projects following Tor's lead. Gitian [gitian.org] is a deterministic build tool that might help - it enables multiple people to build a binary from the same source and check that they get identical results.
Re:Bogus argument (Score:5, Informative)
"Exact binaries" is not the point of having the source code.
Uh, you must not have worked in a shop that does continuous integration automated builds? Do you really think QA should be handed binaries that you compile and have them trust them?
The problem is that GCC will always give you a different binary every time you compile from the same source. This makes it impossible that the binary you received comes from the source you claim to have used. You can get around this by never receiving binaries from anywhere but the automated build machine but it would still be useful to be able to test that a build that you received was built from the code you expect.
There were several reasons why Apple moved away from the GCC tool chain to LLVM and Clang but one of the abilities of the LLVM stack is that you can actually get identical binaries from the same source compiled on different machines at different times.
Re: (Score:3)
First off, [Citation Needed]. This is simply not true from my experience. I've done this many times with GCC and produced identical output (or so diff says). One caveat: make sure you start from a clean directory structure each time, because your Makefile might list dependencies in different orders for the linker if not everything is recompiled, and I think that can produce different result. But this is the build system presenting different input to the compiler, not the compiler itself producing differ
Re: (Score:3, Informative)
I used to work on GCC, and the randomness you describe would have made it impossible to find bugs.
GCC is deterministic. If you feed it the same input and launch it with the same options, it generates the same output. GCC developers would never tolerate random behavior.
Is it possible that you have address randomization turned on in your OS? I used to to use watchpoints & similar in the heap, and this would only work if randomization (ASLR/PAX) is disabled.
Re:Bogus argument (Score:4, Insightful)
Exactly.
I've recompiled Vim because I wanted to fix Vim's broken design of being unable to distinguish between TAB and Ctrl-I, doesn't support CapsLock remap, and wanted a smaller executable not needing all the bells and whistles of the kitchen sink.
I've recompiled Notepad++ due to bug (couldn't select a font smaller then 8 pts because the array was hard-coded in two different places. WTF?)
If you want to be able to quickly tell the quality of an open source project, see how easy it is to follow the directions to even produce an executable. Most open source projects have shitty docs on how to even compile it.
Re: (Score:3)
Sometimes it's exactly the point of having the source code.
Take voting machines for example. I used to work for a company that certified same. This involved obtaining everything that the vendor didn't write (compilers, OS, libraries, etc) from the 3rd party vendors (Microsoft, etc) including Linux from Scratch for the linux-based systems, then compiling it all (thus creating a "trusted build") and comparing the binaries.
No exact match, no certification. (This was after the vendor's source code went thro
Re: (Score:3)
*ducks*
Being able to is nice, but who has the time? (Score:4, Interesting)
Given the scale of most modern programs' codebase, good luck actually reviewing the code meaningfully in the first place. That said, if you're really that concerned about the code matching the source, run a source-based distro like Gentoo or Funtoo. For most practical purposes, though, users find binary distributions like Debian/Ubuntu or the various Red Hat-based systems to be more effective in regards to their time.
bugfixes, not paranoia (Score:2)
We frequently discover a bug and need to fix it without upversioning the whole package (which could result in other incompatibilities with the rest of the system).
So we track down the code for the version we're using, get it building from source with suitable config options, and then fix the bug. In the simple case the bugfix is present in a later version and we can just backport it. In the tricky case you need to get familiar enough with the code to fix it (and hopefully in a way that the upstream mainta
Re: (Score:2)
You can also get the source packages from debian/ubuntu and compile it yourself, all in one command:
apt-get -b source packagename
Source debs have also the good habit of putting the modifications to the upstream package in a separate diff.
The obvious thing is (Score:5, Insightful)
Re: (Score:2)
If you are that paranoid study the source code then recompile
yeah if he is bothering to read through it he should quite easily be bothered enough to compile it as well.. that's what he was going to do anyhow to compare.
also, you could clone the compile chain of popular linux distros as well, without fuss. it's not like they hide their build system behind closed doors.
Re: (Score:2)
Re: (Score:2)
If you are truly paranoid you write it yourself.
Re: (Score:2)
If you're really going to be paranoid, how do you know your machine isn't compromised? I hope you're doing a bit-for-bit comparison on your hard drive twice a day to make sure there's no file changes you didn't approve, and that you've soldered the top off our CPU and put it under a high power microscope to ensure the circuits haven't been changed.
touch o' hyperbole (Score:5, Insightful)
I'd suggest that "severely limiting the whole point of running free software" might be a touch of an exaggeration. A huge touch.
Re: (Score:3)
Is a big point anyway. Indepent auditing. That someone, somewhere, could say that the binary that my distribution gave me had a backdoor instead of the code they published (i.e. because forced by law to do and not disclose it), and that i even could check or rebuild it. With closed source you don't have that freedom, is even against the law to try to find that. And in current US pushed cyberwar state of things (they are trying this kind of things already [slashdot.org]), to have the possibility of independent auditing of
Re:touch o' hyperbole (Score:5, Interesting)
The issue the author is bringing up is that you have no way to easily determine that the published binary is, in fact, functionally identical to the published source code. Imagine you write an app that accesses private data and open source it, saying "check the source, the only thing we use the data for is X". And if you look at the source, that's certainly true. But there's no way to verify that the binary download was built from the published source; especially if the resulting binary is different every time you build it and different if you build it on different machines with different configurations. So, everyone who grabs the binary instead of building from source is taking it on trust, just like proprietary software, that the program does what it claims.
Incorrect suppositions. (Score:5, Insightful)
No it doesn't. The whole point of running free software is knowing that I can rebuild the binary (even if the end result isn't exactly the same) and, more importantly, freely modify it to suit my needs rather than being beholden to some vendor.
Re:Incorrect suppositions. (Score:5, Insightful)
No it doesn't. The whole point of running free software is knowing that I can rebuild the binary (even if the end result isn't exactly the same) and, more importantly, freely modify it to suit my needs rather than being beholden to some vendor.
There's another point too...which incidentally is the whole point of running a distro like Gentoo...that you can compile the binary exactly to your specifications, even sometimes optimizing it for your specific hardware. I don't get at all this idea he has about "reproducible builds;" if he builds the same way on the same hardware, he'll get the same binary. But what he's doing is comparing builds in distros with ones he did himself...and the odds that it's the same method used to create the binary are very low indeed.
If he's concerned about precompiled binaries having been tampered with, he's looking at the wrong protective measure. Hashes and/or signing are what is used to protect against that...not distributing the source code alongside the compiled binary files. If you look at the source code and just assume that a precompiled binary must somehow be the same code "just because," you're an idiot.
Re: (Score:3)
The whole point is that the distro build is supposed to be 100% reproducible, with the exception of things like timestamps and signatures. And it is with Debian, as he found out. But not the other distros he tried. And that is a real problem.
Why? naive people might ask. Because that is the only way to verify that a binary is what is claims to be. And is the only way to reliably support and diagnose something. It is shocking how few people on Slashdot realize that.
Not a concern (Score:5, Insightful)
If you need to be sure, just compile it yourself. If you suspect foul play, you need to do a full analysis (assembler-level or at least decompiled) anyways.
The claim that this is a problem is completely bogus.
Re: (Score:3)
Diverse Double-Compiling by David A. Wheeler (Score:5, Interesting)
Re:Diverse Double-Compiling by David A. Wheeler (Score:4, Funny)
Re: (Score:3)
Nice! While I think the threat is mostly academic, it is nice that somebody competent looked into defeating it.
Re: (Score:2)
With the optimization going on at compile time I do not see an assembler level analysis necessarily giving you any more information than a binary compare of the binaries.
Re: (Score:3)
That is not the point. The point is that comparing binaries will just give you a mismatch, unless you re-create exactly the same build environment. That is often infeasible.
This is my darkest fear... (Score:3)
It's a fair argument. If you are not compiling your binaries, how do you know what you have is compiled from the source you have available?
Truth? You don't. If you suspect something, you should investigate.
Re: (Score:2)
And on an open-source OS, you can.
Re: (Score:3)
Sorry to tell you, but Ken Thompson talked about you how you pretty much have to trust someone back in 1984: http://cm.bell-labs.com/who/ken/trust.html [bell-labs.com]
If no one else, you have to trust the compiler author isn't pulling a fast one on you....
Re: (Score:2)
It's a fair argument. If you are not compiling your binaries, how do you know what you have is compiled from the source you have available?
Truth? You don't. If you suspect something, you should investigate.
You're right, of course. But that's not quite the (non) argument he was making, I think.
My understanding was that he wanted to check how easy it was to get the same result if compiled the public-available source and compared it to the objects.
Turns out that, due to datestamps etc. slightly different, but no biggie.
Anyway, in a production environment you should be compiling from source, since - security concerns aside - that's the only way to be sure you've got the correct source for your objects.
Problems with verifying the binaries from source (Score:5, Funny)
Re: (Score:2)
I have recompiled all my software from the source code and verified that the binaries match
How many different compilers did you use? Did you try any cross-compilers, such as compilers on Linux/ARM that target Windows/x86 or vice versa?
How did Ken Thompson get into my system
See bunratty's comment [slashdot.org].
and how do I get rid of him?
See replies to bunratty's comment.
Re: (Score:2)
How did Ken Thompson get into my system
See bunratty's comment [slashdot.org].
I hope that wasn't a whooshing sound I just heard....
Are You Sure This Is the Source Code? (Score:3)
> Are You Sure This Is the Source Code?
Yes. Yes I am sure. I built it myself. It even includes a few of my own personal tweaks. It does a couple of things that the normal binary version doesn't do at all.
Re: (Score:2)
But given that the optimization phase of compiling/building can be significant, and their are lots of different optimization options; Why would you not just be better to leave that up to code maintainers?
Some things wrong with TFA (Score:4, Informative)
Re: (Score:2)
I honestly don't understand the blog post, I'm not severely limited in any way. I somehow feel the user doesn't even know how to compile software and doesn't know anything about Open Source. It doesn't matter if the binary is the same, maybe his is compiled with different flags than mine or maybe I added a patch.
This honestly smells of someone out to discourage usage of Open Source.
Re: (Score:2)
This honestly smells of someone out to discourage usage of Open Source.
Please run this statement through Hanlon's Razor.
(Or, to put it another way, I don't think you're deliberately misleading when you use the word "honestly". The alternative is much more likely.)
Trust (Score:5, Insightful)
I took a graduate-level security class from Alex Halderman (of Internet voting fame) and what I came away with is that security comes down to trust. To take an example, when I walk down the street, I want to stay safe and avoid being run over by a car. If I think that the world is full of crazy drivers, the only way to be safe is to lock myself inside. If I want to function in society, I have to trust that when I walk down the sidewalk that a driver will not veer off the road and hit me.
When you order a computer, you simply trust that it doesn't have a keylogger or "secret knock" CPU code installed at the factory. It's exactly the same with software binaries, of course. In the extreme case, even examining all the source code will not help [win.tue.nl]. You must trust!
Re: (Score:2)
So very true. In the end it all comes down to trust and as I posted above (before noticing yours) Thompson explained it extremely well.
Re: (Score:2)
Re: (Score:3)
You have an odd notion of trust. And of security, for that matter.
Blindly trust nothing except the laws of physics. Everything else is subject to investigation and verification. Just because verification is difficult or may fail is not excuse for not trying. By being vigilant, you can approach security, although you will never fully get there.
When I walk down the sidewalk, for example, I pay attention to the surroundings. How much attention is based on prior experience and knowledge of how likely drivers (
Re:Trust (Score:4, Interesting)
So your argument is that there will always be risk, so there's no point in managing or minimizing it? To continue your car analogy, even if I'm at a pedestrian crossing I don't really trust cars to stop and I always throw a glance to make sure they've noticed me. An uncle of mine was witness to a horrible accident, old lady got run over in broad daylight in the middle of a well-marked crossing, perpetrator was an old half-blind fool who should have lost his license already or had and didn't care. Doesn't help the old lady one bit no matter how much they punish him anyway. You always trust lots of people, you trust the factory who building the brakes on your car and the mechanic who serviced them, you trust the people who built the bridge it won't collapse from out under you but only because you lack any other practical alternative.
With software you do have more and better choices, not perfect choices but it's a helluva lot harder for the NSA to place a spy bug in Linux than in Windows where they can just show up with a national security letter that is both instructions and gag order and violating either can land you in jail. If there are reasonable ways to prove that these are the exact versions and compiler settings used to produce this binary, then that is much stronger than trust. Trust is something that can be betrayed, while reproducible steps is something you can verify. In science, if one scientists told you here are the steps of my experiment, feel free to reproduce my results and the other said "I can't show you the data but the results are correct, trust me", who would you trust?
Deterministic builds.. (Score:3, Interesting)
..are a bitch. The amount of hoops eg. the bitcoin developers jump through to proof they didn't mess with the build are large. Running specific OS build in emulators with fake system time and whatnot. No easy task.
Logical Equivalency Checking (Score:3)
I do IC design. Logical Equivalency Checking is well worn tool. You can futz about with the logic in a lot of different ways. LEC means we can do all sorts of optimization and still guarantee equivalent function. We can even move logic from cycle to cycle and have it checked that things are logically equivalent.
You run two compilers on the same source code you won't get the same code. You run two different versions of the compiler on the same code you wont' get the same code. You run the same compiler with different options you won't get the same code. They should however all be logically equivalent.
only if the code is 100% valid (Score:2)
Depending on compiler options, some code that isn't completely valid (no overflow/underflow/etc.) can end up logically completely different when you turn on optimization.
Compiler flags make this ridiculously nitpicky. (Score:2)
Unless I'm missing something pretty profound, even having the exact *source* won't always result in the exact binary. My understanding (and I could be wrong about this) is that you can take a well written program and plug it into multiple compilers. GCC may be one of the most popular options, but it's not the only one.
But compilers all optimize differently. GCC 3.x optimizes somewhat differently than GCC 4.x. You can tweak this behavior by manually setting compiler flags, or you can compile binaries that ex
There is no problem; complete chain exists (Score:4)
This a problem that doesn't exist. You establish a chain of evidence and authority for the binaries via signing and checksums, starting with the upstream. Upstream publishes source and there's signing of the announcement which contains checksums. Package maintainer compiles the source. The generated package includes checksums. Your repo's packages are signed by the repo's key.
You can, at any point in time with most packaging systems, verify that every single one of your installed binaries' checksums match the checksums of the binaries generated by the package maintainer.
If you don't trust the maintainer to not insert something evil, download the distro source package and compile it yourself.
If you suspect the distro source package, all you have to do is run a checksum of the copy of the upstream tarball vs the tarball inside the source package, and then all you need to do is review the patches the distro is applying.
If you suspect the upstream, you download it and spend the next year going through it. Good luck...
Tah da (Score:3)
Finally, someone gets it. The backdoor is never where you're looking for it.
Required in some industries (Score:5, Interesting)
I work in the gaming (Gambling) industry.
Many states require us to submit both the source code and build tools required to make an exact (and I mean 'same md5sum') copy of the binary that is running on a slot machine on the floor.. to an extent that would blow you away.
They need to be able to go to the floor of a casino, rip out the drive or card containing the software, take it back to THEIR office, and build another exact image of the same drive or SD card.
md5sum from /dev/sda and /dev/sdb must match.
I can tell you the amount of effort that goes into this is monumental. There can be no dynamically generated symbols at compile time. The files must be built compiled and written to disk exactly the same every time. The filesystem can't have modify or creation times because those would change.
This is a silly idea for open source software, the only industry I've seen apply it is perhaps the least-open one in the world.
Philips multimedia devices and GPL (Score:4, Interesting)
Are you sure it matters? (Score:2)
What difference does it make?
Do you think your smart enough to detect tampering by reading source code?
To detect tampering run strings on the binary and pipe it to grep. If the following string appears 1.3.6.1.4.1.981 you are fucked.
And that's not all! (Score:3)
Not only is limited in that way- which itself is an interesting fact, but it's limited in a lot of other ways also.
For one, source code is often bad, as in impenetrable, just off the top of my head-
* Realms of private, non-API / SPI code which is effectively *how the program actually works* which is also completely undocumented.
* Grotesque architectural errors made by (affordable) beginners which have nevertheless been cast in stone by exposing them publicly (God classes filled with global variables, etc. )
* Telegraphic and or misleading method and variable names, e.g. .VariablesWithMissingVowels, also known as Varwmvwls which nevertheless often serve as the ONLY documentation for that variable or method,
* Unfortunate architectural decisions made early on by experienced programmers who may be proud of those decisions. (tunneling package private methods out to "friend classes") and thus subverting the purpose of package private classes and making the source code scope modifiers an effectively an unreliable indicator of source code scope, for instance)
*500 -1000 line methods with some or all of the above characteristics.
* Just massive code bases- I am facing one with literally half a million classes right now...That's right almost 450,000 classes, in a code base that is deliberately architected to defy built-in scoping rules of the language, so virtually anything could call anything ...
And on and on.
All of these things will never be fixed for reasons we all understand, I presume, but reflect on of what this implies for open source. It implies that the much vaulted idea that more developers will iteratively make the code base better over time is a fiction with respect to the actual quality of the code base itself.
No team is going to stop adding features and create more work for itself in the form of resolving conflicts for the sake of enabling their program to do what it already can do.
This doesn't even get into the whole ego thing.
Worse still, anything exposed as public in any way may have a million clients depending on it and change effectively becomes impossible, open source or not. All things public, or even more precisely all things reachable in the code base by "outsiders" through any device found in the host language whatsoever, intended or otherwise, are effectively unchangeable.
In lieu of a successful campaign to stop development and do a rewrite, only a fork will make any of the above better. Forks are becoming more common, but they fail to sustain their branching a high percentage of the time (57%) and anyways presume the power TO fork and on large project this is harder to achieve.
The net effect is, open source code bases fail to live up to one of the major the promises of open source, iterative improvement of the code base.
It's true that some people may fix bugs that they are motivated for external reasons to correct and it's helpful to look at the code base if you're writing a plugin through a public API, but the code itself is often awful and this awfulness , often produced because of limited time and resources has the ironic effect of driving away many times those resources in the form of all the would-be developers who are just turned off. For those who do partake, the existing code has the effect wasting many multiples of the time originally *saved* as each new developer struggles to make sense of the impenetrable code base.
In my experience there is no easy fix or even pricey one. Original authors are quick to fix on the (self serving) idea that whatever documentation which exists *ought* to be enough and anyone who still has questions must be an *idiot*. Wasting time incrementally slogging around this code becomes some sort of test that the dev is *serious* and *smart* when the reality is more like smart, serious devs came, saw and left without saying a word.
Code quality is only subjective at the edges. Undocumented code should not exist. F
What I want to confirm is ... (Score:3)
... not only is this the source code for the binary I am running, but also that the build system actually works. This is because not only might I want to make changes to the source to improve it, but I might want to do so in a hurry to fix a security hole. Since I might need to rebuild and run the built binary, I might as well test and make sure what the build system built really runs. So I just install the binary I built. Then I know for sure. Who needs the distributed binary (it might have a root kit in it).
Re:What a problem (Score:4, Insightful)
...or just using a binary that you compiled from binary yourself.
For a lot of projects, that's not nearly as hard as some people like to make it sound.
Re:What a problem (Score:5, Funny)
Hey now, you have to be pretty IT savvy to type ./configure, make and make install all in the same day. Some of us make good money doing that, don't just go suggesting everyone should be doing it.
more difficult in practice (Score:3)
./configure, make, make install assumes you're building on the target machine. Many times you want to build on one machine and deploy on another. Even now, there are a lot of packages that don't work properly when cross-compiling. So you end up hardcoding config files, overriding options, patching the source/Makefiles, etc.
Also, in our environment we need to isolate the build system from the host environment to avoid contamination from the host libraries, and we need to version-control the build system s
Re: (Score:2)
Whoosh!
That was the joke passing over your head.
I of course agree with everything you said. I was merely being flippant for the sake of humor.
Re: (Score:2)
Only the hourly employees use that. Us salaried folks just leave early since we earned it by skipping that and increasing our productivity.
Re: (Score:2)
Has anybody thought about recompiling the source and seeing if you get the same binary?
The article says you can try, but you don't.
Re: (Score:2)
Has anybody thought about recompiling the source and seeing if you get the same binary?
That doesn't necessarily work unless you have the exact same build environment (libraries, compilers, etc.), and compiler settings.
Re: (Score:2)
I thought so... the build environment does affect the final hash. However, thinking about this logically most places you can get the source code and executable from the same place... and if the executable matches... how paranoid can you be?
If you're getting the alleged source code to Windows 9 from some guy in Nigeria though, set your expectations accordingly.
Re: (Score:3)
However, thinking about this logically most places you can get the source code and executable from the same place... and if the executable matches... how paranoid can you be?
How paranoid do you want to be? Reflections on Trusting Trust - Ken Thompson [bell-labs.com]
Today, in day to day practice, you are on "reasonably safe grounds" if you get the executable from either the authoritative source, or an associated mirror, and it matches the published cryptographic checksum/hash value. (md5, SHA, etc. [wikipedia.org]) Of course if you can build from source, after checking the checksum of the source archive, and of any libraries you need to add, you should be in good shape as well. (And it isn't necessarily a
Re: (Score:2)
That is what the OP is talking about.
Suddenly it becomes obvious what the AC posting possibility is really about...
Re: (Score:2)
Differing library, linker, compiler versions, configurations, and parameters would all change the output. You'd have to use the exact same system for the two builds, or you are not guaranteed to get a byte-for-byte duplication.
Re:What a problem (Score:5, Insightful)
Most of the time, even that isn't enough. C compilers tend to embed build-time information as well. For verilog, they often use a random number seed for the genetic algorithm for place-and-route. Most compilers have a flag to set a specified value for these kinds of parameter, but you have to know what they were set to for the original run.
Of course, in this case you're solving a non-problem. If you don't trust the source or the binary, then don't run the code. If you trust the source but not the binary, build your own and run that.
Re:What a problem (Score:5, Insightful)
Has anybody thought about recompiling the source and seeing if you get the same binary?
Has anybody thought of reading the article before posting questions like this?
That said, this particular "article" isn't worth the waste of bytes it takes up. It's like seeing a 6 year old trying to explain a combustion engine.
Binaries will almost always differ - if nothing else because you need the entire environment exactly like the binary builder. Not just the time stamps, compile paths, hostnames and account names, which are the obvious.
If your compiler or linker is a minor version off what he used, the results can be very different, even if using the same compile options.
But that's not enough: If your hardware is different, randomization of functions in a library will be different.
To flesh out his article a bit more, the author could have done a test with two different Gentoo systems. Different but mostly compatible hardware, and a slight difference in the toolchain. That might have opened his eyes.
Then again, probably not.
Re: (Score:2)
Re: (Score:2)
-funroll-loops, the breakfast of champions.