Re: Installation of software, and security. . .
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Burton Strauss wrote:
> At best the SCOTUS Grokster opinion bounces the case back to the lower
> courts saying "You erred in dismissing under Betamax".
>
> It's also pretty clear that the Supremes said "Oh, and from looking at the
> facts of this case in that light, MGM will probably win".
>
> So it's now clear that a non-infringing purpose isn't an absolute shield for
> a technological device. You also have to have clean hands regarding the
> usage of it.
>
>
> "Thus, if you are a developer and you deploy software without giving serious
> thought to the things that you could do to make the entire process of
> software distribution and installation safer for everyone, then you are part
> of the problem." is a HUGE stretch.
>
> The mere fact that somebody COULD put a nasty script into SOME package? No
> way... Now if you sold/gave away your Wonka-Magic-Software-Installer saying
> "Oh, and BTW, Wink, Wink, if you were to add this script to your package,
> you will install 27 Trojan and SpyWare apps and we'll pay you $2/copy
> distributed". Well, then, you don't have clean hands...
>
>
> I guess that puts me in the camp that thinks Grokster isn't a bad decision.
>
> It might become one - after a couple more cycles through the courts make the
> new rules clear.
>
> But on it's face? No... In fact, I'd love to see the same kind of rules
> applied to a couple of other industries/technologies - Guns for example (how
> many Deer do you find running around the woods wearing Kevlar?). Tobacco,
> Pharma and 'Nutritional' Supplements for others...
>
>
> OK, back to relevant stuff...
>
> How do we hit the middle ground - enough control over what is done so that
> there is:
>
> * At least a record of what was done.
> * Some attempt at obtaining informed consent.
>
> And, Can we do this within (or just beyond) current packaging methods?
>
Problem being that when you run a program with full access, it has full
access. That means you may see executable code that makes finite
decisions etc, but you'll have to write a hugely complex,
turring-complete emulator to properly determine the system's eventual state.
The other way is to shield the system from all these things it can do
and let it do whatever as long as it doesn't really do anything; then
merge the changes. Mandatory access control policy can do this; though
then you're pushing the responsibility down to another level of the
system (controlling the package manager becomes a kernel duty). In the
case of Autopackage it's been agreed that this is a good course of
action, but Autopackage itself (the project) MUST define the policy and
state that this is the supported rule set and packages which break
because they violate it are SOL.
>
> -----Burton
>
>
>
> -----Original Message-----
> From: Jason Coombs [mailto:jasonc@xxxxxxxxxxx]
> Sent: Tuesday, July 19, 2005 12:16 PM
> To: Tim Nelson
> Cc: John Richard Moser; Klaus Schwenk; bugtraq@xxxxxxxxxxxxxxxxx
> Subject: Re: Installation of software, and security. . .
>
> Tim Nelson wrote:
>
>>On Sun, 17 Jul 2005, John Richard Moser wrote:
>>
>>>Yes, you hit the nail on the head with a jackhammer. One discussion
>>>on autopackage was that the devs don't want to limit the API and thus
>>>want the prepare, install, and uninstall to be a bash script supplied
>>>by the package "so it can do anything." I hate this logic. Why does
>>>it need to be able to do "anything"?
>>
>> I think you're both right :). I agree that packages need to be
>>able to do anything, but it'd be nice if we could try to eliminate the
>>pre and post install scripts.
>
>
> Developers think that installers need to be able to do anything because the
> developers think of themselves as being trustworthy. The code written for an
> installer doesn't do anything harmful and it can be trusted, so why should
> it not have the ability to do anything that the developer decides it needs
> to do?
>
> All malicious attacks originate from the hands and minds of other people,
> malicious people, therefore a typical developer cannot see any harm in their
> own way of thinking or in their own installer. Even those developers who
> perceive an unacceptable risk or intrinsic flaw in the way that these things
> get built and deployed have a very hard time seeing themselves as
> responsible for the harm caused by others.
>
> The truth is that people who expressly allow systems that are harmful to
> continue to exist can be held responsible for the damage that those systems
> cause, regardless of the fact that the malicious actor who initiates the
> specific harm in each instance is somebody else entirely.
>
> See: Metro-Goldwyn-Mayer Studios Inc. v. Grokster, Ltd.
> http://www.supremecourtus.gov/opinions/04pdf/04-480.pdf
>
> Thus, if you are a developer and you deploy software without giving serious
> thought to the things that you could do to make the entire process of
> software distribution and installation safer for everyone, then you are part
> of the problem.
>
> Hopefully everyone can now see that applying digital signatures to code is a
> pointless exercise in somebody else's arbitrary business strategy (i.e.
> VeriSign and other purveyors of so-called 'federated identity
> solutions') and is not being used today as a means of achieving improved
> information security. A very sad state of affairs, given that signed code at
> least attempts to address these issues of security during the software
> installation/distribution process, albeit today's implementations as a rule
> are very poorly-conceived.
>
> We would all receive vastly-improved installation security if every software
> vendor would adopt a standard for code/data/installer authentication (that
> does not require digital signatures but that could optionally use them)
> based on a keyed hash algorithm and a low-cost specialized electronic device
> that sits on the desktop or in the server room alongside the box to which
> software is deployed and is used to verify hashes and explain forensically
> what the installer intends to do to configure the box and deploy the code
> and data to it.
>
> Of course that's just the ideal improvement, which I personally believe the
> industry could even train end-users to understand and use.
> Particularly if the proposed device were to generate an installation key
> that the user would be required to enter in order to install the software.
> (Sure, greedy people would try to use this to increase license revenue or
> improve controls over intellectual property and copyright; they will just
> have to be fought back by those who understand that the point is security
> not personal enrichment.)
>
> Short of the ideal stand-alone embedded system this concept could also be
> built as software-only. Does anyone care? Will anyone ever build it?
>
> Regards,
>
> Jason Coombs
> jasonc@xxxxxxxxxxx
>
>
- --
All content of all messages exchanged herein are left in the
Public Domain, unless otherwise explicitly stated.
Creative brains are a valuable, limited resource. They shouldn't be
wasted on re-inventing the wheel when there are so many fascinating
new problems waiting out there.
-- Eric Steven Raymond
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org
iD8DBQFC3XufhDd4aOud5P8RAnY/AJ4lefhFbWbk/lQL7i95BhqwqHm2+wCfXVII
mWrFatz9c4ESScrwfnPLErw=
=7the
-----END PGP SIGNATURE-----