[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Correct context on Trust essay




Carl, Phill and list:

When Carl commented here on my essay "Towards a real-world model of trust" 
he took out the items that were not relevant to the discussion he was
aimimg at and ... he took out almost 90% of it. That's fine and good but
left the topics a wee bit out of context because the essay was never
posted or referenced here before! For example, what may have looked like a
gratuituous critique on X.509 was actually part of a *self-criticism*, in
the paragraphs before and after.

(thanks for a private posting calling my attention to this)

The resulting discussion, with Phill, Camillo, Marc and Ben, such as
Phill's opening remarks and Ben's questions on how to bootstrap trust,
were perhaps caused by the lack of text and context ...

So, I am posting the full essay here with my apologies for not having
perceived this in my replies. To err is human, my computer says ;-) 

[ Sorry. Those not interested on this thread can delete now. ]

The same essay is available also at http://www.mcg.org.br/trustdef.txt
and it could be interesting to read its follow-up at
http://www/mcg.org.br/reftrust.txt
A further discussion and references can be found at
http://www.mcg.org.br/cert.htm#13. Trust:
(the URL includes the white space and :)

********************

This msg discusses a subject which summarizes and references some of the
points mentioned by myself in this and in other forums, being also a
result of discussions with several colleagues which either had the
opportunity to come here for a visit or that I had the opportunity to
visit during my last trips. After all, December is a good month for that! 
This msg also answers the last request in this forum, with a season's
delay ;-). 

This msg's first subject is on "modelling of trust" and not on "trust
modelling" -- the second being derived from the first. What I am saying,
and this is at the heart of the Meta-Certificate motivation, is that we
must first define and understand what trust is (and, possibly, isn't) in
the context of Internet communication before we go into cryptographic
algorithms and message protocols -- which can serve well either to be a
means of conveying said understanding or, of obfuscating said ignorance!

Today's protocols such as X.509, PGP and others, take a leap of ignorance
on what trust is and start by defining means to convey it. Such attitude
is not even empirical, it is indeed arbitrary. To justify this leap of
ignorance, standards such as X.509 have statements to the effect that "... 
such will be defined in the CPS, which is not a part of this document." --
as if assumptions could be defined after the theorems that use them. 

When faced with the same problem 50 years ago, Shannon took a different
approach. Let's follow his steps because it is not a coincidence that his
ground-breaking Information Theory [1] was also the basis for his
thought-breaking paper on Cryptography [2], which changed cryptography
from (not even) black art to science (before, he also saw that an obscure
Boolean Algebra could very well represent electric circuits, so circuitry
could be built according to Boolean principles, allowing circuits to
binary-test propositions as well as calculate problems -- the dawn of
modern computers and Digital Electronics). 

As in [3], I cite:

"In Information Theory, information has nothing to do with knowledge or
meaning. In the context of Information Theory, information is simply that
which is transferred from a source to a destination, using a communication
channel. If, before transmission, the information is available at the
destination then the transfer is zero. Information received by a party is
that what the party does not expect -- as measured by the uncertainty of
the party as to what the message will be."

Shannon's contribution here goes far beyond the definition (and derived
mathematical consequences) that "information is what you do not know". His
zeroth-contribution (so to say, in my counting) was to actually recognize
that unless he would arrive at a real-word model of information as used in
the electronic world, no logically useful information model could be set
forth!

Now, in the Internet world, we have come to a stand off: either we develop
a real-world model of trust or we cannot continue to deal with limited and
faulty-ridden trust models, as the Internet expands from a parochial to a
planetary network for e-commerce, EDI, communication, etc. 

And, what would be this "real-world model of trust" for the Internet
world? Here, akin to Information Theory, trust has nothing to do with
friendship, acquaintaces, employee-employer relationships, loyalship,
betrayal and other hard to define concepts. In the concept of Generalized
Certification Theory as set forth in [3] and [4], trust is simply "that
which is essential to a communication channel but which cannot be
transferred from a source to a destination using that channel".

Thus, **loosely speaking**, information is what you do not know and trust
is what you know.

This has several consequences, to be pursued elsewhere, but the ones we
need now are:

1. "trust depends on the observer" -- or, "there is no absolute trust". 
What you may know can differ from what I may know.

2. "trust only exists as self-trust". This means that only self-trust has
zero information content, so trust on others always have information
content (surprises or, unexpected behavior, either good or bad). 

3. 'two different observers cannot equally trust any received
information". Direct consequence of (1) and (2).

If we accept the "real-world model of trust" for the Internet world as
defined by: 

trust: "trust is that which is essential to a communication channel but
which cannot be transferred from a source to a destination using that
channel" ,

then these three consequences are as mathematically unavoidable as
Shannon's Theorems and leave us in a severe predicament. How then and to
what measure can I acquire and transfer trust?

To answer this question, we must now look at the mathematical properties
of trust. This is also similar to Shannon's approach -- when the
logarithmic function was found very useful to represent information
content and allowed new insights. As in [5], trust has the following
main mathematical properties: 

 - not transitive
 - not distributive
 - not symmetric

where the reader can see the first two properties exemplified on-line in
[5].  The last property is straightforward: the fact that a lion trusts
a lamb does not mean that the lamb trusts the lion. 

So, using the definition of trust just given and moving towards an
understanding of the definition by using examples, when the lion
communicates with the lamb, the lion does not need to receive any transfer
from the lamb besides that which is communicated in the channel itself,
whereas the lamb needs to *know* whether the lion is hungry -- which is
not information and which cannot be transferred in the same channel. If
such data were information, then it would be new to the lamb (sorry,
ex-lamb, now food). If such data would be transferred in the same channel
how would the lamb know that the lion was not lying?

What is then the solution?  How then and to what measure can I acquire and
transfer trust? 

Contrary to information, trust cannot come in by a type of add-on -- such
as modulation on a carrier. Why?  Because when you modulate a carrier you
are encoding information into that carrier and you suppose that the
carrier is pre-existent -- so the carrier has a very low information
content while the modulating signal has a very high information content. 
Ideally, 0% and 100%. On the other hand, according to our definition,
trust must have zero information content (trust is what you know). 

So, trust cannot be thought of as a modulating wave -- it is the carrier! 
This is the paradigm shift that MCs was based upon in the first place.
First acquisition, then recognition. 

Neither can trust be thought of as a type of authorization loop, where
trust flows from the source to the destination and back to the source,
similar to a battery and electric current. [6]
                                                       
The solution is to mathematically model our definition of trust (ie, this
is not a play on words but we have to model our real-world model of
trust) as a mathematical operator on information, which is parametrized by
(t,d,s,...) where t=transitive, d=distributive, s=symmetric, ... + other
properties such as time (see [6]).

Such a trust model now allows us to answer the question, as a function of
cost and risk [7].

When (t=0, d=0, s=0, T=0, ...) we have "hard-trust" -- ie, zero
information content (no surprises) and no risk. But, also, as isolated as
an island -- trust cannot be acquired or transferred. 

When we allow the parameters (t,d,s,T, ...) to take non-zero values, then
we have "soft-trust" -- ie, non-zero information content (bad and good
surprises) and ... risk. Here, trust can be acquired and transferred but
always tainted with information. Thus, trust must be properly gauged [8]
also as a function of risk/cost if it is to be properly used in the
soft-trust regime. 

The above arguments show already several things:

- it points out the basic inconsistencies of PGP [9], e.g. where PGP
enforces a model of "hard-trust" with "trust is intransitive" to setup
entries in the web-of-trust but uses "soft-trust" to upkeep entries,
without discussing its validity/gauge nor allowing for time factors such
as lack of synch.

- the basic inconsistencies of X.509 [9], e.g. which uses "soft-trust" to
impose a CA chain without discussing its validity/gauge (ie, either you
accept to trust a CA you don't trust because that CA *was* trusted by a CA
you *once* trusted or you are out). 

Taking such model of trust further, as it will be presented in the
Meta-certificate Standard, leads to what is called "archetypical trust
model" as presented in [3] and in the MCG-FAQ. The concept of "critical
radius of trust" is also derived from space and time considerations of
differently interacting agents, where the critical radius is the reach of
soft-trust where risk and cost are equal.


Cheers,

Ed

References:

[1] Shannon, C. A Mathematical Theory of Communication, Bell Syst.
    Tech. J., vol. 27, pp. 379-423, July 1948. 
[2] Shannon, C. Communication Theory of Secrecy Systems. Bell System
    Technical Journal. 28: 656-715. 1949.
[3] Gerck, E., Certification: Intrinsic, Extrinsic and Combined, MCG, 
    http://mcg.org.br/cie.htm . 1997.
[4] Gerck, E., Generalized Certification Theory, to be published. 1998.
[5] Gerck, E., Trust Properties,  MCG http://mcg.org.br/trustprop.txt.
1997.
[6] Gerck, E.  Re: On the Nature of Trust, MCG
http://mcg.org.br/cgi-bin/lwg-mcg/MCG-TALK/archives/mcg/date/article-334.html
1997.
[7] Bohm, N. Authentication, Reliability and Risks, MCG,
http://mcg.org.br/auth_b1.htm. 1997.
[8] 111229 Checking Validity, MCG http://mcg.org.br/pub9x.txt . 1997. 
[9] Gerck, E. Overview of Certification Systems: X.509, CA, PGP and SKIP.
    MCG, http://www/org.br/cert.htm. 1997.


******************

Cheers,

Ed

______________________________________________________________________
Dr.rer.nat. E. Gerck                     egerck@novaware.cps.softex.br
http://novaware.cps.softex.br
    --- Meta-Certificate Group member, http://www.mcg.org.br ---