[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Trust vs. accountability



On Tue, 17 Feb 1998, Bill Buffam wrote:

Bill:

I found your that your msg touches a central point. I do not want to
monopolize the subject but your question has a direct answer in the
context of the presented essay (its answer is also half mentioned there,
as you will see). 

-> This philosophical discussion is fascinating. But as someone interested
-> in building products, I'd like to see some discussion on the real world
-> application of these concepts. 

this is the motivation ;-)

-> It seems to me that the whole SPKI movement got started on the premise
-> that, for e-commerce, trust is pretty well irrelevant, and what you need
-> is accountability. Whereas in non-commerce environments like
-> interpersonal communication, and (the extreme example) warfare, what you
-> need is trust, and accountability is pretty well irrelevant.
-> 

The interplay between trust and accountability is sometimes difficult to 
delineate. But, here, logic can help.

Suppose you have the information that A is accountable on matters of X.
Can this information be trusted? 

So, trust *is* the vehicle, the carrier for accountability. In my example
to the list on the "abuse fee" I used it to show that a list user could be
made accountable even if not trustable but ... the example depended on the
list's moderator being trusted to do what was supposed to be done! Note
that the list moderator would have very low accountability in that case
(lest everyone deserts the list because ... he is not trusted). 

Bottom line here, as defined in the first posting on this thread, is that
trust is akin to a carrier of information -- which information can be: 
evidence, responsibility, validation, reliability, generalization,
uncertainty, consistency, truthfulness, accountability, legal reliance,
liabilities, warranties, ethics, monetary values, contract terms,
deals, person's name, person's DNA, fingerprints, bank account number, 
public-key, etc. 

So, even truthfullness depends on trust. So, trust is a basic property of
communication channels, like information. I could say, in a very broad
generalization, that "everything is information and rides on trust"... 
which allows you to act or not on that information. So, this is a
second-order Information Theory -- in which you are not any more
interested only on *how much data* is being transferred over a channel as
measured by the uncertainty of the party as to what the message will be!
Rather, I now focus on what is *essential* to that message but which
cannot be transferred using THAT channel -- which can be equally
quantitative though subjective.

But, one may ask, can subjective measurements become objective? Let's see
the two cases:

For example, if I can define Information as "that which is unexpected" and
such *subjective* definition makes sense (as it does) in the practical and
*objective* calculation of a microwave link so that the channel capacity
can be correctly especified in MHz as a function of signal and noise in
dB, then I have managed to deal with what was *important* at that level --
while successfully leaving such hard to define and further subjective or
inter-subjective concepts such as knowledge, validity, evidence,
responsiblity, accountability, validation, legal reliance, etc. all *OUT*
of the definition of Information at THAT layer. Even trust is out of that
definition.

The same reasoning is behind the definition of Trust. Such a *subjective*
definition allows mathematical rules to be *objectively* defined in order
to allow Information to be relied upon -- at the communication layer we
are working with. Of course, other components must be added if we want to
deal with someone in Birmania, such as the laws in Birmania, the present
commercial treaties, the current exchange rate, the dealer's
accountability, the bank's accountability, the insurance policy coverage,
etc. -- however, they are all *OUT* of the definition of Trust at the
communication layer.  Therefore, if we want to deal with Birmania, then we
can add them, or Tanzania, or both, or whatever. 


-> Examples: in accepting a credit card payment, the merchant is not really
-> interested in considering whether he trusts the customer or not. He
-> cares that the credit card company makes good on the payment. 

Or, that he trusts the CC company to make good on the payment. The fact
that he does not have to trust the customer is simply a service provided
by the CC company that he trusts to do it. 

-> (How many
-> sales clerks actually check your signature? How many would have the
-> moral courage to decline the sale if they smelled a rat?) The credit
-> card company also is less interested in "trust" than in quantifiable
-> risk: what is their expected profit from this customer, versus expected
-> risk. Does the risk profile for this customer fit the interest rate and
-> expected fraud model?

Trust and risk have also an interesting interplay. I copy from a msg in
the mcg-talk:

"Higher trust means less perceived risk that some piece of data will turn
out to be different than expected, no? So... following the risk issue
touched upon by Bertus, let's see the issue from the point of view of risk
and check to see if we arrive at the same point by going around ..." 

and the example provides the same result. So, while risk is a counterpart
to trust, trust is not the same as risk, because: can you trust the
customer's risk profile to correctly represent the customer's perceived
risk? So, trust is a positive affirmation of capacity: semantic capacity,
while risk decreases semantic capacity.

So, it is perhaps clear: trust is akin to signal while risk is akin to
noise.  Trust is always tainted by risk, which is (bad) surprises or
information.  Signal is always tainted by noise, which is (bad, out of
phase)  suprises or information 

[an in-phase noise is not noise in the same way that good surprises do not
enter in the calculation of risk]

However, there are some differences (and that's why I wrote akin):

1. trust and risk are subjective while signal and noise are objective; 

2. trust and risk do not actually "exist" and are fully abstract concepts,
not something capable of precise empirical prediction or confirmation [1]. 
If they were empirically confirmed/determined, the prediction (for trust) 
or the hazard (for risk) would be certain, not trust or risk. In general,
trust or risk are defined in terms of a subjective forecast, sometimes
also inter-subjective, possibly aided or not by objective data and
mathematical tools, but always essentially "perceived" [1]. 

Thus, they are duals even in these aspects.

It is a trip into the looking glass ...

-> 
-> Extreme example: drug deals. No-one trusts anybody else. But plenty of
-> business gets done. Accountability is everything.
-> 

As above. You trust that accountability will be enforced. Trust is the
medium, accountability is the message. You trust the drug dealer to kill
you in case of fraud (because you saw it happen before) -- so, you know
(trust, loosely speaking, is what *you* know) that you are accountable.

BTW, that's why gangs go "trigger happy" with an alarming frequency: to
show that they can be trusted by you (ie, known to you) to enforce
accountability. 

-> Warfare: you make a binary decision on whether to trust the authenticity
-> of the message or not. It's all about trust. If the message is bogus,
-> and causes you to order the Charge of the Light Brigade, accountability
-> is irrelevant.
-> 

Again, trust is the "semantic vehicle" of useful information. So, briging
trust to the forefront here is a natural thing. Accountability enters the
scene afterwards, on various levels -- including the psychologic.

Thank you,

Ed

Reference:

[1] http://www.fplc.edu/RISK/vol1/fall/shraderF.htm

______________________________________________________________________
Dr.rer.nat. E. Gerck                     egerck@novaware.cps.softex.br
http://novaware.cps.softex.br
    --- Meta-Certificate Group member, http://www.mcg.org.br ---






References: