itrust: encounter based trust recommendation system udayan kumar and dr. ahmed helmy
TRANSCRIPT
iTrust: Encounter Based Trust recommendation System
Udayan Kumar and Dr. Ahmed Helmy
2
Encounter based Trust: iTrust
• Questions/Motivation
– Although so powerful yet these devices do not fully utilize peer to peer interactions. Why?
– What can we do if we start leveraging the power of P2P communication in mobile networks? Applications are plenty.
3
Need for Trust
• To break psychological barriers in using AdHoc and Peer-to-Peer mobile services. (I don’t know anything about this device/user. Should I communicate?).
• Will people communicate without knowledge of other devices?
• Bootstrap recommendation, reputation or credit based system. (I believe in yellow credit and you believe in green. I think green is fraudulent, why should I trust you).
4
Promise of iTrust
• To be a unified communication oriented trust recommendation framework for mobile devices
• To capture socially relevant trust information using social science principle of homophiliy[Mcp01]
• Allow proximity based interactions unavailable in wired networks. Out-of-band communication (can be secured using key exchanges [Che08])
• Encourage interactions in mobile societies and adoption of new mobile services.
5
Illustration: iTrust
I recommend Device B due to
high trust score. It has encountered
you at ….
AB
Hey B. Can we Hang out?Hey A. Yes, why
not!
Out-of-band Key Exchange
Lets exchange keys
iTrust helped A and B discover each other. They may turn out be lifelong friends .
6
Definitions in Literature 1. makes cooperative endeavors happen (e.g., Arrow, 1974; Deutsch, 1973;
Gambetta, 1988)2. cooperating or task coordinating (e.g., Solomon, 1960);3. placing resources or authority in the other party's hands (Coleman, 1990;
Shapiro, 1987a);4. being influenced by the other (e.g., Bonoma, 1976);5. committing to a possible loss based on the other's actions (Anderson & Narus,
1990);6. placing resources or authority in the other party's hands (Coleman, 1990;
Shapiro, 1987a);7. providing open/honest information (e.g., Mishra, 1993);8. entering informal agreements (Currall & Judge, 1995);9. increasing one's vulnerability (e.g., Zand, 1972);10.reducing one's control over the other (Dobing, 1993);11.risk taking (e.g., Coleman, 1990; Mayer, Davis & Schoorman, 1995);12.increasing the scope of the other person's discretionary power (Baier, 1986);13.reducing the rules we place on the other's behavior (Fox, 1974)14.involving subordinates in decision making (Carnevale & Wechsler, 1992).
TrustActual
human trust
ContextSocial
Interactions
Online social networks
Real World interactions
Face to Face Interactions
Similarity (Homophiliy) --
--
--
7
Goals
• Stability – In trust recommendations• Distributed Operation - In calculations• Privacy-Preservation – minimize the need of
data exchange.• Accuracy – when measuring similarity• Resilience – from anomalies such as
artificially induced encounters.• Energy Efficiency
8
Architecture Overview
Trust Scores
9
Trust Adviser Filters• Frequency of Encounter (FE) -- Encounter count • Duration of Encounter (DE) – Encounter duration Proposed:• Profile Vector – Location based similarity using vectors.• Location Vector – Location based similarity using
vectors – Count and Duration (Privacy preserving)• Behavior Matrix – Location based similarity (using
matrix) – Count and Duration [HSU08]
10
Filters
B’s Profile Vector
A’s Profile Vector
Profile Vector Exchange for similarity calculationsB A
B
Profile Vector (PV):
Location Vector (LV) :
Maintains a vector for itself
Maintains a vector for itself
Creates and manages vector for every user encountered
Vector for other users are populated with only the information B has witnessed
No exchange of vectors is needed !! Privacy preserving
Each cell represents a location(such as dorm, ofc)
Each cell stores count/duration at that location
Vector
4 32
15
--
--L1 L2 L3 --
11
4 32
15
--
---
---
--
--
---
---
--
--
---
---
--
--
--
--
--
--
--
--
Day 1Day 2
Day N
Behavior Matrix (BM):
B’s Matrix Summary
A’s Matrix Summary
Behavior Matrix Exchange for similarity calculationsB A
Maintains a Matrix for itself
This matrix is summarized using SVD. The summary is exchanged b/w the users to calculate similariy
Each cell stores count/duration at that location
12
Combined Filter (H)
• In Combined filter we combine trust scores from all the filters to provide a unified trust score.
H (Uj) = Σ αiFi(Uj), where αi is the weight for
Filter Fi, n is the total number of filters
• Different people may prefer different weights (observed from the user feedback on implementation). Eventually it can be made adaptive.
n