Potential problems with trust

Circles is very exciting. Some variant of the web of trust approach seems like the most promising way to protect against Sybil attacks. I’m worried that there are a couple potentially bad incentives in the system as written though. Apologies if I’m misunderstanding how Circles works.

  1. It seems that if a user X belongs to a group G, then anyone holding X-coins can convert them to G-coins, not just X. (This is what I gleaned from the comments, though it seems like this is not fixed in stone yet.) This creates an incentive to make a new account for each group that you want to join, i.e., X1 for group G1, X2 for G2, etc. these accounts can all trust each other, so they are jointly liquid, and claim multiple incomes. The normal disincentive against trusting these accounts does not exist, because their coins can never be worthless; they’re worth at least as much as their corresponding group coins.

  2. Another way to dodge the problem of trusting random accounts: make your ‘real’ account completely illiquid and stash all of your value on a ‘bank’ account that is trusted by no one else. No one can extract the value from the bank, because the only trust link to it is from your illiquid account. There’s also no disincentive to trust random accounts in order to claim the trustee bonus; in the event that they turn out to be bad, they can’t hurt you.

Hey Peter, welcome to the forum! Two very good questions that indeed “hit” some critical points.

First - yes - currently I think everyone having money X should be allowed to convert it to Y as long as X is member of Y. So I think it will be the responsibility of groups that they only allow people to join with the main (and only) account. Otherwise there will be an inflation of the groups money and it will become worthless. One strategy for groups would be to only allow people to join the group that have a strong enough (personal) connection to existing members of the group. One example would be: 3 group members are trusting the new account. Or 10% of the group members.

Very good observation

First on this: so both sides have to agree on the trust relationship. So you can’t grab the bonus without the other accepting it. Actually if you are very new to the system you should better only connect to people (and therfore give them your bonus) who you really trust.

The problem in general of making your account illiquid and therefor “hiding” from the responsibility of your connections: I think this can be solved maybe on a social level. So if there is a easy to read measurement of how often you account is illiquid - people will maybe disconnect to you or at least others will not connect to you in the future.
This is the “pressure” approach. Maybe there is also an incentive approach that incentives people to stay liquid. On approach would be that you earn a small fee (something between 0.1% and 2%) every time someone you are trusting exchanges his money against yours. I am not sure about the global consequences of this but it could incentivize people to provide a liquid account that is often used by others as a “hop”.

This is interesting. I hadn’t thought of the role of inflation as an incentive for groups to be careful with their membership. I don’t think it applies in this case, though, since the total number of members in any given group is the same whether they are honest or pursue this scheme (one account per group). It would certainly cause inflation in the system as a whole, but it doesn’t directly cause inflation in any particular group’s currency. (I think.)

Verifying that Martin Koeppelmann is a real person and has not already joined our group, while it has its own challenges, seems much easier than verifying all of those things and also verifying that there is no other instance of Martin Koepelmann in any other group. It seems like doing this sort of verification might require a lot of cooperation between groups.

Out of curiosity, how do you see the group distribution in the final state of the system playing out if it is successful? Personal connections are viable up to a certain group size; do you see most of the value held by individuals being in the form of small-group coins?

This is an interesting answer, and it’s a layer I had not considered. I wonder if there is a metric for this that is sufficiently hard to game.

This is interesting. A transaction fee is nice in that it encourages a more densely connected graph; in fact, just the presence of a fee might eliminate the need for the trustee bonus, since longer trust chains are more costly to traverse. But I’m not sure it provides a big incentive for individuals to stay liquid. If I understand correctly, the consequences of trusting a bad actor can include basically losing all of my liquid assets; that’s a very big risk to run if I can avoid it.


More generally, have you thought about trying to prove (mathematically) some properties of the proposed system? Having some provable guarantees out there would lower the fear that there is some clever way to cheat the system.