[HN Gopher] Fake_contacts: Android app to create fake phone cont...
___________________________________________________________________
 
Fake_contacts: Android app to create fake phone contacts, to do
data-poisoning
 
Author : karlzt
Score  : 322 points
Date   : 2021-02-27 17:08 UTC (5 hours ago)
 
web link (github.com)
w3m dump (github.com)
 
| vmception wrote:
| To everyone talking about Clubhouse, there isn't an android
| version so this code is not useful as it is only for android
 
  | throwawei369 wrote:
  | TIL The vast userbase of HN is 95% Apple, 3% Android, 0.0005%
  | Pinephone. The remaining ~1% don't even make a digital
  | footprint since they use the old Nokia 3310 type phones.
 
| collaborative wrote:
| Phone numbers are too public. The reason why they're used by
| messaging apps is that they are a goldmine to have. They actually
| make it harder to chat (ever tried using Whatsapp/Signal on a PC?
| Yes, you'll need to have it installed on your phone first (and
| have given over your contacts))
| 
| That's why I chose to set (masked) emails as the primary id on
| groupsapp.online and even these can't be seen publicly unless you
| share a "group". Others will just see XXXX@gmail.com
 
| rasse wrote:
| This makes me wonder if anyone has set up canary emails or phone
| numbers in their phone contacts.
 
  | KirillPanov wrote:
  | The robocall epidemic has pretty much made the notion of
  | "canary phone numbers" useless.
 
  | praptak wrote:
  | What do you mean by "canary" in this context? How do you detect
  | that the canary is dead?
  | 
  | I assume that the "canary being dead" ~= "an adversary added
  | the contact to their watch list". But I don't think you can
  | detect that.
  | 
  | The best you could do is to add a random physical address
  | hoping that you can detect physical surveillance (which is
  | probably not realistic anyway).
 
    | rzzzt wrote:
    | It is like signing up with an e-mail +suffix for services, or
    | the non-existent streets on digital maps; if you come across
    | your fake contact elsewhere, you know that information has
    | been shared.
 
      | vmception wrote:
      | it is trivial to strip suffixes off of aliased email
      | addresses
 
        | rzzzt wrote:
        | What is the equivalent to that in the fake phone contacts
        | domain? I guess removing people with the +21 country code
        | would work for this particular approach, but
        | otherwise...?
 
        | vmception wrote:
        | Good question hmm, I think its just a different strategy
        | with phone contacts
        | 
        | A data broker primarily wants the social graph to make a
        | user profile with a phone number, to show ads later on.
        | Those people wont typically be texting or calling with
        | spam and ads, theyll just match the number and contacts
        | up with information shared in other apps so that ads in
        | your normal internet browser or ad-include app use are
        | more targeted
        | 
        | so if an erroneous contact never logs in thats of no
        | consequence to them, so searching to exclude numbers
        | would be less interesting and less likely than with just
        | sanitizing emails
 
        | kogir wrote:
        | If you control your own email routing, by using your own
        | mail server, Google Workspace, Microsoft 365, etc, you
        | can choose whatever convention you want.
        | 
        | How would you know to strip everything after my first
        | name?
 
        | vmception wrote:
        | I wouldn't care about the people using their own mail
        | server
        | 
        | I would just strip everything after a + sign
 
        | rsync wrote:
        | "it is trivial to strip suffixes off of aliased email
        | addresses ..."
        | 
        | This actually is not a bad point to make ... it would, in
        | fact, be simple to strip +aliases but ... economically I
        | don't think it makes any sense.
        | 
        | You'd have to have a high level decision maker dictating
        | an engineering fix in order to increase email
        | authenticity by ... .01% ?
        | 
        | ... and that assumes that the "engineers" down the chain
        | understand how '+' works in email to begin with and have
        | somehow communicated that back up to management.
 
        | vmception wrote:
        | My response here is that I think this discussion is
        | naive, as the data brokers themselves already do it.
        | 
        | So who cares about what some engineer at a random new
        | business thinks.
        | 
        | Aliasing isn't new. So this isn't a cat and mouse game
        | that just got started.
 
      | rasse wrote:
      | Exactly!
 
    | rasse wrote:
    | Detection would require a call/sms/email. The idea would be
    | just to detect, if your leaked data has been acted upon.
 
  | rsync wrote:
  | "This makes me wonder if anyone has set up canary emails or
  | phone numbers in their phone contacts."
  | 
  | We (rsync.net) have a handful of dummy/fake users in our
  | database whose emails we monitor. The email addresses are
  | cryptic and random and use a different domain, etc.
  | 
  | We should never see an email sent to one of these "canary"
  | email addresses and, so far, we have not.
  | 
  | I am also aware that many of our customers sign up with
  | service-specific email addresses, using the '+' character ...
  | something like john+rsync@mydomain.com.
  | 
  | I personally have a rich and well developed pseudonym that I
  | use for all online non-governmental transactions but in some
  | rare cases I need to use my actual name and email - and in
  | those cases I create '+' aliases.
 
    | techsupporter wrote:
    | I've noticed a bunch of spammers starting to strip out
    | anything after the + and before the @. This is why I've long
    | used a catch-all e-mail domain (subdomain.example.net) where
    | I can put anything I want to the left of the @ sign and no
    | one is the wiser for my real e-mail address.
 
      | Answerawake wrote:
      | Is there some service where I can easily create unlimited
      | custom email addresses for a flat monthly fee? I want to
      | use a unique email for each new website/service. That would
      | go a long way to solving some data leak/privacy problems.
      | The problem with custom domain is I have to maintain it
      | right? I want a service which I don't have to maintain. I
      | used to use new Yahoo accounts but they are a hassle and
      | recently they disabled free auto-forwarding.
 
        | wuuza wrote:
        | spamgourmet.com
        | 
        | I have been using this since 2002. You don't even have to
        | set anything up - just make up addresses on the fly. It's
        | pretty awesome.
 
        | [deleted]
 
        | rzzzt wrote:
        | Mozilla has such a service: https://relay.firefox.com/
        | 
        | I also remember seeing one Show HN recently that offered
        | similar functionality, but couldn't find it via search.
        | The problem is that if the e-mail alias provider becomes
        | popular enough, their subdomains are soon disqualified
        | from being used when registering to sites.
 
        | osamagirl69 wrote:
        | You can do this with any email provider that supports a
        | catchall. I personally use fastmail and have been very
        | happy with it. You don't need to 'create' the accounts,
        | you just set it up so that *@yourdomain.com goes to your
        | catchall. When signing up for a new service, you pick a
        | unique/random email. Then you know unambiguously where
        | each email in your inbox came from.
        | 
        | I personally use the website as the email (example, if HN
        | required an email it would be hn@mydomain.com) to make it
        | easier to filter. But this can be gamed/guessed, to be
        | more secure it is better to generate an actual random
        | email for each site and store it in your password
        | manager.
 
        | notfed wrote:
        | Protonmail allows wildcard emails from 1 custom domain if
        | you pay for the ~$5/mo plan. No maintaining a mail
        | server, just point your MX records to their servers.
 
        | blfr wrote:
        | Catch-all support starts at EUR8 at ProtonMail.
        | 
        | https://protonmail.com/pricing
 
  | arminiusreturns wrote:
  | I create a new email for most services I use, (run my own
  | email) but I had'nt thought of this! Thanks for the idea.
 
| _trampeltier wrote:
| I have no contacts at all on my phone, I created something by
| myself. Now I think it would be funny to brute force Androids
| contacts and just add every number of my countrys phone providers
| :-)
 
| bschne wrote:
| The problem with this approach is twofold:
| 
| a) At the margin, a few people doing this does _nothing_ to mess
| with big companies' data collection & analysis. But opting out
| also has the same problem, obviously, so at least it's not doing
| worse.
| 
| b) In the absence of sandbox / selective sharing features like
| other commenters have mentioned, or you going so far as to _only_
| keep fake contacts in your phone, using this approach requires
| you to also share your actual contacts with the app, thus giving
| away PII of unconsenting third parties. Yes, I'd rather blame the
| app developers for collecting this data in the first place, but
| I'd still prefer not to give my contacts away whenever I can
| reasonably withhold them.
 
| tyingq wrote:
| Bsd style globbing is handy for this sort of thing. Like in Perl:
| use File::Glob qw/bsd_glob/;       my @list = bsd_glob('This is
| nested {{very,quite} deeply,deep}');
 
| crazygringo wrote:
| I don't see what the point is.
| 
| "Data poisoning" gives companies a bunch of fake contacts... on
| top of all your real ones?
| 
| Who cares? So they send some e-mails to addresses that don't
| exist or something? So it takes up an extra 1% of disk space in
| their database?
| 
| If you could share an empty address book then that would actually
| preserve the privacy of your contacts. But this doesn't do that.
| 
| I don't get it.
 
  | loveistheanswer wrote:
  | The vast majority of phone calls I receive are spam calls by
  | people/robocallers which I did not give my phone number to, but
  | apparently someone else did. I don't want people sharing my
  | phone number with random other people
 
    | crazygringo wrote:
    | Nobody had to give them your phone number.
    | 
    | They just dial numbers at random. Phone numbers aren't
    | sparsely distributed. There are entire area codes that are
    | essentially fully utilized.
 
  | [deleted]
 
  | shervinafshar wrote:
  | Not an expert on guerilla cyber-warfare, but isn't it the whole
  | point of this sort of poisoning? If enough people do this the
  | cost of those bouncing emails would become prohibitive. That's
  | my speculation. Would be great to know more from someone who
  | knows the domain better.
 
    | remram wrote:
    | Even if you make your contact list 99% bounces and 1% real
    | (and every user of the app does the same), I don't see how
    | this becomes a problem for the app's operator. Remove a
    | contact after 1-2 bounces and you're golden.
 
      | shervinafshar wrote:
      | Fair. Still golden if this needs to be done for all
      | contacts of all users?
 
        | remram wrote:
        | If they bounce they are extremely fast to cull.
 
    | dogman144 wrote:
    | Pretty nifty side point:
    | 
    | > If enough people do this the cost of those bouncing emails
    | would become prohibitive.
    | 
    | This idea got a ton of attention in early days tech that led
    | to what's known as proof of work: see bitcoin. The primitives
    | of btc show up in a lot of interesting areas.
 
    | jmatthews wrote:
    | You don't bounce emails you prebounce them and clean up your
    | list. This is part of any sensible data engineers process.
 
      | jmatthews wrote:
      | More helpfully, salting their db with real emails but fake
      | contact info requires a more durable hygiene process and
      | often isn't worth the effort for data driven shops.
      | 
      | You serve a variety of email domains that validate as
      | deliverable, then you accept emails and report the sender,
      | which hurts their deliverability.
 
      | TedDoesntTalk wrote:
      | what is prebounce?
 
        | shervinafshar wrote:
        | My guess was they are referring to one of these services
        | that check the validity of any email address. A false
        | signal from one of these services prevented me from
        | signing up for some random website with a .name domain
        | the other day.
 
  | cyral wrote:
  | It will be interesting to see if these fake contacts show up in
  | a leak somewhere someday. Almost like how people do
  | myname+yourcompany@gmail.com, we could create similarly fake
  | contacts to see who is selling or leaking data.
 
    | shervinafshar wrote:
    | *@myname.name
 
| [deleted]
 
| GekkePrutser wrote:
| I wonder if this works at all..
| 
| These companies simply use your contacts to do contact mapping to
| other users. Including fake ones will do nothing as they don't
| point anywhere. Big Data will just filter them out.
 
| antihacker_team wrote:
| Vulnerabilities research. OWASP, PTES, NIST SP800-115. You pay
| only for the found bug, depending on the criticality. Over than 6
| years of experience. email: info@antihacker.pw
 
| yalogin wrote:
| This is not achieving anything positive. I don't which privacy
| threat it's fixing, other than adding a new app into the mix that
| could at some point in the future suck up the contacts itself:)
 
  | alcover wrote:
  | That is pretty insightful. Do you have an email ?
 
| nvoid wrote:
| I was looking through my contacts the other day, deleting some
| people I don't speak to any more. Its interesting that with 5 or
| so unique enough contacts I could be identified. If they were
| sufficiently unique, no one in the world could possible know
| those 5 people. Scary thought.
 
| 0df8dkdf wrote:
| That is why we should have a custom app for contacts with custom
| encryption (like keepass) to store our real contact. So not any
| app or apple or google has access them.
| 
| For some ppl like political of activist fundraisers, contact info
| privacy are utter most important. In fact some of them still
| store it on rolodex, and will not put any of that into digital
| form. And as a software developer I actually support that
| tremendously.
 
| allurbase wrote:
| Take me to your leader.... I don't like thieves!
 
| bredren wrote:
| Clubhouse requires contact list in order to get invites, which
| are required to sign up right now.
| 
| I get why they are doing this, and it caused me to share my
| contacts with them.
| 
| However, I resented it and it put me immediately in a defensive
| posture with the product and company.
| 
| There is no possible way to trust a company with your contact
| list and Apple should make it how Photos works now--where you can
| select which data to share. There are some folks I don't even
| want to possibly find in a social app.
 
  | paul7986 wrote:
  | Would never sign up or use a service that has such an invasive
  | requirement..I only use my google voice number for any type of
  | public to even dating transaction. Spam and robocall that all
  | you want which I surprising never receive/received many such
  | calls.
 
  | [deleted]
 
  | lucb1e wrote:
  | They did something bad and yet here we are. I don't know what
  | Clubhouse is, but I'm somewhat tempted to look it up.
  | Marketing: successful. (I won't, in an attempt to counter that
  | effect of growing due to negative publicity, but I find it
  | noteworthy how well it works.)
 
  | antipaul wrote:
  | App Store guidelines forbid using the Contacts for anything
  | except the intended purpose:
  | https://appleinsider.com/articles/18/06/12/apple-disallows-d...
  | 
  | Do we give CH the benefit of the doubt =p ?
  | 
  | In any case, I also hope (and expect) Apple to implement better
  | controls for sharing contacts.
  | 
  | EDIT: Typo
 
    | koboll wrote:
    | Huh, so Clubhouse is explicitly breaking Apple's rules.
    | 
    | Surely Apple knows this, but is allowing it because... it's
    | mega-popular?
    | 
    | What's the point of having rules if clawing your way to
    | popularity by leveraging their violation is deemed
    | permissible?
 
      | jtsiskin wrote:
      | How are they breaking the rules? It seems like they are
      | using it for the same purpose they prompt permissions for.
 
        | csommers wrote:
        | They create ghost profiles for those contacts, just in
        | case that user ever signs up. That's fucking garbage and
        | they should be ashamed of doing that, let alone
        | immediately removed from the App Store.
 
        | pizza wrote:
        | This seems like the shady type of thing lawmakers should
        | pass laws against
 
        | bredren wrote:
        | I see this as a key problem of our times. Social
        | convention used to have a stronger impact on behavior.
        | Now it isn't enough for behavior to be disdained, it must
        | be flagrantly illegal.
 
        | lupire wrote:
        | When was the time when social convention had a stronger
        | impact?
 
        | withinboredom wrote:
        | Social convention has always had a strong impact. Where I
        | live, people will cut you in line if you leave a space
        | big enough for them to fit and it's perfectly ok. Where
        | I'm from, if someone did that, you'd end up with an angry
        | mob and probably a fist in your face.
        | 
        | Social conventions are always stronger than the law; at
        | least in person.
 
        | pizza wrote:
        | Growth by any means necessary.. seems like there are a
        | tens of thousands of apps that each act like their own
        | data bureau, totalling dossiers on billions of people,
        | just because it makes money. Maybe a few percentage
        | points' value lost as a slap on the wrist every now and
        | then. I feel, that in this scenario, rather than a better
        | carrot, we need a better stick..
 
        | lupire wrote:
        | It's the sort of thing Apple should ban to protect its
        | claim that it is more private than Android.
 
  | [deleted]
 
  | TedDoesntTalk wrote:
  | I mean, unless you're a newbie to the internet, how is this
  | possible?
 
  | hshshs2 wrote:
  | please reconsider doing this next time if you're able to
 
  | styfle wrote:
  | I was kinda confused at first to see the top suggestions were
  | all Doctor's Offices. Then I figured it out.
  | 
  | https://twitter.com/styfle/status/1358186671007760385
 
  | woadwarrior01 wrote:
  | I have an old iPhone with an empty address book for testing
  | dodgy apps that require contacts access, I use that for sending
  | Clubhouse invites. OTOH, Clubhouse seem work fine on my primary
  | phone, where I haven't given it contacts access.
 
    | Haemm0r wrote:
    | For Android I can recommend "Shelter"[1] which lets you setup
    | a work profile, so you dont have to share your contacts,
    | files, etc.. Downside: If you have already a work profile, it
    | does not work (Android allows only one work profile)
    | 
    | [1] https://f-droid.org/en/packages/net.typeblog.shelter/
 
      | IG_Semmelweiss wrote:
      | nice find.
      | 
      | is there a list of known existing "big brother" apps ? or
      | is it just as good to look at app permissions to figure
      | this out ?
 
    | lupire wrote:
    | If your invitees don't also have a spare iPhone, what's the
    | point of inviting them? They'll have the same problem with no
    | workaround?
 
      | woadwarrior01 wrote:
      | You don't need to grant the Clubhouse app access to your
      | contacts to use it. ATM, that's only needed to invite
      | people.
 
  | _jal wrote:
  | Clubhouse can bite me.
  | 
  | I refuse to use tooling from shitbags who try to exort me into
  | compromising others' privacy for shiny toys.
  | 
  | I know other shops do it, as if that makes it OK.
 
    | Aerroon wrote:
    | I remember signing up for Facebook back in the day. They
    | tried to get me to share something about my email contacts
    | list. That just made me not use Facebook instead.
    | Unfortunately, everyone else didn't seem to have a problem
    | with it.
 
      | [deleted]
 
      | 0x0 wrote:
      | Facebook literally had a box on their web site asking for
      | your email address and _email account password_ , so they
      | could _log in to your webmail_ and scrape your contacts.
 
      | bonoboTP wrote:
      | Normal people value their social standing and their
      | relationships, bragging rights etc. higher than abstract
      | principles. It's only loners who will resist. Popular
      | people will be on board because they manage their brand and
      | image instinctively. Wannabe popular too.
 
    | f430 wrote:
    | Server is in the People's Republic of China to boot. But I
    | know we have many wumaos and apologists here on HN because
    | they tasted blood money.
 
  | the-dude wrote:
  | What about dividing your contacts into _circles_ and only give
  | permission to a specific set?
 
    | mandelbrotwurst wrote:
    | Sure, as long as it's possible to create a circle containing
    | only one contact, the way giving permission to access photos
    | now works on iOS.
 
  | post_break wrote:
  | I mean this is why they do it. You knew it was wrong, you knew
  | they were going to take that data and mine it, and you still
  | said sure.
 
    | ganstyles wrote:
    | Correct. I've been a member for going on a year now and I
    | have scores of invites I don't appear to be able to send
    | because I won't share my contacts. Not that I care enough to
    | invite people, but it's a dark pattern to even require it.
    | 
    | I have heard there's a way to share invites without sharing
    | contacts, but I haven't cared enough to even do a cursory
    | search on that.
 
      | chipsa wrote:
      | unsync your contacts from whatever service provider you're
      | using, make sure they're gone, go ahead and share the
      | contacts (which are now empty) with Clubhouse, get the
      | invites, then revert everything back?
 
    | kzrdude wrote:
    | They do it because all the successful social apps need to
    | make contact discovery easy. The ones that don't use this
    | trick - ethical - we don't hear so much about, maybe they
    | don't succeed.
 
      | forgotmypw17 wrote:
      | There are quite a few that have not done it. I don't think
      | it's necessary for success at all.
      | 
      | HN seems to be doing pretty well, and it's never done this
      | sort of thing, as far as I know.
      | 
      | Reddit never did it during their growth phase, instead they
      | provided their own seed content.
      | 
      | Metafilter has never done anything unethical to my
      | knowledge.
      | 
      | There are many, many successful social networks which have
      | not performed unethical contact harvesting and other shady
      | things.
 
        | skinnymuch wrote:
        | Clubhouse raised money at a billion dollar valuation.
        | Hacker News specifically and Metafilter aren't in the
        | same stratosphere
 
        | forgotmypw17 wrote:
        | What are you trying to say?
        | 
        | That because they have a lot of VC money riding on it,
        | they have to do "growth hacking" in order to justify the
        | funds and grow quickly enough to satisfy the investors?
        | 
        | Well, I guess I have to agree.
 
        | lupire wrote:
        | But still, why do they need to steal your addressbook?
        | They can _offer_ you to spam your contacts without
        | demanding. Is the profit contingent on selling the
        | address book data? To the point where they won 't let you
        | _invite more people_ (help them grow!) without it?
 
    | arkitaip wrote:
    | Fomo is a helluva drug.
 
    | jancsika wrote:
    | Are you writing that to emphasize the urgency for the
    | government to pass legislation to reign in unregulated online
    | casinos as they continue refining their dark patterns? (I.e.,
    | without legislation, these companies will continue finding
    | more and more sophisticated ways to get the user to act
    | against their own interest.)
    | 
    | Or do you mean imply that a practical approach to reign in
    | unregulated online casinos is to spread the message of, "Just
    | Say No," in web forum comments to the ostensible addicts?
    | 
    | Or to be fair, something else entirely? My point is I can't
    | tell without context there whether you are sympathizing with
    | the user ("ah yes, something needs to be done because they've
    | found your weak spot"), or chastising them for not having the
    | force of will to resist dark patterns.
    | 
    | Edit: clarification
 
      | DevKoala wrote:
      | Not the poster you are replying to, but I stopped feeling
      | empathy for people who complaint about lack of privacy, yet
      | willingly give up their data to non-essential services that
      | ask for it with all the proper disclosures.
      | 
      | If you agreed to sharing all your contacts to listen to
      | "musical tweets", I don't see why you'll be complaining.
      | You willingly made a trade off.
 
        | WA wrote:
        | ... willingly give up other peoples' data.
 
        | bonoboTP wrote:
        | Social status is a hell of a drug. Clubhouse is a place
        | where people like Elon Musk and famous successful
        | scientists and businesspeople hang out so all the hustler
        | startup get-rich people want to be on board. It's
        | exclusive, it's just for fancy iPhone users. Finally an
        | elite place where you can only get in by invite, most
        | cannot resist. If they miss out on the bandwagon, how can
        | they call themselves an early adopter on the bleeding
        | edge? What will their friends thi k of them? Almost as if
        | they used Android or something.
 
    | toss1 wrote:
    | And this tells me that there is a need for another step up
    | for this app - to not only poison the contacts, but to
    | temporarily 1) backup => 2) delete => 2a) share poisoned list
    | => 3) restore contacts.
    | 
    | So we can share the list, but they'll never get our real
    | contacts, only trash data. Enough use it, maybe they'll stop
 
      | a3n wrote:
      | But wouldn't this company have to periodically review your
      | contacts, to slurp up new ones?
 
        | toss1 wrote:
        | Yup, probably their next move would be to require
        | constant access to contacts list and check whenever the
        | app runs.
        | 
        | The next move on this side would be to keep contacts in a
        | separate app from the std Android/Apple app, and then
        | have to make calls, texts, etc. from there.
        | 
        | If only there weren't so many sociopaths running these
        | companies... sorry, wrong planet
 
    | bogwog wrote:
    | In my case, I don't even remember giving them permission to
    | use my contacts, yet I got accepted because one of my
    | contacts sent me an invite.
    | 
    | I might have given them permission without realizing it, but
    | what could've also happened is that they saw my phone number
    | in someone else's contact list, and assumed we were contacts.
 
      | evanmoran wrote:
      | You probably didn't share, as I didn't. I believe the
      | contacts permission is only required if you want to share
      | an invite, not to accept one.
 
    | tonylemesmer wrote:
    | That means that more than likely clubhouse have our details
    | even if we have no desire to be part of it.
 
      | srockets wrote:
      | It'd be fun once they'll have EU presence.
 
  | JumpCrisscross wrote:
  | > _Clubhouse requires contact list in order to get invites,
  | which are required to sign up right now_
  | 
  | How is this GDPR compliant?
 
    | pmontra wrote:
    | I see the point, but if I upload my contract list the non
    | compliance is mine (I didn't ask permission to each one of my
    | contacts) or of Clubhouse (they asked me to do it)?
 
      | avereveard wrote:
      | both, yours for sharing, clubhouse's for storing.
 
      | gnud wrote:
      | It should be blaringly obvious to Clubhouse that they don't
      | have the right to even store most of this data, let alone
      | use it for anything.
      | 
      | So even if you are at fault, I can't imagine that would
      | help them a lot, if some data protection authority looked
      | into this.
 
    | corty wrote:
    | > How is this GDPR compliant?
    | 
    | It isn't, really, but the question whom to prosecute is
    | complicated. Clubhouse gets the contact list data from you,
    | the user. Usually, somewhere in the ToS, there is a little
    | thing where you confirm to have the right to share all the
    | data you share with Clubhouse. That means that first and
    | foremost, you as a user are responsible.
    | 
    | If you are a non-commercial user using Clubhouse from your
    | private phone, what you do with your private contacts isn't
    | covered by GDPR, private stuff is an exception. However, as
    | consumer, European legislation protects you from surprising
    | and unusual terms, which this might be. Legislation might
    | also protect all your contacts. However, this is a question
    | that still needs to be litigated in court, and I don't
    | remember any decisions around that problem (WhatsApp
    | basically has the same constellation).
    | 
    | If you are a commercial user, because this is your work phone
    | and your contacts are colleagues, business partners,
    | customers, things are quite different. You are, as a data
    | processor, responsible for how you pass on your contact list.
    | You better make sure that you are allowed to do that (because
    | you have a GDPR-compliant reason like legal obligation,
    | contractual obligation with your customer, assent or
    | legitimate interest) and that your contacts have been
    | informed about what you are doing beforehand. Also, you then
    | need a written contract with Clubhouse about the data being
    | passed along, about how it will be used and protected, etc.
    | Also, passing along the contacts to Clubhouse must be
    | necessary for a predetermined, well-defined reason that can
    | be considered more important than your contacts' right to
    | privacy.
    | 
    | So as a private person, you might get away with using
    | Clubhouse. As a company, employee, self-employed, state
    | official, whatever, you are probably in hot water, because
    | surely you didn't do all the required things. But for
    | Clubhouse this might not be a problem, because as current
    | case law stands (imho, iirc, ianal, ...) Clubhouse isn't the
    | party that did something wrong there.
 
      | GekkePrutser wrote:
      | On Android if you use Work Profile your work contacts are
      | in a separate partition and can only be accessed by
      | approved company apps. This works really well for gdpr
      | compliance with dual-use (company & mobile) devices.
 
    | msla wrote:
    | Because it's a non-EU company, and non-EU citizens didn't
    | vote the GDPR into existence.
    | 
    | Europe doesn't get to impose its law on other lands.
    | Colonialism is over.
 
    | numpad0 wrote:
    | Why would you want to be GDPR compliant?
 
      | marban wrote:
      | https://www.jdsupra.com/legalnews/clubhouse-app-faces-
      | court-...
      | 
      | On a side note, Germans are obsessed with Clubhouse.
 
      | bdcravens wrote:
      | To avoid substantial financial risk.
 
        | calvinmorrison wrote:
        | Has the EU sued and won against any company who is not
        | located in the EU?
 
        | otterley wrote:
        | That's not a good test, because the law is still
        | relatively new, and it takes a while for litigation to
        | make its way through the system. We also don't
        | necessarily know who has settled out of court.
        | 
        | Would you like to be a test case for us?
 
      | drclau wrote:
      | Because in European Union it is regulation, and you (as a
      | company) are fined if you are not compliant.
      | 
      | I recommend having a look over the Wikipedia page on the
      | subject:
      | 
      | https://en.wikipedia.org/wiki/General_Data_Protection_Regul
      | a...
 
        | fiddlerwoaroof wrote:
        | If you're not subject to the EU (I.e. don't have any
        | offices, servers, etc. in the EU) I don't see how the
        | GDPR is relevant: non-EU citizens generally aren't
        | subject to the laws of the EU.
 
        | ekianjo wrote:
        | If some of your users are in the EU you need to be GDPR
        | compliant.
 
        | fiddlerwoaroof wrote:
        | This is what the law says, but I don't understand how
        | this is expected to work: without some kind of treaty
        | from the US government, the EU has no way to make US
        | companies comply.
 
        | anonymousab wrote:
        | There's a slew of individual things that can be done. EU
        | companies can be prevented from doing business with a
        | (willfully) noncompliant company. Wire transfers going
        | through the EU and other operations can be blocked. And,
        | of course, the service itself, its apps, its sites, its
        | traffic, can be blocked from accessing the EU internet
        | (or being accessed from it).
        | 
        | That's not even getting into international pressure
        | levers.
        | 
        | I don't know that we've seen any of those kinds of
        | actions yet, but they're clearly on the table if a
        | company breaking the rules became a real "problem".
        | 
        | The thing is, if you're just completely avoiding doing
        | any business with the EU, having any EU customers or
        | users, and just not touching the EU with a 1000 mile pole
        | and avoiding the GDPR in such a fashion - well, then
        | there's no reason to go after you. The legislation has
        | done its job.
 
        | philwelch wrote:
        | > And, of course, the service itself, its apps, its
        | sites, its traffic, can be blocked from accessing the EU
        | internet (or being accessed from it).
        | 
        | In other words, the EU can attempt to extend its internet
        | regulations over the rest of the world by implementing a
        | China-style firewall. Well, we'll see if that happens.
 
        | gabaix wrote:
        | It is more akin to the US Sanctions. You don't have to
        | abide. If you do trade with sanctioned countries, you
        | should not do any kind of business with the US, or pay a
        | hefty penalty.
        | 
        | Here's a case example, BNP Paribas dealings with
        | sanctioned countries. https://www.wsj.com/articles/bnp-
        | agrees-to-pay-over-8-8-bill...
 
        | mattmanser wrote:
        | Have you not heard of extradition treaties?
        | 
        | For example, that's what the US is using on Kim Dotcom.
 
        | sneak wrote:
        | The US and EU have a treaty specifically about enforcing
        | each other's laws. (More accurately, the nations that
        | comprise the EU are individual signatories to such
        | treaties.)
 
        | fiddlerwoaroof wrote:
        | Source? This lawyer seems to think that there's no
        | applicable treaty.
        | 
        | https://tinyletter.com/mbutterick/letters/you-re-not-the-
        | bos...
 
        | sneak wrote:
        | Here's the one between the US and the largest economy in
        | the EU:
        | 
        | https://www.congress.gov/treaty-document/108th-
        | congress/27
 
        | corty wrote:
        | There is no legal mechanism because such exist mostly for
        | criminal law and civil and public debt collection. So the
        | EU maybe cannot use most of the enforcement mechanisms,
        | except one: You can be fined some amount of money,
        | creating a public debt which can then be collected if
        | there is a treaty about such collections.
 
        | alvarlagerlof wrote:
        | If you're operating a business that interacts with
        | customers in the EU, GDPR applies.
 
        | fiddlerwoaroof wrote:
        | The EU says it applies but, AFAICT there's no legal
        | mechanism by which it applies.
        | 
        | Here's a lawyer's take on this:
        | https://tinyletter.com/mbutterick/letters/you-re-not-the-
        | bos...
 
        | TedDoesntTalk wrote:
        | I thought US companies had to agree to Privacy Shield if
        | they wanted to be considered GDPR-regulated.
        | 
        | https://www.privacyshield.gov/welcome
        | 
        | Why any US company would voluntarily agree to this is
        | beyond me, unless one of its EU customers insisted on it.
 
        | malka wrote:
        | Then you cannot have ue customers. Or make wire transfer
        | through the ue.
 
        | Moru wrote:
        | You can also forget vacation trips in EU.
 
        | numpad0 wrote:
        | If thoroughly enforced, which is currently not the case.
 
        | drclau wrote:
        | "The GDPR also applies to data controllers and processors
        | outside of the European Economic Area (EEA) if they are
        | engaged in the "offering of goods or services"
        | (regardless of whether a payment is required) to data
        | subjects within the EEA, or are monitoring the behaviour
        | of data subjects within the EEA (Article 3(2)). The
        | regulation applies regardless of where the processing
        | takes place. This has been interpreted as intentionally
        | giving GDPR extraterritorial jurisdiction for non-EU
        | establishments if they are doing business with people
        | located in the EU."
        | 
        | Source: https://en.wikipedia.org/wiki/General_Data_Protec
        | tion_Regula...
 
        | msla wrote:
        | Countries or groups of countries don't get to impose
        | their law on other countries.
        | 
        | That's called colonialism, and Europe is supposed to have
        | given it up.
 
        | drclau wrote:
        | I am not a lawyer, and I don't claim I understand the
        | legal mechanisms involved. I don't even claim GDPR is
        | perfect.
        | 
        | But, as I see it, EU is protecting its citizens. If you
        | want to do business with EU citizens you must abide by EU
        | regulations. It's that simple. I don't get how this came
        | to be all of a sudden about colonialism. Any business is
        | free to stay out of EU.
 
        | msla wrote:
        | > If you want to do business with EU citizens you must
        | abide by EU regulations.
        | 
        | No, no more than if I want to do business with Saudis I'm
        | liable for punishment if I drink a beer.
 
        | drclau wrote:
        | But that's not really a good analogy (not that analogies
        | are proof). A better analogy would be you selling beers
        | in Saudi Arabia.
        | 
        | I urge you to read this, it should clarify things:
        | 
        | Applicability outside of the European Union:
        | 
        | https://en.wikipedia.org/wiki/General_Data_Protection_Reg
        | ula...
 
        | cortesoft wrote:
        | And any EU citizen is free to not do business with a
        | company outside the EU.
        | 
        | Do you think the EU laws should apply to people selling
        | things to EU citizens while they are on vacation in other
        | parts of the world? If someone from Germany travels to
        | Brazil and buys something from a store, are they required
        | to abide by EU rules?
        | 
        | If someone from the EU leaves the EU digitally to buy
        | something in another country, it isn't up to the seller
        | to enforce EU rules.
        | 
        | Unless you have an entity (either yourself or your
        | business) under EU jurisdiction, you don't have to follow
        | their rules.
 
        | drclau wrote:
        | There's an asymmetry of information and power in the
        | relationship between a business and a citizen.
        | Governments, generally, attempt to mitigate this
        | asymmetry. Hence, we have consumer protection laws, GDPR
        | and the likes.
        | 
        | While these solutions may be incomplete, or imperfect,
        | having none is definitely worse.
        | 
        | > If someone from the EU leaves the EU digitally to buy
        | something in another country, it isn't up to the seller
        | to enforce EU rules.
        | 
        | > Unless you have an entity (either yourself or your
        | business) under EU jurisdiction, you don't have to follow
        | their rules.
        | 
        | Please _do_ read the link I already posted in a previous
        | comment [0]. It clarifies many things, but I don't want
        | to paste too much content here.
        | 
        | [0]: https://en.wikipedia.org/wiki/General_Data_Protectio
        | n_Regula...
 
        | fiddlerwoaroof wrote:
        | This article basically confirms my suspicion that this
        | provision is basically unenforceable:
        | 
        | http://slawsonandslawson.com/article-32-the-hole-in-the-
        | gdpr...
 
        | cortesoft wrote:
        | I am not sure what you are trying to argue here. I am not
        | making any moral claim about whether a GDPR-type
        | regulation is good or bad. I am simply saying that the EU
        | saying the law applies outside their borders doesn't make
        | it so.
        | 
        | If I am a US citizen living and working in the US, and
        | break the GDPR by storing data illegally from visitors to
        | my website from the EU, the EU can certainly try to fine
        | me or issue a summons or whatever they want to do.
        | 
        | However, there exists no extradition treaty for this law,
        | and there would be no way for the EU to enforce
        | judgement.
 
        | mellavora wrote:
        | I wonder when the USA will follow suit?
 
        | [deleted]
 
    | paulie_a wrote:
    | They are in california. They can give the finger to the gpdr.
    | It's irrelevant to most people in the world
    | 
    | People tend to forget that it is not applicable. For instance
    | nothing I build will ever comply to it regardless of users
    | that might be in europe
    | 
    | Clubhouse has no duty to obey european law
    | 
    | The question is: why do you think the need to be compliant?
 
      | GekkePrutser wrote:
      | This is not how it works. If you make it available to EU
      | users, you have to comply with GDPR (at least when it comes
      | to those user's data).
      | 
      | For the same reason WhatsApp's new T&Cs don't really change
      | anything for EU users.
      | 
      | However I don't think the collection of contacts is
      | actually illegal under GDPR, considering WhatsApp does
      | exactly this too. And it's huge in Europe, much bigger than
      | in the US. if they haven't gone after WhatsApp for this,
      | they will probably not do so for Clubhouse.
 
        | paulie_a wrote:
        | If they don't do business there they don't have to
        | comply. Making it available doesn't count
        | 
        | Just like I don't have to comply if I have EU users on a
        | service, I am in the united stated. europe cannot enforce
        | their laws here. It's just the same as if saudia arabia
        | tried to enforce their laws here. They carry no wait
        | 
        | That is what makes the GDPR insignificant. It applies to
        | Europe. Not the rest of the world. The cookie warnings
        | for the vast majority of the internet are stupid an
        | unnecessary
        | 
        | So call it illegal in europe but who cares?
        | 
        | It honestly is maddening how many people care about the
        | GDPR that don't need to
 
        | GekkePrutser wrote:
        | There's many EU things that take effect with vendors
        | outside the EU. Like software sales: Try to buy a license
        | for a software package from the EU (or with an EU payment
        | card) and you will always be hit with VAT at the rate of
        | your country :( Even if the company is US based only.
        | With the exception of really small ones I guess. In the
        | above case it's annoying for us :) But in the case of
        | GDPR it's good IMO.
        | 
        | Anyway the EU says it applies but I agree they don't
        | really have much in the way of enforcement capability
        | with companies that have no presence here. Though they
        | could ask Apple/Google to remove it from the store I
        | suppose.
        | 
        | And of course most companies do have a presence here. All
        | multinationals do, and even the smaller ones. Even if
        | it's just a sales office.
 
        | paulie_a wrote:
        | Most American companies don't though. They can safely
        | ignore european laws
 
        | TT3351 wrote:
        | And also choose not operate in the nations whose laws
        | they are flouting in most cases; EDIT: a few weeks ago EU
        | posters here were describing how ERCOT was preventing
        | access to the company's _public facing website_ , citing
        | not wanting to comply with GDPR
 
    | vmception wrote:
    | I think this is a wording issue if you haven't used
    | Clubhouse.
    | 
    | You don't need to share contacts in order to get invit _ed_ ,
    | like you don't have to do it to use the platform. You have to
    | do it to invite others (like your friend that you told about
    | Clubhouse) after you are already on the platform, so that is
    | not regulated by GDPR.
    | 
    | It is a shitty user experience and I also want Apple to
    | control this at the OS level. Let me select which contacts if
    | I want to do it at all.
 
  | satya71 wrote:
  | Here's how to get around Clubhouse uploading contacts. We
  | shouldn't have to do this, but here we are.
  | 
  | 1. Disable contacts for all your configured accounts 2. Add a
  | dummy Gmail account, enable contacts. 3. Add invitee to dummy
  | account 4. Give contacts access to Clubhouse 5. Send invite 6.
  | Remove contact access 7. enable contacts disabled in 1
 
    | lupire wrote:
    | 0. Don't use Clubhouse because it adds no value?
 
      | satya71 wrote:
      | When you run a business, you have to go where the people
      | are. If my customers are there, I have to be there.
 
        | jcims wrote:
        | I'd think that depends on the business. What is the
        | engagement like on clubhouse? Do you participate or just
        | have a presence?
 
    | [deleted]
 
  | gherkinnn wrote:
  | I did the same and I'm still annoyed at myself.
  | 
  | Clubhouse is pretty shit, really. So I sold my soul and got
  | nothing in return.
 
    | bredren wrote:
    | Thanks for sharing this.
    | 
    | I have similar feelings about the product, but am curious to
    | hear your reasons in detail first if you'll share them.
 
      | gherkinnn wrote:
      | The one thing that got me interested is them using a photo
      | as the app icon. Intriguing. Maybe there's some fun to be
      | had. The rest was of no real interest to me. Silly, but
      | here we are.
      | 
      | Trivialities aside, the content is not for me. It's either
      | some self-help thing or a get rich fast scheme. And I don't
      | care about either.
      | 
      | Worse though is the content delivery. They talk so much and
      | say so little. Horrible.
      | 
      | It really is this:
      | 
      | > Clubhouse is C tier people listening to B tier people
      | talk about A tier people
      | 
      | And here I am, a D tier person not wanting to be part of
      | this circlejerk.
 
  | sneak wrote:
  | When you leak your contacts, you harm others, not just
  | yourself.
  | 
  | This, among other reasons, is why I never give out the number
  | of my SIM card, or my residential address, et c, to anyone.
  | They're just going to click "allow" and give it to a thousand
  | shady companies, starting with Facebook.
  | 
  | I never give people data I don't want stored in my shadow
  | profile.
 
    | MaxBarraclough wrote:
    | > When you leak your contacts, you harm others, not just
    | yourself.
    | 
    | As Eben Moglen puts it, _privacy is ecological, not
    | transactional._
    | 
    | See http://snowdenandthefuture.info/PartIII.html
 
  | JMTQp8lwXL wrote:
  | It's disingenuous of them to say they "have to" do contact
  | upload. Why can't I type in a phone number to invite?
  | Completely hostile. Consequently, I have invited nobody.
 
    | vinay_ys wrote:
    | Same here. It also seems to burn through battery more quickly
    | than other apps.
 
      | 177tcca wrote:
      | An app that recreates party lines on POTS burning through
      | battery is unfortunately unsurprising!
 
  | dehrmann wrote:
  | First I have to keep a burner number with a real sim card for
  | things that require signup, now I have to keep a burner phone
  | with no contacts?
 
    | [deleted]
 
| adsharma wrote:
| I wonder if people have thought about another variant of this. An
| app that maintains two address books and switches between them
| based on context.
 
  | tanelpoder wrote:
  | Or just some form of "share only these contacts with app X"
  | option at the device system/OS level.
 
    | adsharma wrote:
    | Given the tracking cookie situation, apps could refuse to
    | install if that option is turned on. They can easily detect
    | if they see a small number of contacts relative to average.
    | 
    | With the two address book solution, they should have no way
    | of telling which one is the real address book.
 
| cyberlab wrote:
| Remember: some apps check for what apps are installed on the
| device, and if they see this installed they can deduce you're
| poisoning the well.
| 
| Also if you want to research obfuscation and how it thwarts
| surveillance, check these:
| 
| https://www.schneier.com/blog/archives/2019/11/obfuscation_a...
| 
| https://www.science20.com/news_articles/obfuscation_how_to_h...
| 
| https://www.theguardian.com/technology/2015/oct/24/obfuscati...
| 
| https://adnauseam.io/
| 
| https://bengrosser.com/projects/go-rando/
 
  | artwork159 wrote:
  | If they saw this app installed, what might they actually do
  | about me or my contact list?
 
    | sopromo wrote:
    | Remove all contacts that first name and last name start with
    | Z.
    | 
    | Docs say that they prefix every first & last name with Z so
    | that would be a start.
 
      | cyberlab wrote:
      | Also: check for contacts with weird country-code prefixes
      | that don't match the country the user is based in
 
    | speedgoose wrote:
    | I guess they may decide to not sell your data. Which is
    | actually a good thing.
 
    | cyberlab wrote:
    | They could just flag you as someone who poisoned the well and
    | ignore you I suppose. Remember: bad actors go after low
    | hanging fruit and tend to ignore privacy-aware folk and those
    | doing anti-surveillance.
 
  | djrogers wrote:
  | >> some apps check for what apps are installed on the device
  | 
  | I can't believe that's allowed by the OS - seems like a
  | horrible policy.
 
    | TedDoesntTalk wrote:
    | agreed. Id like to see a source or reference for this.
 
      | throwawei369 wrote:
      | https://arstechnica.com/information-
      | technology/2020/03/4000-...
 
        | TedDoesntTalk wrote:
        | But the app in the original article doesn't even work on
        | Android. It is an ios app. The link you provide is about
        | android, right? (Still concerning , however)
 
        | throwawei369 wrote:
        | Seems you have too many HN tabs open at the same time..
        | But the article I have linked shows the study done on how
        | apps read this information. Goes both ways for Android
        | and Apple at the time, not sure if much has changed
 
        | TedDoesntTalk wrote:
        | Yeah sorry :)
 
| naebother wrote:
| How does this help me? Malicious apps are still going to scoop up
| my real contacts, right? What if one of the random phone numbers
| belongs to someone deemed a "terrorist" by one the imperial
| powers and I'm judged guilty by association?
 
| aboringusername wrote:
| Can someone please explain to me how the collection of contact
| data is in any way legal under the GDPR and why Microsoft
| (Windows), Apple/Google haven't been required to make changes to
| prevent abuse of this permission (such as selecting specific
| contacts).
| 
| I'd also like to not know why if my contact data is shared, I am
| not informed of this. If my data is uploaded by Google to their
| servers, I should know. If somebody chooses to share my data with
| $app I should know, and, be able to "opt-out" of being included,
| perhaps (although it should be opt-in!)
| 
| Being able to mass collect what is often the most sensitive
| information means that consistent data is now a liability;
| keeping the same number/email can be useful for cross-
| referencing. Ideally you should rotate what data you can
| (physical address/location is obviously extremely difficult).
| Everything else is possible (browsers/IP addresses/emails/User
| Agent strings, phone numbers etc etc)
| 
| The best idea is to "troll" with your data; put insane items in
| your logged in basket (ebay/amazon etc), like sex toys. You can
| even make an order (and refund it) to further poison the well.
| Log in to Google and do some disgusting searches, and train
| algorithms to have the "wrong idea" about you, this is a reality
| we're now facing as this data can (and will) be used against you
| at any opportunity.
 
  | JCDenton2052 wrote:
  | The best idea is to not use their services. Switch from Windows
  | to Linux, de-google and if you must use Android keep the data
  | on your phone to a minimum.
 
  | djrogers wrote:
  | > and why Microsoft (Windows), Apple/Google haven't been
  | required to make changes
  | 
  | I don't believe there's anything in the GDPR that gives it the
  | ability to regulate entities several steps removed from the
  | violations. If company A uses a posted letter to ask for PII
  | then stores it in violation of the GDPR, would you then
  | regulate the post office?
 
| nbzso wrote:
| All the shady data schemes and dark patterns in todays idea of
| software business motivated me to look to my phone as an enemy
| and using the web cautiously all the time. Actually the idea of
| hyperconnected future in which 24/7 monitoring of the individuals
| will be normalised and mandatory makes me cringe. The Internet
| from force of good is turning to dystopian toolchain by the hour.
| And all is because we as society cannot find an effective way to
| limit the greed.
 
  | Klwohu wrote:
  | The Internet was designed to be dystopian before it was even
  | technically implemented.
 
  | throwawei369 wrote:
  | Wait until iot becomes mainstream. I foresee tiny chips
  | creating mass scale mutiny against their creators and
  | colonizing us (best case scenario)
 
    | shervinafshar wrote:
    | I wonder how dystopian sci-fi would read in such future? I
    | mean...what would be _their_ parable of The Matrix?
 
      | throwawei369 wrote:
      | You joke. But what if we we're playing right into their
      | game and robot resistance is already underway. What if
      | there's more to the vaccines we're injecting into
      | ourselves? Is Bill Gates even a real person or just a
      | simulation?
 
  | wruza wrote:
  | Because some ['kl^bhaUs], a shitty app promoted and used by
  | hype-flex-and-chill type of "people"? Just let them be and move
  | on, what do you think you miss there? If you see them as a
  | source of income, a second job-only phone is a must anyway.
 
  | federona wrote:
  | Society, current society also called capitalism, is designed
  | not for greed but constant growth. When your goal is not
  | satisfaction but constant growth and you already are a billion
  | dollar company, then it makes you look at all the shady shit
  | you can still do and get away with in order to grow. These
  | companies don't need to grow, if anything they actually should
  | be growing smaller and sustainable if we actually wanted to
  | engineer towards goodness rather than money. The fact that the
  | rich are getting richer while having absolutely no need for it
  | says to me our prerogatives are wrong and our engineering about
  | business is wrong. A lot of the common laws rules and norms
  | around which business is built are insane.
  | 
  | That is to say that if the economy is a mirror of nature, then
  | businesses should be engineered to die. Not to be a going
  | concern forever. After a certain amount of profit is extracted
  | and life is lived, into the grave they should go. Not just as a
  | result of competition, but as a result of system design.
  | 
  | This would then lead to a more evolutionary world and better
  | distribution of power and resources rather than continuous
  | monopolizing and consolidation. Also a different mentality of
  | you can't take it with you to the grave, rather than infinite
  | mindset. It would be a cyclical mindset about finite things,
  | not infinite things. Corporations want to be people, so
  | engineer them like people and less like machines.
 
| neilv wrote:
| > _The app is designed to be very simple and fail silently. If
| you deny permission to access contacts, the app will not
| complain, it just will not work._
| 
| I don't understand the reason behind "designed to...fail
| silently" in this way, in a privacy&security measure.
 
| annoyingnoob wrote:
| I'm of the opinion that personal data is not like a currency and
| should not be seen as a form of currency.
| 
| If you want to barter then I want to negotiate, no one sided
| contracts. Can't make a deal? Your loss then.
 
| ketamine__ wrote:
| Is there a limit on the number of contacts Clubhouse would sync?
 
  | CharlesW wrote:
  | It's incredibly unlikely. This kind of social graph information
  | is _gold_.
 
    | lanstin wrote:
    | I suspect it is less valuable than call logs. I have never
    | deleted contacts so I have over twenty years of entries with
    | pretty low value (e.g. call this number to find out about
    | this real estate offering; my old mechanics for on 2003 old
    | phone number) or accuracy. I only call about seven people but
    | those are significant links.
 
| paulie_a wrote:
| Data poisoning needs to become a standard practice. Make the
| "valuable" ad data useless
 
  | tjpnz wrote:
  | From an economics perspective it seems like a more viable
  | approach. Most of the techniques considered state of the art
  | now are likely easily detectable by Google and other ad tech
  | companies - they have a very good idea of which data can be
  | safely discarded. Rather than blocking Google Analytics I
  | wonder what would happen if browsers started responding with
  | garbage.
 
  | throwawei369 wrote:
  | Couldn't agree more. It's a far better approach as a cloaking
  | technique. Reason I use privacy-possum addon on Firefox.
 
| jpmattia wrote:
| Not exactly on topic, but historical context maybe: Long ago
| (early 90s?) when it was guessed/assumed that intelligence
| agencies were scanning emails, emacs was still among the best
| ways to read and send email. So emacs provided a handy function
| to append a random list of "hot" words to each outgoing email in
| the signature, just to degrade the signal-to-noise of such
| surveillance.
| 
| It's still there today, and you can see the output via M-x spook.
 
  | ianmcgowan wrote:
  | That used to be the case on usenet too - people would put
  | attention-grabbing words in .signature as "NSA Food" - to
  | overwhelm the NSA data capture algos. It seemed like a futile
  | gesture even at the time, but particularly poignant looking
  | back from a post-Snowden world.
 
    | eternalban wrote:
    | The real poignancy is the shift in hacker political views.
    | Call it post-software-is-sexy world. Those usenet sigs were
    | by hackers who lived in a world where software engineer or
    | programmer were social reject code words. That world changed
    | after geeks came into money. Suddenly but soon thereafter,
    | paranoia about privacy was rewarded by tinfoil hats. (And
    | then yes, years later, came along this guy called Snowden.)
 
  | shervinafshar wrote:
  | Such an interesting context. Thanks for sharing this. I
  | appreciate the nostalgia poetics of this today.
 
| atum47 wrote:
| You can always use bash or python to create vcards and import
| them in your phone.
| 
| I've used this technique once to generate a bunch of numbers to
| find the whatsapp of a person, works just fine
 
| fsflover wrote:
| Or just stop using operating systems and apps which you don't
| trust and switch to GNU/Linux phones.
 
| Waterluvian wrote:
| Apps using contacts is a $#%$ing anxiety attack for me. The scum
| companies don't care. They just want more leads. But for me, it's
| this fear that they're going to spam my exes and old roommates
| and bosses and professors and landlords and everyone who ends up
| added to my contacts.
| 
| Signal did that to me last week. This person I'm not on speaking
| terms with got Signal and it added us and announced to each other
| we were on it and put our empty conversation onto my list of
| convos.
| 
| Phone contact lists are a complete $&^*ing disaster and Apple
| needs to make it far more clear what specific contacts I share
| access to.
 
  | tchalla wrote:
  | Does Signal share contacts the same way others like WhatsApp
  | does?
  | 
  | https://signal.org/blog/private-contact-discovery/
  | 
  | > Signal clients will be able to efficiently and scalably
  | determine whether the contacts in their address book are Signal
  | users without revealing the contacts in their address book to
  | the Signal service.
 
    | lucb1e wrote:
    | Note that this SGX thing is broken seven ways from sunday,
    | but in principle, yep they have some security measures here.
    | We just have to trust them not to crack their SGX environment
    | as well as (regardless of SGX' security) Intel not to
    | generate an identical MRENCLAVE for anyone else but with
    | additional logging code running inside.
    | 
    | This is the best system I know of anyone running, by the way.
    | Threema, Wire, etc., nobody else has this (but then neither
    | requires a phone number, so...). I also don't know of a
    | better way to do phone number matching than having a trusted
    | third party that bakes their private key into chips and
    | verifies that you're really talking to the code you think
    | you're talking to. The upsides of DRM technology!
 
  | purpmint008 wrote:
  | About that Signal thing: Did that other person actually get a
  | conversation starter message of some sort?
 
  | carmen_sandiego wrote:
  | Not to be unkind but I suppose most people are not really
  | traumatised by merely seeing someone's name, even if they're
  | not on speaking terms with that person. It probably falls on
  | the side of convenience for the vast majority. For the Signal
  | org, it's possibly even an existential issue, since it helps
  | them counter network effects in the incumbents. It's hard to
  | expect them not to do it, then.
  | 
  | Having said that, I think it would be nice for Apple to
  | implement what you describe.
 
    | Waterluvian wrote:
    | Yep. I can't claim to know how everyone else responds to
    | these things.
    | 
    | The Signal example isn't the worst. It's a mutual connection.
    | It's not like they're emailing hundreds of people saying
    | "Waterluvian wants you to get on signal!"
    | 
    | What's to stop them from doing that when they get
    | sufficiently desperate? I don't even own my contact lists.
    | They seem to grow on their own with anyone I've ever emailed.
 
      | sneak wrote:
      | Signal does it for anyone in your address book, not just
      | mutuals.
      | 
      | Your "anyone I've emailed" example is a great reason not to
      | use the same service you use to host your email to host
      | your contacts.
      | 
      | Personally I would never in a million years sync my
      | contacts to Google, which I assume is what you mean here
      | (most people use gmail).
 
        | Waterluvian wrote:
        | Probably. Contacts have been confusing. I've had Gmail
        | list. My phone. What's in my Sim card. My Sony contact
        | list...
        | 
        | I had a really infuriating time trying to clean them all
        | up many years ago and I've just tapped out.
 
        | ficklepickle wrote:
        | Same here. I recently went to LineageOS and use fastmail
        | for email/contacts/calendar. It's been wonderful.
 
    | ficklepickle wrote:
    | I've got a dead friend that I'm reminded about every time I
    | open signal. "DeceasedFriend is on signal!". No, no he is
    | not.
    | 
    | I'm sure I could clear it, but I don't really want to yet.
    | 
    | On the whole, I still like the feature.
 
      | carmen_sandiego wrote:
      | I'm sorry about your friend. I've had similar experiences
      | with tech products, but I tend to think that unexpected
      | reminders (of any kind) are all part of the process of
      | dealing with loss. That hyper-avoidance seems an unhealthy
      | route, popular though it is in modern discussions about
      | emotionally difficult subjects.
 
    | myself248 wrote:
    | In my case it wasn't traumatic, exactly. More, targeting.
    | 
    | There was an individual that I kept in my contacts, you see,
    | for the the sole purpose that if he ever called me, I'd know
    | to let it go to voicemail. We had been close long ago, but he
    | stopped living in consensus reality and wasn't interested in
    | treatment. I considered him disturbing but not immediately
    | dangerous, just someone I didn't want to reconnect with.
    | 
    | When I installed Signal, he got the notification that I had
    | done so, and immediately messaged me, along the lines of "Oh
    | hey, you still exist! And I guess by the timing of this
    | install, you must be at [security-focused event] this
    | weekend, yeah? Hey let me tell you about my latest
    | harebrained scheme..."
    | 
    | I understand that Signal needs to do that sort of connection
    | to work behind the scenes, but they don't need to generate an
    | alert on the guy's lock screen about me.
 
    | heavyset_go wrote:
    | > _Not to be unkind but I suppose most people are not really
    | traumatised by merely seeing someone 's name, even if they're
    | not on speaking terms with that person._
    | 
    | Domestic abuse, harassment/sexual harassment, stalking etc
    | are all more common than they should be.
 
    | aboringusername wrote:
    | > but I suppose most people are not really traumatised by
    | merely seeing someone's name
    | 
    | I mean there are cases where that can be _devastating_.
    | 
    | "Ohai here's your old abusive ex, here's a chat box just for
    | good measure, good luck!".
    | 
    | There are people who I'd never ever want to be within a
    | textbox and tap away from accessing me, for any reason,
    | period.
    | 
    | You can get restraining orders in the physical world, the
    | digital world however has no boundaries when the apps
    | _themselves_ are too stupid and are defined by real-world-
    | illogical programming code. I wouldn 't expect an app to
    | understand a 'court order' but that's a real human construct.
    | How do we design against that in the digital space, when you
    | are so accessible that if you have a crazy dude following you
    | you're basically forced to retreat as there's no effective
    | measures/guards against this?
 
      | carmen_sandiego wrote:
      | Well, a couple of things:
      | 
      | (a) You can't take seeing their name, but you keep them in
      | your contacts? Don't you occasionally scroll past it with a
      | call button right there, which is just as easy to hit and
      | put you in touch with them? How is this any different?
      | Seems a bit silly.
      | 
      | (b) As far as I know, research suggests hyper-avoidance is
      | not a good way to resolve trauma. So I'm not convinced by
      | the idea that this is harmful, especially when you can
      | control it through (a).
 
        | Waterluvian wrote:
        | A contact list often operates as a database of what
        | number belongs to who, for guarding incoming calls. It
        | can be a security tool.
 
        | carmen_sandiego wrote:
        | You can generally block calls by number, without having
        | them as a named contact.
 
        | lucb1e wrote:
        | I do see Waterluvian's point though. You might still have
        | business with them yet you don't really want to deal with
        | them otherwise. Knowing who this SMS or call was from can
        | be helpful rather than blocking the number outright.
        | 
        | Then again, seeing their name when installing Signal and
        | figuring "oh hey they have signal too" seems no less
        | weird to me than seeing their name in my phone book and
        | thinking "oh hey they have a phone too". If that really
        | sets you off... that seems unlikely. So I don't really
        | get this subthread, even if I see the general point that
        | you might not want to be reminded of certain people on a
        | regular basis (for me, installing a phone number-based
        | social application is not a monthly occurrence).
 
        | nvr219 wrote:
        | In iOS and Android, incoming call blocks are in a
        | separate database and explicitly not the contacts
        | database.
 
        | the_local_host wrote:
        | Even if you don't keep them in your contacts, the
        | connection tracking can be problematic if they keep you
        | in their contacts.
        | 
        | "But what if you didn't give Clubhouse access to your
        | contacts, specifically because you didn't want all or any
        | of them to know you were there? I regret to inform you
        | that Clubhouse has made it possible for them to know
        | anyway, encourages them to follow you, and there isn't
        | much you can do about it... I got followers who weren't
        | in my contacts at all -- but I was in theirs."
        | 
        | https://www.vox.com/recode/22278601/clubhouse-invite-
        | privacy...
 
        | heavyset_go wrote:
        | > _You can 't take seeing their name, but you keep them
        | in your contacts?_
        | 
        | If I start getting abusive calls or texts from a usual
        | suspect, I want to know who it is. My carrier-level
        | number blocking resets every couple of years, and I
        | cannot remember everyone's phone numbers.
 
        | musingsole wrote:
        | Why do you have the authority to dismiss many's
        | experience of a feature? Because you can think of a way
        | _you_ would handle it and you 've read some things?
 
        | carmen_sandiego wrote:
        | Because we're all here talking about how things should be
        | designed, which often inherently requires fulfilling some
        | needs at the expense of others? Not quite sure how you
        | expect those decisions to be made without people
        | gathering to discuss the relative merits of each
        | approach.
        | 
        | If you're about to tell me we should just implement every
        | user request that they claim is of 10/10 importance to
        | them personally, then I'm not even sure what to tell you.
        | Have you taken all of a few seconds to consider what
        | happens when two people make conflicting requests? Then
        | we're back to evaluating things and discussing them
        | again. How arrogant of us.
        | 
        | I appreciate the implied authority you've given yourself
        | to be the conversation police, though.
 
    | nathanfig wrote:
    | "Did this cause trauma" is not the bar we're trying to set
    | here, any level of anxiety caused by tech companies misusing
    | contacts is bad.
 
    | laurent92 wrote:
    | The problem I have with Whatsapp is even more than Signal:
    | Not only they engage me to start a conversation with that
    | customer to whom I only wanted to appear super-stern and
    | rigorous, but they also send them my profile photo and my
    | name!
    | 
    | My business name is not my private name! At least let me
    | remain under my name in their address book, don't give them
    | information.
 
| jp57 wrote:
| Can we get little Bobby Tables in there?
| 
| https://xkcd.com/327/
 
| championrunner wrote:
| Do you have a running APK ?
 
| nom wrote:
| Hm can it be estimated / is there public information about how
| many phone numbers are taken? E.g. I generate a valid number for
| one country or state, how likely is it that the number is in use
| or registered?
| 
| I once got a phone call from a university student for a survey
| for their project and they told me they generate them randomly
| which makes me really wonder, how likely is it?
 
| aasasd wrote:
| On Android, IIRC I've seen a dialer app that stores contacts in
| its own database instead of the system thing. Seems to be a
| better approach than this--at least if other apps also don't
| write to the shared contacts.
| 
| (It was probably an open-source dialer on F-Droid, but don't
| remember exactly which one.)
| 
| Anyway, an even better approach of course is to tell data-
| slurping apps to bugger off.
| 
| Edit: come to think of it, maybe alternative Android ROMs could
| fence the contacts so that an app only sees its own unless the
| user specifically selects someone. I guess this is similar to
| Apple's trick with Photos.
 
| andix wrote:
| Just don't share your contacts with apps that steal them and use
| them for marketing purposes.
| 
| It is also illegal to do it (GDPR), if you don't have the
| permission of every single person in your contacts.
 
| ccleve wrote:
| This is a common technique in the mailing list industry. It's
| called "salting". You add fake names, but real email addresses,
| street addresses, or post office boxes. You then monitor what
| shows up in these places addressed to "Mr. Fake Name". It's how
| mailing list companies monitor who is using their lists and helps
| control misuse.
 
  | bredren wrote:
  | Have you worked in this industry? Curious about more details of
  | tricks from various list makers/sellers.
 
| the_local_host wrote:
| I have to say the spirit of this fake_contacts app is very
| appealing. Why stop at defending your data, when you can attack?
 
  | throwawei369 wrote:
  | Offence is the best defence
 
| aww_dang wrote:
| Imagine if your fake contact's randomly created email or phone
| number is on a terror watch list.
 
  | praptak wrote:
  | I think that's exactly the point of this. I remember people on
  | Usenet posting random shit like "construct bomb kill president"
  | when the news about Echelon came out.
 
  | corentin88 wrote:
  | The documentation states that it uses a non-allocated country
  | code (+21). So it seems unlikely to happen.
 
    | dustymcp wrote:
    | Doesnt this defeat the purpose tho as it could be filtered?
 
      | 0x426577617265 wrote:
      | Yes, this data could be quickly mitigated.
 
      | o-__-o wrote:
      | The us government monitored all DC residents personal
      | communication for over 2 years because they fat fingered
      | the collection regex. The country code for Egypt is +20,
      | the DC area code is 202.
 
        | IAmGraydon wrote:
        | You think that was a mistake, huh?
 
        | grandinj wrote:
        | That is a mistake that sounds suspiciously self serving,
        | given how many powerful people live and work there
 
    | toast0 wrote:
    | +21 isn't allocated, but                  +211 South Sudan
    | +212 Morocco        +213 Algeria        +216 Tunisia
    | +218 Libya
    | 
    | Someone putting random numbers after +21 because it's
    | unallocated has a fundamental misunderstanding of
    | international phone numbers.
    | 
    | But also, the server side is likely to throw away invalid
    | numbers to start with. It's simple and easy to do, and
    | reduces the data storage by a lot (there's a lot of garbage
    | in people's address books)
 
| ficklepickle wrote:
| Sad state of affairs. AOL couldn't kill the open web, but "apps"
| have.
| 
| The user agent should respect your wishes, but instead we are
| reduced to this insane work-around.
| 
| Surveillance capitalism needs to die in a fire. To anybody
| working on that shit: I hate you. Personally, as an individual, I
| wish you harm.
| 
| OK, that was hyperbole, but I do love the open web. RIP.
 
| otterley wrote:
| Recently Apple added a feature to iOS that allows you only to
| allow selected photos to be accessible by an app. This allows the
| user to respond positively to an access request, but allow the
| app to see only a subset (or zero) actual photos.
| 
| It would be a very useful feature for Apple to do the same for
| contacts: the app would think it's getting access to your
| contacts, but would only actually receive a subset of them, and
| be none the wiser. This would be a tremendous boon for privacy.
 
  | rsync wrote:
  | "Recently Apple added a feature to iOS that allows you only to
  | allow selected photos to be accessible by an app."
  | 
  | What we really need to see from Apple is a permissions index
  | _in the app store_ that allows me to inspect, and consider, the
  | permissions that an app will request _before installing that
  | app_.
  | 
  | I shouldn't have to install the app (or do laborious research
  | online) to discover what permissions it will attempt to utilize
  | and which of them are required to function.
  | 
  | It would be trivially easy to list that in the app store, for
  | each app.
 
    | behnamoh wrote:
    | They have added that, but it's written by the app developers
    | so you still can't trust what they claim they're gathering
    | from you.
 
    | aeternum wrote:
    | I'm not sure the permission index would be very useful.
    | 
    | Most iPhone chat apps for example work perfectly fine with
    | zero permissions granted yet provide the option to send
    | pictures, invite contacts, use mic/camera, send gps location,
    | etc if a user is so inclined. With a permissions index, you
    | would likely end up with the majority of apps listing all
    | permissions and users would simply ignore it.
 
      | NeutronStar wrote:
      | So? Just give me the possibility to see it.
 
    | l8rpeace wrote:
    | +1 and a filter you can use on related permissions when
    | searching for apps
 
    | lanstin wrote:
    | All these permission choices should be invisible to the app.
    | If I say no contacts the call should succeed but with a zero
    | Len response. It shouldn't be possible for apps to say you
    | have to agree to this or I won't run. I can run the software
    | and as the root user control what data the software can use.
 
      | dheera wrote:
      | > If I say no contacts the call should succeed but with a
      | zero Len response.
      | 
      | Actually I would take it further and say that I should be
      | able to define its response or have it render a random but
      | plausible template response. Otherwise a zero len response
      | is too obvious that you didn't give it permissions.
 
      | lanstin wrote:
      | Or even as a a service fake data - feed fake location data
      | and fake contact list. Full of 202-555-1234 type numbers. I
      | always put fake data into web forms and it is a sign that I
      | don't truly own the phone that I can't do the same for
      | local software.
 
        | lanstin wrote:
        | Like I want a pop up: this application is requesting your
        | location data. Shall we give the real data, no data, or
        | simulated data. Same for contacts, photos, apps
        | installed, etc.? Not saying that would solve all the
        | problems but it would be user centric in a way the
        | privacy conversation just isn't.
 
      | djrogers wrote:
      | > It shouldn't be possible for apps to say you have to
      | agree to this or I won't run.
      | 
      | It's not - that's a violation of the App Store TOS. That's
      | also not what's happening here - you can use clubhouse
      | without allowing contacts access, but you can't invite
      | someone to the closed beta without allowing it.
 
        | lanstin wrote:
        | They must know that I have disallowed access in that
        | case.
 
        | danShumway wrote:
        | GP means that it shouldn't be technologically possible,
        | not just that it shouldn't be possible as a matter of
        | policy.
        | 
        | The policy solution clearly doesn't work in all scenarios
        | because Clubhouse is still on the store. But an on-they-
        | fly permission model that allowed the user to deny the
        | permission invisibly or share a subset of their contacts
        | would completely solve the problem regardless of whether
        | or not Apple was effective at moderating.
        | 
        | Apple could still do whatever moderation they wanted to
        | reduce annoyances for the end user, but the sandboxing
        | approach would catch any apps they missed or refused to
        | moderate.
        | 
        | This would also solve the problem where an app
        | legitimately needs some access to contacts to run, but
        | doesn't need access to the entire list. Clubhouse does
        | need access to some contacts to invite someone to the
        | beta, but it does not need access to the entire contacts
        | list, and there's no reason for it to have the ability to
        | tell whether or not a user is providing the full list.
 
    | djrogers wrote:
    | > and which of them are required to function.
    | 
    | On the iOS App Store, none of the optional permissions can be
    | required for an app to perform it's basic functions - that's
    | a store policy, and it's generally well enforced. Obviously
    | if your app's function is mapping, GPS can be required to use
    | those features (but only at the user's discretion - ie while
    | running or all the time, granular or coarse), but the app
    | can't just refuse to launch without it.
 
    | andai wrote:
    | I didn't realize iOS doesn't have that. Google Play shows
    | each app's permissions on the listings page.
 
| parkingpete wrote:
| Hmmm, not good
 
| floatingatoll wrote:
| Is it possible to create a network of contacts that triggers
| worst-case memory and cpu scenarios when the network is
| reconstructed from contacts?
| 
| Or, put another way, can a collection of people doing this
| construct a set of synthetic contacts spread out in various ways
| across their devices, such that anyone doing contact analysis
| sees their analyses slow down, drain resources, or crash
| altogether due to network structure?
 
  | alcover wrote:
  | Wouldn't any worthy graph explorer handle cycles ?
 
    | floatingatoll wrote:
    | If I had a nickel for every time an algorithm was found to
    | have an exploitable weakness due to unforeseen alignments of
    | input, I'd certainly have some nickels. We know what the
    | common screwups in crypto are, and we _could_ know what
    | common screwups in network graphs are. I'm just wondering if
    | anyone actually _does_ know of any of those.
 
| williesleg wrote:
| Give me your data now!
 
| heavyset_go wrote:
| This can be easily bypassed by cross referencing contact lists on
| the backend.
 
| washadjeffmad wrote:
| I seem to remember CyanogenMod having a per-app sandbox feature
| around 2013 that returned blank info from a virtual root.
| 
| Like many point out, this isn't data poisoning, especially if
| there aren't metric-breaking honeypots around the web seeding
| these services with enough noise to make these collection
| practices useless, which there are not.
| 
| A more effective alternative might be hashing real contacts to
| generate seeds of complete but false profile information. Apps
| thinking they got the mother lode wouldn't be able to assign
| confidence to any results they didn't have duplicates of, and
| slowly over time, groups who used this would become worthless.
 
| sanxiyn wrote:
| What a great idea. Let's do more of these.
 
___________________________________________________________________
(page generated 2021-02-27 23:00 UTC)