The Great Hack

Yes, the US data protection laws aren’t sufficiently rigorous, OK, what are you going to do about it?

We have different laws over here.

Yes, the Chinese data protection laws aren’t sufficiently rigorous, OK, but I don’t think we can do anything about it!

We have different laws over here.

I think you’ll find that we have precisely zero regulation “over here” as regards facial recognition. A little investigation will also reveal that at least one of our equivalent organisations has been indulging in exactly the same kind of activity.

I don’t welcome the aggressive response but your confidence is entirely misplaced.

I just meant that you should start writing to your politicians to get the law changed (the US Freedom of Information Act needs updating to cover modern interconnected data processing systems and interconnected organisation, including QANGOs [in both the American and European usage]). Better still work with one of the organisations in your country that seek to pressurise politicians into curtailing such activities.

Incidentally, as I worked for a mutual organisation that was dealing with very sensitive data on behalf of our members, so I know rather more about the British implementation of the GDPR (i.e. our data protection regulations) than do most; and it has clauses that define principles that do cover most (but not quite all) eventualities.

However, as such I found your repeated inference that the computing and data processing industry in general is operating against the interests of the public to be rather offensive; I do apologise for my over-sensitivity and excessively strong response. I’m sorry my response was not appropriate.

Ah, noted. I do a lot with GDPR too but in order for it to work you have to know what’s happening with your data in the first place. That’s the current gap and why I’m posting the links.

I think “computing industry” is probably too broad for what I’m trying to say but I’d stand by the general assertion that there’s a major subset of very large (and other) IT organisations, government and universities who are actively acting against the public interest. That’s largely irrefutable but below the radar because too many vested interests value it being that way.

I am an activist on this front but we live in a climate where the extent to which that gets discussed publicly is something one needs to be circumspect about.

That’s the principle of the GDPR: that any organisation has to be cognisant of how data are to be used by any organisation to whom it sends those data. To send the data to a 3rd party without specifying the limitations of use under which those data were collected and full disclosure of how those data are to be used by the 3rd party is an offence under the GDPR. Use of the data outside of the disclosed processing intent is also an offence under the GDPR. Export of sensitive data outside of any area not covered by the equivalent European GDPR arrangements is not permitted. Data that can, on their own, be considered non-sensitive can become sensitive dependant on context, and data processing organisations also need to take this into account. This arrangement doesn’t allow much of a loophole, but there are still many organisations (within the EU including in Britain) who are not fully compliant to these principles, and so are, in fact, breaking the law (e.g. Cambridge Analytica!).

It’s more a case that the enforcement needs to be tightened rather than the GDPR itself.

1 Like

Totally agree. The issue in the UK is that privacy notices aren’t read so people don’t know what’s being collected and what happens next. Combined with numerous examples of collection and sale without consent. The ICO simply can’t keep up.

I think that is very true. Indeed, I do sometimes wonder if they (and user agreements etc) are deliberately written as verbosely as possible to hide information, and with the expectation that the chances are they won’t be read.

It seems from popular magazines, even quite serious ones, and some modern non-fiction books (also other media) that these days the general public’s attention span is very short, and anything longer than perhaps 100 words or so simply doesn’t get read.

1 Like

There are other issues around privacy notices. Notably that they are required to be formatted to the audience e.g. if you’re a charity representing those with a learning disability then you need to produce an easy read version etc. Almost never done.

I’m glad I’m not alone in spotting that. I reported it and wasn’t taken seriously by the desk sergeant. I said I was of good mind to suspend his membership of the flat earth society.

Do folk worrying about privacy still use the Google search engine?

The desk sergeant was one of them of course, so obviously he would deride your astute observation!

I do use the Google search engine but I turn off the suggestions and advertising ID and send the ‘Do not Track’ flag as together they dramatically reduce the amount of information Google records. I also have an anonymising browser (which again reduces the amount of information on the source of the query).

1 Like

We do.
And we’re currently working on deriving a minimal simplified form of the confidentiality statement that’s more easily understandable than the full legal statement (although the full statement will still be freely available).

Two links which really ought to start alarm bells ringing.

That is indeed suspicious given the ethics (or rather lack thereof) of the people concerned.

Furthermore unless this has been stated as a purpose for collection and specified as a permitted use of these data as a condition of use of the services, then this is almost certainly in contravention of the GDPR.

And then there’s this.

https://diginomica.com/gds-takes-govuk-open-source-code-and-makes-it-privatebut-why

This is both inevitable and sickening.
It shows why I use very few “connected” ‘apps’ on any of my computing devices and those I do either come from known sources or I don’t allow them access to sensitive data.

I check the ‘permissions’ quite carefully, a dead giveaway is when an app asks for more permissions than are needed for its function!

Many ‘apps’ are distributed and run commercially from jurisdictions that have very lax data privacy laws - hence, from a legal perspective, what people type into them becomes ‘fair game’ for anyone on the internet, and for any purpose to which they desire to put that information.

Background refresh is a key issue. You can’t assume that the permissions sought reflect the data requested.

Original article in the Washington Post I believe.

https://www.msn.com/en-gb/money/technology/its-the-middle-of-the-night-do-you-know-who-your-iphone-is-talking-to/ar-AAC3Rq2

Unless an Android 'phone is ‘rooted’ then apps can only gather information from services they have on their permissions list. If the app can’t access the information, the tracker can’t report it.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.