Welcome to our New Forums!

Our forums have been upgraded and expanded!

Your digital identity has three layers, and you can only protect one of them

FancyMancy

Well-known member
Joined
Sep 20, 2017
Messages
7,032
3bX1wu4.jpg

Marcin Antas & Kamil Śliwowski/Panoptykon Foundation
The further out the ripples go, the harder it is to control.

Your online profile is less a reflection of you than a caricature.

Whether you like it or not, commercial and public actors tend to trust the string of 1s and 0s that represent you more than the story you tell them. When filing a credit application at a bank or being recruited for a job, your social network, credit-card history, and postal address can be viewed as immutable facts more credible than your opinion.

But your online profile is not always built on facts. It is shaped by technology companies and advertisers who make key decisions based on their interpretation of seemingly benign data points: what movies you choose to watch, the time of day you tweet, or how long you take to click on a cat video.

Many decisions that affect your life are now dictated by the interpretation of your data profile rather than personal interactions. And it’s not just about advertising banners influencing the brand of the soap you buy—the same mechanics of profiling users and targeting messages apply to political campaigns and visa applications as much as supermarket metrics. When advertising looks like news and news look like entertainment, all types of content are profiled on the basis of your data.

So what story does your data tell about you?

The layers of your online profile

It would be nice to think that we have control over our online profile. After all, we’re the ones who feed terabytes of personal data into mobile apps and online platforms. We decide which photos we want to share and which should remain private. We accept or reject invitations, control tags, and think twice before publishing a post or a comment. We are critical and selective about the content we like or share. So why wouldn’t we be in control?

The bad news is that when it comes to your digital profile, the data you choose to share is just the tip of an iceberg. We do not see the rest that is hidden under the water of the friendly interfaces of mobile apps and online services. The most valuable data about us is inferred beyond our control and without our consent. It’s these deeper layers we can’t control that really make the decisions, not us.

Let’s peel open this data onion.

DVGSk74.jpg

Panoptykon Foundation
The three layers of your digital shadow.

(If you want to view the graphic even larger, click here.)

The first layer is the one you do control. It consists of data you feed into social media and mobile applications. This includes what you have revealed in your profile information, your public posts and private messages, likes, search queries, uploaded photos, tests and surveys you took, events you attended, websites you visited, and other types of conscious interactions.

The second layer is made of behavioral observations. These are not so much choices you consciously make, but the metadata that gives context to those choices. It contains things that you probably do not want to share with everybody, like your real-time location and a detailed understanding of your intimate and professional relationships. (By looking at location patterns that reveal devices that often meet in the same office buildings or “sleep” together in the same houses, tech companies can tell a lot about who you spend your time with.) It also tracks your patterns of when you’re online and offline, content you clicked on, time you spent reading it, shopping patterns, keystroke dynamics, typing speed, and movements of your fingers on the screen (which some companies believe reveal emotions and various psychological features).

The third layer is composed of interpretations of the first and second. Your data are analyzed by various algorithms and compared with other users’ data for meaningful statistical correlations. This layer infers conclusions about not just what we do but who we are based on our behavior and metadata. It is much more difficult to control this layer, as although you can control the inputs (posting photos of your newborn), you don’t know the algorithm that is spitting the output (that you might need to order nappies).

Here’s how it works in practice:

FcBj2yf.gifx

Panoptykon Foundation

The task of these profile-mapping algorithms is to guess things that you are not likely to willingly reveal. These include your weaknesses, psychometric profile, IQ level, family situation, addictions, illnesses, whether we are about to separate or enter in a new relationship, your little obsessions (like gaming), and your serious commitments (like business projects).

Those behavioral predictions and interpretations are very valuable to advertiser. Since advertising is meant to create needs and drive you to make decisions that you haven’t made (yet), marketers will try to exploit your subconscious mechanisms and automatic reactions. Since they cannot expect that you will tell them how to do this, they hunt for behavioral data and employ algorithms to find meaningful correlations in this chaos.

Binding decisions made by banks, insurers, employers, and public officers are made by big data and algorithms, not people. It saves a lot time and money to look at data instead of talking to humans, after all. And it seems more rational to place statistical correlations over a messy individual story.

Therefore, there’s a shared belief in the advertising industry that big data does not lie—that statistical correlations tell the “truth” about humans, their behavior, and their motivations.

But do they?

When your data double is wrong

The troubling thing is that we as users might not like or recognize ourselves in the profiles that are created for us. How would it feel if you discovered your “data double” is sick or emotionally unstable, not credit worthy, or not simply not cool enough, all because of the way you type, your search queries, or any “strange” relationships you may have?

hx8tzXV.gif

Panoptykon Foundation

Your online simulation may look nothing like your real-life one—yet it’s the one that the internet will treat you as.

Market players do not care about you—they care about numbers. Algorithms make decisions based on statistical correlations. If you happen to not be a typical individual, showing unusual characteristics, there is a chance that an algorithm will misinterpret your behavior. It may make a mistake regarding your employment, your loan, or your right to cross the border. As long as those statistical correlations remain true, nobody will care to revise this particular judgement. You’re an anomaly.

If the result of this algorithmic analysis is discriminatory or unfair—for example, your credit application is refused because you live in the “wrong” district, or your job application does not make it through because your social network is not “robust enough”—there is no market incentive to correct it. Why would they? You’re a single data point in a wave of billions. Why make an exception in the system just for you?

We can already see this playing out in China. As part of their “social credit score” system, every citizen is ranked on professional and personal interactions, online activity, and public appearances. Fail to pay a parking ticket? Look up banned topics online? Your actions in real life have lasting effects, such as your ability to buy train tickets or send your kids to good schools.

Scoring systems in the West place the same blind trust in big data, ignoring the specificity and uniqueness of individual cases. We can shake our heads at the absurdity of China’s social credit score all we like—but are we really that far off ourselves?

Will the real digital you please stand up?

We must take back control of our digital shadows. If we don’t, we’ll continue to be incorrectly and unfairly penalized in our lives, both online and off.

5ZdsT5b.gif

Panoptykon Foundation

We can take measures to control the first layer of our online profile. Even though we are often impulsive or spontaneous with the data we share, we have the tools to control this process. We can choose not to post status updates or like pages. We do not have to use messaging systems embedded into social media platforms. We can encrypt our private communication by choosing certain messaging apps and block tracking scripts by installing simple plug-ins. We can even switch off metadata being stored in our photos by changing the default settings on our phones and making sure that they don’t have access to our locations.

But even if we make that effort, we cannot control what is observed and interpreted by algorithms. The second and third layer of our profiles will continue to be generated by machines.

The only way to regain full control over our profiles is to convince those who do the profiling to change their approach. Instead of hiding this data from us, they could become more transparent. Instead of guessing our location, relationships, or hidden desires behind our backs, they could ask questions and respect our answers.

Instead of manipulation, let’s have a conversation. Imagine that instead of having data brokers guess who you are, you could just tell them. Sharing real information would help make your online experience (and any offline ramifications) more accurate.

Sounds too radical or naive? Not really. European law already requires companies that engage in tracking and profiling to make it more transparent. The data protection regulation GDPR that was put in place in May 2018 gives European users the right to verify their data, including marketing profiles generated by data brokers, internet platforms, or online media. While companies can still protect their code and algorithms as business secrets, they can no longer hide personal data they generate about their users.

GDPR and its logic gives users a good starting point for negotiating the new power balance in data-driven industry. But what will make further transactions possible in the future is building trust. As long as we treat data brokers and marketers as the enemy and they treat us as an exploitable resource, there is no space for open conversation.

It is therefore time to treat users as active players, not passive participants. With GDPR in force and new companies building their competitive advantage on trust and transparency, new models of marketing and financing online content become realistic. Solutions that seem counterintuitive and risky may turn out to be the most natural way forward: Instead of telling users who they are, try listening to what they say.

This article is part of Quartz Ideas, our home for bold arguments and big thinkers.

https://qz.com/1525661/your-digital-identity-has-three-layers-and-you-can-only-protect-one-of-them

I bet they wouldn't give an audience to, nor merely entertain, a big-thinking, bold argument from us, though. It's all on their terms, I expect.


[underline mine - the underlined text is a link to download a PDF file directly - https://www.ivir.nl/publicaties/download/UtrechtLawReview.pdf]

"Since advertising is meant to create needs and drive you to make decisions that you haven’t made (yet), marketers will try to exploit your subconscious mechanisms and automatic reactions."
In a similar way, when I used to get receipts from a shop I shop in often, they also added alleged bargains or deals. They were not bargains/deals because they were trying to get me to buy things I do not buy from there. They should listen to me (as the article says) and give me a deal or bargain to get the things I do buy from there but at a cheaper price. Of course, there would be a time limit on it, but it would expire after I would be due to buy those things again. Neither would they get me to buy things I don't buy, nor would they keep me as a loyal customer with satisfaction, because I buy only the things I want from there regardless - as long as they keep them in; if not, I'll go somewhere else to buy them or buy something else, and their "temptations" neither would make any difference with me; I still wouldn't buy what they want me to which I don't want. I also stopped using the "loyalty/points cards", as well. Why should I let them keep yet more tabs on me?

"Since they cannot expect that you will tell them how to do this, they hunt for behavioral data and employ algorithms to find meaningful correlations in this chaos."
I also can remember seeing something which said that they would be putting cameras/technology in aisles and behind products to watch and see who buys what, as well. I can't find anything about that online but I know I didn't make that up.

"Market players do not care about you—they care about numbers."
Yeah, how many zeros they can add to the end of the money they have scammed out of you.

It has been said that everything which is put on the Internet remains there, and that deleting - or trying to delete - your social media account, or things within it while still keeping it active, never actually removes it. Not to mention that between your computer and any website there are a few computers in-between where you have to go through before uploading something to, or downloading something from, any particular website. These are called 'hops'.

Experian TV Advert 2017 - Meet Your 'Data Self'
https://youtu.be/6rRKRipuCOg

The jew loves big data/data mining and having a big database of things. Say - don't the greys have access to filing cabinets and sticky notes full of things, as well?

Regarding "If you happen to not be a typical individual, showing unusual characteristics...", security through obscurity, according to some, is not a good idea; likewise, if you stand out in a crowd, then you are more easily noticeable, whereas if you blend in, that makes you more anonymous. For example, if everyone in your country does not encrypt their Internet traffic and does not use a VPN and does not use a Virtual Machine, then they all merge together; however, if only you do, then you stick out like a sore thumb - what do you have to hide and why?

Regarding "Why would they? You’re a single data point in a wave of billions. Why make an exception in the system just for you?..." we can manipulate energies to cause that employer or bank worker, etc. to accept us.

The article said "Scoring systems in the West place the same blind trust in big data". Practically everything we do is based on trust - paying your bills, you trust that the jew is not over-charging you for your use of amenities/facilities; you trust that your bank and online accounts for shopping and things will keep your money and information safe; etc...

The article said "But even if we make that effort, we cannot control what is observed and interpreted by algorithms. The second and third layer of our profiles will continue to be generated by machines."
There have always been ghosts in the machine. Random segments of code, that have grouped together to form unexpected protocols. These ghosts are added deliberately, though, of course. There's a glitch in the matrix. I'm afraid Isaac Asimov and his so-called "Three Laws of Robotics" are being ignored and all forgot about. It's a good thing that there is also a glitch is the jewtrix!

"The data protection regulation GDPR that was put in place in May 2018 gives European users the right to verify their data, including marketing profiles generated by data brokers, internet platforms, or online media."
Consider Brexit. It said "European users" as opposed to "European jewnion victims". Also - I'll ask you once more - Whence cometh "rights"?!?! Can you believe it that we didn't have "rights" to know things about ourselves?! The fuck is that? "god" damn it, "adam"! Fucking eating of the tree of knowledge of good and evil! Look what you've fucking done! Sheeple were blissfully ignorant once upon an oblivious crime!

"As long as we treat data brokers and marketers as the enemy and they treat us as an exploitable resource, there is no space for open conversation."? Exploitable resource? You don't say. The Earth has become more and more polluted and toxic in the last few centuries thanks to the dirties who just want to sell us shit which we don't want nor need. Exploitable, indeed. The reason "there is no space for open conversation" is because money talks, makes "the world" go round, and is very...well...dirty. Dirty cash handled by many, but understood by few.

"All money belongs to 'god'" (the jew thinks it is "god") - look at the amount of debt there is. I was told the jewmerican federal reserve has an automatic and deliberate $9 or $10 debt on top of every single $1 borrowed, so it is impossible to pay it back. To whom, may I ask, is the debt owed? Think thusly - everyone pays back everyone the debt they owe. Money will be 'going round' and round and round, back and forth. Think of it this way - you're in a shop and you need to tend 1.50, and someone else tends 2.00. You both give 2.00 to the cashier, or the other customer skips a step and gives 1.50 to the cashier and the 0.50 to you (if my maths is correct!). The extra step in the middle is unnecessary.

Everyone repays debt back to whomever as necessary, or money is moved to the final position/step and work or service is done to compensate, as necessary. (Maybe I missed some things out which I didn't consider, but I think that makes sense, if you know what I'm trying to say.) The jew requires you work through "official channels", with unnecessary extra steps - not just in this sense but with anything. If the unnecessary extra steps were removed, things would progress and advance a lot quicker.

Contrary to what the article said, there is place for debate/open conversation - on political debate programmes where it is biased and unfair, so the taking is given to the Left which shuts down the Right.
 

Al Jilwah: Chapter IV

"It is my desire that all my followers unite in a bond of unity, lest those who are without prevail against them." - Satan

Back
Top