I see the TrustArc in use at quite a lot of sites - the fake delay, and the whole UX in general, is intensely irritating, and it just feels like the darkest of dark patterns. Really gives me a bad feeling about sites that use it.
I find less and less I want to use the internet to browse sites anymore.
Not just because of these dark patterns, but usability is messed up. The web should be redesigned to force standards compliant requiring websites to allow a “no script” support where you just go for information.
Cookies are not even remotely the largest problem on the fucked UX web we have today. It’s less about data delivery and ubiquity of the original WWW concept and more about “how do we force users to stay on our platform” or “how do we extract data on our users and sell it to the highest bidder.”
They also need to pass laws forcing companies who sign up users for services to have a graceful way to sign down and delete their account instead of these stupid cookie banners.
Take IMDB - in the early 90's it was a fun little database hosted at the University of Cardiff, obviously limited content-wise but responsiveness was certainly not an issue. Fast forward to today and the modern behemoth it has become is essentially unusable (at least on my tablet) for quick enquiries, what with all the ads/video whatnot clogging up the homepage plus the utterly borked predictive search textbox - eurgh.
> Subsets of IMDb data are available for access to customers for personal and non-commercial use. You can hold local copies of this data, and it is subject to our terms and conditions. Please refer to the Non-Commercial Licensing and copyright/license and verify compliance.
> The dataset files can be accessed and downloaded from https://datasets.imdbws.com/. The data is refreshed daily.
> Each dataset is contained in a gzipped, tab-separated-values (TSV) formatted file in the UTF-8 character set. The first line in each file contains headers that describe what is in each column. A ‘\N’ is used to denote that a particular field is missing or null for that title/name. The available datasets are as follows: [...]
Good to know—I never would have found this though as I’d expect a web database to work as such (e.g. not require me to download and parse the database in Excel). On the usability front it doesn’t help.
I’d wager the license does not permit republishing the data on the web using better UX paradigms either—though I am too lazy to read through the fine print.
Ugh yes perfect example—every time I go there nowadays I can’t find anything. Often I would try to find an actor/director’s filmography but it takes a good 20 seconds to find the tiny links squeezed between the ads!
Not to mention that the movie ratings are derived from fake reviews, and you have to look at the distribution of individual user reviews to gauge if they are genuine and decide if the movie is worth watching. If its lots of 1 and 10 ratings, the movie is probably crap. Many 6-8 ratings and you're good to go. The nice thing for bad movies is that there's always a low-rating review which is a great read that gives you a good laugh.
You also get adverts before you watch trailers (but trailers are ads!) and you can't watch the trailers full screen because if you rotate your phone it keeps the video small and puts lots of distracting text next to it.
They must be trying to maximize some bounce rate metric. I had to make a bookmarklet just to do the simplest of tasks because I can never see the user reviews scrolling the title's page.
As a web developer I can only agree with all that you said. It's like some companies are actively trying to make visitors leave their websites and never come back. Or at least avoid them as much as possible.
Also as a web developer, agreed, but unfortunately it’s like spam. I’d never buy fake sunglasses or penis enlargement pills from some email that appeared in my inbox, but the fact that I still get these means it’s profitable. As long as dark patterns are profitable, they’ll be around.
old people. I've helped a lot of old people with their PCs and they call me about paypal scams when they do not have paypal. they want to donate to some one because of a nice horoscope email they got.
Yea makes me so sad. When I still did computer repair the number of genuinely nice people being scammed in one way or another was depressingly high.
Unfortunately some cannot be helped as they just want to feel like they are helping/connecting with someone.
Sometimes, I swear the people I'd be helping knew they just didn't care, they were that desperate for human interaction.
It's a tale as old as time. I remember spending far too many hours as a kid convincing my grandmother not to send various people/fake businesses money (you owe the IRS! You power is going to be shut off!) and trying to curb her purchasing on QVC (I ended up later having a roommate who worked for them - the horror stories I could tell!).
Partially true. If you make it too easy for the customer to get what they want, they won't have the time or inclination to engage with your true content: your brand and your ads.
So, increase the complexity. A certain number of savvy people will click off immediately, but they were likely too clever to bother with your game, so no love lost there. The ones who stick with it are your ideal targets. They'll wrestle through anything you throw at them to get what they want as long as you keep teasing that there's a light at the end of the tunnel.
However, you also can't make totally random, nonsensical design choices. They need to have a pattern that guides them in the darkness to the actions and outcomes you want most. It's your site after all, and no one can or will tell you how to run it as long as you show those retention and engagement numbers going up, so remember that.
Now that they're spinning in circles clicking on anything and everything trying to understand the labyrinth you've set out for them, you've got nothing to worry about. They've sunk their time and attention into solving this, and you're going to have them come back soon with a new strategy they're excited to try to beat the game. Once they feel like they've beaten you, you've won. They'll keep coming back just to prove to themselves that they've figured it out. They'll search on behalf of their friends and family, rising in their estimation.
Now they smirk and with just a hint of pride say whenever someone complains about how hard your site is to navigate, "Oh, it's not that hard. You've just got to know how it works."
starbucks.co.uk works fine for me without trustarc, newrelic, googletagmanager or cloudflareinsights. No point executing all of that extra JS as its not for your benefit.
I believe that the `AdGuard Annoyances` list in ublock origin setting does this. It also blocks another offending cookie script that popups `everytime` if you opt out non-essential cookies.
I don't see them anywhere other than as a light red rectangle in the uBlock popup - they're not in my whitelist so they end up blocked by default. Things tend to work just fine without it, the same goes for many other "trust"-related sites. Some sites won't work at all without them but, fortunately, the web is a large place full of choice.
I am not entirely sure that a fake delay here is a dark pattern ... Computers cnan be blazing fast, and thus Usability principles allow the use of a "fake delay" to convey the perception that something is happening or has happened. (See - https://stackoverflow.com/q/536300 ).
That link doesn't contain anything that justifies adding prolonged delays to applications. It documents that people can perceive sub-second delays, but this delay is tens of seconds. It's also only occurring for specific choices. That rules out any reasonable argument that it's a usability aid.
> That link doesn't contain anything that justifies adding prolonged delays to applications.
It does in the context of usability:
> What I remember learning was that any latency of more than 1/10th of a second (100ms) for the appearance of letters after typing them begins to negatively impact productivity (you instinctively slow down, less sure you have typed correctly, for example), but that below that level of latency productivity is essentially flat ...
> That's for visual feedback that a specific input has been received. Then there'd be a standard of responsiveness in a requested operation. If you click on a form button, getting visual feedback of that click (eg. the button displays a "depressed" look) within 100ms is still ideal, but after that you expect something else to happen. If nothing happens within a second or two, as others have said, you really wonder if it took the click or ignored it, thus the standard of displaying some sort of "working..." indicator when an operation might take more than a second before showing a clear effect (eg. waiting for a new window to pop up).
> but this delay is tens of seconds.
Oh, I wasn't aware of that - in that case it's ofcourse unjustified and definitely a "dark pattern".
IME this isn't unique to Starbucks, every single site that uses TrustArc does this.
Thankfully, I haven't had to deal with any of these stupid pop-ups since installing the 'I don't care about cookies' add-on. [1]
Related question: Does anyone have experience using 'Stardust Cookie Cutter'? [2] Is it better than 'I don't care about cookies' or does it do the same thing?
You should not use "I don't care about cookies", it is broken by design and chooses the wrong policy. Instead, use Consent-O-Matic. This kills consent banners but preserves privacy to its best ability.
You could also block the trackers with uBlock Origin. I use “I don’t care about cookies” because I don’t care about cookies. Not because I care about what settings are being set. I trust uBlock and other privacy protecting settings to actually protect privacy instead of the cookie prompts.
My issue with the I don’t care about cookies add-on is that it auto accepts all of the marketing and tracking cookies, doesn’t it? I would love something that auto declines everything.
Most of the notices are implemented in such a way that the tracking is enabled by default and clicking "decline" in the notice sets a cookie saying "opt out" to all the trackers (whose effectiveness is probably equivalent to the "evil bit" in IPv4).
Blocking the notice (or ignoring it) is technically equivalent to opting in. Of course, if you're using a competent ad blocker it's likely that the trackers themselves were also blocked, making this a non-issue.
* it's hard/impossible to prove - given how many factors go into ad targeting there isn't a conclusive way to prove whether an ad was targeted because of illicitly-collected data.
* as you see here even blatant GDPR breaches and acting in bad faith doesn't actually land you in trouble, so although it's "illegal", the law isn't enforced. Experian (or Equifax) got caught by the ICO knowingly misusing people's credit data for marketing purposes and all they got was a warning, so clearly the message is that "breaching the GDPR does pay".
Ah, I didn't know that, but I run uBlock Origin anyway, which should block nearly all of that stuff.
It does appear (from their website) like the aforementioned Stardust can auto-decline everything, but I haven't tried it myself.
One problem you run into when declining cookies is that on many sites you won't be able to view embedded YouTube videos, tweets, etc. unless you go back in and allow social media cookies.
Keep in mind that data processing consent forms covers more than just cookies - providing consent and then deleting cookies still allows them to stalk you based on IP address, browser fingerprint, etc.
It is illegal, at least from the point of view of the GDPR which is what these pop ups are supposed to comply with.
You could argue that the artificial delay is implemented as a way to dissuade people from declining which would fail the idea that data processing consent should be freely given (you can’t force people to opt-in).
You could also argue that even if there was a legitimate technical reason for the delay then it wouldn't be compliant because it would prove that data processing is enabled by default before the user opts-in (otherwise the delay should be on opt-in and opt-out should be instant as it's essentially a no-op).
TrustArc essentially provides "breaching the GDPR as a service" and their continued existence proves the incompetence of the data/privacy regulators in all EU countries.
It may also be a sign of how much national governments care about privacy, compared to the EU parliament which voted for the e-Privacy Directive.
I suppose the counter-argument would be that passing legislation is cheap, but enforcing it costs money, and governments have other priorities, but, for example, in the UK there can be fines of up to £500,000 for breaches of the e-Privacy Directive[0], which should be more than enough to cover the cost of the investigation.
Please stop doing this. Not everything you don't like needs to be illegal, and taking your business elsewhere has literally never been easier in the history of the world.
I don't want to live in a world where the criminalization of everything that ever happened that you didn't like means that I'm always breaking the law.
If hn is not the place to discuss web regulation, I don’t know what is. New things pose new and unexpected harms and nuisances. Regulation is the cure.
On the flip side, this particular dark pattern was caused by regulation. As usual, shades of gray
> I don't want to live in a world where the criminalization of everything that ever happened that you didn't like means that I'm always breaking the law.
You'll be fine as long as you're not in the habit of doing things like this that are clearly outwardly hostile to everyone you come into contact with.
How many people not liking murder or theft did it take to make it into a law?
How many people not liking gaslighting personal-data-theft dark patterns will it take to make it into a law?
We're transitioning from purely physical beings to having a more virtual presence. Virtual crimes are much less visible and have much greater impact at scale than their physical counterparts, identity theft by Equifax breach or a hack, VS physical force or pickpocketing, for example.
No, they are equation that previously both items were socially acceptable until society demanded change and made both illegal and provided services to enforce such laws.
The impact of violating privacy is neither increased nor decreased by the impact of theft and/or murder. If we compare theft and murder, theft «in general» is less impactful than murder, as I'm deprived of property and potentially physically injured with theft, with murder I am deprived of life itself.
That murder is generally more impactful doesn't make theft more acceptable/less bad; we should have laws for both.
"Cookie banners" are a misnomer. GDPR rules apply to all persistent personal identifiers, not just cookies. (And likewise, they do not apply to cookies which are not personal identifiers or are critical for site functionality)
I seem to recall a relation between loading time and visitor retention? A quick search gives dozens of statements along the lines of..
A 1 second delay in page response can result in a 7% reduction in conversions. [1]
47% of consumers expect a web page to load in 2 seconds or less. [2]
40% of people abandon a website that takes more than 3 seconds to load. [3]
...etc
Either those cookies make up for the lost business, these statements only hold for the initial page load or these statements are factually incorrect. I suspect the statements only hold for the initial page load, that spinner and the slowly but surely updating fake counter holds visitors enthralled for the final outcome.
Anyway, the path is clear: close that Starbucks tab after ~2 seconds of faked cookie setting time and get your caffeine kick elsewhere.
[1,2,3] just search for it - most results are commercial entities trying to sell some "marketing" or "website enhancement" service which I do not feel like boosting by linking to them. Much of the original research seems to come from Google and can be found in a report titled “The Need for Mobile Speed".
There’s a big ol’ “Cancel” button that stays live during the “processing time”. They’re trying to get you to click it in frustration, which will reset your cookies to maximum intrusion levels before you go about your business on the site.
The delay is not specific to Starbucks' implementation. Every TrustArc popup has such a delay.
Considering everything else about it also screams bad faith, I think it's a deliberate tactic to train people to click "accept" on these so they can then boast about how their "consent" management platform provides better conversion, which in turn somewhat justifies the salaries of the oxygen wasters in the marketing/advertising departments.
There is a lot of value in being able to identify users, and starbucks 100% is constantly analyzing the gain they get from identifying users against the losses the incur from an increased bounce rate. (I work in adtech and am part of similar tests for similar brands)
Almost certainly these tests do not take into account longterm affects on user's opinions on the brand, etc.
Those numbers are for people stumbling onto your web. If people are forced through your WiFi because they’re in a foreign country and get no data roaming and really need to connect to the internet, those statistics mean nothing.
Being privacy minded and traveling during covid has been a nightmare.
"In January 2006, Harvard economics researcher Benjamin Edelman published a study showing that sites with TRUSTe certification were 50 percent more likely to violate privacy policies than uncertified sites:
And perhaps ironically (if honestly true - maybe it never was the intention), TrustArc was nominally/purportedly started to promote privacy at TrustE. A lie perhaps.
"TrustArc, was founded as a non-profit industry association called TRUSTe in 1997 by Lori Fena, then executive director of the Electronic Frontier Foundation, and Charles Jennings, a software entrepreneur, with the mission of fostering online commerce by helping businesses and other online organizations self-regulate privacy concerns."
Let's encrypt was also started by EFF. It's been doing some shady business with it's authority and the trust it accumulated since internet heyday. I wonder when it will betray the community.
How many companies seriously start revoking visa sponsorship the moment an employee pushes back on a Jira ticket?
Really the issue is being fired over it isn't it? The visa just makes being fired worse for employees requiring one.
I would hate to work at a company where a bit of debate on 'is this really a good idea' were a firable offence; sounds like the 'believe it or not - jail' scene from Parks & Recreation! That's satirising a visiting delegation from a developing country under military rule.
>I would hate to work at a company where a bit of debate on 'is this really a good idea' were a firable offence
Right, and you don't have to because your continued existence in a country isn't dependent on it. Companies with attitudes like that don't reveal it until it's too late.
I just think the existence/prevalence of such places is being wildly overstated... Especially without any sort of 'hey, stop pushing back on every issue, shut up and do your job or you won't have one' type warning.
But then I've never lived or worked in the land of the free, so what do I know.
It just seems even less likely to be such an issue to me than unavoidable things like personality clashes, or getting stuck on some work that someone you report to thinks should be quick and easy, etc.
Same for if you’re disabled, your partner has a medical condition… moving between jobs can be cost less for some, but changing jobs is not cost less universally.
Which means in a given developer pool, there’s usually at least one person who “won’t put up a fuss about implementing industry standard code”.
You absolutely do have a choice. You just have to decide where your line is.
If you are asked to commit a crime by your employer, do you go ahead and do it for the sake of keeping your job? What about something legal yet highly questionable on moral grounds? Going ahead with an annoying UI feature you don't agree with is probably justified if the alternative is getting deported, but there's a threshold somewhere and it's different for everyone.
Yes because America, not China has exacerbated the conditions in India to make it so you have no choice to come to America. There won't be a timeline where the US relaxes their immigration laws to make something like this possible.
Maybe in the US, but not in Europe. Health care and the education of your children is largely not dependent on your job. Shades of grey of course, I don't think university education is free in the UK anymore (it was when I was at university) and private health care does exist
So you literally do have a choice. There’s just a consequence (There are worse things than not living in America, hard as that might be for you to imagine.)
You're going to say no to writing a timeout code and risk your entire family going back to your country?
What if you have kids who were born here? You're now going to take them back to a place they don't know? Or are you going to let them go to foster care here?
How about if you've bought a house here? You're going to have just a few months to settle everything before you can go.
Why would anyone contemplate leaving their children behind in foster care? That seems an absurdly hyperbolic suggestion. People move countries for work and take their children with them all the time. I mean, if the country was Syria and you literally risk death, then I could understand, but people in that situation are refugees, not on work visas.
Perhaps I’m finding this hard to understand since I have no desire to live in the US whatsoever and have turned down several offers from companies that wanted me to move there. Nice place to visit, but I can think of dozens of places I would rather live.
The kid is a US citizen and doesn't know any language other than English. Depending on the home country that might be a huge huge obstacle for the kid.
I was once that kid who had to go back to Iran without knowing Persian. It was fucking terrible.
Yes, moving to a country is hard when you don’t speak the language. Can I ask, why did your Iranian parents not ensure you learned Persian if moving back was a possibility?
If you're here from a country that you had to escape, then you're a refugee, and unlike H-1B immigrants, refugees don't have to leave just because they lost their job.
There's a ton of people who had to (or really wanted to, makes no difference in this case) escape a country that did it using their skills as a programmer.
But the vast majority moved because they could make more money in the US. They behave unethically because they don’t want to give up that income. It’s a purely mercenary calculation. And it does make a difference whether it’s a want rather than a need. I want lots of things, but if I behave in a shitty way to get them I should be condemned.
I’m sure there are some people in the unfortunate position you describe, and in their case it’s understandable. But it’s not the general case.
People who moved for the money likely went to FAANG instead.
Actually, I got interested in this and checked trustarc's careers page and it seems most technical positions are in Philippines/Canada with a mention of "global team" so I'm convinced now all this thread is arguing about strawmen and in reality the product is being written by some remote contractors from third world country who will be easily replaced by a million others if they refuse.
The corporate leaders that make the decisions are the ones that should resign on principle. Not the theoretical H1B employee who would uproot their family, derail their career, distress-sell their house, leave the country, etc., over the setTimeout line.
Is being an H1B worker now an accusation? This was me pointing out that our VISA system forces a huge chunk of our workforce to not have a strong ability to stand up to their employer.
What if 3 US citizens said "fuck no", and it was someone on much shakier ground who felt the pressure to say yes? Not a given at all, just makes it more likely.
It really seems horribly racist to just assume that it must be the “lesser moral” H1B employee who made this dark pattern happen. You have zero evidence, no indication that’s the case, and are speculating wildly.
It isn't even slightly racist. It isn't a comparison of morals between the H1B worker and U.S. citizen; the employer simply has more leverage over the H1B worker.
I'm not seeing any claim that an H1B employee is less moral, only that a person (regardless of visa status) can be coerced into doing something they would rather not do.
Also, there is no inherent racial component to an H1B or other status.
The example of an H1B person seems to have been provided only as a sample to further illustrate the point that "Just quit on principle, rather than implement this thing!" is often not an acceptable action due to other effects.
This whole thread has nods to moral high ground US citizens, versus the immoral scared H1B workers, who are /obviously/ the only ones who would implement such a dark pattern. Mind you we’re discussing an American company, working with American clients.
You seem to be one of those “assume good faith” people, who knows exactly what the others actually mean.
>Mind you we’re discussing an American company, working with American clients.
And everyone in this thread is discussing how that company could be using American laws to pressure workers. This thread is an indictment of an American system, no one is blaming the H1B workers.
I'm out of the job market myself (retired) but isn't every developer's job stability above average? Isn't everyone looking for developers now? If you're one to take a stand then now is the perfect time. Unless you're in a H1B situation as mentioned by the comment below.
Yep, but that is a dangerous path/metric, besides this specific "artificial" delay, think of the millions, billions, trillions seconds humanity has lost - particularly the poorest - waiting for stupidly bloated sites to load on a slowish connection (or even worse a metered one) when the same content and message could have been delivered with 1/10th or 1/100th of bandwidth usage ...
If you adopt this kind of metric/moral stance any web programmer workng in the last 20 years is guilty ...
There are things you can refuse to work on and there are things that you have to put in effort to make better. Improving a bloated site to load faster is not as trivial as refusing to put a dumb timeout to slow things down. I still think the OP is totally overreacting and even calling something as stupid as this a "dark pattern" belittles truly horrific things that are happening in the world including the cyber world.
We’re talking about “dark patterns” here, not crimes against humanity. I think this is unambiguiously evil, but that doesn’t automatically make it as bad as murder.
Ethics should follow the same standard normal distribution model as everything else. Which means that 50% of the population has less than average ethics.
Not everything follows a standard distribution model. In fact, since some psychological tests are designed to return a standard distribution result, if the traits do not occur in the population along a standard distribution, the psychological tests are designed in a way which will give inaccurate results.
Despite being called normal, not as much as you would think follows a normal distributon. But if your main point is that there is some mean value of ethicalness and 50% fall above and below that value then I suppose there's not much to argue about there.
It's also common for more or less than 50% to fall below the median. The average M&M fun size package has 15 M&Ms (mode, median=15; mean=15.02). Only 25.6% have fewer than that. As fumeux_fume stated earlier, not everything follows a normal distribution.
Since morality is socially mediated, I think it's reasonable to hypothesize it would tend to be N-modal.
This argument can justify anything on that basis, from fraud to murder or slavery. By withdrawing your services to do it, you reduce supply, increasing prices and providing a financial penalty for trying to enact it.
Am I the only one seeing the irony of it? You are asking the guy who already added a ton of JavaScript junk to the website to have concerns about one delay function?
There's plenty of people who would do whatever they're told, regardless of their own principles, as long as they're paid for it. I'm not one of them, sure, but as long as there's just ONE person like this, we can't have nice things in the long run.
It's one reason that software engineering should become a real Engineering profession and not just a title. If your employer asks you do something unethical, it would give you grounds for pushing back. Who would risk losing their license to practice because of a deceptive cookie notice?
Another poster mentioned H1B visa holders, and I'm sure that is a valid concern there, given how poorly H1B holders are treated. But as a citizen, I've heard this many times, been told it to my face during sit downs with my boss while refusing to do something shady, and it has never happened. On two occasions, the threat was idle. On two more, I quit and they never did find anyone to replace me.
But regardless, even if it were true, you still need to protect your own soul. Better to let someone else corrupt themselves.
Except it's not just Starbucks. It's all sites that use TrustArc. TrustArc is a scummy middleman that is extracting money in the name of privacy without providing any serious protection (except to the companies who pay their protection money). I worked with them when I whored my services to a list broker as a contractor for a brief time. They are a virtual money printer because their certifications are so incredibly expensive for what they actually provide.
So a dumb one-minute timeout is the thing that pushed you over the edge? How would you feel if you had to work on drones that kill people or as a nuclear scientist on the Manhattan project?
When I work on drones that kill people I choose to work on it from the start. There’s no doubt that they’ll be used to kill people since it’s an explicit design goal.
The same is not true for a ‘cookie selection dialog’, where the stated goal would be to allow people to easily select what cookies they want to allow.
The Docker homepage (https://www.docker.com/) is even worse. It takes minutes in my case when I press on "Essential Cookies only". And this is reproducible. It's like this for more than a year now.
Even more fun, when I actually dig down to advanced settings I can't turn of some cookies like:
"Bizible - Do Not Use
bizible.com
No Opt Out Mechanism
Bizible enables you to drill deep on settled and projected ROI of online advertising, so you can make data-driven budget decisions based on revenue."
That's also why I like the web, instantly anybody can pull the inspector and see right through their crap and dark patterns, good luck doing that on mobile.
some newspaper sites start autoplaying a little video window, and it you click the "close" X, the player will keep playing for several more seconds with a phony subtitle saying "shutting down" or "closing"
btw why do some many sites do whatever they can to force a video to play when you click on them?
What about Hanlon’s razor[1]? Are we sure this is a fake delay and not just some bad engineering? E.g. a bad case of long polling. Sure, even in that case it would still be a dark pattern; I just want to make sure we’re not assuming too much.
You're right, it takes much longer to pick "no" or "customise" because the way it is implemented in 99% of the tracking tools is that it has to be opt-out and to do the opt-out the site loads a pixel that places a special opt-out cookie. Not saying this is a good thing, but it is a reality that it takes time to load so many opt-out pixels.
So there is a big change that this is a lot of outrage while there is no dark pattern here.
Except I guarantee if choosing the "allow all cookies" option took 50 seconds that the very first thing that product would prioritize is getting that down to sub-second.
Granted, it could still be some sort of a polling situation vs. just a deliberate fake "make this take a really long time", but it still doesn't matter - it's still a dark pattern because the site owner is deliberately OK with the "opt out" solution being so onerous that hardly anyone would wait that long.
I just want to add that with Firefox's Containers and the Temporary Container addon I usually just accept whatever, because the cookies are not shared between my tabs, and the temp container is deleted after 15 minutes of closing the tab. So while I technically accept, it's not much of a privacy violation anyways.
They're trying to be clever and punish people who enjoy their rights of GDPR. The EU will not find these patterns cute or acceptable, and fortunately their fines are large enough to cause the offending business real pain. It's only a matter of time!
Though the practice is always deplorable, I don't get why Starbucks would want this kind of thing. Sure they might be able to sell your data for a little extra cash, but why would they make it harder to buy coffee on their site?
I am very tired of having to deal with cookies on just about every site I go to. Almost nobody wants tracking cookies. I’ve noticed many sites it’s hard to tell whether the tracking cookies are enabled or disabled.
I don't remember the site, but one single time so far a website told me "we picked minimal cookie settings for you, since you sent a Do Not Track order". Very nice.
I have seen this in distant times as well. The one I remember most provided a link to view your opt-in choices. Clicking on it showed what the "minimal" cookies were, that actually did affect how the page worked, or fed other "features" that were non-tracking, and what ones were not included so that you could opt in to some things if you were interested in them (which I'm sure was always never).
The legislation should have been that sites honor DNT (or something similar) better, and you're promoted by your browser. Doing it site-by-site is a headache for both users and companies.
> Almost nobody wants tracking cookies.
It's complicated because a lot of people do want to stay logged into certain websites. Even if that's not "tracking," what about recommendations? Youtube does recommendations for logged-out users, and I suspect a lot of people find some value in that.
I tried the cookie settings on TrustArc's own site (https://trustarc.com) and they don't appear to have the timeouts. Though they do have a weird way to select "Essential Cookies Only". You have to say no to "Functional" and "Advertising" cookies, separately...2 clicks instead of 1.
Accepting cookies should be a client side action... browsers should quietly accept all the cookies, and once the tab/window is closed, delete them. There should be a separate button near the address bar to keep the cookies between restarts, and users should be prompted when logging in (as they are with "save password?" - 'yes' - 'no' - 'never').
1.5 seconds seem to low for me to be for user annoyance, especially since this comes after the point where the user would change his mind. I'd bet someone had to implement this and needed this to look like it's doing something.
Honestly, I actually don't understand how that can be beneficial.
First we all know that increased loading time also increases the bounce rate - so we are all working really hard to minimize it.
If you add a fake loading time you actually say that you don't want particular users. Why then they don't just block the site if cookie policy is not accepted? Does anyone actually accepts cookies and expects website to work faster? That sounds very counter intuitive to me.
Actually wondering what you can achieve with introducing a fake loading time and how can company benefit from that.
If turning off cookies is a time-wasting chore, you are more likely to accept them next time, which will allow them to benefit from tracking you. It's a net win as long as the benefit they extract from the people that get tired and accept tracking is bigger than the loss from the people that bounce.
The "beauty" of it is that so many websites in the internet are doing it, that even if it's your first time going to a website, when you see the cookie popup you already know the drill and are primed to just accept everything.
For me, any cookie setting is instant at Starbucks. Actually I've never experienced this delay with 'essential' cookies anywhere, despite hearing about it a lot. Possibly because I'm using Brave, and so those cookies are blocked by default anyway?
Wish regulators could hijack their domain and force visitors to a 15 second delay landing page explaining that they were found in violation of the GDPR and “you will be redirected to Starbucks shortly”. Second infringement make it a two minute delay.
One percent? Make it something significant like 50%. Better yet, figure out how much money they made with their surveillance capitalism, including any investments and profits derived from such capital. Fine them exactly that and then some for good measure.
Why? They’re literally adding friction to their purchasing process. Nothing they sell is critical. Nobody’s privacy is being violated. They aren’t lying. They’re just being annoying.
This is the most trivial non-issue one could possibly get hysterical over.
Does that mean the law doesn't apply to them? You do business in Europe, you follow European laws + the laws of the specific country you're doing business with, doesn't matter what the type of business is.
The ePD is notoriously ignored and unenforced [1]. It is also not clear what part of the law a simple delay would violate. (Most of the sparing enforcement has been around dropping cookies after someone opts out.)
No no no it's blatantly and obviously in breach as it adds friction to the opt-out that doesn't exist for opt in while the letter of the regulation says it must be as easy to opt out.
It doesn't get much clearer than that.
Of course it doesn't matter if the friction is 1.5 seconds artificial delay or if the friction is because you are forced to send an opt out in two copies via fax.
The only argument to why it wouldn't be in violation would be "it's too trivial" - but I don't think that's a very good argument.
It's a decent attempt at an argument, but far from convincing. One could argue it's to dissuade opting out. One could also argue it's being presented to show the opt out has teeth. (Non-technical people ascribe meaning to fantasy progress meters. A number of UI studies have shown that.)
As for opt out needing to be instant in comparison to opt in, the argument holds no water. If a legacy system were patched for GDPR, it's reasonable for the opt-out to involve more code, not less, as an extra routine undoes the defaults. That or making a record of the opt out is done tediously. (In this case, the argument is moot since the delay is fake.)
The toughest argument one could make from the ICO checklist [1] is that a one and a half second spinner delay constitutes a material "detriment" or penalization of withdrawal of consent. Those are technically true to a trivial degree, but immaterial. Far from meriting a 1% fine per the original comment.
These kinds of arguments hurt everyone working for privacy by trivializing it to a sympathetically mockable degree.
> One could also argue it's being presented to show the opt out has teeth. (Non-technical people ascribe meaning to fantasy progress meters. A number of UI studies have shown that.)
In this case, why isn't the same applied to the opt-in?
> If a legacy system were patched for GDPR, it's reasonable for the opt-out to involve more code, not less, as an extra routine undoes the defaults.
The GDPR mandates that no non-essential data processing should happen unless the user opts-in. Even if there was more code involved in making a legacy system GDPR-compliant, said code would need to be ran first (essentially applying the delay to the initial page load). Otherwise, since this consent form is overlaid on top of the existing webpage (as opposed to being on its own page with none of the trackers being loaded) this essentially means that data is being processed until the slow opt-out process completed, thus being in breach of the GDPR. In short, GDPR-compliant systems should work on the basis of "opt-in", not "opt-out". Having the delay on the opt-out proves that the system assumes the user has opted in (and thus immediately processes data that the user may not be willing to share) until told otherwise.
Also, regardless of the delay, the simple fact that the flow has a big prominent "agree and proceed" button which takes one click and then a less prominent "manage settings" which takes multiple clicks is enough for this to be in breach, at least according to the ICO's guidelines.
wait, what? are you saying that a change in the way the site functions is not a change in functionality?
Seems your argument is that the change is trivial, thus safe to ignore; that there is some threshold below which changes to functionality don't matter. Do I read you right? i.e. "Important functionality is not changed one iota"
I see a cookie popup, I click Back. I'm usually visiting out of curiosity, and the content is nothing special. TrustArc is one of the worst - I can't reverse too quickly.
I don't know if this is the experience for European visitors, but as the Twitter thread states, this is in violation of both the spirit, and, importantly, the letter of GDPR. I really hope there are more than slap-on-the-wrist consequences for this blatant, deliberate attempt to side-step the requirements of GDPR.
I really don't understand Google, why should I install some addon to stop myself from being tracked, shouldn't it be other way around? That is allow people install add-on to be tracked...
The fake "processing" delay is likely there because averaged across all visitors (not just averaged across HN commentors and voters) it increases visitor confidence that a change has occurred in site tracking activities as a result of clicking that button and hence it's there because it net increases both customer confidence and flow through to the rest of the site, as annoying as it may be to HN readers.
The problem with having three sigma or more excess knowledge about a problem domain is that solutions designed for the center of the bell curve likely won't work well for the many-sigma outlier population, and the fraction of the population out in that many-sigma part of the curve is too small for providers to justify expending significant resources there. It's not uncommon for businesses to optimize for the center of the bell curve and leave many sigma outliers poorly served, as is happening here.
I get what you're saying, but even if this design feature comes out of good intentions (which I honestly doubt), requiring the user to wait almost a minute so that it can "process" is rather excessive.
If they really needed this delay, surely it only needs to be a few seconds tops.