> People don’t want to run their own servers, and never will
This kind of gets at the reason why I think a lot of tech articles/blogs about what the future will be like are just terrible. The wants of someone who is driven enough read and write about the bleeding edge of technology are very, very different from the general population. Like this author says, most people don't want to run their own web server, but I'd go even farther and say, most people don't really care about decentralization or even data privacy. Getting most people to care about privacy and decentralization is like getting a kid to eat vegetables. They know they should, but the alternative has more short term benefits. I think most people care about ease of use over almost everything else.
People who write these articles need to be thinking about the middle aged woman who still calls every video game system "a Nintendo". There will always be some users for technologies like web3, but until you can clearly demonstrate to that woman that this new technology has value and is easier to use than the status quo, you're never going to get mass adoption.
Connecting this back to web3, we're clearly not there yet. Almost anything being done on web3 is slower, more expensive, and more complicated than its web2 alternative. We may or may not get there one day, but until we do, I don't see web3 being anything more than a niche product.
It's refreshing to read an article that admits this:
> > Even nerds do not want to run their own servers at this point.
I actually enjoy build and running servers, but only for hobby purposes. When it comes down to anything business related or critical, I have zero desire to run and maintain it on my own. And I especially don't want to have to handle security for large amounts of money that could disappear in an instant if I make one wrong misstep.
For sure. I ran my own servers for many years. And I still enjoy playing with hardware at home. But a couple years back I shut down my last colocated physical server and I do not miss it. The background stress of knowing that at any point I might have to wake up, haul my ass down to a colo, and swap a motherboard just got to me.
Now all my must-stay-up stuff is built via Terraform in a a public cloud. If there's a hardware failure, it's not my problem. It's such a relief!
> The background stress of knowing that at any point I might have to wake up, haul my ass down to a colo, and swap a motherboard just got to me.
I would miss mine terribly. I couldn't afford colo and hosted on VPS for a while but just didn't cut it. The Cloud is the same. Kind of like having two monitors and downgrading to only one.
In all honesty how often does that requirement come about? Did you not have fail over? 2u is mandatory if you want to fully exercise colo, 4u is ideal.
> If there's a hardware failure, it's not my problem. It's such a relief!
Not for me, if AWS or Azure fall over I'm at the mercy of the engineers to fix which could take hours just due to the processes standing up the cloud. And when those occurrences happen its normally fatal. If the same happens in colo their are only three reasons.
Datacentre,
Server or DDoS
Granted you can either live on the edge and having no spare hardware and hope they don't die. Or have kit ready to ship and rack. My colo servers are eight hours from me and always happy to jump down to my rack to fix whatever.
But I do respect your opinion because I don't know the variables you live in. Colo forever with me.
Is hardware failure a common problem making you can't sleep? I don't get it. I run several desktops and servers at home for decades. Other than my baby pulled keys out of keyboards, I never had any hardware problem at all. And some of computers are more than ten years old.
It's not a particularly common problem. But it's one I always had to plan for. I was, in effect, always on call. If I was going to be out of town, I had to have somebody to cover for me. Somebody who was on the list for physical access and knew what to do enough that I could talk them through it.
It's worth making a distinction between running a server and managing it. People don't want the hassle of managing all the complexity of server infrastructure, but they appreciate the benefits of owning your data, and the hardware it is stored on. It's just that right now the centralized solutions that store data centrally are the only ones available for web-scale applications.
However, that doesn't have to be the case. If you look at consumer appliances and mobile computing, you can build managed environments that are physically distributed but partially or fully managed, with the actual code and data as close to the user as possible.
IMO this diagnosis is still one level away from a more fundamental truism, which is that people don't want to pay anything for digital goods. Running servers can and has been massively simplified over the last couple decades, and I don't see any inherent technical barrier preventing it from being as simple as registering for an account on FB (i.e. anyone can do it). The deeper problem is the lack of willingness to pay (directly) for anything online.
The reason for this is complex, with lots of unclear cause and effect dynamics (e.g. did our unwillingness to pay push the ecosystem to gravitate towards ad-based revenue models, or the other way around?). The inevitable race to the bottom between competitors, under the massive incentive for platforms to centralize/consolidate (if you charged any amount for your service I can always under-price and out-compete you) is likely a major contributor. We do not exhibit such reservations against payment for anything physical, probably because of the innate sense we have that anything in physical reality should have a cost, yet not so in the digital world.
I’m not sure I agree with that. People wanna pay as little as possible but they gladly pay for Netflix or whatever. People spend a lot of money on Amazon because they make it really easy to pay. One of the original promises of cryptocurrency is it would make micro transactions easy and painless (with something to do about trust, but… that goes in the opposite direction than consumers would like as it’s the provider that doesn’t have to trust the consumer instead of the other way around like with credit cards which allow you to back charge stuff).
The key is still making stuff easy to pay for. Low transaction fees. Low risk to the consumer. Low friction overall. Ideally we would want to enable that without enabling monopolies like Amazon. Because the low friction is Amazon’s real moat.
Netflix sets up a very obvious dollars-to-value relationship. "Subscribe" and watch "things you already want to watch" - easily.
Most types of online monetization fail that test: subscribe and then you'll use this website for 15 minutes, then the promise is it will do something later that will be worth $10 a month to you. They're the gym-membership of digital services.
They want you to pay to join, but you don't actually know what you're getting and you don't know if you're going to find it usable at even a minimal level. Netflix deals with this too: they sell you access to a movie catalogue, not a specific movie - built into the model is a hedge against local risk for a product which already has very broad appeal.
That's why micropayments are a neat idea. Sure, I'd pay a dime or a quarter to read you crappy news site. A quarter doesn't matter, as long as you don't bug me, I'm not subscribed to anything, and I just click. That's kind of what Bitcoin was promising... Of course for several reasons, that doesn't actually work with Bitcoin.
Steam does amazing because it’s all so easy and well developed. Steam is also very conservative in its development and doesn’t add stuff for the sake of it, like so many other companies fall for (Norton Crypto anyone?)
Also, we think we are there when it comes to UX, but I feel we haven’t even started to make good UX paradigms.
I am fervently anti crypto, and haven’t seen any argument that makes me move an inch, because all of the current alternatives are so much safer and easier. However, the idea of an internet wallet does appeal that’s distributed rather than centralized does appeal on some level. Crypto enthusiasts should focus on that more.
Agreed. There are significant audiences where cognitive load is a much bigger barrier than spending actual money. But people do want privacy, independence, and control, so I think non-centralized services could still work.
I think "virtual server" is the wrong abstraction here. It's like "radio with pictures" or "horseless carriage" in that it's telling us we haven't found the right new way to think about it.
people don't want to pay anything for digital goods
Which brings up a different problem: Web3 assumes that everything you do online will cost money. Even assuming that fees go to zero, virtually nobody wants that. Web3 advocates will say that the money you earn will offset what you spend, but you only have to look at Patreon/Substack/OnlyFans earnings to see that it won't happen for most people.
It also strikes me that there’s an implicit requirement to “already have sufficient capital” to operate in the crypto space - even more so that normal finance. I don’t see middle-to-low income people being willing to adopt this as any interaction will burn even more of a limited resource than normal mechanisms.
If the majority of people can’t get in, or can’t afford to do anything in the space, is there any real chance this will actually take off?
Now I’m sure someone will respond along the lines of “crypto is an investment/asset not a currency, etc etc etc” in which case, why is it trying to do all these currency things?
Arguably, everything does. You just also either sell something at the same time or someone else subsidises it for you. Neither of those approaches are forbidden in web3. It may be more explicit at least.
More generally though, "everything" there means state changing operations. Read only doesn't.
> Like this author says, most people don't want to run their own web server...
I know I certainly don't. I want to write my software and I want to be able to deploy it somewhere and manage the things I may care about for that specific software. As much as possible I don't want to have to care about hardware, or routing, or server administration, or user permissions, etc. Learning it once? Sure. Dealing with it every time I have a new project? No thanks.
So, I totally agree. decentralization and privacy on their own are difficult to market, as they aren't nearly as in demand as convenience.
Amen. I don't even get why most companies have Dev-ops. For the price of one Dev-ops you can get the most expensive plan on many providers. Running the most expensive Heroku plan (with a concierge service) is cheeper than an employee, and office space, and medical insurance, and ....
And that's just the one provider I know.
I want to type git push master, and that's the end of my involvement in standing things up.
It like getting a kid to compost, sew seeds, tend to the veggie patch, pull weeds and 10 weeks later cook and eat the vegetables.
Im the sort of person who should be interested in web3 (i dreamt of this kind of stuff years ago although had no technical idea how it might work) but now I’ve seen the culture of the space I have no interest.
"C-suite, or C-level, is widely-used vernacular describing a cluster of a corporation's most important senior executives. C-suite gets its name from the titles of top senior executives, which tend to start with the letter C, for "chief," as in chief executive officer (CEO), chief financial officer (CFO), chief operating officer (COO), and chief information officer (CIO). "
It also seems chronologically wrong since a woman who is middle-aged today would have been in the prime age group for the Nintendo Entertainment System.
I feel like it's a matter of OS improvement that will enable people to manage the software side of their own servers in as little (or less) effort than managing cloud platforms or even VPCs. Ideally in a standard way. Why is learning Dropbox any easier than learning to copy a file to some other FTP serving software? The clouds are just making $$ to support you, though that often turns on it's head when they try to protect their interests. This conflict is why everything is shit right now IMO.
If we are talking about the hardware... That might be a harder sell. But at the same time, I don't see why a company like Apple couldn't market a product like the HomePod as a personal server. It falls into the privacy narative and would be a way to make more device sales by supporting faster local services.
Personally, I want my ISPs to give me a static IP more easily so I can more in this direction without worrying about weird dynamic DNS issues. IPv6 should have enabled this years ago, but it remains an issue.
FTP is horrible. I'm glad as a web dev I haven't touched it in 8 years not since I worked for a hosting company in tech support.... git, or even rsync over ssh is way better...
The upcoming generation..even the 'non-tech' people are tech-savvy, meaning most could probably get arch linux up and running at least via a distro or follow the docs, etc...where there parents would fail.
the problem is they need to create something w/ a big enough value proposition but at the same time, easy enough for the masses to assimilate and understand it, and that serves enough utility to make it worth it.
Something like an actual currency w/ basic income dividends (taxed $ goes to lowest 50% who have a minimum utilization score), and identity/fraud management, that has a built in tax and cap system so whales can't abuse it, and zero transaction fees, instead fees are taxes for hodling and lack of utilizing (less daily/weekly transactions lower your utilization score, so you might lose a couple coins/day until you start spending more). Fraud/ID comes in handy here so you can't just spend it to yourself or other accounts you own. 1:1 only.
It'd need wide adoption to make utilization scores accessible, and maybe the price be pegged at or near a loaf of bread... and somehow make that global to be a universal price-setter to.. like say it's 1000 x currency for a loaf of bread in X country and 100y currency for a loaf of bread in y country, the c (coin) to y trade rate would be 100:1, and 1000:1 for x.
I also feel that decentralization can be bad, full democratization is good, and DAO's would be good assuming every member gets equal voting rights (protect against whales), but sometimes esp. in the beginning centralized aspects like identity verifiers could go a long way towards building something resilient, and make tweaks/iterate changes faster than blockchain tech, and then when the tech is more sound in 10 years, move 100% decentralized... or only parts if that's what the org votes on....etc...
When this expands to include metaverse it even becomes more important to have liquid democracy at its core, to ensure fairness and that companies don't control everything.
>The upcoming generation..even the 'non-tech' people are tech-savvy, meaning most could probably get arch linux up and running at least via a distro or follow the docs, etc...where there parents would fail.
I don't know what members of the upcoming generation you're dealing with, but the ones I know are more computer illiterate than their parents.
Their parents played games on dos and had to configure shit, the kids use their phone for everything and don't know how to use computers beyond a basic level.
Not a chance on that mate.. No way on Arch. Most of my neighbours can't change their wifi password. Heaps of the 20 somethings can't even run "ls" in a terminal.
> Something like an actual currency w/ basic income dividends, and identity/fraud management
Proof of Humanity is trying to do that with $UBI tokens and their method of proving who you are (basically a video of you with your wallet address saying a specific script, and putting up a collateral that could be lost if a court can provide evidence that you've signed up for it before). After you're signed up, you get one $UBI token every hour. $UBI tokens are currently worth $0.12 apiece, so it's roughly $1200 USD per year (at least for the moment, it's inherently very inflationary and seems to kind of rely on people like Vitalik Buterin, creator of Ethereum, to buy a bunch of tokens and burn them).
It does have a complex onramp, though, and will be difficult to get non-tech-savvy people onto it without some help, most likely.
We'll see if it continues to work. It's only been around since March 2021. It's an interesting idea, though.
I'm not sure I would call this "solved", since it's effectively just a replacement for the DNS servers in effect.
What I want is no additional dependencies, esspecially on dynamic and slow to propagate services. Not to mention that my current dynamic DNS (through tplink) seems to be filtered by a lot of firewalls or something.
ISPs providing a static IPv6 would be a simple solution that I should be able to create my own DNS records for convienence. No external VPN or otherwise.
Regarding your last sentence, I think that's fine.
I know Moxie criticizes people for saying “It’s early days still" but I really do think it's early days and NFTs have driven crypto into the mainstream too quickly.
Crypto researchers are still chipping away at the math and computer science required to bring the web3 vision to life. What's unfortunate is I've yet to see an article on hacker news about this research and, instead, articles about the hacked together shit that is unfortunately the face of web3 at the current moment.
If you're interested, I'd recommend people check out some of the following topics:
- Smart Wallets for better UX for your average user.
Does this not prove his point though? Because decentralization is harder to get right on a technical level, centralized alternatives will always outcompete more decentralized ones.
> The wants of someone who is driven enough read and write about the bleeding edge of technology are very, very different from the general population.
This is very insightful. I wonder what else it applies to. I bet there are tons of media sectors writing to irrelevant but interested audiences.
> People who write these articles need to be thinking about the middle aged woman who still calls every video game system "a Nintendo". There will always be some users for technologies like web3, but until you can clearly demonstrate to that woman that this new technology has value and is easier to use than the status quo, you're never going to get mass adoption.
I don't get it. I thought this used to be common knowledge. I mean it's basically a TV trope, so why and how do industries "forget" this?
Being easy to use is not usually thought of as a feature. Just look at the reaction of telephone hardware vendors to the original iPhone: There's nothing new about this, there have been tons of devices with touchscreens, we already know the customer does not want those, yada yada. They did not even consider the possibility that the selling point was not a list item on the spec sheet, but the user experience.
You're correct about ease-of-use being key, I think.
It was easier for "us" (the industry) to build hosted web servers, and so that's the paradigm that has won out. It's a direct evolution from client-server computing in the mainframe-and-terminal era.
But the user doesn't need to care what a server is, or what running one involves; it's a bit of a red herring.
A winning platform wouldn't communicate to people that they're running a server at all; they'd upload their messages/profile/etc, and the application user experience would be akin to that of any other application, with the difference that -- at an implementation level -- their data would be encrypted, replicated and hosted across multiple devices. The platform provider then goes on to win-in-competition because their hosting and bandwidth costs reduce to near-zero.
That of course conflicts with the second point: evolving the protocols for that is hard. I'd wager that a winning platform will get 98%+ of the protocol design and implementation correct up-front, because it would have to be based on simple, iterable, secure and near-correct fundamentals that stand the test of time.
> People who write these articles need to be thinking about the middle aged woman who still calls every video game system "a Nintendo".
In a world where the pool of capital allocated into crypto is hyperconcentrated in the hands of a tiny number of elite investors who employ teams of analysts to scour the web for opportunities to rapidly take advantage of, those people don’t matter.
This is also why no modern cryptocurrency investor can realistically be considered “early”, anymore. The only thing early about crypto is the general maturity levels of its technology, which arguably doesn’t matter to valuation based on the reality we see play out in the crypto markets on a daily basis.
> There will always be some users for technologies like web3, but until you can clearly demonstrate to that woman that this new technology has value and is easier to use than the status quo, you're never going to get mass adoption.
I think this isn't true. A large part of getting people to use something is often not ease of use, but momentum and popularity. Ease of use plays a large role but by itself, it doesn't explain the entire variance of why some technology reaches mass adoption or becomes the most popular.
This kind of gets at the reason why I think a lot of tech articles/blogs about what the future will be like are just terrible. The wants of someone who is driven enough read and write about the bleeding edge of technology are very, very different from the general population. Like this author says, most people don't want to run their own web server, but I'd go even farther and say, most people don't really care about decentralization or even data privacy. Getting most people to care about privacy and decentralization is like getting a kid to eat vegetables. They know they should, but the alternative has more short term benefits. I think most people care about ease of use over almost everything else.
People who write these articles need to be thinking about the middle aged woman who still calls every video game system "a Nintendo". There will always be some users for technologies like web3, but until you can clearly demonstrate to that woman that this new technology has value and is easier to use than the status quo, you're never going to get mass adoption.
Connecting this back to web3, we're clearly not there yet. Almost anything being done on web3 is slower, more expensive, and more complicated than its web2 alternative. We may or may not get there one day, but until we do, I don't see web3 being anything more than a niche product.