The author is correct in that media DRM is tied to GPU vendors on the field right now.
But hardware backed DRM can be so much more invasive beyond that. I have no doubts the long term goal of MS is to have a Windows version of Play Integrity.[0] So total control over everything that happens on your device. Just to give an example of what could happen if this becomes reality: https://en.m.wikipedia.org/wiki/Web_Environment_Integrity
This tech extended to browsers could easily mean that sites could refuse to serve you if your machine is running any bigcorp unapproved software. An easy example of that would be adblockers.
Unless we get lucky with secure world compromises like the Tegra X1 bootrom exploit[1] or get real good at passing legistlation that forces companies to give you all the private keys to your own machine, the future for personal computing is looking grim.
> The author is correct in that media DRM is tied to GPU vendors on the field right now ... hardware backed DRM can be so much more invasive
I expect mjg59 to know what they're talking about but like you say, I wonder the same thing about the strength of (what you call) Media DRM v Hardware-backed DRM.
GPU vendors have quietly deployed [hardware-based DRM] ... [which] works just fine on [boards] that [don't] have a TPM and will continue to do so.
Work fine? Even if a section of GPU's vRAM is out of the reach of the OS (here, to implement DRM), wouldn't TPM / DICE be needed to establish trust / measure GPU's firmware?
No, the GPUs have their own hardware RoT that measures the firmware. Modern GPUs are basically parallel computers with their own RAM, bootup sequence, BIOS, operating systems (drivers and firmware together are basically an OS), compiler toolchains, debuggers, sub-drivers and so on.
One which needs to be opened to users/owners instead of locked away. A price-doubling 100% sales tax on Universal Machines which lock owners out like with video cards (and their firmware), should make products which are not fundamentally significantly GNU-ideals friendy unaffordable to the average consumer (and therefore not economically viable anymore). Siemens can still sell their $5MM machine for $10MM to BASF or whatever, because BASF can afford to borrow double to pay the tax, but Cletus and Dorothy will not be buying sony playstations and apple iphones because $2,000+ isn't worth it.
The GPU is a completely separate computer running proprietary software. "Operating systems" do not operate anything anymore. They are just some user app, to be sandboxed very far away from the real action.
> "Operating systems" do not operate anything anymore.
Not entirely fair. There is still a kernel and a privileged userspace layer. That hasn't changed. The OS implements a common API that abstracts over ISAs and other finnicky hardware details that are under constant short term churn.
It's just that peripherals themselves have become so incredibly complex that many of them now require their own embedded systems in order to operate. The hardware was always a black box it's just that now it contains an entire embedded OS.
BS. Either we're privileged and can copy their precious content, or they're privileged and we cannot.
The current status quo is they sit above us in the truly privileged hardware modes while we are isolated, virtualized and sandboxed for their safety. It's not our computers anymore, they're just allowing us to use them.
Not what I meant. For example, on most mainstream linux distributions systemd fulfills the role of privileged userspace layer that I was referring to there.
> truly privileged hardware modes
The presence of a hypervisor doesn't imply paravirtualized hardware. Neither does the presence of an entire OS on modern GPUs imply a reduction in kernel responsibilities. Ring 0 is still ring 0. The OS is still managing and abstracting hardware in the same way that it always was.
That doesn't mean that these other things aren't concerning developments. Particularly having an entire unauditable shadow OS running on the CPU is an incredibly dystopian scenario that almost seems unbelievable. But technical accuracy is important when discussing these things.
> The OS is still managing and abstracting hardware in the same way that it always was.
Not at all. The OS is not "managing" anything. It has no direct access to the real hardware. Only the firmware does. The OS is just talking to the API the firmware presents.
They're not our devices anymore. They're intel's, nvidia's. They dictate how we use them. The hardware's just sitting there, waiting for the right electrical signals to come in. But the OS is not the one sending those signals. Their firmware's in charge of that. It's the middle man between the OS and the device we paid money for. If the firmware doesn't like the tune we're singing, it shuts us down.
There are completely separate computers inside these things. They don't run our code, they only run signed code. Whoever has the keys to the machine's code owns the machine itself. And it sure as hell ain't us.
"Managing" and "talking to an API" are not mutually exclusive though.
Yes, firmware has continuously become more complex. Yes, if you go back far enough (quite a long ways) there wasn't any.
Peripherals have always been a black box that increased in complexity over time. That increase in complexity does not imply a decrease in management complexity on the part of the kernel. Far from it! Modern device drivers are far from simple.
> They're not our devices anymore. They're intel's, nvidia's.
This is arguably true, but it is also a rather separate topic of discussion.
> They dictate how we use them.
That's largely only in theory. Now if you had said that Apple or Samsung were dictating how we use our phones I would have been inclined to agree. But I don't think gating certain features in the CPU or GPU for the purpose of market segmentation qualifies as dictating how I use my device. I don't like the practice, but I can't deny that I am able to use the APIs provided by the device in an arbitrary manner without it phoning home to the manufacturer or otherwise authorizing the specifics of their use.
> But the OS is not the one sending those signals.
Depending on how you define "sending those signals" and where you consider the boundary between sender and receiver to be you could reasonably argue that the OS never did that to begin with, or alternatively that it has always done so and still does. It's really quite arbitrary and depends entirely on where you consider the boundary of the device to lie.
I purchase a peripheral. It is a black box that implements some device or manufacturer specific API. The kernel has a device driver that abstracts over this and provides a generic userspace API that will (hopefully) remain relatively stable for multiple decades. That's the extent of the contract and that hasn't changed at all.
The device driver situation is already nearly unmanageable. Imagine how much worse it would be if the kernel needed to manage every last minute hardware detail down to the model and even sub-model variants. For example, for every USB mouse and keyboard, past and present. And that's before we even consider things like the firmware for the USB controller on the mouse, which in all likelihood is its own modularized unit from an entirely different manufacturer. But we're going to need to account for every last detail of that ourselves if we fully commit to the "all opaque firmware bad" route. After all, for the kernel to "truly" be in control of the hardware I suppose it will need to manually manage every last pin that falls under software control.
Technical accuracy and nuance is really quite important here. There are many different nefarious things happening at once. Conflating them only serves to confuse the discussion and leads people to (wrongly) believe that there's no need to worry about those weirdos ranting and raving in the corner.
> That increase in complexity does not imply a decrease in management complexity on the part of the kernel.
Complexity is not the point. Control is. The operating system should be in complete control of the system, and it isn't.
Complexity is part of the reason for that. The actual hardware is exceedingly complex, so manufacturers simplify it with firmware that presents a more convenient API.
That's convenient but it means we are no longer in control of the hardware. We merely interface with the convenient abstraction presented to us. It's that abstraction which actually drives the hardware, not our "drivers".
And that obviously becomes a mechanism by which to control us. Access to perfectly good hardware could be denied by the firmware for unacceptable reasons such as market segmentation or copyright enforcement.
> But I don't think gating certain features in the CPU or GPU for the purpose of market segmentation qualifies as dictating how I use my device.
BS. I want to copy stuff. It's not letting me. It's that simple. Some nonsense about "protected video paths".
The hardware is working and able but a fundamental computer operation cannot be performed because the firmware doesn't want to. Computer says no.
> The device driver situation is already nearly unmanageable. Imagine how much worse it would be if the kernel needed to manage every last minute hardware detail down to the model and even sub-model variants.
If that's the cost of maintaining control, we should pay it gladly. Better than growing comfortable with the manufacturer's convenient abstraction which also conveniently allows them to control what we do with "our" machines.
> There are many different nefarious things happening at once.
There is exactly one thing happening here: corporations usurping control of our devices to protect their interests and profits. The means by which they do so are far less important, they are merely details.
These details are irrelevant in the grand scheme of things. It's all about control, about giving you less of it, the minimum amount of it. The exact mechanism by which they do it is irrelevant.
It's always some abstraction, some indirection, a little bit of clever cryptography. Maybe there's an even more privileged hidden OS running on the CPU which can access everything while we can't. Maybe there's some signed firmware running in a completely separate computer in the hardware and that computer acts as a middleman and gatekeeper. It doesn't matter. Our goal should be to take over the functions those components are doing, whatever it is that they do. They should be running our code, doing our bidding.
> Conflating them only serves to
confuse the discussion and leads people to (wrongly) believe that there's no need to worry about those weirdos ranting and raving in the corner.
What else is new? Stallman has been warning everyone about exactly this for nearly half a century already and people still treat him like some lunatic religious zealot despite the cyberpunk reality we live in today. Even I made that mistake at some point in my life.
If they won't listen, they'll suffer the consequences. They'll end up living under the control of corporations. Might as well remove the word "hacker" from this website's name because everything it ever stood for is over.
In my opinion, Stallman's mistake is he's way too nice about it. Always speaking softly and being reasonable about everything. Always getting bogged down over precise wording and irrelevant details. GNU has an entire glossary page dedicated to precise wording.
Meanwhile, the entire industry has worked around his ideas by isolating his free software and maintaining control with firmware. To have a truly "freedom respecting" computer with no firmware blobs, you gotta get one from literally decades ago. Because these days everything has firmware which you do not control. If you're lucky. If you aren't, you get something that's literally locked down to the point you have no choice whatsoever. What good is free software if you can't run it? It's worthless. It's worse than worthless: one day you wake up and you realize you were working for free for the corporations who are now profiting off of you while denying you the control you wanted.
It's all very simple. Free computers are subversive weapons. They have the power to literally wipe out entire sections of the economy. They have the power to defeat judges, armies, nations. They are quite literally the most important invention of mankind.
Naturally, corporations and governments will do everything in their power to control what you can do with a computer. First, they reduced computers to toys which could run all programs, except the ones they didn't like. This sort of "computer" is what we are discussing right now. Computers where you can do everything except copy their precious content. They are currently in the process of reducing computers to toys which refuse to run all programs, except the ones they like. That's the mobile landscape. Does it matter that hardware remote attestation is the mechanism by which they're doing it? Not much.
I can barely find the words to describe how disgusted this status quo makes me feel. I know what they're doing and I know they're succeeding. It makes me sick. Like I'm witnessing something great be destroyed due to greed and fear. I feel sick.
If that makes me the weird fellow raving in the corner, so be it. I'll keep raving in every thread about the subject until the day I get banned by dang. There's no point to this site if they win anyway. What good is Hacker News if you can't hack?
I always said a hefty sales tax (50%? 100%? 200%?) on final sale of any product containing just a single Universal Machine which has artificial designs/locks that prevent the owner from replacing any and all firmware/software with versions he has authored, and/or which lacks complete enough documentation of design and interfaces that would enable a knowledgable and capable owner to author his own software/firmware. This should apply to PCs, phones, watches, microwaves, televisions, CPAP machines, automobiles, toasters... everything which contains a Universal Machine. Uncontrolled [by owner] Universal Machines are a national security concern which has the potential to turn grave at any moment.
A "tax" like this is essentially equivalent to a fine, and a fine is a price
Also, companies can just price the additional cost in, blame the government for the price increase, and mislead consumers about the tradeoff being made. A ban is harder to do that about
Yep and you can come in and make a fully open and compliant competitor product, because your closed and uncompliant incombent is forced to charge a price which should give you enough margin to succeed.
I am admitting that yes closed beats open at money extraction/harvesting from customers, which is why you only ever see closed hardware. The whole idea is to kneecap business models which depend on handcuffing owners with digital locks. This is economic lawfare, I am not hiding that. We The People are not animals on a farm to harvest dollars from occasionally, as if they were milk and methane.
Yea but you know what's even better than a tax? Not being able to release those products onto the market. I really doubt products whose "value-add" is lock-in would be able to survive black-market dynamics. But Apple, Google, Microsoft, John Deere, what have you can totally afford extra taxes, and they probably even have the market power to just foist those increases on their customer base and not lose them because they're effectively a captive market in the current regulatory regime, and that's also the kind of market power that makes it so that you don't have to compete on price. I consistently can get cheaper laptops by seeking out ones that don't include windows licenses, but this clearly hasn't affected the laptop market much
Also, let's say I'm gonna undercut someone on price for electronic devices. Unless I'm starting from a place of great personal wealth and don't take any capital at all, this needs investment, which means that an obvious solution to any scenario in which I'm meaningfully harming the bottom line of one of these incumbents could just be buy it out from under me, which is indeed how this generally plays out in the real world
If we are serious about regulating monopolies, we have to understand that remedies that rely on raising operating costs are simply always going to be ineffective
So the idea is to ban the practice for smaller players without the scale to eat the costs?
No thanks, an outright ban is necessary. This will not prevent manufacturers from doing business no matter how they may whine about it, and frankly if this does somehow kill their business it should
The idea is to make it almost completely commercially unviable to sell locked down DRM hardware to small players, and only somewhat harder for XYZ Healthcare to buy the multimillion dollar GE MRI machine unless the MRI is fully open and compliant (XYZ can borrow and amortize, but Joe can't do that to buy playstations and cars).
Wait so you're saying that it's important to allow predatory business models to continue in the industries that do the most harm through constant consolidation to support ballooning costs?
Like the healthcare system's consolidation and scale that allows it to deal with massive extra costs and the degree to which that system is beholden to predatory technological models is if anything a great motivating example for the potential benefits of a ban
I have trouble understanding your use of the term DRM. Media DRM makes sense: the copyright holders want to "manage" their rights digitally. How is that relevant to Play Integrity or WEI? Whose right is being protected or managed? If I have an Android without Play Integrity there are certain apps that will not run, but I don't see any rights being managed here: an app developer has the right to refuse service just like I have the right to refuse running an app.
In fact I see no relationship between DRM and Play Integrity other than a tenuous connection that both are about controlling what a user cannot do on their device. If this is what you mean, then you have made the same mistake as FSF by conflating unrelated technologies.
Ultimately, DRM is untenable without users also being locked out of their own devices.
Consequently pressure to support more effective DRM will always translate into pressure to restrict what users can do with their devices.
Furthermore, the only defense against this is large open device market share: once closed devices comprise most of the market, DRM proponents can announce they'll stop supporting open devices, creating a downward spiral that further decreases the availability of open devices.
This is an FSF level understanding. Android devices are fully open and you can reflash them to whatever OS you want. Some remote servers won't give you service if you do that, but nothing is locking you out of your device. As Android dominates the global market, you already live in that world where most devices are open.
>Some remote servers won't give you service if you do that
This is exactly my problem. Before ideas like this surfaced, the demarcation line between who controls what was purely based on ownership. The machine that I own acts only on my behalf and in my best interests, the server that you own does so for you (or atleast for PCs this has always been the case)
TPMs, attested bootchains and whatnot trample on this whole concept. It's like your very own hardware now comes with a built in Stasi agent that reports on your conduct whether you like it or not. It bothers me on a visceral level and I'm constantly wondering if it's just me.
It's not just you but what people who hate remote attestation tend to forget is that it's a sword that cuts in both directions. Servers can remotely attest to you, not just the other way around. Signal is an example of an app that demands a remote attestation from the server before uploading your sensitive data.
Attestation is just a tool. It can be used for all kinds of things and doesn't privilege one side or another. The average app developer doesn't truly care what device you use, they just want to cut out abuse and fraud, which are real problems that do require effective solutions.
Ultimately, trade requires some certainty that both sides will act as they promise to act. Attestation is more important for individuals attesting to companies because individuals have so many more ways to hold companies to account if they break their agreements than technology, like the legal system, which is largely ineffective at enforcing rules against individuals due to cost.
> Attestation is just a tool. It can be used for all kinds of things and doesn't privilege one side or another.
It priveleges the side that designs and uses it. By and large that's going to be the corporations, not individuals or those acting to maximize their interest.
> The average app developer doesn't truly care what device you use, they just want to cut out abuse and fraud, which are real problems that do require effective solutions.
I don't doubt that. But the price of attestation, if it's not properly isolated from the hosting OS (like Microsoft's completely unrealistic attempts of bringing the whole OS into the trusted computing base, kernel and applications and all), would be a homogeneity of computing I don't think is necessarily worth the benefits.
The good news is that such proper isolation is not only possible but even desirable (it keeps the trusted computing base small), and if done well could actually replace annoying half-measures such as "root detection": Who cares if my phone is rooted, as long as my bank's secure transaction confirmation application is running in a trusted, isolated enclave, for example?
Fair points. I was aware of this anti fraud angle of WEI/attestations before.
From this point on this is more of an emotional argument rather than a technical one, but I feel like the negative effects way outweigh the positive ones. Giving MORE power (be it technical or poltical) to big tech companies is just tipping the scales in their favor so much we will even worse off than we already are.
But if you work in anti-fraud and are fixated on solving this problem as effectively as possible, I can imagine not caring about this too if I were you...
Fully agreed on attested bootchains. General-purpose level OS-wide attestation is indeed a blight on open computing: It's ineffective because it implies a gigantic trusted code base (what are the odds that the entire Windows kernel is completely free of vulnerabilities?), and conversely it does tie you to somebody else's more or less arbitrary kernel build.
Almost complete disagree on TPMs. A better comparison than a spy would probably be a consulate (ok, maybe an idealized one, located underground in a Faraday cage): Their staff doesn't get to spy on you, but if you ever do want to do business with companies in that country and need some letters notarized/certified, walking into their consulate in your capital sure beats sending trustworthy couriers around the world every single time.
To torture that analogy some more: Sure, the guest country could try to extend the consulate into a spy base if you're not careful, and some suspicion is very well warranted, but that possibility is not intrinsic to its function, only to its implementation.
By that same logic evil is not inherent to attested bootchains either. When used to verify that the computer loaded the OS that the end user expected it is a very powerful security tool. It is only bad when the keys aren't under the control of the device owner.
You're mixing up the authentication and attestation parts of secure boot here.
You can absolutely install Linux, run secure boot (e.g. to protect you against "evil maid attack"), use your TPM to store your SSH keys, and live a happy and attestation-free life.
You can also do other things, but if you don't want to, why would you?
Attested boot chains aren't normally being used to attest a whole general purpose OS. They attest up to a small hypervisor that allows partitioned worlds to be created and chain attested, and then sensitive computations are done inside that.
> It bothers me on a visceral level and I'm constantly wondering if it's just me.
It's not just you.
It disgusts me so deeply I wish computers had never been invented. A wonderful technology with infinite potential, capable of reshaping the world. Reduced to this sorry state just to protect vested interests. They used to empower us. Now they are the tools of our oppression.
While I don't agree with the FSF on even close to everything regarding trusted computing, I think for a fair discussion you'd have to at least steelman their arguments here:
I think it's fair to assume that in a world in which almost every device supports attestation and makes it available to any service provider by default, without giving users an informed choice to say no or even informing them at all, service providers are much more likely to provide access exclusively to attestation-capable clients.
That, in turn, has obvious negative consequences for users with devices not supporting attestation (whether out of ideological choice, because it's a low cost device and the manufacturer can't afford the required audits and security guarantees etc.): Sure, these users will always be able to just refuse to transact with any service provider requiring attestation.
But think that through: We're not only talking about Netflix here. At what availability rates of attestation will decision makers at financial institutions decide that x% is good enough and exclude everybody else from online banking? What about e-signing contracts for doing business online? What about e-government services?
I am at the same time excited about the new possibilities attestation offers to users (in that they will be able to do things digitally that just weren't economically feasible for service providers, since they often have to cover the risks of doing so) as I am very wary of the negative externalities of a world in which attestation is just a bit too easy and ubiquitous.
In other words, the ideal amount of general purpose attestation availability is probably high, but significantly below 100% (or, put differently, the ideal amount of friction is non-zero). Heterogeneity of attestation providers can probably help a bit, but I'm wary of the inherent centralizing forces due to the technical and economical pragmatics of trusted computing.
The ideal amount of attestation on a general purpose computer which is owned by me is zero. Any nonzero amount implies that control of the device has not actually been turned over to me. It implies not only the slippery slope to which you refer but also things about back doors and opportunity for dystopian political regimes and much more.
When it comes to financial or legal matters (and this includes online banking) a small dedicated hardware element for signing fingerprints is all that's ever been required. Anything more is an overreach.
> back doors and opportunity for dystopian political regimes
No, this is a misunderstanding of what a TPM is.
A TPM is a secure element inside your computer, similar to the chip running your credit and debit card. That's it. Without you using it (i.e. your OS or an application you installed asking it to do something), it's exactly as dangerous as a blank chip card in your house that you don't use and didn't open any account for.
If you don't want anybody to talk to it, don't install applications or OSes on your computer that do things you don't want. You have full control over that! Not running software that's not acting in your own best interests is generally good practice anyway, TPM or no TPM.
> [...] a small dedicated hardware element for signing fingerprints is all that's ever been required [...]
You might be happy to hear that that's exactly what a TPM is, then!
I am fully aware of what a TPM is. I was speaking about trusted computing - ie the "general purpose attestation capability" that you referred to above.
As you say, a TPM alone can't do much of anything and doesn't pose much of a threat. Of course expanding the acronym - Trusted Platform Module - is a bit of a giveaway. They were always fully intended to serve as the root of trust for much more nefarious things.
> the only thing I’ve ever seen TPMs used for is full disk encryption and user authentication.
Aren't all device attestation schemes underpinned by authenticated boot which itself is underpinned by a TPM? This is certainly the case for Android - AVB is implemented on top of secure boot on all the devices I've ever owned (and Play Integrity, if I had ever permitted it to run, on top of that). Do I have some misunderstanding about the stack?
> Conversely, DRM is alive and well on almost universally TPM-less devices.
You mean software DRM I assume? Because the only TPM free hardware backed DRM that comes to mind is GPU based encrypted streams where the GPU does the decoding and final compositing locally. And even then the TPM-equivalent exists, it just isn't accessible to the end user.
SGX can be used to do various interesting things without attesting the state of the broader system, but none of the examples that immediately come to mind feel much like DRM to me.
> comments in this thread end up dead
Thanks for letting me know. I guess I should email them?
You think there's no value in your laptop being able to attest its state to your phone in order to give you confidence it hasn't been tampered with? That's something that would be entirely under your control.
So don't normalise manufacturer locking. We're not going to prevent the bad thing from happening by arguing against the hardware that enables the bad thing - we're going to need to argue against the bad thing.
When this remote attestation business started, people tried to minimize its impact by saying only apps that really needed it would use it. Such an absurd argument. Everyone is going to use this technology. It will literally become the default.
Everyone loves cryptography and wants it working in their favor. Everyone. It's great for us when it protects our messages and browsing from surveillance capitalism and warrantless government espionage. It's extremely bad for us when it becomes the policy enforcement tool of corporations and governments.
Remote attestation means we either we run the software which does their bidding and protects their interests and bottom line or we don't participate in society or the economy. Only way it could get worse is if the government starts signing software as well. One day even the goddamn ISPs will refuse to link to our hardware if it fails attestation.
It's literally the end of free computing as we know it. Everything the word "hacker" ever stood for, it's over.
Meh. I didn't reflash my phone. I didn't root it. I didn't do anything to modify its system files whatsoever.
I just installed KDE Connect, and an open source keyboard. Banking apps refuse to run because of those (because my keyboard might see my keystrokes!!!). They don't even need a failed hardware attestation to refuse you service.
So even if you don't try to modify your device, your device might still end up like half a paperweight. I either can't do banking, or I can't use the functionality I want.
The ability for someone with a news article or a game to only have you experience it if you pay their fee or watch their ads, preventing you from copying the content off your device or modifying it in some way that is unauthorized (removing ads or otherwise modifying the behavior to circumvent protection mechanisms) is pretty obviously the exact same idea -- not some mere metaphor -- and is a protection of the exact same "right" conferred by the exact same laws as allowing someone with a movie to only have you see it if you pay their fee or watch their ads... I am honestly having a difficult time understanding your confusion here :/.
You are still talking about DRM in the context of copyright. If someone has a news article or a game, they have copyright on that article or game and they use DRM to protect their copyright. All these are applications of DRM.
Applications like Play Integrity could be quite different: say a bank can refuse to move money if your instructions to move money comes from a device deemed not trustworthy by Play Integrity. That's like a bank can refuse to let you into their branch if you are dressed in swimwear. A game can also deploy this tech for anti-cheating purposes; really no different from a real-world casino refusing a customer who is known to be good at card counting.
And this is the root cause you fail to understand - the idea of copyright contradicts the idea of information freedom. You should be able to make a copy for you own purposes such that when you go back, the information is still the same and not manipulated and you should be able to actually share this information given it's important. For example a news story about corruption that has been taken down.
Also why the hell you believe that the same copyright rules that apply to a movie that can take millions to make and keeps relevance for years should apply to a news article for example? It's madness.
Information freedom is merely an ideal not a right. It is an ideal by techno-optimists. But there is no legal basis for information to be free. Indeed I agree with you that the idea of copyright contradicts the idea of information freedom. And guess what, copyright is in our constitution, and information freedom is not.
Furthermore, there is also no legal basis in differentiating copyright by the budget involved to produce the work.
> an app developer has the right to refuse service just like I have the right to refuse running an app.
In this case it feels like an app developer having the right to punch[0] you in the face just like you have the right to refuse being punched in the face :-P.
Not GP, and don’t have their patience anyway. But while I see them as real computers, they aren’t any that I enjoy using, so I care relatively little for them.
In most well designed systems the only keys that are useful are held in HSMs that won't export them to anyone, so you can't easily do that. You could at best sign a few things with the keys if you were able to compromise HSM credentials, but, once you were caught your access would be revoked along with anything you signed.
Consider a benevolent cryptographer, who is able to break modern asymmetric cryptography, but refuses to use it for petty personal gain, and is fully aware of the dangers of publishing it (why this cryptographer put it in dead man's switches instead, with recipients randomized over nearly all power blocs, political groups, companies, ...)
The cryptographer never implemented it on daily compute devices.
Perhaps this cryptographer would be willing to risk a low communication round release of private keys corresponding to public keys in ROM or burnt in eFuses etc... but only if the public key dump is sufficiently large and encompassing.
From the perspective of the cryptographer we are all whining wankers, and we should just collect all the public keys as a wishlist.
The cryptographer care naught about "liberating" hour long advertisements for the militaries or intelligence agencies etc. The cryptographer does wish sovereign compute to fellow humans, a primordial requisite for effective democracy.
====
While I understand the average programmer would ascribe an incredibly low probability to the above, the absolute absence of such a comprehensive public key dump is not in proportion to the probability considered.
> the future for personal computing is looking grim
I don't know. They could lock up the hardware stack as much as they want, in the end it's pixels being pushed to arrays. It's extremely hard to prevent these pixels from being intercepted. You'll have pirate groups just going deep in the hardware (opening the monitors and soldering and hacking and whatnots) and eventually tap these.
As for personal usage: I've got hardware from the eigthies still working fine.
Instead of:
movie2025-WEBRip1080p-x265.mp4
people shall download:
movie2025-WEBRip1080p-DRMfree-x265.mp4
And people shall just play that on their DRM-free hardware, either brand new or old.
For example people can still buy brand new CRT (!) screens today. Not just CRT screens but also brand new CRT PCBs to drive either new or old CRTs. It's 2025 and people can still buy brand new CRTs. That's kinda rad.
And if worse comes to worse, if it's really impossible to go "tap" into the pixels being sent to a DRMed monitor (which I don't buy for a second), there's still the analog hole. Pirates are just going to use old (non DRMed) gear to rip, analog style, DRMed content and then they'll just process the result with some AI models to get it back to near perfection.
Heck, the day's probably not very far where I can use, say, two handcams from the 90s to film a movie at the movie theater and then use an AI model to give back a near pristine movie file (as in: one where it's impossible for the layman to discern from the original).
> This tech extended to browsers could easily mean that sites could refuse to serve you
That's already the case: some content is geo-blocked. People use a VPN or just fire up Frostwire or qbittorrent.
Even a Raspberry Pi 5 goes a long way: when are these going to play the DRM game and make the future look grim, instead of bright?
I don't doubt there are really deeply sick, evil, people out there thinking about how they can ruin of collective future but I also know that they'll encounter people who have systematically owned their sorry arses.
We're not concerned about DRM because it will (or won't) stop us from redistributing and playing content. The stated goal of DRM (blocking copyright infringement), and DRM's general failure to meet that goal, is the least interesting part of the story.
We're concerned about DRM because what it does accomplish. DRM creates a vertically-integrated market wherein every layer of the stack is authoritatively controlled by a colluding oligopoly of vertically integrated hardware+media corporations (Apple, Amazon, Facebook, Comcast, etc.)
The greatest problem with DRM is drivers. NVIDIA hardware only works well in Linux because it's important to NVIDIA's business. Even so, there are longstanding issues that would have been fixed decades ago if kernel devs were allowed to collaborate. Instead, DRM (and copyright in general) demands that the driver dev team be siloed away from the kernel devs. This way, NVIDIA can use the exclusivity of its CUDA implementation as an anticompetitive advantage in its hardware business.
Copyright is, fundamentally, a wall between would-be collaborators. DRM is an implementation of that wall, but instead of isolating people, it isolates software. The wall DRM provides is not used to monopolize the distribution of content: it is used to construct moats in our software ecosystem.
There's a reason I prefer the experience of torrenting a Netflix rip over streaming Netflix on my Roku: the entire hardware+software stack is superior. I can actually sort and navigate my library. I can decode&render with my faster GPU. I can adjust the audio delay. I can adjust subtitle placement & font. I can mix the audio so that dialogue is actually audible. I can do frame interpolation with SVP (again using a better GPU than whatever your "smart" TV has onboard). I can seek forward&backward quickly without changing bitrate. I can let the credits play without being interrupted by an ad. The list goes on...
I don't want a goddamn CRT. I want modern hardware. The more we let corporations abuse us with DRM, the less compatible that hardware will be with real software.
The issue isn't preventing piracy, it is defending GPU market segmentation. In the old days you could flash Quadro firmware to Geforce cards and unlock features or modify clocks. The common thread is artificial scarcity.
Yes, you can never "plug the analog hole" completely, but you can definitely lock stuff down to the point it's impractical for 95% of people.
For instance, imagine some sort of audio / video fingerprint system that resides in Intel and/or nVidia's GPU drivers. Content gets played through the on-GPU HEVC / h.264 decoders already. Doesn't seem like a huge stretch to add a fingerprint authentication system to that stage.
Have a list of content IDs that are protected, and require a valid license to play.
Yes, your source file is unprotected (video camera in front of monitor), but all of your devices are unable to play it.
Yes, your ancient, circa 2024 desktop PC will still play it, but your new 2030 model TV implements this fingerprint system as well so you can't just cast this file to your 100" display in your living room.
This is to say nothing of other forms of content (applications / games / web pages) that actually could require attestation / DRM HW / always-on internet to run.
I was thinking of someone hacking a capture device that sniffs the output matrix of a display in order to capture the video and has a line-in plugged into the drivers on the speaker. Way out of reach of most people, but only a very small number of people need to be have the wherewithal to do it to keep the pirate scene going, especially if they live in countries that don't care about your DRM laws. The analog hole exists so long as people don't have DRM directly implanted into their eyeballs.
As I understand it, that's common now - cheap HDMI splitters do the HDCP negotiation on the first port, and then the unencrypted digital video and audio signals are cloned to both ports, ready to be captured.
Oh yeah, I had to buy one of these (I called it the HDCP defeater) because my receiver was otherwise unable to forward the negotiation between the Roku and the TV fast enough. I would turn on the TV and the screen would blink on and off for several minutes before the HDCP handshake managed to win the race. In theory those devices might be defeated with newer versions of the protocol, but that part that drives the matrix of pixels can never be encrypted until you have DRM built directly into your eyeballs.
But hardware backed DRM can be so much more invasive beyond that. I have no doubts the long term goal of MS is to have a Windows version of Play Integrity.[0] So total control over everything that happens on your device. Just to give an example of what could happen if this becomes reality: https://en.m.wikipedia.org/wiki/Web_Environment_Integrity
This tech extended to browsers could easily mean that sites could refuse to serve you if your machine is running any bigcorp unapproved software. An easy example of that would be adblockers.
Unless we get lucky with secure world compromises like the Tegra X1 bootrom exploit[1] or get real good at passing legistlation that forces companies to give you all the private keys to your own machine, the future for personal computing is looking grim.
[0]: https://developer.android.com/google/play/integrity
[1]: https://github.com/fail0verflow/shofel2