HN2new | past | comments | ask | show | jobs | submit | RedComet's commentslogin

If the owner of a device can't sign and install their own software, then your definition of PKI doesn't "work" at all.

The first party must be able to entirely decide that "some third party" for it to be anything more than an obfuscation of digital serfdom.


The difference between “PKI” and “just signing with a private key” is the trusted authority infrastructure. Without that you still get the benefit of signatures and some degree of verification, you can still validate what you install.

But in reality this trustworthiness check is handed over by the manufacturer to an infrastructure made up of these trusted parties in the owner’s name, and there’s nothing the owner can do about it. The owner may be able to validate software is signed with the expected key but still not be able to use it because the device wants PKI validation, not owner validation.

I’ve been self-signing stuff in my home and homelab for decades. Everything works just the same technically but step outside and my trustworthiness is 0 for everyone else who relies on PKI.


[flagged]


> My definition of PKI is the one we’re using for TLS, some random array of “trusted” third parties can issue keys

Maybe read the actual definition before assuming you're so much smarter than "HN". One doesn't need third parties to have pki, it's a concept, you can roll out your own


“read the actual definition”;stellar contribution there, mate. I checked and sure enough its exactly in line with my comments.

I’ve been discussing the practical implementation of PKI as it exists in the real world, specifically in the context of bootloader verification and TLS certificate validation. You know, the actual systems people use every day.

But please, do enlighten me with whatever Wikipedia definition you’ve just skimmed that you think contradicts anything I’ve said. Because here’s the thing: whether you want to pedantically define PKI as “any infrastructure involving public keys” or specifically as “a hierarchical trust model with certificate authorities,” my point stands completely unchanged.

In the context that spawned this entire thread, LineageOS and bootloader signature verification, there is a chain of trust, there are designated trusted authorities, and signatures outside that chain are rejected. That’s PKI. That’s how it works. That’s what I described.

If your objection is that I should have been more precise about distinguishing between “Web PKI” and “PKI generally,” then congratulations on missing the forest for the trees whilst simultaneously contributing absolutely nothing of substance to the discussion.

But sure, I’m the one who needs to read definitions. Perhaps you’d care to actually articulate which part of my explanation was functionally incorrect for the use case being discussed, rather than posting a single snarky sentence that says precisely nothing?

EDIT: your edit is much more nuanced but still misses the point; https://imgur.com/a/n2VwltC


The snarky tone and sarcasm are not helping your case in this thread.

The tone matched the engagement I received. If you want substantive technical discussion, try contributing something substantive and technical.

I've explained the same point three different ways now. Not one person has actually demonstrated where the technical argument is wrong, just deflected to TOFU comparisons, philosophical ownership debates, and now tone policing.

If Aachen has an actual technical refutation, I'm all ears. But "read the definition" isn't one, and neither is complaining about snark whilst continuing to avoid the substance.


> I've explained the same point three different ways now.

But you're demonstrably wrong. The purpose of a PKI is to map keys to identities. There's no CA located across the network that gets queried by the Android boot process. Merely a local store of trusted signing keys. AVB has the same general shape as SecureBoot.

The point of secure boot isn't to involve a third party. It's to prevent tampering and possibly also hardware theft.

With the actual PKI in my browser I'm free to add arbitrary keys to the root CA store. With SecureBoot on my laptop I'm free to add arbitrary signing keys.

The issue has nothing to do with PKI or TOFU or whatever else. It's bootloaders that don't permit enrolling your own keys.


> The purpose of a PKI is to map keys to identities

No, the purpose is "can I trust this entity". The mapping is the mechanism, not the purpose.

> There's no CA located across the network that gets queried by the Android boot process

You think browser PKI queries CAs over the network? It doesn't. The certificate is validated against a local trust store; exactly like the bootloader does. If it's not signed by a trusted authority in that store, it's rejected. Same mechanism.

> The point of secure boot isn't to involve a third party

SecureBoot was designed by Microsoft, for Microsoft. That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

> The issue has nothing to do with PKI [...] It's bootloaders that don't permit enrolling your own keys

Right, so in the context of locked bootloaders (the actual discussion) "unsigned" and "signed by an untrusted key" produce identical results: rejection.

Where exactly am I "demonstrably wrong"?


Look I'm not even clear where you're trying to go with this. You honestly just come across as wanting to argue pointlessly.

You compared bootloader validation to TLS verification. The purpose of TLS CAs is to verify that the entity is who they claim to be. Nothing more, nothing less. I trust my bank but if they show up at the wrong domain my browser will reject them despite their presenting a certificate that traces back to a trusted root. It isn't a matter of trust it's a matter of identity.

Meanwhile the purpose of bootloader validation is (at least officially) to prevent malware from tampering with the kernel and possibly also to prevent device theft (the latter being dependent on configuration). Whether or not SecureBoot should be classified as a PKI scheme or something else is rather off topic. The underlying purpose is entirely different from that of TLS.

> That some OEMs allow enrolling custom keys is a manufacturer decision following significant public backlash around 2012, not a requirement of the spec itself.

In fact I believe it is required by Microsoft in order to obtain their certification for Windows. Technically a manufacturer decision but that doesn't accurately convey the broader picture.

Again, where are you going with this? It seems as though you're trying to score imaginary points.

> Where exactly am I "demonstrably wrong"?

Your claimed that the point of SecureBoot is to involve a third party. It is not. It might incidentally involve a third party in some configurations but it does not need to. The actual point of the thing is to prevent low level malware.


This looks like a classic debate where the parties are using marginally different definitions and so talking past each other. You're obviously both right by certain definitions. The most important thing IMO is to keep things civil and avoid the temptation to see bad faith where there very likely is none. Keep this place special.

I said, from the point of view of the bootloader: signed with an untrusted certificate and unsigned are effectively the same thing.

Somehow this was controversial.


Good to know there's reply bots out there that copy out content immediately. I rarely run into edit conflicts (where someone reads before I add in another thing) but it happens, maybe this is why. Sorry for that

Besides the "what does pki mean" discussion, as for who "misses the point" here, consider that both sides in a discussion have a chance at having missed the original point of a reply (it's not always only about how the world is / what the signing keys are, but how the world should be / whose keys should control a device). But the previous post was already in such a tone that it really doesn't matter who's right, it's not a discussion worth having anymore


You misunderstood, it appears.

Or its collective ignorance, can’t be sure.

Public key infrastructure without CAs isn’t a thing as far as I can see, I’m willing to be proven wrong, but I thought the I in PKI was all about the CA system.

We have PGP, but that's not PKI, thats peer-based public key cryptography.


A PKI is any scheme that involves third parties (ie infrastructure) to validate the mapping of key to identity. The US DoD runs a massive PKI. Web of trust (incl. PGP) is debatably a form of PKI. DID is a PKI specification. You can set up an internal PKI for use with ssh. The list goes on.

I don't know what's going on in this thread. Of course PKI needs some root of trust. That root HAS to be predefined. What do people think all the browsers are doing?

Lineage is signed, sure. It needs to be blessed with that root for it to work on that device.


They're assuming PKI is built on a fixed set of root CAs. That's not the case, as others have pointed out - only for major browsers. Subtle nuance, but their shitty, arrogant tone made me not want to elaborate.

"Subtle nuance" he says, after I've spent multiple comments explaining that bootloaders reject unsigned and untrusted-signed code identically, whilst he and others insist there's some meaningful technical distinction (which none of you have articulated).

Then you admit you actually understood this the entire time, but my tone put you off elaborating.

So you watched this thread pile on someone for being technically correct, said nothing of substance, and now reveal you knew they were right all along but simply chose not to contribute because you didn't like how they said it.

That's not you taking the high road, mate. That's you admitting you prioritised posturing over clarity, then got smug about it.

Brilliant contribution. Really moved the discourse forward there.


You seem angry. Perhaps some time away from the message boards would be beneficial.

Still not elaborating on that "subtle nuance," I see.

Strange claiming you "called it', people were saying this far earlier than 8 days ago.


"once you set the watch up"

This does require an iPhone though, right?


Access to someone’s, once.

An achievable bar for people buying Dick Tracy computers.


if i remember right, to activate the security (touch id like) feature you need the phone to be on your account. there are more restrictions than just pure activation.


Even for cellular models?

That seems…ripe for disruption.

Why is nobody selling a cellular standalone watch?

Not enough units to make manufacturing worth it?

Is this a chicken-and-egg problem?


They are signed, though. Just not by Google.


Signed by any non-authorised signature is the same as unsigned from security perspective.


“Running binaries signed either by yourself or by whoever wants to spy on you.”

That last part there is the problem.


Let's ignore all of the preinstalled programs, which are signed by Google and do a great deal of spying.

Do you think the 100 most popular F-Droid apps do more spying than the 100 most popular Play store apps?


No, that’s a straw man. The popular ones are not the concern.


A straw man in your favor, maybe. Shall we compare the 100 least popular of each store?

Those are more likely to be outright malware on Play.


The popularity in app stores has no bearing. Some problem apps can be on no store, just locally installed. This has been well covered in the past and you are playing catch up. It’s about abusive household members who spy on their grown children, siblings, roommates, girlfriends, parents, etc. with apps they install on their devices if given a route to do so.


The sideloading change doesn't protect against abusive household members, though. Simple lock screen hygiene and periodic reminders about invasive permissions (e.g. accessibility & location) would do more. And let us not even pretend that is the true motivation for the change. An incidental consequence that you find defensible is simply that.


It's an excuse. Give me the option to install the software I see fit. Period.


Is this not a meaningless differentiation if Google does no assume any responsibility for apps on the Play Store?


You cannot (generally) install and run apps that aren't (recently) notarized, though. They do owe the service inasmuch as they require it for installing and running apps.


Yeah, the OS preinstalled on the phone functions that way. But this is not in opposition to your ownership of the physical device. You can do still do whatever you want with the phone. Grab a hot plate and pull off the NAND, chuck the whole thing in a blender, anything -- knock yourself out.


By analogy, if food was sold with poison in it, "hey man, you bought it, just remove it if you don't like it. not a chemist? crack a book buddy". And now imagine you had no means of producing your own food and all food sold contained poison.

If unlocking an iPhone and running e.g. AOSP on it were feasible, people would be doing it. And you know that. Your argument is disingenuous.


Food with poison in it is both criminally and civilly illegal, and it puts peoples lives in danger.

Equating something like this to closed source software is why some people don’t take FOSS seriously.

You might think I was being facetious, but I’m being completely serious: the only way for FOSS to compete is by producing good products and bringing them to market. If FOSS advocates keep trying to fight some software licensing culture war instead of producing good technology, they’re not going to change anyone’s mind. 99.999% of people do not give two shits about a software license, they just want to use a damn phone.


It was an analogy. You're moving the goalposts and ignored the latter point.

And I'm not a foss advocate, I just want to be able to run software of my choosing and without spyware, as has been the case since the advent of personal computing.

As a side note, legality seems irrelevant to your position. What if a world government mandated optional sideloading + unlocking? Wouldn't you then argue against that law?


I know it’s an analogy. I just think it was a bad one. The desire for nerds to run unusual software on their phone is not really a life or death situation. I think it’s important to remember that in context, the number of us who care about this issue rounds to about zero. Most people using a phone don’t care.

I also want to run the software of my choosing. But there’s not a single phone you can completely do that with. Some of this is due to design decisions, some of it is due to corporate lock-in, and some of it is due to regulatory requirements.

I wouldn’t be against a law requiring side loading and unlocking, I would be in favor of it. This only addresses part of the software on a phone, though. There’s a lot of software on a phone beyond user space applications.

But I do think it would be reasonable to put some hurdles to make it difficult to do. There are completely valid reasons to protect the average user from being scammed by malicious software.


It sounds like we largely agree, then, so I'm not sure what you were arguing in the first place. That because the companies are legally able to do this and that [hardware-based] jailbreaking is possible in theory, it can't be opposed?

To your other point, firmware is another battle entirely and currently has less practical value.


Yet it happens all the time. More than half of Android phones are infected. So again, a poor argument for security. If anything, by opening it up, we (the collective nerds) could help harden it. Protect it. Improve it.


Not very "carbon friendly". But the recycler they mention will probably kindly ship it do a foreign landfill.


The malware excuse is just a palatable false pretense. "We have to protect granny!" Of course, she is getting fleeced by plain scam calls, not somehow sideloading apks onto her idevice, but the truth doesn't help advance their narrative.


Granny can get scammed using Anydesk, available on Google Play.


Imagine that metaphorical granny that in an instant catches fire and turns into ash if the governments and large corporations don't have complete control over our lives.

What a lovely granny that totally exists.


I suspect it's not grandma getting scammed by APKs, but people installing cracked versions of spotify/youtube/paid games.


> cracked versions of spotify/youtube/paid games

This doesn't make much sense to me.

To put the strongest face on it, by "cracked" youtube, you mean a version that shows the cracker's ads and maybe somehow generates extra clicks (or whatever) so they can get money out of it?

Cracked spotify? In my mind that's just like YouTube, almost entirely server-side. I guess you're talking about hijacking ads here, too? I feel like a "real" crack of Spotify would let you listen to music for free, but that should be impossible (unless their SWE's are incompetent).


You are approaching as is the malicious developer was trying to add useful features for the users.

But in practice, these “apps that lookalike popular apps” are not intended to just be adware-less versions of the popular apps. They are frequently “hide the ads, inject the malware with more permissions” Trojan horses.


I think there is likely a dual motive from Google where they both want to stop malware _and_ stop people blocking youtube ads. The malware problem is real though.


Yes but using a real problem as a vehicle for increased control and permission is, in it of itself, a Trojan horse.

Google is doing the same thing the fake apps are doing. Real problem: bad ads. Solution: cracked app. Trojan: too many permissions, steals data.

Google: problem: bad apps. Solution: advanced Google DRM. Trojan: too many permissions, steals data.


They mean apps like SmartTube, Vanced, Instander, Spotify Premium Mod which block ads or grant other premium features for free.


no, cracked as in the ad-free premium versions, without paying for them


Those "cracked" versions often require extra permissions.

My favorite was a local "discover which on your contacts is on the leaked Covid quarantine list[1]" scam app. It claimed that the extra permission dialogs are just fearmongering by Google, who is in cahoots with big pharma, and wants covid to spread to sell more medications.

[1] In fact, no such leak has ever taken place, its existence was just part of the setup for the scam.


My mother in law is constantly worried by some Google Ads in random apps that her phone is hacked...


Did she ever get anything side loaded like that? I have downloaded malware by mistake before. Not once were they allowed to proceed with installation. The only way I got anything side loaded was if I installed the first one (which is always Fdroid) deliberately via ADB after I enabled the developer mode.


No, her phone is clean. The point is GAds quite often are of questionable quality with bullshit scaring unaware people. But then as a solution G worry of grannies being tricked into installing APKs so they turn into gatekeepers of side loading completely for everyone - absurd.


Cell carriers will just start requiring the attestation as well. And eventually, even an internet connection will - wifi routers will have to attest to ISP equipment, etc.

The final phase is "AI" monitoring everything you do on your devices. Eventually it won't just be passive, either, but likely active: able to change books you read and audio you listen to on-the-fly without your consent. It will be argued that this ok because the program is "objective".


At this point, I would stop using commercial cell carriers and ISP-provided equipment altogether, even if that means setting up mesh networks with an underground community. User control or bust.


Will you elaborate?


If all major platforms do so in concert, it actually is the same.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: