A year ago I used Azure Trusted Signing to codesign FOSS software that I distribute for Windows. It was the cheapest way to give away free software on that platform.
A couple of months ago I needed to renew the certificate because it expired, and I ran into the same issue as the author here - verification failed, and they refused to accept any documentation I would give them. Very frustrating experience, especially since there no human support available at all, for a product I was willing to pay and use!
We ended up getting our certificate sourced from https://signpath.org and have been grateful to them ever since.
tsujamin 1 days ago [-]
For what it’s worth, Trusted Signing verification has been a moving target over the last 12 months. It was open for individuals, then it was closed to anyone except (iirc) US businesses with DUNS numbers, then it opened again to US based individuals (and a few other countries perhaps).
My completely uninformed guess was that someone had done something naughty with Trusted Signing-issued code signing certificates.
Anyway, when I first saw the VeraCrypt thing this morning my initial reaction was “I wonder if this is them pushing developers onto trusted signing the hard way?”
michaelt 21 hours ago [-]
I don't know anything about Trusted Signing verification, but I do know from reports on 'mini umbrella company fraud' that if you're a fraudster, there are people in the Philippines who will happily sign their name to western countries' official paperwork in exchange for $2000 or so. Understandably, as that's more than the country's median annual income.
So I can see why offering trusted signing for individuals worldwide would come with certain challenges.
VadimPR 1 days ago [-]
I'm in Europe and ended up creating an organization since I have my own company, but they messed up the verification of one of the legitimate documents, and there was no way to reach them once they made that mistake. Frustrating, and definitely a lost customer for them.
dolmen 14 hours ago [-]
Anyway, when I first saw the VeraCrypt thing this morning my initial reaction was “I wonder if Iran uses VeraCrypt”
account42 13 hours ago [-]
It's absurd that anyone should pay Microsoft or their goons anything to provide free software for their platform. Code signing is a scam.
riedel 1 days ago [-]
I like the idea of a central signing authority for open source. While this might go against the spirit of open source, I think it eventually creates a critical mass and outcry if Microsoft or Google would play games with them. Also foundations might be a good way to protect against legal trouble distributing OSS under different regulations. I am imagining e.g. an FDroid that plays Googles game. With reproducible or at least audited builds also some trusted authorities could actually produce more trusted builds especially at times of supply chain attacks. However, I think such distribution authorities would need really good governance and a lot of funding.
AnthonyMouse 24 hours ago [-]
There is no real advantage of a central signing authority. If you use Debian the packages are signed by Debian, if you use Arch they're signed by Arch, etc. And then if one of them gets compromised, the scope of compromise is correspondingly limited.
You also have the verification happening in the right place. The person who maintains the Arch curl package knows where they got it and what changes they made to it. Some central signing authority knows what, that the Arch guy sent them some code they don't have the resources to audit? But then you have two different ways to get pwned, because you get signed malicious code if a compromised maintainer sends it to the central authority be signed or if the central authority gets compromised and signs whatever they want.
woodruffw 23 hours ago [-]
All PKI topologies have tradeoffs. The main benefit to a centralized certification/signing authority is that you don't have to delegate the complexity of trust to peers in the system: a peer knows that a signature is valid because it can chain it back to a pre-established root of trust, rather than having to establish a new degree of trust in a previously unknown party.
The downside to a centralized authority is that they're a single point of failure. PKIs like the Web PKI mediate this by having multiple central authorities (each issuing CA) and forcing them to engage in cryptographically verifiable audibility schemes that keep them honest (certificate transparency).
It's worth noting that the kind of "small trusted keyring" topology used by Debian, Arch, etc. is a form of centralized signing. It's just an ad-hoc one.
AnthonyMouse 22 hours ago [-]
> a peer knows that a signature is valid because it can chain it back to a pre-established root of trust, rather than having to establish a new degree of trust in a previously unknown party.
So the apt binary on your system comes with the public keys of the Debian packagers and then verifies that packages are signed by them, or by someone else whose keys you've chosen to add for a third party repository. They are the pre-established root of trust. What is obtained by further centralization? It's just useless indirection; all they can do is certify the packages the Debian maintainers submit, which is the same thing that happens when they sign them directly and include their own keys with the package management system instead of the central authority's, except that now there isn't a central authority to compromise everyone at once or otherwise introduce additional complexity and attack surface.
> PKIs like the Web PKI mediate this by having multiple central authorities (each issuing CA) and forcing them to engage in cryptographically verifiable audibility schemes that keep them honest (certificate transparency).
Web PKI is the worst of both worlds omnishambles. You have multiple independent single points of failure. Compromising any of them allows you to sign anything. Its only redeeming quality is that the CAs have to compete with each other and CAA records nominally allow you to exclude CAs you don't use from issuing certificates for your own domain, but end users can't exclude CAs they don't trust themselves, most domain owners don't even use CAA records and a compromised CA could ignore the CAA record and issue a certificate for any domain regardless.
> It's worth noting that the kind of "small trusted keyring" topology used by Debian, Arch, etc. is a form of centralized signing. It's just an ad-hoc one.
Only it isn't really centralized at all. Each package manager uses its own independent root of trust. The user can not only choose a distribution (apt signed by Debian vs. apt signed by Ubuntu), they can use different package management systems on the same distribution (apt, flatpak, snap, etc.) and can add third party repositories with their own signing keys. One user can use the amdgpu driver which is signed by their distribution and not trust the ones distributed directly by AMD, another can add the vendor's third party repository to get the bleeding edge ones.
This works extremely well. There are plenty of large trustworthy repositories like the official ones of the major distributions for grandma to feel safe in using, but no one is required to trust any specific one nor are people who know what they're doing or have a higher risk tolerance inhibited from using alternate sources or experimental software.
woodruffw 18 hours ago [-]
> What is obtained by further centralization?
Nothing, I can’t think of a reason why you would want to centralize further. But that doesn’t mean it isn’t already centralized; the fact that every Debian ISO comes with the keyring baked into it demonstrates the value of centralization.
> Each package manager uses its own independent root of trust.
Yes, each is an independent PKI, each of which is independently centralized. Centralization doesn’t mean one authority; it’s just the way you distribute trust, and it’s the natural (and arguably only meaningful) way to distribute trust in a single-source packaging ecosystem like most Linux distros have.
M95D 9 hours ago [-]
> I like the idea of a central signing authority for open source.
It would be the most corrupt(ible) org ever involved in open source and it would promote locked-down computing, as that would be their main reason to exist. Be careful what you wish for!
VadimPR 1 days ago [-]
If someone is willing to put in the work in governance, FOSS projects would be willing to fund it - at least Mudlet would be. We get income from Patreon to cover the costs.
mschuster91 23 hours ago [-]
There is ossign.org, Certum offers a cheap certificate for FOSS [1], and Comodo offers relatively cheap (but still expensive) certs as well [2]. Not affiliated with either service, but these are the ones I remember last time I had to dig into this mess, so there might be even more services that I don't recall at the moment.
$300 / year for a code signing cert that won’t pass Smartscreen Filter is wild.
fl0id 23 hours ago [-]
isn't the issue more that this also needs to be included by default in Windows?
dns_snek 1 days ago [-]
This is precisely why we can't allow platform-owners to be the arbiters of what software is allowed to run on our devices. Any software signing that is deemed to be crucial for ensuring grandma-safety needs to be delegated to independent third parties without perverse incentives.
This is what the Digital Markets Act is supposed to protect developers against. Have there been any news regarding EU's investigation into Apple? Last I remember they were still reviewing their signing & fee-collection scheme.
duped 1 days ago [-]
There is nothing stopping you from using third party certificates to sign Windows binaries. It's just expensive. You don't even need a MS toolchain or CLI tool for it.
dns_snek 1 days ago [-]
> “Users who have enabled system encryption with VeraCrypt may face boot issues after July 2026 because Microsoft will revoke the [certificate authority] that was used to sign the VeraCrypt bootloader,” Idrassi said. “A new Microsoft CA must be used for bootloaders to continue working.”
> Without access to the Microsoft account used for sending software updates, “I will not be able to apply the required new signature to VeraCrypt, making it impossible to boot.”
Capricorn2481 21 hours ago [-]
> It's just expensive
So yes there is.
duped 6 hours ago [-]
Having a fee that's trivial for serious software developers but too high for script kiddies shipping trash is a good thing.
billziss 1 days ago [-]
It is not just VeraCrypt that has been affected by this. There is a bunch of Windows driver developers that have been suddenly kicked out of the "Partner Center" without explanation.
5eplay.com has also been suspended, as well as my company.
valeriozen 23 hours ago [-]
We are seeing the dark side of "Security as a Service". When Microsoft simplifies the signing pipeline (like with Trusted Signing), they also centralize the point of failure. The fact that a FOSS pillar like VeraCrypt can be sidelined due to what looks like an automated account flagging issue with no path to human arbitration shows that the current system is too fragile for critical infrastructure. Secure Boot is a great security feature, but it shouldnt be used as a tool for vendor lock in through administrative incompetence
heh the same company that controls your secure boot chain just killed the signing account for the tool that encrypts your disk
ai5iq 20 hours ago [-]
This is the same pattern playing out everywhere. The platform giveth, the platform taketh away. If your software's distribution depends on one company's good graces, you don't really ship it they do
salawat 5 hours ago [-]
But nooooooo. All of us screaming bloody murder about UEFI Secure Boot impl's and code signing, and how they were the fundamental primitives to locking users out of general computation were the "paranoid" ones.
The entire Trusted Computing initiative had exactly one benefactor, and it was people looking to constrain what you did on your own machine. Y'all just set up your "End-of-Analysis" goalposts too early, and blinded yourselves to the maliciousness bundled in silver tongued beneficent intentions.
We'd be better off as a society all recognizing the inherent risk of computation than lulling people into a habit of "trust us bro" espoused by platform providers. Anyone trying to sell Trust is someone you can't afford to be trusting of.
I'll live with the threat of rootkits if it means no one can pull this kind of shit.
romaniv 1 days ago [-]
I still hope that one of these days people in general will realize that executable signing and SecureBoot are specifically designed for controlling what a normal person can run, rather than for anything resembling real security. The premises of either of those "mitigations" make absolutely no sense for personal computers.
arcfour 1 days ago [-]
I strongly disagree on the Secure Boot front. It's necessary for FDE to have any sort of practical security, it reduces malicious/vulnerable driver abuse (making it nontrivial), bootkits are a security nightmare and would otherwise be much more common in malware typical users encounter, and ultimately the user can control their secure boot setup and enroll their own keys if they wish.
Does that mean that Microsoft doesn't also use it as a form of control? Of course not. But conflating "Secure Boot can be used for platform control" with "Secure Boot provides no security" is a non-sequitur.
whatevaa 1 days ago [-]
Full disk encryption protects from somebody yanking a hard drive from running server (actually happens) or stealing a laptop. Calling it useless because it doesn't match your threat model... I hate todays security people, can't threat model for shit.
AnthonyMouse 23 hours ago [-]
> Full disk encryption protects from somebody yanking a hard drive from running server (actually happens) or stealing a laptop.
Both of these are super easy to solve without secure boot: The device uses FDE and the key is provided over the network during boot, in the laptop case after the user provides a password. Doing it this way is significantly more secure than using a TPM because the network can stop providing the key as soon as the device is stolen and then the key was never in non-volatile storage anywhere on the device and can't be extracted from a powered off device even with physical access and specialized equipment.
tremon 23 hours ago [-]
> the device uses FDE and the key is provided over the network during boot
> The device uses FDE and they key is provided over the network during boot, in the laptop case after the user provides a password.
Sounds nice on paper, has issues in practice:
1. no internet (e.g. something like Iran)? Your device is effectively bricked.
2. heavily monitored internet (e.g. China, USA)? It's probably easy enough for the government to snoop your connection metadata and seize the physical server.
3. no security at all against hardware implants / base firmware modification. Secure Boot can cryptographically prove to the OS that your BIOS, your ACPI tables and your bootloader didn't get manipulated.
AnthonyMouse 23 hours ago [-]
> no internet (e.g. something like Iran)? Your device is effectively bricked.
If your threat model is Iran and you want the device to boot with no internet then you memorize the long passphrase.
> heavily monitored internet (e.g. China, USA)? It's probably easy enough for the government to snoop your connection metadata and seize the physical server.
The server doesn't have to be in their jurisdiction. It can also use FDE itself and then the key for that is stored offline in an undisclosed location.
> no security at all against hardware implants / base firmware modification. Secure Boot can cryptographically prove to the OS that your BIOS, your ACPI tables and your bootloader didn't get manipulated.
If your BIOS or bootloader is compromised then so is your OS.
mschuster91 21 hours ago [-]
> If your threat model is Iran
Well... they wouldn't be the first ones to black out the Internet either. And I'm not just talking about threats specific to oneself here because that is a much different threat model, but the effects of being collateral damage as well. Say, your country's leader says something that makes the US President cry - who's to say he doesn't order SpaceX to disable Starlink for your country? Or that Russia decides to invade yet another country and disables internet satellites [1]?
And it doesn't have to be politically related either, say that a natural disaster in your area takes out everything smarter than a toaster for days if not weeks [2].
> If your BIOS or bootloader is compromised then so is your OS.
well, that's the point of the TPM design and Secure Boot: that is not true any more. The OS can verify everything being executed prior to its startup back to a trusted root. You'd need 0-day exploits - while these are available including unpatchable hardware issues (iOS checkm8 [3]), they are incredibly rare and expensive.
> Say, your country's leader says something that makes the US President cry - who's to say he doesn't order SpaceX to disable Starlink for your country?
Then you tether to your phone or visit the local library or coffee shop and use the WiFi, or call into the system using an acoustic coupler on an analog phone line or find a radio or build a telegraph or stand on a tall hill and use flag semaphore in your country that has zero cell towers or libraries, because you only have to transfer a few hundred bytes of protocol overhead and 32 bytes of actual data.
At which point you could unlock your laptop, assuming it wasn't already on when you lost internet, but it still wouldn't have internet.
> The OS can verify everything being executed prior to its startup back to a trusted root.
Code that asks for the hashes and verifies them can do that, but that part of your OS was replaced with "return true;" by the attacker's compromised firmware.
Snoozus 15 hours ago [-]
The boot verification code wasn't replaced, because it sits in the encrypted partition.
AnthonyMouse 17 minutes ago [-]
You're assuming the attacker never had write access to the encrypted partition. If the system was compromised while it was running, they do. If the key is in a TPM and they can extract it using a TPM vulnerability or specialized equipment, they do. And keeping the key in the TPM instead of a remote system or external media also gives them a third one: Now the system can boot up and unlock the partition by itself by running the original signed boot chain, giving the attacker the opportunity to compromise the now-running OS using DMA attacks etc. Or they can stick it in a drawer without network access to receive updates until someone publishes a relevant vulnerability in the version of the OS that was on it when it was stolen.
Whereas if the FDE key was never stored on the device to begin with, they need the system to be running and unlocked before the victim realizes they have access, which is the one where you're screwed in either case.
Notice that if they can replace the device without you noticing then they can leave you one that displays the same unlock screen as the original but sends any credentials you enter to the attacker. Once they've had physical access to the device you can't trust it. The main advantage of FDE is that they can't read what was on a powered off device they steal, and then the last thing you want is for the FDE key to be somewhere on the device that they could potentially extract instead of on a remote system or removable media that they don't have access to.
amatecha 22 hours ago [-]
they said network, not internet :)
arcfour 21 hours ago [-]
I (the commenter you responded to) am a security engineer by trade and I'm arguing that SB is useful. I'm not sure if the parent commenter is or isn't a security person but my interactions with other people in the security field have given me the impression that most of them think it's good, too.
So I'm a little confused about the "can't threat model for shit part," I think these sorts of attacks are definitely within most security folks threat models, haha
account42 12 hours ago [-]
Security professionals wanting to have security solutions they can sell to people doesn't mean that those people actually need or benefit from those solutions. Security professionals tend to vastly overestimate the relevant threat models relevant for regular people and have no concern for anything other than so-called security.
serf 1 days ago [-]
>It's necessary for FDE to have any sort of practical security
why? do you mean because evil maid attacks exist? anyone that cared enough about that specific vector just put their bootloader on a removable media. FDE wasn't somehow enabled by secure boot.
>bootkits are a security nightmare and would otherwise be much more common in malware
why weren't they more common before?
serious question. Back in the 90s viruses were huge business, BIOS was about as unprotected as it would ever possibly be, and lots of chips came with extra unused memory. We still barely ever saw those kind of malware.
arcfour 1 days ago [-]
> anyone that cared enough about that specific vector just put their bootloader on a removable media. FDE wasn't somehow enabled by secure boot.
Sure, but an attacker could still overwrite your kernel which your untouched bootloader would then happily run. With SB at least in theory you have a way to validate the entire boot chain.
> why weren't they more common before?
Because security of the rest of the system was not at the point where they made sense. CIH could wipe system firmware and physically brick your PC - why write a bootkit then? Malware then was also less financially motivated.
When malware moved from notoriety-driven to financially-driven in the 2000s, bootkits did become more common with things like Mebroot & TDL/Alureon. More recently, still before Secure Boot was widespread, we had things like the Classic Shell/Audacity trojan which overwrote your MBR: https://www.youtube.com/watch?v=DD9CvHVU7B4 and Petya ransomware. With SB this is an attack vector that has been largely rendered useless.
It's also a lot more difficult to write a malicious bootloader than it is to write a usermode app that runs itself at startup and pings a C2 or whatever.
AnthonyMouse 23 hours ago [-]
> Sure, but an attacker could still overwrite your kernel which your untouched bootloader would then happily run.
Except that it's on the encrypted partition and the attacker doesn't have the key to unlock it since that's on the removable media with the boot loader.
They could write garbage to it, but then it's just going to crash, and if all they want is to destroy the data they could just use a hammer.
arcfour 23 hours ago [-]
The attacker does this when the drive is already unlocked & the OS is running.
Backdooring your kernel is much, much more difficult to recover from than a typical user-mode malware infection.
AnthonyMouse 23 hours ago [-]
> The attacker does this when the drive is already unlocked & the OS is running.
But then you're screwed regardless. They could extract the FDE key from memory, re-encrypt the unlocked drive with a new one, disable secureboot and replace the kernel with one that doesn't care about it, copy all the data to another machine of the same model with compromised firmware, etc.
cyberax 1 days ago [-]
> serious question. Back in the 90s viruses were huge business,
No, they were not. They were toys written for fun and/or mischief. The virus authors did not receive any monetary reward from writing them, so they were not even a _business_. So they were the work of individuals, not large teams.
The turning point was Bitcoin. Suddenly it provided all those nice new business models that can be scaled up: mining, stealing cryptowallets, ransomware, etc.
jyrkesh 21 hours ago [-]
Malware was absolutely used to sell botnet access in the 90s, millions of Windows machines were used for DDoS and as anonymous proxies
otterley 18 hours ago [-]
The '90s was a bit too soon for that. Most people using the Internet then were still on dialup, to the extent they were connected at all. There weren't that many DDoSes yet. Even the Trin00 DDoS in 1999 only involved 114 machines.
cyberax 18 hours ago [-]
DDoS for sale were not a big thing until Bitcoin. You couldn't transfer meaningful amounts anonymously.
And no, lol. There were no million machine botnets in 90-s. You could DDoS the entire countries with a few dozen computers, Slammer did that accidentally with Korea.
jrm4 24 hours ago [-]
Secure Boot provides no useful security for an individual user on the machine they own, and as such should be disabled by default.
If you want to enable it for enterprise/business situations, thats fine, but one should be clear about that. Otherwise you get the exact Microsoft situation you mentioned and also no one knows about it.
arcfour 23 hours ago [-]
So everyday users should be vulnerable to bootkits and kernel-mode malware...why, exactly? That is useful security. The fact that people do not pursue this type of malware very frequently is an effect of SB proliferation. If it were not the default then these attacks would be more popular.
account42 12 hours ago [-]
Every day users care most about the files in their home directory (or cloud services these days). The OS kernel and ring 0 isn't any more important to them than that.
jrm4 4 hours ago [-]
Ooh, I like this argument a lot. Right now I'm thinking a good analogy is, you live in a gated community, but the locks on your house and your ring camera are fine -- but your overly annoying gate system makes it hard for people or deliveries to get to you etc.
romaniv 7 hours ago [-]
This is a tiresome argument that is based on a pile of unstated and rather shaky assumptions, ignores the very concept of opportunity costs and does not consider alternative solutions to the problems you seem to consider so important.
Fir starters, UEFI Secure Boot is actually rater bad at protecting users from bootkits or kernel-mode malware or anything, really. You can search this very website to get a giant list of bypasses and news about leaked vendor keys. Not to mention the fact that CrowdStrike Falcon incident had clearly demonstrated that Microsoft is more than happy to sign utterly insecure garbage.
Also, the issues with boot malware and kernel verification could be solved in many other ways, many of which are much more sensible or elegant. For example, by storing the bootloader and its keys on a physically separate read-only medium.
The issues with UEFI Secure Boot are actually the main point of the system, just like the issues with Windows executable signing are the whole point of that system.
jrm4 22 hours ago [-]
Citation please.
There are so many vectors for malware, can't say I'm just going to accept this one on pure "because it's possible."
All of these attacks would be thwarted by SB (and in Petya's case, simply having UEFI enabled at all, since that was only for BIOS machines)
jrm4 4 hours ago [-]
No. The existence of actually dangerous bootkits in relation to ease of use of UEFI, ease of prevention, likelihood and magnitude of harm of said bootkits and adverse secondary problems when UEFI is used.
burnt-resistor 21 hours ago [-]
You're arguing for not wearing seatbelts because no evidence has been shown that anyone has ever been saved by wearing one has been presented. That's just stupid by refuting ubiquitously understood data and facts.
SecureBoot ensures a valid, signed OS is installed and that the boot process generally hasn't been completely compromised in a difficult-to-mitigate manner. It provides a specific guarantee rather than universal security. Talking about "many vectors" has nothing to do with SecureBoot or boot-time malware.
jrm4 4 hours ago [-]
Absolutely ridiculously terrible analogy.
Car accidents kill and injure LOTS OF PEOPLE EVERY DAY. Provably, and the burden to require the installation of, and -- more importantly -- to use one, is tiny.
Boot malware is nothing like this, either in magnitude of harm nor in ease of use by the user.
This is perhaps closer to preventing car theft by adding facial recognition software that renders the car unusable.
account42 12 hours ago [-]
No, you're arguing for wearing a spacesuit when riding your bicicle.
fsflover 24 hours ago [-]
Instead of proprietary SecureBoot controlled by megacorps, you can use TPM with Heads based entirely on FLOSS with a hardware key like Librem Key. Works for me and protects from the Evil Maid attack.
arcfour 21 hours ago [-]
You can also use SB with your own keys (or even just hashes)...just because Microsoft is the default included with most commercially sold PCs—since most people use Windows on their PCs—doesn't mean SB is controlled by them. You can remove their signing cert entirely if you want. I have done this and used my own.
Plus they signed the shim loader for Linux anyways so they almost immediately gave up any "control" they might have had through SB.
kelseyfrog 1 days ago [-]
Anything that restricts user freedom is entirely bad, even if it's at the expense of security.
arcfour 1 days ago [-]
But...it doesn't restrict user freedom. If the user wishes to do so, they can disable SB.
CodesInChaos 1 days ago [-]
And will then be locked out from an increasing amount of Applications, Media, and eventually even Websites.
arcfour 1 days ago [-]
I run Linux with Secure Boot and I don't feel locked out of any media, applications, or websites.
My mom uses Secure Boot with Windows and doesn't know or care that it's enabled at all.
heavyset_go 19 hours ago [-]
The OP is describing the status quo on mobile phones and tablets. On mobile Secure Boot, and systems like it, are used to lock out the user. If the boot path integrity is altered, some apps won't work or will provide degraded experiences.
What's happening the article is what has already happened on mobile: it requires vendor signing to run anything on mobile OS and the vendor locks out 3rd party drivers from their OS entirely.
It's yet another step towards desktop computing converging with mobile when it comes to software/firmware/boot/etc integrity attestation, app distribution and signing, and the ability to use your own bootloader and system drivers. When Secure Boot was first rolled out on laptops, it was used by Microsoft to lock the user out of the boot process before it was adapted to let users register their own keys, it can always be used for its original purpose, and how it's currently used on mobile, again.
kelseyfrog 1 days ago [-]
They shouldn't _have_ to do anything. The point is that no demands should be placed upon users.
Same problem with age gating. It's fine, as long as zero additional demands are placed upon users.
robotresearcher 1 days ago [-]
Freedom from the consequences of malware is more valuable than the low cost of turning SecureBoot off if you don’t want it.
We shouldn’t need the hassle of locks on our home and car doors, but we understand they are probably worthwhile for most people.
thisislife2 1 days ago [-]
Do you lock your house or car and permanently handover the keys to some stranger, who you then have to depend on always to lock or unlock it for you?
dwattttt 1 days ago [-]
No? I have locks on my house and car that I have the keys for. That an argument _for_ secure boot.
jrm4 24 hours ago [-]
It is absolutely not.
It's a decent one for "locks on an apartment building that someone else owns."
But no, purchasing a house ought not include by default "a set of locks that you must work around, permission-wise."
robotresearcher 23 hours ago [-]
Funnily enough, when you buy a house, the first task is to change all the locks.
Y’know, for security.
jrm4 22 hours ago [-]
Sure. Now, of the people who buy houses -- how many of them would find this a difficult or onerous task?
And then, do computers.
Apples and oranges here, for this point.
Spooky23 23 hours ago [-]
Sorry dwattttt, I’m unable to verify your identity and your keys are disabled. If you have an issue, please fax a copy of your DUNS number.
dwattttt 17 hours ago [-]
You don't have the ability to revoke my keys on this machine, that's the point. Not even MS could do that, because these are _my_ keys. The alternative proposed here is no keys at all.
aeternum 1 days ago [-]
What's the improved security argument for terminating VeraCrypt's account though? SB does have clear benefits but what is unclear is the motivation for the account termination.
What's the likelihood that this account ban provides zero security benefit to users and was instead a requirement from the gov because Veracrypt was too hard to crack/bypass.
UrMomsRobotLovr 1 days ago [-]
Are the demands that users become experts in provider their own security against more advanced actors not significantly worse? The control part is unfortunate but the defaults should make it so users can focus on sharing pictures of cats without fear or need for advanced cyber security knowledge.
bigfatkitten 1 days ago [-]
Users who care enough to do so can enrol their own keys using the extremely well documented process to do that.
Users who don’t care about the runtime integrity of their machine can just turn it off.
Both options are so easy that you could’ve learned how to do them on your machine in the time that you spent posting misinformation in this thread.
brookst 1 days ago [-]
So like banks requiring you to have a PIN on your ATM card, even if you don’t want one… that’s bad? Seatbelt laws are bad?
1 days ago [-]
astrobe_ 1 days ago [-]
I don't know about executable signing, but in the embedded world SecureBoot is also used to serve the customer; id est provide guarantees to the customer that the firmware of the device they receive has not been tampered with at some point in the supply chain.
tosti 1 days ago [-]
Computers should abide by their owners. Any computer not doing that is broken.
ghighi7878 1 days ago [-]
Its a simple solution in law to enable. Force manufacturers to allow owners of computer to put any signing key in the BIOS.
We need this law. Once we have this law, consumers csn get maximum benefit of secure boot withiut losing contorl
miki123211 1 days ago [-]
But that's how it already works.
If you install Windows first, Microsoft takes control (but it graciously allows Linux distros to use their key). If you install Linux first, you take control.
It's perfectly possible for you to maintain your own fully-secure trust chain, including a TPM setup which E.G. lets you keep a 4-digit pin while keeping your system secure against brute force attacks. You can't do that with the 1990s "encryption is all you need" style of system security.
ray_v 19 hours ago [-]
It's funny, but I just encountered this for the first time the other day - feels like I had to do a lot of digging to find out how to do this so that I could add my LUKS key to my TPM... really felt like it took some doing on the HP all-in-one that I was trying to put debian on... maybe because it was debian being debian
201984 23 hours ago [-]
Most embedded processors sadly don't have a BIOS, and the signing key is permanently burned into the processor via eFUSEs.
astrobe_ 3 hours ago [-]
Yes, BIOS is really a PC-thing, AFAIK. Embedded processors have "bootloaders" which often serve a similar purpose of performing the minimal viable hardware initializations in order to load the OS kernel.
PunchyHamster 1 days ago [-]
> Its a simple solution in law to enable. Force manufacturers to allow owners of computer to put any signing key in the BIOS.
...it's already allowed. The problem is that this isn't the default, but opt in that you need quite a lot of knowledge to set up
cferry 1 days ago [-]
I make the analogy with a company, because on that front, ownership seems to matter a lot in the Western world. It's like it had to have unfaithful management appointed by another company they're a customer of, as a condition to use their products. Worse, said provider is also a provider for every other business, and their products are not interoperable. How long before courts jump in to prevent this and give back control to the business owner?
wat10000 1 days ago [-]
This gets tricky. If I click on a link intending to view a picture of a cat, but instead it installs ransomware, is that abiding by its owner or not? It did what I told it to do, but not at all what I wanted.
ghighi7878 1 days ago [-]
We dont need to get philosophical here. You(the admin) can require you (the user) to input a password to signify to you(the admin) to install a ransomware when a link is clicked. That way no control is lost.
wat10000 1 days ago [-]
What if the cat pictures are an app too? The computer can't require a password specifically for ransomware, just for software in general. The UI flow for cat pictures apps and ransomware will be identical.
Zak 1 days ago [-]
A computer that can run arbitrary programs can necessarily run malicious ones. Useful operations are often dangerous, and a completely safe computer isn't very useful.
Some sandboxing and a little friction to reduce mistakes is usually wise, but a general-purpose computer that can't be broken through sufficiently determined misuse by its owner is broken as designed.
tosti 23 hours ago [-]
If you connect your computer to the Internet, it can get hacked. If you leave it logged in unattended or don't use authentication, someone else can use it without your permission.
This isn't rocket science and it has nothing to do with artificially locking down a computer to serve the vendor instead of the owner.
Edit: I'd like to add that no amount of extra warranty from the vendors are going to cover the risk of a malware infection.
account42 11 hours ago [-]
The ransomware can encrypt the files in your home directory just as well with secure boot enabled.
This is just another example of how secure boot provides zero additional security for the threat modes normal users face.
And what if that customer wants to run their own firmware, ie after the manufacturer goes out of business? "Security" in this case conveniently prevente that.
astrobe_ 3 hours ago [-]
Well, that's a different market. What I say is that there are markets in which customers wants to be sure that the firmware is from "us".
And those markets are certainly not IoT gizmos, which I suspect induce some knee-jerk reactions and I understand that cause I'm a consumer too.
But big/serious customers actually look at the wealthiness of the company they buy from, and would certainly consider running their own firmware on someone else's product; they buy off-the-shelf products because it's not their domain of expertise (software development and/or whatever the device does), most of the times.
hhh 1 days ago [-]
you click the box to turn off secure boot
201984 23 hours ago [-]
And how do you do that on some locked down embedded device? Say, a thermostat for instance.
bakugo 1 days ago [-]
...and then some essential software you need to run detects that and refuses to run. See where the problem is here?
bigfatkitten 1 days ago [-]
It does no such thing if you enrol your own keys using the extremely well documented process to do that.
greycol 1 days ago [-]
It's fair to think of secure boot in only the PC context but the model very much extends to phones. It seems ridiculous to me that to use a coupon for a big mac I have to compromise on what features my phone can run (either by turning on secure boot and limiting myself to stock os or limiting myself to the features and pricing of the 1 or 2 phones that allow re-locking).
account42 11 hours ago [-]
And the PC situation is only a leftover due to historical circumstances that will be "corrected" in due time. Microsoft already tried this once with their ARM devices.
201984 23 hours ago [-]
Where is this "extremely well documented process" to enroll new signing keys on an embedded device? I don't see one for any of these embedded processors with secure boot.
Then that customer shouldn't buy a device that doesn't allow for their use case. Exercise some personal agency. Sheesh.
bakugo 1 days ago [-]
What happens when there are no more devices that allow for that use case? This is already pretty much the case for phones, it's only a matter of time until Microsoft catches up.
fsflover 24 hours ago [-]
There are still phones not obeying the megacorps. Sent from my Librem 5.
bakugo 24 hours ago [-]
Does your Librem 5 run banking apps, though?
fsflover 23 hours ago [-]
Waydroid allows to run Android apps that don't require SafetyNet. If your bank forces you into the duopoly with no workaround, it's a good reason to switch.
account42 11 hours ago [-]
And you only have that option as long as people oppose that secure boot enabled dystopia.
gjsman-1000 1 days ago [-]
Tradeoffs. Which is more likely here?
1. A customer wants to run their own firmware, or
2. Someone malicious close to the customer, an angry ex, tampers with their device, and uses the lack of Secure Boot to modify the OS to hide all trace of a tracker's existence, or
3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself
Apple uses #2 and #3 in their own arguments. If your Mac gets hacked, that's bad. If your iPhone gets hacked, that's your life, and your precise location, at all times.
samlinnfer 1 days ago [-]
1. P(someone wants to run their own firmware)
2. P(someone wants to run their own firmware) * P(this person is
malicious) * P(this person implants this firmware on someone else’s computer)
3. The firmware doesn’t install itself
Yeah I think 2 and 3 is vastly less likely and strictly lower than 1.
mikestew 1 days ago [-]
As an embedded programmer in my former life, the number of customers that had the capability of running their own firmware, let alone the number that actually would, rapidly approaches zero. Like it or not, what customers bought was an appliance, not a general purpose computer.
(Even if, in some cases, it as just a custom-built SBC running BusyBox, customers still aren't going to go digging through a custom network stack).
account42 11 hours ago [-]
The customers don't have to install the firmware themselves, they can have a friend do it or pay a repair shop. You know, just like they can with non-computerized tools that they don't fully understand.
mikestew 6 hours ago [-]
I’m not talking about your buddy’s Android phone, the context was embedded systems with firmware you’re not going to find on xda developers. A “friend” isn’t going to know jack shit about installing firmware on an industrial control.
itsdesmond 1 days ago [-]
This guy thinks that if you rephrase an argument but put some symbols around it you’ve refuted it statistically.
P(robably not)
samlinnfer 1 days ago [-]
The argument is that P(customer wants to run their own firmware) cancels out and 2,3 are just the raw probability of you on the receiving end of an evil maid attack. If you think this is a high probability, a locked bootloader won’t save you.
FabHK 1 days ago [-]
Very neat, but 1) is not really P(customer wants to run their own firmware), but P(customer wants to run their own firmware on their own device).
So, the first term in 1) and 2) are NOT the same, and it is quite conceivable that the probability of 2) is indeed higher than the one in 1) (which your pseudo-statistical argument aimed to refute, unsuccessfully).
philistine 1 days ago [-]
As if the monetary gain of 2 and 3 never entered the picture. Malicious actors want 2 and 3 to make money off you! No one can make reasonable amounts of money off 1.
the__alchemist 1 days ago [-]
I encourage you to re-evaluate this. How many devices do you (or have you) own which have have a microcontroller? (This includes all your appliances, your clocks, and many things you own which use electricity.) How many of these have you reflashed with custom firmware?
Imagine any of your friends, family, or colleagues. (Including some non-programmers/hackers/embedded-engineers) What would their answers be?
account42 11 hours ago [-]
I would reflash almost all my appliances if I could do so easily since they all come with non-optimal behavior for me.
gjsman-1000 1 days ago [-]
On Android, according to the Coalition Against Stalkerware, there are over 1 million victims of deliberately placed spyware on an unlocked device by a malicious user close to the victim every year.
#2 is WAY more likely than #1. And that's on Android which still has some protections even with a sideloaded APK (deeply nested, but still detectable if you look at the right settings panels).
As for #3; the point is that it's a virus. You start with a webkit bug, you get into kernel from there (sometimes happens); but this time, instead of a software update fixing it, your device is owned forever. Literally cannot be trusted again without a full DFU wipe.
samlinnfer 1 days ago [-]
And where are the stats for people running their own firmware and are not running stalkerware for comparison? You don’t need firmware access to install malware on Android, so how many of stalkerware victims actually would have been saved by a locked bootloader?
gjsman-1000 1 days ago [-]
The entirety of GrapheneOS is about 200K downloads per update. Malicious use therefore is roughly 5-1.
> You don’t need firmware access to install malware on Android, so how many of stalkerware victims actually would have been saved by a locked bootloader?
With a locked bootloader, the underlying OS is intact, meaning that the privileges of the spyware (if you look in the right settings panel) can easily be detected, revoked, and removed. If the OS could be tampered with, you bet your wallet the spyware would immediately patch the settings system, and the OS as a whole, to hide all traces.
kuschku 1 days ago [-]
LineageOS alone has around 4 million active users. So malicious use is at most 1:4, not 5:1.
samlinnfer 1 days ago [-]
Assuming that we accept your premise that the most popular custom firmware for Android is stalkerware (I don’t). This is of course, a firmware level malware, which of course acts as a rootkit and is fully undetectable. How did the coalition against stalkerware, pray tell, manage to detect such an undetectable firmware level rootkit on over 1 million Android devices?
dns_snek 13 hours ago [-]
> The entirety of GrapheneOS is about 200K downloads per update. Malicious use therefore is roughly 5-1.
Can you stop this bad faith bullshit please? "Stalkerware" is an app, not an alternate operating system, according to your own source. You're comparing the number of malicious app installs to the number of installs of a single 3rd party Android OS which is rather niche to begin with.
You don't need to install an alternate operating system to stalk someone. And in fact that's nearly impossible to do without the owner noticing because the act of unlocking the bootloader has always wiped the device.
> The Coalition Against Stalkerware defines stalkerware as software, made available directly to individuals, that enables a remote user to monitor the activities on another user’s device without that user’s consent and without explicit, persistent notification to that user in a manner that may facilitate intimate partner surveillance, harassment, abuse, stalking, and/or violence. Note: we do not consider the device user has given consent when apps merely require physical access to the device, unlocking the device, or logging in with the username and password in order to install the app.
> Some people refer to stalkerware as ‘spouseware’ or ‘creepware’, while the term stalkerware is also sometimes used colloquially to refer to any app or program that does or is perceived to invade one’s privacy; we believe a clear and narrow definition is important given stalkerware’s use in situations of intimate partner abuse. We also note that legitimate apps and other kinds of technology can and often do play a role in such situations.
This assumes a high level of technical skill and effort on the part of the stalkerware author, and ignores the unlocked bootloader scare screen most devices display.
If someone brought me a device they suspected was compromised and it had an unlocked bootloader and they didn't know what an unlocked bootloader, custom ROM, or root was, I'd assume a high probability the OS is malicious.
account42 11 hours ago [-]
> And that's on Android which still has some protections even with a sideloaded APK (deeply nested, but still detectable if you look at the right settings panels).
Exactly, secure boot advocates once again completely miss that it doesn't protect against any real threat models.
lazide 1 days ago [-]
Clearly you’ve never met my ex’s (or a past employer). Not even being sarcastic this time.
wombatpm 1 days ago [-]
You expect that stuff to happy with 3 letter agencies.
lazide 1 days ago [-]
Sorry, I have no idea what you are trying to say.
account42 11 hours ago [-]
> 2. Someone malicious close to the customer, an angry ex, tampers with their device, and uses the lack of Secure Boot to modify the OS to hide all trace of a tracker's existence, or
Lol security people are out of their mind if they think that's actually a relevant concern.
> 3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself
Oh no so now the malware can only permanently encrypt all the users files and permanently leak their secrets. But hey at least the user can repair the operating system instead of having to reinstall it. And in practice they can't even be sure about that because computers are simply too complex.
dns_snek 1 days ago [-]
#2 and #3 are fearmongering arguments and total horseshit, excuse the strong language.
Should either of those things happen the bootloader puts up a big bright flashing yellow warning screen saying "Someone hacked your device!"
I use a Pixel device and run GrapheneOS, the bootloader always pauses for ~5 seconds to warn me that the OS is not official.
root_axis 1 days ago [-]
Yes. They're making the point that your flashing yellow warning is a good thing, and that it's helpful to the customer that a mechanism is in place to prevent it from being disabled by an attacker.
dns_snek 1 days ago [-]
No, they've presented a nonsense argument which Apple uses to ban all unofficial software and firmware as if it had some merit.
burstmode 1 days ago [-]
I don't know about executable signing, but in the embedded world SecureBoot is also used to serve the PRODUCER; id est provide guarantees to the PRODUCER that the firmware of the device they SELL has not been tampered with at some point in the PROFIT chain.
astrobe_ 3 hours ago [-]
Frankly: that's stupid. In case you didn't figure it out, I work in the field and I can tell you that this is was not the mindset at the places where I worked.
11 hours ago [-]
Galanwe 1 days ago [-]
> id est provide guarantees to the customer that the firmware of the device they receive has not been tampered with
The firmware of the device being a binary blob for the most part... Not like I trust it to begin with.
Whereas my open source Linux distribution requires me to disables SecureBoot.
What a world.
WhyNotHugo 1 days ago [-]
You can set up custom SecureBoot keys on your firmware and configure Linux to boot using it.
There's also plenty of folks combining this with TPM and boot measurements.
The ugly part of SecureBoot is that all hardware comes with MS's keys, and lots of software assume that you'll want MS in charge of your hardware security, but SecureBoot _can_ be used to serve the user.
Obviously there's hardware that's the exception to this, and I totally share your dislike of it.
Galanwe 1 days ago [-]
> You can set up custom SecureBoot keys on your firmware and configure Linux to boot using it.
Right, but as engineers, we should resist the temptation to equate _possible_ with _practical_.
The mere fact that even the most business oriented Linux distributions have issues playing along SecureBoot is worrying. Essentially, SB has become a Windows only technology.
The promise of what SB could be useful for is even muddier. I would argue that the chances of being victim of firmware tampering are pretty thin compared to other attack vectors, yet somehow we end up all having SB and its most significant achievement is training people that disabling it is totally fine.
dwattttt 1 days ago [-]
[dead]
repelsteeltje 1 days ago [-]
+1
An unsigned hash is plenty guard to against tampering. The supply chain and any secret sauce that went into that firmware is just trust. Trust that the blob is well intentioned, trust that you downloaded from the right URL, checked the right SHA, trust that the organization running the URL is sanctioned to do so by Microsoft...
Once all of that trust for every piece of software is concentrated in one organization, Microsoft, Apple or Google, is has become totally meaningless.
mort96 1 days ago [-]
It's to serve the regulators. The Radio Equipment Directive essentially requires the use of secure boot fir new devices.
petcat 1 days ago [-]
I happen to like knowing that my mobile device did not have a ring 0 backdoor installed before it left the factory in Asia. SecureBoot gives me that confidence.
mort96 1 days ago [-]
No it doesn't? The factory programs in the secure boot public keys
petcat 1 days ago [-]
The public keys are provided by the developer. Google, or Apple, for example. It's how they know that nothing was tampered with before it left the factory.
mort96 14 hours ago [-]
This is true for phones but not for IoT in general.
realusername 1 days ago [-]
Nothing has been tampered with doesn't mean there's no factory backdoor, it just only means same as factory, nothing more.
petcat 1 days ago [-]
Apple or Google know what the cryptographic signature of the boot should be. They provide the keys. It's how they know that "factory reset" does not include covert code installed by the factory. That's what we're talking about.
PunchyHamster 1 days ago [-]
well, unless govt tells MS to tamper it
1 days ago [-]
pjmlp 1 days ago [-]
If only people didn't install Ask Jeeves toolbars all over the place and then asked their grandson during vacations to clean their computer.
account42 11 hours ago [-]
Hey I made some good money from that as a kid. And some of the malware that people ended up with was also fairly visually pleasing to a teenager.
prerok 1 days ago [-]
Geez, this brings back memories.
At one time at our university we had table desktop dancers installed everywhere. Was kind of funny when it turned up just as a student wanted to defend their work in a lab.
gloosx 16 hours ago [-]
Executable signing is also designed to make easy money from selling certificates
gusfoo 1 days ago [-]
> I still hope that one of these days people in general will realize that executable signing and SecureBoot are specifically designed for controlling what a normal person can run, rather than for anything resembling real security
For home/business users I'd agree. But in Embedded / money-handling then it's a life-saver and a really important technology.
account42 11 hours ago [-]
If by "really important technology" you mean it lets companies save a bit on fraud-related expenses then sure. But the world worked just fine with much simpler solutions because secure boot or not we have plenty of ways to discourage most people from committing crimes.
TitaRusell 24 hours ago [-]
Videogames are increasingly demanding secure boot.
asveikau 1 days ago [-]
Apple is also somewhat responsible for the attitude shift with the introduction of iOS. 20-25 years ago a locked down bootloader and only permitting signed code would have been seen by techies as dystopian. It's now quite normalized. They say it's about security but it's always been about control.
Stallman tried to warn us with "tivoization".
duped 1 days ago [-]
This is like saying you shouldn't vaccinate your kids because no one gets polio anymore
account42 11 hours ago [-]
We we don't just pump our kids with any vaccine ever developed "just in case" either. Instead we weight actual risk against possible side effects - a concept most security people seem to be unable to grasp.
MetroWind 5 hours ago [-]
Apple has trained people to believe their software needs to be signed by someone they don't know or trust.
saltamimi 1 days ago [-]
I'm confused why they can't just generate their own signing key and deploy it alongside the installer.
Using arbiter platforms like this sounds like a great way to footgun yourself.
Someone1234 1 days ago [-]
Because a bad guy can also generate their own signing key and deploy it alongside the installer.
See Notepad++ for how that winds up.
saltamimi 1 days ago [-]
Then you can publish the public Code Signing certificate for download/import or publish it through WinGet.
Using Azure Trusted Signing or any other certificate vendor does not guarantee that a binary is 100% trustworthy, it just means someone put their name on it.
blindriver 22 hours ago [-]
It's okay. I'm pretty sure after 40+ years of using Microsoft products I'm going to switch fully to Linux and MacOS. I'm tired of fighting against Microsoft even though I am a long time (and mostly happy) user of Windows. But whatever is going on in the last few years, especially Recall, has made it dangerous in my opinion to keeping Windows. So as they become and more draconian it only makes my decision easier and easier. I've had Macs and Macbooks for a while now but I bought the latest Macbook Pro and I'm very very happy with it, despite Glass (I barely notice any differences from the previous version).
There's a good reason everyone calls them microslop these days. The sooner we're all able to ditch this crappy company, the better - they're actively holding back the tech industry at this point
mbix77 1 days ago [-]
Yea, I'm in the process of converting our complete ETL infrastructure from SSIS/SQL Server to Python/PostgreSQL. Next step is Office 365, which will be more difficult, but doable since we are a small company anyway.
aetherspawn 16 hours ago [-]
Apple suite (Pages, Numbers etc) works well, has good mobility to mobile, and is free.
Apple Mail and Apple Calendar are fine to replace Exchange, as is Thunderbird(see 1), but Mail is more turnkey (1-click pairs with MS Exchange)
You can downgrade your O365 licenses to Exchange Plan 1 and keep your email hosting at a tiny fraction of the price of full 365 suite.
(1): Beware thunderbird has an open and unsolved bug that randomly deletes all your emails, kind of like a 1d1 million dice roll.
stvltvs 1 days ago [-]
Are you converting the SSIS automatically somehow or rewriting it?
tonyedgecombe 1 days ago [-]
They have been holding back the tech industry for decades now.
embedding-shape 1 days ago [-]
To be fair, the tech industry been holding itself back for decades now too, since lots of people seemingly have somewhat low prices to go from being a FOSS evangelist to wearing a "Microsoft <3 Open Source" t-shirt.
trueno 1 days ago [-]
that's just a byproduct of "job creators" holding the keys to a comfortable life over everyones head.
i dont think its fair to conflate the tech industries self-owns with microsofts damages. microsoft has for decades poured untold resources and money into capturing everything they possibly could to sustain themselves with honestly what i call cultural and software vendor lock. we're only just now seeing the gaming industry take its first real footsteps towards non-windows targets, but for the most part the decades of evangelizing Microsoft apis and bankrolling schools and education systems to carry courses for their way of doing things makes that a particularly uphill battle thats going to take a lot more time. people have built entire careers out of the microsoft-way in multiple industries. pure microsoft houses are still everywhere at many orgs, so many of them don't even recognize that there is another path. there's plenty of infra/dbadmin/devops people who are just pure windows still. there's multiple points where microsoft did have the best in class solution for something, but these days you'd be hard pressed to not go another way if you were starting from scratch. problem is such a lift and shift is really hard to do for orgs that have spent decades being a microsoft shop.
in a roundabout way, this sort of translates to real long lasting impact/damage to me. microsoft has always been such a force over history that it caused a massive rift in computing. no matter how much they embrace linux and claim to not fight the uphill battle of open source anymore, that modus operandi of locking people into their suite of things still exists on so many fronts and is in some ways more in your face than it's ever been. there's no benefit of the doubt to give here, i just have a hard time choosing microsoft for... well anything.
ryandrake 1 days ago [-]
Microsoft has been trying to kill everything in computers that's not-Microsoft, for as long as I've been alive. Their actual power comes and goes, strengthens and weakens, but it's been a continuous background threat to personal computing since the first day something other than Microsoft tried to get traction in the industry.
BigTTYGothGF 1 days ago [-]
Looking at the rest of the tech industry in 2026 that might be a blessing.
p_ing 1 days ago [-]
What does this even mean? It's like throwing around the word 'bloat'.
BizarroLand 1 days ago [-]
We can explain it to you, but we can't understand it for you.
Explanation: Microslop is a power hungry, greedy and frankly evil corporation whose only goal is complete financial domination of the government, business, and personal tech industries. They actively promote making regressive software, increasing complexity, and hiding straightforward processes behind an information veil.
Example: Go to learn.microsoft.com and try to actually learn HOW to do anything. You'll read 35 pages of text talking about the concept of working with a specific microslop product but not 1 single explicit example of HOW to accomplish a specific task.
Example: Windows 11
Example: Copilot
The whole company is run by backassward tech hicks and digital yokels who can't think past a dime on the floor for a dollar in customer satisfaction, and somehow they run the majority of non-server space or personal device tech on the planet.
p_ing 23 hours ago [-]
Funny enough, I do read the Copilot Studio and Dynamics Customer Service and Power Platform documentation and understand it.
But reading documentation from any vendor is a skill. Don’t throw me in front of Google or Oracle documentation and expect me to understand it off the bat.
And of course companies in the US are wanting to make money/capture markets. They’re not a charity. None of that has any relation to holding back the industry. Unless you wish to explain how they hold back all FOSS projects.
You don’t need to be rude in your replies. This is HN, not reddit.
giancarlostoro 1 days ago [-]
Outside of work, I don't use Windows very often if at all. I have a 2017 laptop that Microsoft made, and it is so damn sluggish for absolutely no reason, its VERY VERY vanilla mind you.
leptons 1 days ago [-]
Apple also holds back the tech industry in many ways. All companies seem willing to put profits before progress.
red-iron-pine 1 days ago [-]
active directory and excel runs the world.
what is apple doing that is similar?
leptons 22 hours ago [-]
How is active directory and excel holding the tech industry back?
Apple is holding the tech industry back by forbidding any browser on iOS except Safari and then refusing to implement any APIs that would allow web applications to compete with their app store. Apple is choosing profit over progress.
trueno 1 days ago [-]
i remember years and years ago learning some posix/shell syntax and working in terminal. felt like my love for windows unraveled in real time. these days using windows... feel like i gotta take a shower after. like many i was just raised on windows it was the household operating system i had like 20 years of general computer usage under my belt on windows before i finally felt a mac trackpad for the first time. that hardware experience alone was the first pillar kicked out upholding my "windows is the best" philosophies. then i got into coding, then i tripped and fell out of hourly boeing slave labor into a sql job (lost 55% yearly income, no regrets yo). then i started discovering the open source world, and learned just how much computing goes on outside of the world of windows and how many insanely bright minds are out there contributing to... not microsoft. now i have linux and macos machines everywhere, i still haven't found the bottom but the last 6-7 years or so have been a really rich journey.
currently have a 32bit win xp env spun up in 86box just to compile a project in some omega old visual studio dotnet 7 and the service pack update at the time (don't ask). it is seriously _wild_ being in there, feels like stepping into a time machine. nostalgia aside, the OS is for the most part... quiet. doesn't bother you, everything is kind of exactly where you expect it to be, no noise in my start menu, there isnt some omega bing network callstack in my explorer, no prompts to o365 my life up.
it feels kinda sad, what an era that was. it's just more annoying to do any meaningful work in windows these days.
im currently working with c/cpp the idiot way (nothing about my story is ever conventional sigh), by picking a legacy project from like 22 years ago. this has forced me to step back into old redhat 7.1+icc5, old windows xp + dotnet7 like i explained above, and im definitely taking the most unpragmatic approach ever diving in here.. but there's one thing that absolutely sticks out to me: microsoft has always tried to capitalize on everything. tool? money. vendor lock. os? money. vendor lock. entire industries/education system capture? lotta money. lotta vendor lock. lotta generational knowledge lock.
they are lucky people are still using github. theyve tried to poke the bear a few times and theyre slowly but surely enshittifying the place, but im just kinda losing any reverence for microsoft altogether. microsoft has been big for a hot minute now, they have their eras. you can feel when things are driven by smart visionary engineers working behind the scenes, and you can tell when things are in pure slop mode microservice get rich or die trying mode. yea, microsoft has.. always been vendor-lock aggro and kinda hostile, but the current era microsoft is by far the grossest it's ever been. see: microsoft teams (inb4 "i use teams every day, i dont have a problem with it")
im aware people smarter than me can write diatribes on why windows is the best at x thing, but im only informed by my own experience of having to use all three (linux/macos/windows) for my professional work life: i grew up thinking windows was the best.. now im like mostly confident that windows is actually the worst lol. by a pretty damn decent margin. i was gaslit for ages
philistine 1 days ago [-]
> feel like i gotta take a shower after
I run Crossover and I feel like I gotta take a shower after. Just knowing there's a folder called drive_c on my Mac is the stuff of nightmares.
shevy-java 1 days ago [-]
Yeah. I felt in a similar manner when I moved to Linux. Microsoft seemed to make people dumber. I do actually use both Linux and Windows (Win10 only), largely for testing various things, including java-related software. But every time I use Windows, I am annoyed at how slow everything is compared to Linux. (I should mention that I compile almost everything from source on Linux, so most of the default Linux stack I don't use; many linux distributions also suck by default, so I have to uncripple the software stack. I also use versioned appdirs similar as to how GoboLinux does, but in a more free form.)
TheOtherHobbes 1 days ago [-]
Microsoft has spent most of its life as a corporate bureaucracy that produces sales-and-marketing content, some of which happens to moonlight as software.
1 days ago [-]
msla 1 days ago [-]
With Windows, you get what you pay for.
In this case, that's an OS controlled by an unaccountable company that can take application software away from you.
Related: If you're the customer, you're the product.
subscribed 1 days ago [-]
Hmmm, so basically Google but you also pay for it?
kgwxd 1 days ago [-]
ChromeOS and Android are definitely comparable.
Already__Taken 1 days ago [-]
Windows actually isn't very cheap.
stronglikedan 1 days ago [-]
agree, because "free" can be neither "cheap" nor "expensive"
jonathanstrange 1 days ago [-]
It's not free at all. If you buy Windows through the official channels it's quite expensive. If you buy it on the grey market, it's dirt cheap, though.
BizarroLand 22 hours ago [-]
And even if you pay $1,000,000/day to use it, it still spies on you and sells your data to outsiders.
shiroiuma 18 hours ago [-]
Exactly, yet lots of people are happy to keep using it and paying the price to do so, despite the existence of free and far superior alternatives.
panzi 1 days ago [-]
I see what you did there.
dark-star 1 days ago [-]
you can always either disable secureboot and driver signature verification, or (the better solution) just enroll your own certificate in your TPM and sign the driver with that...
askonomm 1 days ago [-]
Ah, yes, the [insert super inconvenient and complex thing to do that most people don’t know, want or should do] will solve it! And when that fails, surely the user can just write their own OS, right? Bunch of skill-issued complainers we the users are.
falcor84 1 days ago [-]
Well, the hope was always that those of us inconvenienced by M$ would all collectively contribute to making Linux distros more convenient for everyone. But we can't ever seem to get inconvenienced enough to actually sufficiently mobilize and/or coordinate such an effort.
weaksauce 1 days ago [-]
It does seem like linux is having its moment right now. there's the money and effort valve is putting into KDE making the steamdeck and steammachine polished for their hardware which helps all users of KDE. cachyos is making having a rolling distro really smooth and snappy on old hardware and making games work mostly ootb. stuff like winboat and wine will let you use the few windows apps you need. you are kinda stuck though if you want to use something like fusion360 or solidworks. freecad has improved quite a bit but it's still like gimp where it's slightly worse UX in a lot of ways.
SV_BubbleTime 1 days ago [-]
Valve is doing great work.
Now… maybe we could condense the 10,000 pointless distros down to a dozen? Oops, nope. Now 10,001, except this one has the menu bar in the middle of the screen and it moves around.
account42 11 hours ago [-]
The distros are not pointless. For every one of them there was a human being that wanted something to work differently and the nature of open source let them do it. That should be celebrated and the day we loose that flexibility would be a very sad day.
askonomm 8 hours ago [-]
This. Not to mention that for the mainstream users there are mainstream distros that are largely the same they have always been: Fedora, Ubuntu, Mint, so I never really understood the issue of having tons of distros out there for enthusiasts.
falcor84 10 minutes ago [-]
I think that both perspectives are right. We should celebrate diversity, but there's also power in consensus.
There needs to be some competition between ideas, but if every bit of disagreement about direction ends in "I'm going to build my own distro, with blackjack and hookers", then we as a community won't ever end up building something that can compete with the megacorps.
dark-star 1 days ago [-]
I mean, the super-easy option would be to just use BitLocker for FDE. No hassles, just works. But I fugured since everyone here on HN hates MS I wouldn't even bring that up. Don't trust MS? Enroll yourown keys
rstat1 1 days ago [-]
Yes use Bitlocker, the thing that uploads the encryption key to OneDrive "for convenience" thereby negating the whole point of FDE in the first place
dark-star 6 hours ago [-]
by default, yes. Can be disabled with a single click. That's something that even your Grandma can do, as opposed to installing VeraCrypt (with dozens of options on what to encrypt, and how, and when, ...)
rstat1 2 hours ago [-]
Well no actually I do not think either of my granmothers could have done that, nor would they have even known (or cared) what a Bitlocker even was.
malfist 1 days ago [-]
> or (the better solution) just enroll your own certificate in your TPM and sign the driver with that...
I'll tell Grandma that's what she needs to do.
pixel_popping 1 days ago [-]
Make sure that she setup a PKI infrastructure to manage certificate revocation as well, wouldn't want a bad grandson to mess with it.
p_ing 1 days ago [-]
Why would you put Grandma on VeriCrypt in the first place? It's the more 'difficult' option for FDE.
unethical_ban 1 days ago [-]
What's easier, and bitlocker doesn't count. I want my FDE to be based on a password or a keyfile, not simply by some code in the motherboard. I want it encrypted until I, the operator, provide some data to unlock.
In my limited experience with bitlocker, the disk is decryptable automatically as long as it's in the original motherboard.
p_ing 1 days ago [-]
> and bitlocker doesn't count.
Wat? Bitlocker is the answer to your question.
> In my limited experience with bitlocker, the disk is decryptable automatically as long as it's in the original motherboard.
It's unlocked (not decrypted) when the OS boots, yes. You can optionally enforce (not on Home) other unlock methods, such as PIN before the OS boots.
> I want my FDE to be based on a password or a keyfile, not simply by some code in the motherboard.
That's less secure than TPM.
unethical_ban 1 days ago [-]
If someone steals my laptop, and there is no factor of decryption requiring something I possess or know, then the only use of that disk being encrypted is that I can throw it out more safely at end of life. Thieves/LEO has the data because they have the motherboard.
If bitlocker has a PIN/passphrase decrypt option, then I missed it.
p_ing 23 hours ago [-]
While a thief or LEO could boot the OS, just having the motherboard doesn’t give them access to the underlying data. They would need to have a valid user account.
dark-star 7 hours ago [-]
you should protect your account with a password of course. that will be used to decrypt your drive/data
unethical_ban 3 hours ago [-]
It was not made clear to me that my username/password was the decryption method! I was expecting something like Linux where a separate password is needed.
Furthermore it wasn't intuitive to me that my user account would decrypt more than just my home directory.
dark-star 1 days ago [-]
your grandma is probably fine with BitLocker....
ntoskrnl_exe 1 days ago [-]
And they say Linux is inconvenient because you have to open the terminal every once in a while.
1 days ago [-]
shevy-java 1 days ago [-]
Microsoft wants to control computers. This is why they came up with InsecureBoot - or ad-hoc eliminating accounts willy-nilly style. Microsoft kind of acts like Google here. It is also interesting that the US government is doing absolutely nothing against this despicable behaviour.
red-iron-pine 1 days ago [-]
the US government is owned by corporate interests and has been in some capacity since inception. special mention to the Russians and Israelis and Saudis who also own a piece.
A couple of months ago I needed to renew the certificate because it expired, and I ran into the same issue as the author here - verification failed, and they refused to accept any documentation I would give them. Very frustrating experience, especially since there no human support available at all, for a product I was willing to pay and use!
We ended up getting our certificate sourced from https://signpath.org and have been grateful to them ever since.
My completely uninformed guess was that someone had done something naughty with Trusted Signing-issued code signing certificates.
Anyway, when I first saw the VeraCrypt thing this morning my initial reaction was “I wonder if this is them pushing developers onto trusted signing the hard way?”
So I can see why offering trusted signing for individuals worldwide would come with certain challenges.
You also have the verification happening in the right place. The person who maintains the Arch curl package knows where they got it and what changes they made to it. Some central signing authority knows what, that the Arch guy sent them some code they don't have the resources to audit? But then you have two different ways to get pwned, because you get signed malicious code if a compromised maintainer sends it to the central authority be signed or if the central authority gets compromised and signs whatever they want.
The downside to a centralized authority is that they're a single point of failure. PKIs like the Web PKI mediate this by having multiple central authorities (each issuing CA) and forcing them to engage in cryptographically verifiable audibility schemes that keep them honest (certificate transparency).
It's worth noting that the kind of "small trusted keyring" topology used by Debian, Arch, etc. is a form of centralized signing. It's just an ad-hoc one.
So the apt binary on your system comes with the public keys of the Debian packagers and then verifies that packages are signed by them, or by someone else whose keys you've chosen to add for a third party repository. They are the pre-established root of trust. What is obtained by further centralization? It's just useless indirection; all they can do is certify the packages the Debian maintainers submit, which is the same thing that happens when they sign them directly and include their own keys with the package management system instead of the central authority's, except that now there isn't a central authority to compromise everyone at once or otherwise introduce additional complexity and attack surface.
> PKIs like the Web PKI mediate this by having multiple central authorities (each issuing CA) and forcing them to engage in cryptographically verifiable audibility schemes that keep them honest (certificate transparency).
Web PKI is the worst of both worlds omnishambles. You have multiple independent single points of failure. Compromising any of them allows you to sign anything. Its only redeeming quality is that the CAs have to compete with each other and CAA records nominally allow you to exclude CAs you don't use from issuing certificates for your own domain, but end users can't exclude CAs they don't trust themselves, most domain owners don't even use CAA records and a compromised CA could ignore the CAA record and issue a certificate for any domain regardless.
> It's worth noting that the kind of "small trusted keyring" topology used by Debian, Arch, etc. is a form of centralized signing. It's just an ad-hoc one.
Only it isn't really centralized at all. Each package manager uses its own independent root of trust. The user can not only choose a distribution (apt signed by Debian vs. apt signed by Ubuntu), they can use different package management systems on the same distribution (apt, flatpak, snap, etc.) and can add third party repositories with their own signing keys. One user can use the amdgpu driver which is signed by their distribution and not trust the ones distributed directly by AMD, another can add the vendor's third party repository to get the bleeding edge ones.
This works extremely well. There are plenty of large trustworthy repositories like the official ones of the major distributions for grandma to feel safe in using, but no one is required to trust any specific one nor are people who know what they're doing or have a higher risk tolerance inhibited from using alternate sources or experimental software.
Nothing, I can’t think of a reason why you would want to centralize further. But that doesn’t mean it isn’t already centralized; the fact that every Debian ISO comes with the keyring baked into it demonstrates the value of centralization.
> Each package manager uses its own independent root of trust.
Yes, each is an independent PKI, each of which is independently centralized. Centralization doesn’t mean one authority; it’s just the way you distribute trust, and it’s the natural (and arguably only meaningful) way to distribute trust in a single-source packaging ecosystem like most Linux distros have.
It would be the most corrupt(ible) org ever involved in open source and it would promote locked-down computing, as that would be their main reason to exist. Be careful what you wish for!
[1] https://shop.certum.eu/open-source-code-signing.html
[2] https://comodosslstore.com/code-signing/comodo-individual-co...
This is what the Digital Markets Act is supposed to protect developers against. Have there been any news regarding EU's investigation into Apple? Last I remember they were still reviewing their signing & fee-collection scheme.
> Without access to the Microsoft account used for sending software updates, “I will not be able to apply the required new signature to VeraCrypt, making it impossible to boot.”
So yes there is.
https://community.osr.com/t/locked-out-of-microsoft-partner-...
https://nitter.net/windscribecom/status/2041929519628443943
> According to a post on Hacker News, the popular VPN client WireGuard is facing the same issue.
Microsoft is building things on top of it:
https://learn.microsoft.com/en-us/azure/aks/container-networ...
The entire Trusted Computing initiative had exactly one benefactor, and it was people looking to constrain what you did on your own machine. Y'all just set up your "End-of-Analysis" goalposts too early, and blinded yourselves to the maliciousness bundled in silver tongued beneficent intentions.
We'd be better off as a society all recognizing the inherent risk of computation than lulling people into a habit of "trust us bro" espoused by platform providers. Anyone trying to sell Trust is someone you can't afford to be trusting of.
I'll live with the threat of rootkits if it means no one can pull this kind of shit.
Does that mean that Microsoft doesn't also use it as a form of control? Of course not. But conflating "Secure Boot can be used for platform control" with "Secure Boot provides no security" is a non-sequitur.
Both of these are super easy to solve without secure boot: The device uses FDE and the key is provided over the network during boot, in the laptop case after the user provides a password. Doing it this way is significantly more secure than using a TPM because the network can stop providing the key as soon as the device is stolen and then the key was never in non-volatile storage anywhere on the device and can't be extracted from a powered off device even with physical access and specialized equipment.
An example of such an implementation, since well before TPMs were commonplace: https://www.recompile.se/mandos
Sounds nice on paper, has issues in practice:
1. no internet (e.g. something like Iran)? Your device is effectively bricked.
2. heavily monitored internet (e.g. China, USA)? It's probably easy enough for the government to snoop your connection metadata and seize the physical server.
3. no security at all against hardware implants / base firmware modification. Secure Boot can cryptographically prove to the OS that your BIOS, your ACPI tables and your bootloader didn't get manipulated.
If your threat model is Iran and you want the device to boot with no internet then you memorize the long passphrase.
> heavily monitored internet (e.g. China, USA)? It's probably easy enough for the government to snoop your connection metadata and seize the physical server.
The server doesn't have to be in their jurisdiction. It can also use FDE itself and then the key for that is stored offline in an undisclosed location.
> no security at all against hardware implants / base firmware modification. Secure Boot can cryptographically prove to the OS that your BIOS, your ACPI tables and your bootloader didn't get manipulated.
If your BIOS or bootloader is compromised then so is your OS.
Well... they wouldn't be the first ones to black out the Internet either. And I'm not just talking about threats specific to oneself here because that is a much different threat model, but the effects of being collateral damage as well. Say, your country's leader says something that makes the US President cry - who's to say he doesn't order SpaceX to disable Starlink for your country? Or that Russia decides to invade yet another country and disables internet satellites [1]?
And it doesn't have to be politically related either, say that a natural disaster in your area takes out everything smarter than a toaster for days if not weeks [2].
> If your BIOS or bootloader is compromised then so is your OS.
well, that's the point of the TPM design and Secure Boot: that is not true any more. The OS can verify everything being executed prior to its startup back to a trusted root. You'd need 0-day exploits - while these are available including unpatchable hardware issues (iOS checkm8 [3]), they are incredibly rare and expensive.
[1] https://en.wikipedia.org/wiki/Viasat_hack
[2] https://www.telekom.com/de/blog/netz/artikel/lost-place-und-...
[3] https://theapplewiki.com/wiki/Checkm8_Exploit
Then you tether to your phone or visit the local library or coffee shop and use the WiFi, or call into the system using an acoustic coupler on an analog phone line or find a radio or build a telegraph or stand on a tall hill and use flag semaphore in your country that has zero cell towers or libraries, because you only have to transfer a few hundred bytes of protocol overhead and 32 bytes of actual data.
At which point you could unlock your laptop, assuming it wasn't already on when you lost internet, but it still wouldn't have internet.
> The OS can verify everything being executed prior to its startup back to a trusted root.
Code that asks for the hashes and verifies them can do that, but that part of your OS was replaced with "return true;" by the attacker's compromised firmware.
Whereas if the FDE key was never stored on the device to begin with, they need the system to be running and unlocked before the victim realizes they have access, which is the one where you're screwed in either case.
Notice that if they can replace the device without you noticing then they can leave you one that displays the same unlock screen as the original but sends any credentials you enter to the attacker. Once they've had physical access to the device you can't trust it. The main advantage of FDE is that they can't read what was on a powered off device they steal, and then the last thing you want is for the FDE key to be somewhere on the device that they could potentially extract instead of on a remote system or removable media that they don't have access to.
So I'm a little confused about the "can't threat model for shit part," I think these sorts of attacks are definitely within most security folks threat models, haha
why? do you mean because evil maid attacks exist? anyone that cared enough about that specific vector just put their bootloader on a removable media. FDE wasn't somehow enabled by secure boot.
>bootkits are a security nightmare and would otherwise be much more common in malware
why weren't they more common before?
serious question. Back in the 90s viruses were huge business, BIOS was about as unprotected as it would ever possibly be, and lots of chips came with extra unused memory. We still barely ever saw those kind of malware.
Sure, but an attacker could still overwrite your kernel which your untouched bootloader would then happily run. With SB at least in theory you have a way to validate the entire boot chain.
> why weren't they more common before?
Because security of the rest of the system was not at the point where they made sense. CIH could wipe system firmware and physically brick your PC - why write a bootkit then? Malware then was also less financially motivated.
When malware moved from notoriety-driven to financially-driven in the 2000s, bootkits did become more common with things like Mebroot & TDL/Alureon. More recently, still before Secure Boot was widespread, we had things like the Classic Shell/Audacity trojan which overwrote your MBR: https://www.youtube.com/watch?v=DD9CvHVU7B4 and Petya ransomware. With SB this is an attack vector that has been largely rendered useless.
It's also a lot more difficult to write a malicious bootloader than it is to write a usermode app that runs itself at startup and pings a C2 or whatever.
Except that it's on the encrypted partition and the attacker doesn't have the key to unlock it since that's on the removable media with the boot loader.
They could write garbage to it, but then it's just going to crash, and if all they want is to destroy the data they could just use a hammer.
Backdooring your kernel is much, much more difficult to recover from than a typical user-mode malware infection.
But then you're screwed regardless. They could extract the FDE key from memory, re-encrypt the unlocked drive with a new one, disable secureboot and replace the kernel with one that doesn't care about it, copy all the data to another machine of the same model with compromised firmware, etc.
No, they were not. They were toys written for fun and/or mischief. The virus authors did not receive any monetary reward from writing them, so they were not even a _business_. So they were the work of individuals, not large teams.
The turning point was Bitcoin. Suddenly it provided all those nice new business models that can be scaled up: mining, stealing cryptowallets, ransomware, etc.
And no, lol. There were no million machine botnets in 90-s. You could DDoS the entire countries with a few dozen computers, Slammer did that accidentally with Korea.
If you want to enable it for enterprise/business situations, thats fine, but one should be clear about that. Otherwise you get the exact Microsoft situation you mentioned and also no one knows about it.
Fir starters, UEFI Secure Boot is actually rater bad at protecting users from bootkits or kernel-mode malware or anything, really. You can search this very website to get a giant list of bypasses and news about leaked vendor keys. Not to mention the fact that CrowdStrike Falcon incident had clearly demonstrated that Microsoft is more than happy to sign utterly insecure garbage.
Also, the issues with boot malware and kernel verification could be solved in many other ways, many of which are much more sensible or elegant. For example, by storing the bootloader and its keys on a physically separate read-only medium.
The issues with UEFI Secure Boot are actually the main point of the system, just like the issues with Windows executable signing are the whole point of that system.
There are so many vectors for malware, can't say I'm just going to accept this one on pure "because it's possible."
Petya/NotPetya, Alureon, Carberp/Rovnix, Gapz, LoJax (firmware rootkit!).
All of these attacks would be thwarted by SB (and in Petya's case, simply having UEFI enabled at all, since that was only for BIOS machines)
SecureBoot ensures a valid, signed OS is installed and that the boot process generally hasn't been completely compromised in a difficult-to-mitigate manner. It provides a specific guarantee rather than universal security. Talking about "many vectors" has nothing to do with SecureBoot or boot-time malware.
Car accidents kill and injure LOTS OF PEOPLE EVERY DAY. Provably, and the burden to require the installation of, and -- more importantly -- to use one, is tiny.
Boot malware is nothing like this, either in magnitude of harm nor in ease of use by the user.
This is perhaps closer to preventing car theft by adding facial recognition software that renders the car unusable.
Plus they signed the shim loader for Linux anyways so they almost immediately gave up any "control" they might have had through SB.
My mom uses Secure Boot with Windows and doesn't know or care that it's enabled at all.
What's happening the article is what has already happened on mobile: it requires vendor signing to run anything on mobile OS and the vendor locks out 3rd party drivers from their OS entirely.
It's yet another step towards desktop computing converging with mobile when it comes to software/firmware/boot/etc integrity attestation, app distribution and signing, and the ability to use your own bootloader and system drivers. When Secure Boot was first rolled out on laptops, it was used by Microsoft to lock the user out of the boot process before it was adapted to let users register their own keys, it can always be used for its original purpose, and how it's currently used on mobile, again.
Same problem with age gating. It's fine, as long as zero additional demands are placed upon users.
We shouldn’t need the hassle of locks on our home and car doors, but we understand they are probably worthwhile for most people.
It's a decent one for "locks on an apartment building that someone else owns."
But no, purchasing a house ought not include by default "a set of locks that you must work around, permission-wise."
Y’know, for security.
And then, do computers.
Apples and oranges here, for this point.
What's the likelihood that this account ban provides zero security benefit to users and was instead a requirement from the gov because Veracrypt was too hard to crack/bypass.
Users who don’t care about the runtime integrity of their machine can just turn it off.
Both options are so easy that you could’ve learned how to do them on your machine in the time that you spent posting misinformation in this thread.
We need this law. Once we have this law, consumers csn get maximum benefit of secure boot withiut losing contorl
If you install Windows first, Microsoft takes control (but it graciously allows Linux distros to use their key). If you install Linux first, you take control.
It's perfectly possible for you to maintain your own fully-secure trust chain, including a TPM setup which E.G. lets you keep a 4-digit pin while keeping your system secure against brute force attacks. You can't do that with the 1990s "encryption is all you need" style of system security.
...it's already allowed. The problem is that this isn't the default, but opt in that you need quite a lot of knowledge to set up
Some sandboxing and a little friction to reduce mistakes is usually wise, but a general-purpose computer that can't be broken through sufficiently determined misuse by its owner is broken as designed.
This isn't rocket science and it has nothing to do with artificially locking down a computer to serve the vendor instead of the owner.
Edit: I'd like to add that no amount of extra warranty from the vendors are going to cover the risk of a malware infection.
This is just another example of how secure boot provides zero additional security for the threat modes normal users face.
And those markets are certainly not IoT gizmos, which I suspect induce some knee-jerk reactions and I understand that cause I'm a consumer too.
But big/serious customers actually look at the wealthiness of the company they buy from, and would certainly consider running their own firmware on someone else's product; they buy off-the-shelf products because it's not their domain of expertise (software development and/or whatever the device does), most of the times.
https://pip-assets.raspberrypi.com/categories/1214-rp2350/do...
https://documentation.espressif.com/esp32_technical_referenc...
https://docs.amd.com/v/u/en-US/ug1085-zynq-ultrascale-trm
1. A customer wants to run their own firmware, or
2. Someone malicious close to the customer, an angry ex, tampers with their device, and uses the lack of Secure Boot to modify the OS to hide all trace of a tracker's existence, or
3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself
Apple uses #2 and #3 in their own arguments. If your Mac gets hacked, that's bad. If your iPhone gets hacked, that's your life, and your precise location, at all times.
2. P(someone wants to run their own firmware) * P(this person is malicious) * P(this person implants this firmware on someone else’s computer)
3. The firmware doesn’t install itself
Yeah I think 2 and 3 is vastly less likely and strictly lower than 1.
(Even if, in some cases, it as just a custom-built SBC running BusyBox, customers still aren't going to go digging through a custom network stack).
P(robably not)
So, the first term in 1) and 2) are NOT the same, and it is quite conceivable that the probability of 2) is indeed higher than the one in 1) (which your pseudo-statistical argument aimed to refute, unsuccessfully).
Imagine any of your friends, family, or colleagues. (Including some non-programmers/hackers/embedded-engineers) What would their answers be?
#2 is WAY more likely than #1. And that's on Android which still has some protections even with a sideloaded APK (deeply nested, but still detectable if you look at the right settings panels).
As for #3; the point is that it's a virus. You start with a webkit bug, you get into kernel from there (sometimes happens); but this time, instead of a software update fixing it, your device is owned forever. Literally cannot be trusted again without a full DFU wipe.
> You don’t need firmware access to install malware on Android, so how many of stalkerware victims actually would have been saved by a locked bootloader?
With a locked bootloader, the underlying OS is intact, meaning that the privileges of the spyware (if you look in the right settings panel) can easily be detected, revoked, and removed. If the OS could be tampered with, you bet your wallet the spyware would immediately patch the settings system, and the OS as a whole, to hide all traces.
Can you stop this bad faith bullshit please? "Stalkerware" is an app, not an alternate operating system, according to your own source. You're comparing the number of malicious app installs to the number of installs of a single 3rd party Android OS which is rather niche to begin with.
You don't need to install an alternate operating system to stalk someone. And in fact that's nearly impossible to do without the owner noticing because the act of unlocking the bootloader has always wiped the device.
> The Coalition Against Stalkerware defines stalkerware as software, made available directly to individuals, that enables a remote user to monitor the activities on another user’s device without that user’s consent and without explicit, persistent notification to that user in a manner that may facilitate intimate partner surveillance, harassment, abuse, stalking, and/or violence. Note: we do not consider the device user has given consent when apps merely require physical access to the device, unlocking the device, or logging in with the username and password in order to install the app.
> Some people refer to stalkerware as ‘spouseware’ or ‘creepware’, while the term stalkerware is also sometimes used colloquially to refer to any app or program that does or is perceived to invade one’s privacy; we believe a clear and narrow definition is important given stalkerware’s use in situations of intimate partner abuse. We also note that legitimate apps and other kinds of technology can and often do play a role in such situations.
- https://stopstalkerware.org/information-for-media/
If someone brought me a device they suspected was compromised and it had an unlocked bootloader and they didn't know what an unlocked bootloader, custom ROM, or root was, I'd assume a high probability the OS is malicious.
Exactly, secure boot advocates once again completely miss that it doesn't protect against any real threat models.
Lol security people are out of their mind if they think that's actually a relevant concern.
> 3. A malicious piece of firmware uses the lack of Secure Boot to modify the boot partition to ensure the malware loads before the OS, thereby permanently disabling all ability for the system to repair itself from within itself
Oh no so now the malware can only permanently encrypt all the users files and permanently leak their secrets. But hey at least the user can repair the operating system instead of having to reinstall it. And in practice they can't even be sure about that because computers are simply too complex.
Should either of those things happen the bootloader puts up a big bright flashing yellow warning screen saying "Someone hacked your device!"
I use a Pixel device and run GrapheneOS, the bootloader always pauses for ~5 seconds to warn me that the OS is not official.
The firmware of the device being a binary blob for the most part... Not like I trust it to begin with.
Whereas my open source Linux distribution requires me to disables SecureBoot.
What a world.
There's also plenty of folks combining this with TPM and boot measurements.
The ugly part of SecureBoot is that all hardware comes with MS's keys, and lots of software assume that you'll want MS in charge of your hardware security, but SecureBoot _can_ be used to serve the user.
Obviously there's hardware that's the exception to this, and I totally share your dislike of it.
Right, but as engineers, we should resist the temptation to equate _possible_ with _practical_.
The mere fact that even the most business oriented Linux distributions have issues playing along SecureBoot is worrying. Essentially, SB has become a Windows only technology.
The promise of what SB could be useful for is even muddier. I would argue that the chances of being victim of firmware tampering are pretty thin compared to other attack vectors, yet somehow we end up all having SB and its most significant achievement is training people that disabling it is totally fine.
An unsigned hash is plenty guard to against tampering. The supply chain and any secret sauce that went into that firmware is just trust. Trust that the blob is well intentioned, trust that you downloaded from the right URL, checked the right SHA, trust that the organization running the URL is sanctioned to do so by Microsoft...
Once all of that trust for every piece of software is concentrated in one organization, Microsoft, Apple or Google, is has become totally meaningless.
At one time at our university we had table desktop dancers installed everywhere. Was kind of funny when it turned up just as a student wanted to defend their work in a lab.
For home/business users I'd agree. But in Embedded / money-handling then it's a life-saver and a really important technology.
Stallman tried to warn us with "tivoization".
Using arbiter platforms like this sounds like a great way to footgun yourself.
See Notepad++ for how that winds up.
Using Azure Trusted Signing or any other certificate vendor does not guarantee that a binary is 100% trustworthy, it just means someone put their name on it.
Apple Mail and Apple Calendar are fine to replace Exchange, as is Thunderbird(see 1), but Mail is more turnkey (1-click pairs with MS Exchange)
You can downgrade your O365 licenses to Exchange Plan 1 and keep your email hosting at a tiny fraction of the price of full 365 suite.
(1): Beware thunderbird has an open and unsolved bug that randomly deletes all your emails, kind of like a 1d1 million dice roll.
i dont think its fair to conflate the tech industries self-owns with microsofts damages. microsoft has for decades poured untold resources and money into capturing everything they possibly could to sustain themselves with honestly what i call cultural and software vendor lock. we're only just now seeing the gaming industry take its first real footsteps towards non-windows targets, but for the most part the decades of evangelizing Microsoft apis and bankrolling schools and education systems to carry courses for their way of doing things makes that a particularly uphill battle thats going to take a lot more time. people have built entire careers out of the microsoft-way in multiple industries. pure microsoft houses are still everywhere at many orgs, so many of them don't even recognize that there is another path. there's plenty of infra/dbadmin/devops people who are just pure windows still. there's multiple points where microsoft did have the best in class solution for something, but these days you'd be hard pressed to not go another way if you were starting from scratch. problem is such a lift and shift is really hard to do for orgs that have spent decades being a microsoft shop.
in a roundabout way, this sort of translates to real long lasting impact/damage to me. microsoft has always been such a force over history that it caused a massive rift in computing. no matter how much they embrace linux and claim to not fight the uphill battle of open source anymore, that modus operandi of locking people into their suite of things still exists on so many fronts and is in some ways more in your face than it's ever been. there's no benefit of the doubt to give here, i just have a hard time choosing microsoft for... well anything.
Explanation: Microslop is a power hungry, greedy and frankly evil corporation whose only goal is complete financial domination of the government, business, and personal tech industries. They actively promote making regressive software, increasing complexity, and hiding straightforward processes behind an information veil.
Example: Go to learn.microsoft.com and try to actually learn HOW to do anything. You'll read 35 pages of text talking about the concept of working with a specific microslop product but not 1 single explicit example of HOW to accomplish a specific task.
Example: Windows 11
Example: Copilot
The whole company is run by backassward tech hicks and digital yokels who can't think past a dime on the floor for a dollar in customer satisfaction, and somehow they run the majority of non-server space or personal device tech on the planet.
And of course companies in the US are wanting to make money/capture markets. They’re not a charity. None of that has any relation to holding back the industry. Unless you wish to explain how they hold back all FOSS projects.
You don’t need to be rude in your replies. This is HN, not reddit.
what is apple doing that is similar?
Apple is holding the tech industry back by forbidding any browser on iOS except Safari and then refusing to implement any APIs that would allow web applications to compete with their app store. Apple is choosing profit over progress.
currently have a 32bit win xp env spun up in 86box just to compile a project in some omega old visual studio dotnet 7 and the service pack update at the time (don't ask). it is seriously _wild_ being in there, feels like stepping into a time machine. nostalgia aside, the OS is for the most part... quiet. doesn't bother you, everything is kind of exactly where you expect it to be, no noise in my start menu, there isnt some omega bing network callstack in my explorer, no prompts to o365 my life up.
it feels kinda sad, what an era that was. it's just more annoying to do any meaningful work in windows these days.
im currently working with c/cpp the idiot way (nothing about my story is ever conventional sigh), by picking a legacy project from like 22 years ago. this has forced me to step back into old redhat 7.1+icc5, old windows xp + dotnet7 like i explained above, and im definitely taking the most unpragmatic approach ever diving in here.. but there's one thing that absolutely sticks out to me: microsoft has always tried to capitalize on everything. tool? money. vendor lock. os? money. vendor lock. entire industries/education system capture? lotta money. lotta vendor lock. lotta generational knowledge lock.
they are lucky people are still using github. theyve tried to poke the bear a few times and theyre slowly but surely enshittifying the place, but im just kinda losing any reverence for microsoft altogether. microsoft has been big for a hot minute now, they have their eras. you can feel when things are driven by smart visionary engineers working behind the scenes, and you can tell when things are in pure slop mode microservice get rich or die trying mode. yea, microsoft has.. always been vendor-lock aggro and kinda hostile, but the current era microsoft is by far the grossest it's ever been. see: microsoft teams (inb4 "i use teams every day, i dont have a problem with it")
im aware people smarter than me can write diatribes on why windows is the best at x thing, but im only informed by my own experience of having to use all three (linux/macos/windows) for my professional work life: i grew up thinking windows was the best.. now im like mostly confident that windows is actually the worst lol. by a pretty damn decent margin. i was gaslit for ages
I run Crossover and I feel like I gotta take a shower after. Just knowing there's a folder called drive_c on my Mac is the stuff of nightmares.
In this case, that's an OS controlled by an unaccountable company that can take application software away from you.
Related: If you're the customer, you're the product.
Now… maybe we could condense the 10,000 pointless distros down to a dozen? Oops, nope. Now 10,001, except this one has the menu bar in the middle of the screen and it moves around.
There needs to be some competition between ideas, but if every bit of disagreement about direction ends in "I'm going to build my own distro, with blackjack and hookers", then we as a community won't ever end up building something that can compete with the megacorps.
I'll tell Grandma that's what she needs to do.
In my limited experience with bitlocker, the disk is decryptable automatically as long as it's in the original motherboard.
Wat? Bitlocker is the answer to your question.
> In my limited experience with bitlocker, the disk is decryptable automatically as long as it's in the original motherboard.
It's unlocked (not decrypted) when the OS boots, yes. You can optionally enforce (not on Home) other unlock methods, such as PIN before the OS boots.
> I want my FDE to be based on a password or a keyfile, not simply by some code in the motherboard.
That's less secure than TPM.
If bitlocker has a PIN/passphrase decrypt option, then I missed it.
Furthermore it wasn't intuitive to me that my user account would decrypt more than just my home directory.