This article was painful to read because of all the misconceptions. A cpio archive is not a filesystem. Author uses initramfs, which is based on tmpfs. Linux can extract cpio to tmpfs. An archive of files and directories is in itself not a program.
Just because something looks similar doesn't mean it's equivalent. Binary programs are executed on the CPU, so if there's an interpreter involed it's hiding in the hardware environment. That's outside the scope of an OS kernel.
If you have a shell script in your filesystem and run it, you need to also provide the shell that interprets the script. Author omits this detail and confuses the kernel with the shell program.
Linux can easily be compiled without support for initramfs and ramdisk. It can still boot and run whatever userland sits in the filesystem.
"Linux initrd interpreter" hurts my brain. That's not how it works.
Edit: should've read further. Still a backwards way of explaining things imho.
astralbijection 22 hours ago [-]
> An archive of files and directories is in itself not a program.
Okay, but you can make the same argument to say that ELF files aren't programs in and of themselves either. In fact, some ELF files are dynamic libraries without an entrypoint, and therefore not actually executable in any meaningful way unless connected to yet another program.
If you can accept that some ELF files are executables and some aren't, then you can also accept that some CPIOs are executables and some aren't. What's the difference between ld.so unpacking an ELF file into RAM and running its entrypoint, and the Linux kernel unpacking an initramfs into RAM and running its entrypoint?
tosti 14 hours ago [-]
By that logic, everything is executable. Not entirely wrong, but mostly because of vulnerabilities. Not because of a highly contrived way of using a file format to run a program. You could do the same thing with json or xml.
saidnooneever 11 hours ago [-]
i agree, its linkers and loaders which parse a file and extract meaning, which might be executed by yet another program that runs in kernel and creates a 'user space' part to run the untrusted code in.
its not as simple as executable file, blap now it runs. they made it like that on the surface so people dont need to bother with the details.
a lot in this article, writes about the abstractions and maybe how they work. not really sure as you i found it hard to read. It doesnt mean its all wrong though, maybe theres more ways to look at a system which has layered abstractions. Each layer can be a different view and be discussed independently in its design.
if you look at what the CPU and kernel code is doing its a messy afair :D hard to talk about (fun tho :D and imho good to understand as you pointed out)
saidnooneever 11 hours ago [-]
ELF files are an image format which is a format to store things in basically..image being a legacy term.
an elf file is not executable, but depending on what you do, a linker and loader might cause the operating system to execute some or all of its contents at some point.
Hasslequest 21 hours ago [-]
It's the init in the cpio which is the interpeted program, and the rest of the cpio is memory for this interpeted progam.
tremon 6 hours ago [-]
How is it interpreted? Something that you load into memory and then set the processor's Instruction Pointer at is not interpreted at all. And in case /init is a shell script, it's not the kernel doing the interpreting -- the interpreter would be /bin/sh, which would still be loaded into memory and executed by the processor. Claiming that machine code is "interpreted" because it still needs to be finalized by a loader is not a clever gotcha -- it's ignorant erasure of relevant distinctions.
jeffbee 20 hours ago [-]
Binary programs are executed on the CPU but the program file is an archive with sections, and only one of them is the program, usually, while the others are all metadata. The CPU isn't capable of understanding the program file at all. Linux has to establish the conditions under which the program runs, that means at a minimum establishing the address space in which the program counter lies then jumping to that address. The instructions for how to do that are in the metadata sections of the ELF executable.
saidnooneever 11 hours ago [-]
not too bad explain, though the 'usually' might be clarified that an ELF file 'can have sections marked as executable' (tho ofc i get not wanting to get into segment flags :p) and also a program is cobbeled together potentially from many of these ELF files. in most cases the single file is useless. (most cases as in binaries provided by a standard linux distro, now 'producible binaries')
commandersaki 22 hours ago [-]
At least it isn't AI slop!
daveguy 22 hours ago [-]
I dunno, sure seems like "AI research" at least.
saidnooneever 11 hours ago [-]
might be, i wouldnt be sure but i would encourage author to dive a lil deeper based on some comments in the thread. Theres obviously lots to explore. Anything that goes into this topic is likely to make omissions, have some ,wrong viewing angle' or such things.
its not a phd paper so thats fine!
vatsachak 1 days ago [-]
Isn't every OS an interpreter for machine code with kernel privileges?
BenjiWiebe 21 hours ago [-]
No. The OS's software doesn't individually read each instruction and decide what to do with it.
It passes it off to the hardware (CPU) which runs the instructions.
lolsowrong 20 hours ago [-]
Most of the time. But sometimes, no. See ATL thunk emulation (last I checked, still alive in the windows kernel) and ntvdm handling of the BOP pseudoinstruction.
See also: Jazelle DBX.
Hell, on modern x86 processors, many “native” instructions are actually a series of micro-ops for a mostly undocumented and mostly poorly understood microcode architecture that differs from the natively documented instruction set.
It’s turtles all the way down.
fc417fc802 17 hours ago [-]
Aren't all of them microcoded? Some years back root was achieved on a line of intel processors and new instructions implemented as proof of concept. There's an academic paper, citation not immediately to hand.
trynumber9 15 hours ago [-]
Some instructions are microcoded but others take the fast path and avoid the microcode sequencer. Can't patch the latter in microcode RAM.
lolsowrong 16 hours ago [-]
I saw the paper from Google last year and thought something in it aligned with not everything running through the microcode engine, though I could be wrong.
fc417fc802 16 hours ago [-]
Might well be the case. I don't
think I'm familiar with the paper you're referring to; any chance of at least a vague description?
lolsowrong 14 hours ago [-]
Can’t find the pdf, but it’s all related to the zentool stuff:
“ The simplest instructions (add, sub, mov, etc) are all implemented in hardware. The more complicated instructions like rdrand, fpatan and cmpxchg are microcoded. You can think of them as a bit like calling into a library of functions written in that RISC-like code.”
saagarjha 19 hours ago [-]
Jazelle and micro-ops are not interpreters, they are executed in hardware.
lolsowrong 16 hours ago [-]
I believe only some parts of Jazelle are handled in hardware, though I don’t know if anybody has got their hands on any of the bits of the software side. I do know there’s documentation on handling unimplemented instructions.
I don’t know how I feel about micro-ops being executed in hardware - I mostly agree, but also, microcode updates exist…
fc417fc802 15 hours ago [-]
An interpreter implemented in hardware is still an interpreter. Hot take, all machine instruction sets are scripting languages and LLVM is a transpiler.
12 hours ago [-]
saidnooneever 11 hours ago [-]
OS is an interface that allowes to uses system resources. it is usually a collection of software and interfaces that do this these days because system resources are complex especially to use securely. the cpu interprets machine code.. the OS might tell the CPU what to execute (might, depend on design).
astralbijection 1 days ago [-]
This one is an interpreter for CPIO files.
Roshan_Roy 19 hours ago [-]
I think the article works better as a mental model than a literal claim. “Linux is an interpreter” feels wrong if you define interpretation strictly at the CPU instruction level, but it becomes more reasonable if you look at the kernel as something that interprets executable formats and environments (ELF, scripts with shebangs, initramfs, etc.). In that sense it’s less about instruction-by-instruction interpretation and more about orchestrating how different representations of programs become runnable. Maybe the confusion here is mixing those two meanings of “interpreter”.
kevinbaiv 10 hours ago [-]
The interesting part isn’t whether the analogy is correct, but that it highlights how much “execution” depends on its environment.
ssyhape 9 hours ago [-]
I mean you can call anything an interpreter if you squint hard enough right? CPU interprets x86, microcode interprets that, turtles all the way down. The real question is whether the framing is useful, not whether it's "correct." And I actually kinda like the initramfs-as-program thing -- it maps the boot sequence onto a model I already understand: loader, runtime, dependencies. Last time I had to debug a busted initramfs I was basically thinking about it this way already without realizing it.
djmips 21 hours ago [-]
Everything is an interpreter?
tsoukase 12 hours ago [-]
Yes, except compilers.
tnwhitwell 12 hours ago [-]
Aren’t they the most interpretery of interpreters? Just the languages aren’t Spanish and English, but C and machine code
imtringued 11 hours ago [-]
They're ahead of time interpreters.
qubex 21 hours ago [-]
Turing’s Theta Combinator
ogghostjelly 5 hours ago [-]
How is Turing's theta combinator related to the article? I'm not very familiar with functional programming concepts.
qubex 5 hours ago [-]
Θ()= (Θ)
TZubiri 1 days ago [-]
From earlier in the series.
"Okay, so the reason I initially did this was because I didn’t want to pay Contabo an extra $1.50/mo to have object storage just to be able to spawn VPSes from premade disk images."
I think there's a sweetspot between " I spent 50 hours to save 1.50$/mo" and "every engineer should be spending 250K$/mo in tokens".
Host employees still need to eat, if we can't afford 1.50$/mo, then we aren't really professionals and are just coasting on real infrastructure subsidized by professionals that pay for the pay-as-you-go infrastructure.
It's still possible to go even further to these extremes, there's thousands of developers that just coast by on github pages and vercel subdomains. So at least having a VPS puts you ahead of that mass competitively, but trying to save 1.50$/mo is a harsh place to be. At that point I don't think that the technical skills are the bottleneck, it's more likely that there's some social work that needs to be done, and that obsessing over running doom on curl is not a very productive use of one's time in a critical economic spot.
I write this because I am in that spot, but perhaps I'm reading a bit much into it.
PhilipRoman 1 days ago [-]
That sounds like something I would've done... When I was a kid, the 5€/month for a VPS was a massive expense, to the point where I occasionally had to download my 10GB rootfs to my mom's windows laptop, terminate the instance and then rebuild it once I had enough money. Eventually I got an old Kindle that was able to run an app called Terminal IDE which had a Linux shell with some basic programs like busybox, gcc. Spartacus Rex, if you're out there, thank you for making my entire career possible.
miki123211 23 hours ago [-]
And I think this point is heavily under-appreciated in the cloud Vs. on prem debate.
The cost for 1 hour of cloud CPU time is the same (barring discounts), no matter who you are. THe cost for 1 hour of engineer time varies wildly. If you're a non-profit or a solo dev, you may even consider that cost to be "free."
If your engineer costs are far lower than what AWS assumes they are, going with AWS is a stupid decision, you're far better off using VPSes / dedicated servers and self-hosting all the services you need to run on top.
bityard 24 hours ago [-]
The author did write that, yes. But it's very obviously a joke. The real reasons are literally the very next paragraph:
> I thought it was a neat trick, a funny shitpost that riffs on the eternal curl | sh debate. I could write a blog post about it, I tell you about how you can do it yourself, one thousand words, I learn something, you learn something, I get internet points, win win.
cardanome 1 days ago [-]
> it's more likely that there's some social work that needs to be done, and that obsessing over running doom on curl is not a very productive use of one's time in a critical economic spot.
It can be a problem but it can be also just a human following their special interests that give them joy.
For me as a ADHD person engaging with my special interests is a hard requirement to keep my mental health in check and therefore a very good use of my time.
jmalicki 1 days ago [-]
I like the term host employee, carrying the LLM parasite as it uses us to embody itself and reproduce into the singularity.
GranPC 1 days ago [-]
I think they meant employees from the hosting company. But that's a funny interpretation!
1 days ago [-]
pwdisswordfishy 1 days ago [-]
> if we can't afford 1.50$/mo, then we aren't really professionals and are just coasting on real infrastructure subsidized by professionals
This is a strange claim.
Whether someone is getting paid or not to do something is what determines who is a professional, not whether or how much they're paying someone else. (And that's the only thing that matters, unlike the way that "professional" is used as a euphemism in Americans' bizarre discursive repertoire.)
TZubiri 24 hours ago [-]
I think the sense of the word professional here is not as a boolean professional/amateur, but the sense of professionalism, the characteristic of taking business seriously, not letting personal matters intervene, and in this case, investing into tools.
To put an example, suppose you hire a painter, and they show up with non-work attire, no ladder, no brush, they ask you to buy a can of paint for them and a brush. Compared to a contractor that bills you flat and brins their own ladder, has work clothing and shoes, an air pneumatic spray painter, a breathing mask. Who is more professional?
It's part of a broader debate for sure, OP seems to have done it more for the experience than to actually save 1.50$.
bigyabai 23 hours ago [-]
It always depends on results. It can be unprofessional to design a system that takes an external variable like S3 for granted, especially if it's not needed. As long as the hack isn't worse than the official $1.50 happy-path, you might as well save the end-customer a monthly fee and reduce your attack surface.
I think hacks like these have a positive effect on the industry. It pushes back on meaningless, encroaching monetization and encourages Conatbo to reevaluate their service offerings to ensure they justify the price.
pwdisswordfishy 18 hours ago [-]
Nope. There's no broader debate. "Professional" means "X is getting paid for this", not "X is paying something in order for X to be able to do this". It's that simple.
> To put an example, suppose you hire a painter, and they show up with non-work attire, no ladder, no brush, they ask you to buy a can of paint for them and a brush. Compared to a contractor that bills you flat and brins their own ladder, has work clothing and shoes, an air pneumatic spray painter, a breathing mask. Who is more professional?
Literally meaningless. Are both getting paid? Yes? Then they are both professionals.
You can insist on using "professional" in a strained way to try to facilitate some attempt at being judgmental and gatekeepy, but "professional" means what it means. If you mean something else, then say what you mean and leave out the euphemisms.
fc417fc802 17 hours ago [-]
You can't just conveniently ignore that "professionalism" is a concept that exists and is pretty clearly what the original author meant based on context and content. Refusing to interpret things in the most plausible manner just wastes everyone's time.
For example the phrase "unprofessional professional" means a professional (ie getting paid) who is behaving unprofessionally (ie exhibiting a lack of professionalism).
>characterized by or conforming to the technical or ethical standards of a profession
>"did a competent, professional job"
>exhibiting a courteous, conscientious, and generally businesslike manner in the workplace
professional behavior/attire
>"I thought the whole meeting was going to fall apart but you rescued it like a true professional!"
As you can see there's more than 1 sense for the word, I didn't just make it up, it's a well established use of the word.
The definition you refer to is the 2nd sense:
>a: participating for gain or livelihood in an activity or field of endeavor often engaged in by amateurs
> "a professional golfer/poker player"
>"Few think of Idaho as fertile ground for developing professional baseball players."
>b: having a particular profession as a permanent career
>"a professional soldier"
>c: engaged in by persons receiving financial return
> "plays professional football/sports/poker"
astralbijection 1 days ago [-]
... I think you're reading a bit much into it. It's less that I couldn't afford to pay that, and more that I didn't want to pay that, and iterating on the solution I used to dodge that led me down a giant rabbit hole of learning more about Linux while solving stupider and stupider problems posed for myself.
jcul 1 days ago [-]
That four part blog was one of the most entertaining things I've read this year, thanks.
Really in the spirit of "hacker" news IMO.
I get the motivation, it's less avoiding the 1.50 per month and more like a challenge to work around it!
hrmtst93837 1 days ago [-]
Calling cheap hacks unprofessional misses the point, some suprisingly portable tricks only show up when you stop paying for everything on autopilot.
TZubiri 24 hours ago [-]
I really get that, and I value these otherwise pointless hack articles as much as the next guy. But I think I was specifically getting at the fact that these might actually turn into an economically useful skill just by finding a sweetspot in the amount of money they can save.
1.5$/mo is still in the toy realm, (and games can be very good for practicing before the real stuff), but using tricks like this to save 50$/mo or 500$/mo or 5k$/mo or 50k$/mo and so on can definitely cross the threshold into actually (massively) useful.
The biggest challenge in crossing that bridge is matching up clients with bad engineers but good budgets, with good engineers with no budget. There's probably thousands of engineers that are currently spinning 5$/mo into impressive architecture for their blog or their 2 user startup, and clients throwing buckets of cash into tokens and zapier/n8n. The world needs Cupids that match those together.
lstodd 22 hours ago [-]
man ld.so:
... (in which case no command-line options to the dynamic linker can be passed and, in the ELF case, the dynamic linker which is stored in the .interp section of the program is executed)
note how the ELF section is named.
novachen 21 hours ago [-]
[dead]
shevy-java 1 days ago [-]
Well - Linux is kind of like a somewhat generic interface to have actionable, programmable tasks. One could use Windows for this too, but IMO Linux is in general better suited for that task.
The only area I think Windows may be better is the graphical user interface. Now, the windows interface annoys me to no ends, but GNOME annoys me and KDE annoys me too. I have been more using fluxbox or icewm, sometimes when I feel fancy xfce or mate-desktop, but by and large I think my "hardcore desktop days" are over. I want things to be fast and efficient and simple. Most of the work I do I handle via the commandline and a bit of web-browsing and writing code/text in an editor, for the most part (say, 95% of the activities).
chickensong 1 days ago [-]
> I want things to be fast and efficient and simple.
Sway + foot with keybinds to provision each workspace to your liking is pretty nice. No desktop, but really flies for your use case (mine also). Bind window focus to your most comfortable keys.
ux266478 1 days ago [-]
> The only area I think Windows may be better is the graphical user interface.
Nah. You're right about Gnome and KDE, but Windows is even worse because you can't exactly escape away from microsoft's insane labyrinth or awful wm. Frankly, not a fan of the Xerox bloodline of desktop interfaces in general. mpx/mux heritage is the one I like. 9wm, cwm or dwm. Closer to Engelbart and just generally all around better.
pugchat 12 hours ago [-]
[dead]
ryguz 22 hours ago [-]
[dead]
unit149 11 hours ago [-]
[dead]
Rendered at 20:18:41 GMT+0000 (Coordinated Universal Time) with Vercel.
Just because something looks similar doesn't mean it's equivalent. Binary programs are executed on the CPU, so if there's an interpreter involed it's hiding in the hardware environment. That's outside the scope of an OS kernel.
If you have a shell script in your filesystem and run it, you need to also provide the shell that interprets the script. Author omits this detail and confuses the kernel with the shell program.
Linux can easily be compiled without support for initramfs and ramdisk. It can still boot and run whatever userland sits in the filesystem.
"Linux initrd interpreter" hurts my brain. That's not how it works.
Edit: should've read further. Still a backwards way of explaining things imho.
Okay, but you can make the same argument to say that ELF files aren't programs in and of themselves either. In fact, some ELF files are dynamic libraries without an entrypoint, and therefore not actually executable in any meaningful way unless connected to yet another program.
If you can accept that some ELF files are executables and some aren't, then you can also accept that some CPIOs are executables and some aren't. What's the difference between ld.so unpacking an ELF file into RAM and running its entrypoint, and the Linux kernel unpacking an initramfs into RAM and running its entrypoint?
its not as simple as executable file, blap now it runs. they made it like that on the surface so people dont need to bother with the details.
a lot in this article, writes about the abstractions and maybe how they work. not really sure as you i found it hard to read. It doesnt mean its all wrong though, maybe theres more ways to look at a system which has layered abstractions. Each layer can be a different view and be discussed independently in its design.
if you look at what the CPU and kernel code is doing its a messy afair :D hard to talk about (fun tho :D and imho good to understand as you pointed out)
an elf file is not executable, but depending on what you do, a linker and loader might cause the operating system to execute some or all of its contents at some point.
its not a phd paper so thats fine!
It passes it off to the hardware (CPU) which runs the instructions.
See also: Jazelle DBX.
Hell, on modern x86 processors, many “native” instructions are actually a series of micro-ops for a mostly undocumented and mostly poorly understood microcode architecture that differs from the natively documented instruction set.
It’s turtles all the way down.
https://github.com/google/security-research/blob/master/pocs...
Tavis spells it out there pretty quickly:
“ The simplest instructions (add, sub, mov, etc) are all implemented in hardware. The more complicated instructions like rdrand, fpatan and cmpxchg are microcoded. You can think of them as a bit like calling into a library of functions written in that RISC-like code.”
I don’t know how I feel about micro-ops being executed in hardware - I mostly agree, but also, microcode updates exist…
"Okay, so the reason I initially did this was because I didn’t want to pay Contabo an extra $1.50/mo to have object storage just to be able to spawn VPSes from premade disk images."
I think there's a sweetspot between " I spent 50 hours to save 1.50$/mo" and "every engineer should be spending 250K$/mo in tokens".
Host employees still need to eat, if we can't afford 1.50$/mo, then we aren't really professionals and are just coasting on real infrastructure subsidized by professionals that pay for the pay-as-you-go infrastructure.
It's still possible to go even further to these extremes, there's thousands of developers that just coast by on github pages and vercel subdomains. So at least having a VPS puts you ahead of that mass competitively, but trying to save 1.50$/mo is a harsh place to be. At that point I don't think that the technical skills are the bottleneck, it's more likely that there's some social work that needs to be done, and that obsessing over running doom on curl is not a very productive use of one's time in a critical economic spot.
I write this because I am in that spot, but perhaps I'm reading a bit much into it.
The cost for 1 hour of cloud CPU time is the same (barring discounts), no matter who you are. THe cost for 1 hour of engineer time varies wildly. If you're a non-profit or a solo dev, you may even consider that cost to be "free."
If your engineer costs are far lower than what AWS assumes they are, going with AWS is a stupid decision, you're far better off using VPSes / dedicated servers and self-hosting all the services you need to run on top.
> I thought it was a neat trick, a funny shitpost that riffs on the eternal curl | sh debate. I could write a blog post about it, I tell you about how you can do it yourself, one thousand words, I learn something, you learn something, I get internet points, win win.
It can be a problem but it can be also just a human following their special interests that give them joy.
For me as a ADHD person engaging with my special interests is a hard requirement to keep my mental health in check and therefore a very good use of my time.
This is a strange claim.
Whether someone is getting paid or not to do something is what determines who is a professional, not whether or how much they're paying someone else. (And that's the only thing that matters, unlike the way that "professional" is used as a euphemism in Americans' bizarre discursive repertoire.)
To put an example, suppose you hire a painter, and they show up with non-work attire, no ladder, no brush, they ask you to buy a can of paint for them and a brush. Compared to a contractor that bills you flat and brins their own ladder, has work clothing and shoes, an air pneumatic spray painter, a breathing mask. Who is more professional?
It's part of a broader debate for sure, OP seems to have done it more for the experience than to actually save 1.50$.
I think hacks like these have a positive effect on the industry. It pushes back on meaningless, encroaching monetization and encourages Conatbo to reevaluate their service offerings to ensure they justify the price.
> To put an example, suppose you hire a painter, and they show up with non-work attire, no ladder, no brush, they ask you to buy a can of paint for them and a brush. Compared to a contractor that bills you flat and brins their own ladder, has work clothing and shoes, an air pneumatic spray painter, a breathing mask. Who is more professional?
Literally meaningless. Are both getting paid? Yes? Then they are both professionals.
You can insist on using "professional" in a strained way to try to facilitate some attempt at being judgmental and gatekeepy, but "professional" means what it means. If you mean something else, then say what you mean and leave out the euphemisms.
For example the phrase "unprofessional professional" means a professional (ie getting paid) who is behaving unprofessionally (ie exhibiting a lack of professionalism).
adj sense 1c
>characterized by or conforming to the technical or ethical standards of a profession
>"did a competent, professional job"
>exhibiting a courteous, conscientious, and generally businesslike manner in the workplace professional behavior/attire
>"I thought the whole meeting was going to fall apart but you rescued it like a true professional!"
As you can see there's more than 1 sense for the word, I didn't just make it up, it's a well established use of the word.
The definition you refer to is the 2nd sense:
>a: participating for gain or livelihood in an activity or field of endeavor often engaged in by amateurs
> "a professional golfer/poker player"
>"Few think of Idaho as fertile ground for developing professional baseball players."
>b: having a particular profession as a permanent career
>"a professional soldier"
>c: engaged in by persons receiving financial return
> "plays professional football/sports/poker"
Really in the spirit of "hacker" news IMO.
I get the motivation, it's less avoiding the 1.50 per month and more like a challenge to work around it!
1.5$/mo is still in the toy realm, (and games can be very good for practicing before the real stuff), but using tricks like this to save 50$/mo or 500$/mo or 5k$/mo or 50k$/mo and so on can definitely cross the threshold into actually (massively) useful.
The biggest challenge in crossing that bridge is matching up clients with bad engineers but good budgets, with good engineers with no budget. There's probably thousands of engineers that are currently spinning 5$/mo into impressive architecture for their blog or their 2 user startup, and clients throwing buckets of cash into tokens and zapier/n8n. The world needs Cupids that match those together.
... (in which case no command-line options to the dynamic linker can be passed and, in the ELF case, the dynamic linker which is stored in the .interp section of the program is executed)
note how the ELF section is named.
The only area I think Windows may be better is the graphical user interface. Now, the windows interface annoys me to no ends, but GNOME annoys me and KDE annoys me too. I have been more using fluxbox or icewm, sometimes when I feel fancy xfce or mate-desktop, but by and large I think my "hardcore desktop days" are over. I want things to be fast and efficient and simple. Most of the work I do I handle via the commandline and a bit of web-browsing and writing code/text in an editor, for the most part (say, 95% of the activities).
Sway + foot with keybinds to provision each workspace to your liking is pretty nice. No desktop, but really flies for your use case (mine also). Bind window focus to your most comfortable keys.
Nah. You're right about Gnome and KDE, but Windows is even worse because you can't exactly escape away from microsoft's insane labyrinth or awful wm. Frankly, not a fan of the Xerox bloodline of desktop interfaces in general. mpx/mux heritage is the one I like. 9wm, cwm or dwm. Closer to Engelbart and just generally all around better.