Is this the same compiler that famously spurred Richard Stallman to create GCC [1] when its author "responded derisively, stating that the university was free but the compiler was not"?
It seems to be free now anyway, since 2005 according to the git history, under a 3-clause BSD license.
" Shortly before beginning the GNU Project, I heard about the Free University Compiler Kit, also known as VUCK. (The Dutch word for “free” is written with a v.) This was a compiler designed to handle multiple languages, including C and Pascal, and to support multiple target machines. I wrote to its author asking if GNU could use it.
He responded derisively, stating that the university was free but the compiler was not. I therefore decided that my first program for the GNU Project would be a multilanguage, multiplatform compiler."
And not only was the university 'free' and the compiler not, neither was 'Minix', which was put out there through Prentice Hall in a series of books that you had to pay a fairly ridiculous amount of money for if you were a student there.
So the VU had the two main components of the free software world in their hand and botched them both because of simple greed.
I love it how RMS has both these quotes in the same text:
"Please don't fall into the practice of calling the whole system “Linux,” since that means attributing our work to someone else. Please give us equal mention."
"This makes it difficult to write free drivers so that Linux and XFree86 can support new hardware."
And there are only a few lines between those quotes.
rogerbinns 20 hours ago [-]
I was one of those students saving up the large sum for the book, when Linux was announced. There were other tensions at the time - the biggest was that Minix on 8086 was 16 bit real mode only. Someone had developed patches to run in 32 bit protected mode, but they were invasive and large, and the Minix maintainers would not integrate them as the increased complexity would not help the mission of Minix being easy to learn and tinker with. The filesystem code was also single threaded, essentially doing one request at a time. IIRC there were patches to address that too, also not integrated for the same reason. (Note that the books included print outs of the source so keeping it short did matter.)
This explains the final 2 sentences of the original Linux announcement:
> PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that's all I have :-(.
Tanenbaum made that deal. He collected royalties from the book (as was his right) but it clearly was a way to make money for him. Just another part of the textbook grift because students were forced to work on Minix long after that that made any sense at all.
Ironically, that single threaded nature of the FS made it a perfect match for my own little OS and I happily hacked it to pieces to bootstrap it using message passing into a FS executable. That trick probably saved me a year in bringing up the kernel to the point that the OS could compile itself, which greatly sped up development.
justin66 6 hours ago [-]
> students were forced to work on Minix long after that that made any sense at all
Not to defend the textbook grift or the lack of vision here, but I strongly suspect an undergraduate minix course taught at VU would be very good. It’s not obvious to me that it would be inferior to the xv6-based course taught at MIT, for example.
jacquesm 5 hours ago [-]
That's fair, but it would be no less effective than a similar course based on Linux which would actually give the graduate a far more practical amount of knowledge. Acquisition of knowledge isn't free and to purposefully use a toy when the real thing is freely available for commercial reasons is just grift and AT and VU were well aware of this.
Note that all I'm doing here is taking AT at his word that he developed Minix solely because the source to Unix wasn't free to universities to hack on. They could have adopted Linux from the day that it became available then, or at least the beginning of the next academic year.
justin66 54 minutes ago [-]
I believe in the present day, the premise motivating these undergrad books and courses based on alternatives (VU and Minix, MIT and xv6, Purdue and Xinu, God knows what else) is that Linux has become too complicated for an introductory course. I honestly don’t have any instinct as to whether this is correct pedagogically. I suspect the two main factors are how well the software facilitates getting students situated and in a position to do meaningful programming assignments quickly, and how motivated the students are to work on the software.
I reminder taking a security-oriented class ages ago and hacking on an operating system that was already dead as a trilobite, and we were all smart enough to realize this was not a triumph we’d be bragging about to our future children (or recruiters). Bleh.
cryptonector 16 hours ago [-]
Terrible mistakes. People keep repeating these mistakes. Makes me think of Larry McVoy.
mqus 20 hours ago [-]
Re your last paragraphs: I think RMS really meant just the Linux kernel when he wrote that(the topic is drivers, after all), not GNU/Linux, the OS or GNU/Linux, "the system". So it can be argued that he isn't really contradicting himself
yjftsjthsd-h 18 hours ago [-]
Agreed. As a practical example, Alpine Linux isn't a GNU/Linux OS, but it does use Linux+Xorg graphics drivers.
phicoh 11 hours ago [-]
Selling ACK meant money for research into distributed systems (Amoeba) and parallel programming languages. I can see that money for research is more attractive than open source.
For MINIX the situation was different and I think more unfortunate. AST wanted to make sure that everybody could obtain MINIX and made his publisher agree to distributing the MINIX sources and binaries on floppies. Not something the publisher really wanted, they want to sell AST's book. In return the publisher got (as is usual for books) the exclusive right to distribute MINIX.
Right at the start that was fine, but when Usenet and the Internet took off, that became quite painful. People trying to maintain and distribute patch sets.
jacquesm 11 hours ago [-]
I disagreed strongly with that at the time and still do. The money we're talking about here was a pittance compared to the money already contributed by Dutch society to the university where these people were working. Besides that some of these royalty streams went into private pockets.
A friend of mine was studying under Andy and I had a chat with him about this at his Amstelveen residence prior to the release. He was dead set on doing it that way. As a non-student and relatively poor programmer I pointed out to him that his chosen strategy would make Minix effectively unaffordable to me in spite of his stated goal of 'unlocking unix'. So I ended up in Torvald's camp when he released Linux as FOSS (I never contributed to either, but I figured as a user I should pick the one that would win the race, even if from a tech perspective I agreed more with Tanenbaum than with Torvalds).
Minix was (is?) flogged to students of VU for much longer than was beneficial to those students, all that time and effort (many 100's of man years by now) could have gone into structurally improving Linux. But that would have required admitting a mistake.
phicoh 8 hours ago [-]
Universities get paid for teaching and research. Any software that is produced is a by product. Producing production quality software in a university is not easy and the university has to find a way to fund it.
MINIX was originally a private project of ast. It worked very well for the goal of teaching student the basics of operating systems.
One thing that might have been a waste of time is making the MINIX utilities POSIX compliant. Then again, many students would like an opportunity to work on something like that. The ones that wanted to work on Linux could just do that. Students worked in their free time on lots of interesting projects that were unrelated to the university.
jacquesm 4 hours ago [-]
> The ones that wanted to work on Linux could just do that.
Sure, but time is a very finite quantity and wasting a couple of years on Tanenbaum's pet project may have resulted in some residual knowledge about how operating systems in general worked but looking at most of the developments they pursued the bulk were such dead-ends that even outside of VU there was relatively little adoption. The world had moved to Linux and VU refused to move with it.
From being ahead they ended up being behind.
phicoh 2 hours ago [-]
I wonder who you are thinking of who 'wasted a couple of years'. Regular students do one course in operating systems. That is a series of lectures and some practical work. The practical work is a couple of weeks at most if you know what you are doing.
Some people spent a lot more time on MINIX, but that was either as a hobby or the PhD students who worked on MINIX3. But MINIX3 generated lots of papers with a best paper award, so that can hardly be seen as wasted from an academic point of view.
jacquesm 23 minutes ago [-]
I have some friends that went that route. They did not come away with anything that helped their careers later on and the 'academic point of view' in CS in NL hasn't been the best way to put food on your table since the days of Dijkstra.
pjmlp 6 hours ago [-]
He means Linux the kernel, getting new drivers.
Another interesting fact is that until Linux came to be, GCC only became relevant because Sun started the trend among UNIX vendors to split UNIX into user and developer SKUs, thus making the whole development tooling behind an additional license.
spit2wind 13 hours ago [-]
> I love it how RMS has both these quotes in the same text:
>
> "Please don't fall into the practice of calling the whole system “Linux,” since that means attributing our work to someone else. Please give us equal mention."
>
> "This makes it difficult to write free drivers so that Linux and XFree86 can support new hardware."
>
> And there are only a few lines between those quotes.
I'll be honest, I don't understand your point here?
JJMcJ 6 hours ago [-]
RMS calls it Linux, not GNU/Linux in the second quote.
9 hours ago [-]
10 hours ago [-]
userbinator 19 hours ago [-]
the Free University Compiler Kit, also known as VUCK. (The Dutch word for “free” is written with a v.)
I'm not sure if I'm reading satire or they are having some fun trolling.
throwaway81523 16 hours ago [-]
Of course RMS understood the overtone perfectly, but Vrije Universiteit (vu.nl) is the real name of the university. Its name can be translated to "liberated university". As I understand it, it's a free university in the sense that historically, students of all religions were eligible to attend, as opposed to e.g. Katholieke Universiteit which was Catholic.
The librarated part means free from government control. Until the VU all Dutch universities belonged (indirectly) to the Dutch government.
JJMcJ 6 hours ago [-]
Some universities, especially in Latin America, use the term "autonomous". Is that the same thing is "free" in this context?
phicoh 5 hours ago [-]
Yes. Absence of direct control by the government. The VU was founded for religious reasons, so the main goal was to be able to teach theology according to the particular type of protestant Christianity that the founders of the VU believed in.
yellowapple 12 hours ago [-]
Sounds like Katholieke Universiteit ought to release their own Compiler Kit ;)
jacquesm 11 hours ago [-]
I think the part that he - and you - missed is that tuition at the time was entirely free, so it wasn't just 'free' in one sense of the word.
DonHopkins 10 hours ago [-]
Vrije as in "Not Catholic", not as in beer.
vrijen 16 hours ago [-]
The adjective meaning "free" is "vrij" or "vrije" in Dutch.
Amusingly, the Dutch verb "vrijen" does, in fact, mean to have sex.
andsoitis 8 hours ago [-]
I like the Afrikaans (evolved from Dutch) even better for its streamlined spelling and double-use depending on context:
Vry == "free" (noun) or "to court/kiss/have sex" (verb, contextual).
hbogert 14 hours ago [-]
You really just made an account now to make that point?
gjvc 13 hours ago [-]
his comment was more useful than yours
hbogert 3 hours ago [-]
He made an account name called vrijen which is having sex in dutch.. as he himself explained. Not sure if you noticed that part
actionfromafar 10 hours ago [-]
But it’s correct. :)
Linux the kernel has the drivers.
23 hours ago [-]
DonHopkins 11 hours ago [-]
UniPress, RMS's arch enemy Evil Software Hoarder, sold a commercial version of the Amsterdam Compiler Kit as well as Gosling's Emacs.
>UniPress ported and sold a commercial version of the "Extended Amsterdam Compiler Kit" for Andrew Tanenbaum for many CPUs and versions of Unix (like they also ported and sold his Unix version of Emacs for James Gosling), so Emacs might have been compiled with ACK on the Cray, but I don't recall.
>During the late 80's and early 90's, UniPress's Enhanced ACK cost $9,995 for a full source license, $995 for an educational source license, with front ends for C, Pascal, BASIC, Modula-2, Occam, and Fortran, and backends for VAX, 68020, NS32000, Sparc, 80368, and others, on many contemporary versions of Unix.
>Rehmi Post at UniPress also made a back-end for ACK that compiled C to PostScript for the NeWS window system and PostScript printers, called "c2ps", which cost $2,995 for binaries or $14,995 for sources.
>Independently Arthur van Hoff wrote a different C to PostScript compiler called "PdB" at the Turing Institute, not related to c2ps. It was a much simpler, more powerful, more direct compiler written from scratch, and it supported object oriented PostScript programming in NeWS, subclassing PostScript from C or C from PostScript. I can't remember how much Turing sold it for, but I think it was less than c2ps.
My goodness, this is hard to imagine from today when open source has driven the price of software (code itself) to nil. And that's the price from decades ago. While I'm glad I don't have to pay 15K for a C to PostScript compiler, as someone who might have written similar software if I'd lived back in those days - I can imagine an alternate timeline where I'd be getting paid to write such tools instead of doing it as a hobby project.
> NeScheme.txt
Nice rabbit hole about LispScript, what a cool idea. I've been re-studying Scheme recently, its history and variants like s7, and was appreciating its elegance and smallness as a language, how relevant it still is. One of the books I'm reading uses Scheme for algorithmic music composition. (Notes from the Metalevel: An Introduction to Computer Composition)
samemrecebi 22 hours ago [-]
this does not suprise me at all if other stories i heard are true.
anonzzzies 22 hours ago [-]
Go on...
samemrecebi 9 hours ago [-]
nothing bad but just doenst suprise me with the reaction he gave to stalman
pjmlp 1 days ago [-]
One of the first widely used compiler toolkits with multiple frontends, intermediate language for the phases and a common backend.
Contrary to common understanding LLVM wasn't the very first one, ACK also not, there are others predating it when diving into compiler literature.
unusual-name 1 days ago [-]
It's interesting that they have a Raspberry Pi GPU backend, but neither an ARM backend nor any modern ISA. (such as x86-64, Aarch64, etc.) Is there any example program that actually runs on the rpi gpu? I skimped the website, but it is only mentioned in the release notes.
lproven 10 hours ago [-]
> Is there any example program that actually runs on the rpi gpu?
ThreadX is the RasPi firmware. The GPU is the primary processor of the Pi: the ARM cores are essentially just co-processors.
hmry 12 hours ago [-]
Presumably someone wanted to write an RPi bootloader, which run on the GPU. Several universities have OS programming courses that use old 32-bit RPis. Not sure if anything was actually written though.
lproven 10 hours ago [-]
> Not sure if anything was actually written though.
The librerpi project
«
librerpi is a FOSS boot firmware based on littlekernel for Raspberry Pi boards, it replaces the proprietary boot firmware normally used to boot.
»
The requirements for "flex and yacc" seem to indicate that this is from a time and culture before recursive descent/precedence climbing became the norm.
barfiure 1 days ago [-]
I’m still making my way through the MINIX book. Love it.
AlexeyBrin 1 days ago [-]
Are you working through the 1st or 2nd edition of the book ? I think these are the ones that used ACK.
rwmj 24 hours ago [-]
I remember after I read the 1st edition, bought MINIX ($150 !!), and then was very annoyed to find that the compiler source was not included. Luckily it was '89 or '90 and GCC sources were available.
barfiure 23 hours ago [-]
I’m working through the third edition which I believe is also ACK based as far as I can tell. MINIX version 3.1.0?
The sursprise comes when you try to compile the minimal book version and find out that it is not as lean as presented in the book but actually depends on hundereds of assembler files (see https://github.com/rochus-keller/Minix3/tree/Minix3_Book_TCC).
barfiure 19 hours ago [-]
I’m a tad confused so maybe I’m not understanding the horror show.
Tanenbaum explicitly mentions multiple times that the book is a subset of the code because it would be too long to print with the library. So he covers mostly the main areas.
But the source code, in its entirety, is mounted under /usr/src. And it has all the assembly files in ACK files, mostly in lib I believe. You can compile it with a make command and it works as expected.
The author makes it seem like there’s some terrible thing. Am I missing some gory directory? Yes the ACK syntax would need to be ported over to something more modern like NASM or FASM if someone wants to move the whole kitchen sink, new linker scripts made as a result of exported symbols etc. It is painful but alas, so is the archaic K&R C.
I don’t know if that’s necessary though? It sounds like a waste of time to begin with.
I mean this book is ancient, and nobody really uses 32-bit protected mode. I’m mostly doing it out of curiosity even though I already stood up a small 64-bit long mode thinger.
Let me know what I’m missing!
Rochus 18 hours ago [-]
The author writes in the book explicitly "This is a modified version of config.h for compiling a small Minix system with only the options described in the text". This leaves no doubt that the book indeed describes a working microkernel of less than 15kSLOC which can be built and run (even if the "small Minix" lacks a few features). I blieved the author (like generations of other scholars) until I tried to actually build and run it myself.
phicoh 4 hours ago [-]
Converting between ACK and GCC assembler is a solved problem. Minix-vmd can be compiled with both ACK and GCC.
Unfortunately, when MINIX3 as started, it was copied directly from MINIX2 and a lot of interesting stuff was left out.
Rochus 2 hours ago [-]
> Converting between ACK and GCC assembler is a solved problem.
I assume you mean because the assembler was manually migrated in later Minix versions, not because there is a tool which can do so automatically. Or did I miss this?
> and a lot of interesting stuff was left out
Can you please make examples what you mean specifically?
phicoh 2 hours ago [-]
Yes, there is a tool called asmconv, that converts.
One example is the new compiler driver that can use ACK and GCC from a single compiler driver and that can automatically convert assembler and figure out which archiver to use to create libraries.
Another example is library support for filenames longer than 14 characters that was completely transparent. MINIX3 just broken all backward compatibility by increasing the size of directory entries to a new fixed size.
I'm sure there is more stuff, these are just a few I remember.
22 hours ago [-]
ramon156 1 days ago [-]
Looks cool, last post in 2022 though? Is it feature complete?
tl;dr: A kit for targeting several old or old-ish platforms, with code in some languages popular in the 1980s: C89 (ANSI C), Pascal, Modula 2, Basic. A 'kit' here means: frontend, codegen, support libraries and some tools. This is apparently known as being the default toolchain for Minix 1 and 2.
But - the repository is not "everything you need"; it actually relies on a lot from an existing platform - GCC, Lua, Make, Python etc. So, you would typically use this to cross-compile it seems.
tgv 1 days ago [-]
It doesn't rely on gcc. Any C compiler will do. The rest is there to build it on " Linux, OSX, and Windows using MSYS2 and mingw32". Indeed for cross-compilation, as it won't run on CP/M.
consp 1 days ago [-]
> apparently known as being the default toolchain for Minix 1 and 2.
That is not very surprising since Tannenbaum is a professor there and cowrote wrote the ACK and wrote Minix.
phicoh 1 days ago [-]
ACK used to be self-hosting. Of course, standard Unix utilities like sh and make are required. I still use one of those versions.
janlucien 12 hours ago [-]
[dead]
Rendered at 22:03:00 GMT+0000 (Coordinated Universal Time) with Vercel.
It seems to be free now anyway, since 2005 according to the git history, under a 3-clause BSD license.
[1] https://www.gnu.org/gnu/thegnuproject.en.html
" Shortly before beginning the GNU Project, I heard about the Free University Compiler Kit, also known as VUCK. (The Dutch word for “free” is written with a v.) This was a compiler designed to handle multiple languages, including C and Pascal, and to support multiple target machines. I wrote to its author asking if GNU could use it.
He responded derisively, stating that the university was free but the compiler was not. I therefore decided that my first program for the GNU Project would be a multilanguage, multiplatform compiler."
And not only was the university 'free' and the compiler not, neither was 'Minix', which was put out there through Prentice Hall in a series of books that you had to pay a fairly ridiculous amount of money for if you were a student there.
So the VU had the two main components of the free software world in their hand and botched them both because of simple greed.
I love it how RMS has both these quotes in the same text:
"Please don't fall into the practice of calling the whole system “Linux,” since that means attributing our work to someone else. Please give us equal mention."
"This makes it difficult to write free drivers so that Linux and XFree86 can support new hardware."
And there are only a few lines between those quotes.
This explains the final 2 sentences of the original Linux announcement:
> PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that's all I have :-(.
The book publisher is blamed for preventing Minix from being freely distributed: https://en.wikipedia.org/wiki/Minix#Licensing
Ironically, that single threaded nature of the FS made it a perfect match for my own little OS and I happily hacked it to pieces to bootstrap it using message passing into a FS executable. That trick probably saved me a year in bringing up the kernel to the point that the OS could compile itself, which greatly sped up development.
Not to defend the textbook grift or the lack of vision here, but I strongly suspect an undergraduate minix course taught at VU would be very good. It’s not obvious to me that it would be inferior to the xv6-based course taught at MIT, for example.
Note that all I'm doing here is taking AT at his word that he developed Minix solely because the source to Unix wasn't free to universities to hack on. They could have adopted Linux from the day that it became available then, or at least the beginning of the next academic year.
I reminder taking a security-oriented class ages ago and hacking on an operating system that was already dead as a trilobite, and we were all smart enough to realize this was not a triumph we’d be bragging about to our future children (or recruiters). Bleh.
For MINIX the situation was different and I think more unfortunate. AST wanted to make sure that everybody could obtain MINIX and made his publisher agree to distributing the MINIX sources and binaries on floppies. Not something the publisher really wanted, they want to sell AST's book. In return the publisher got (as is usual for books) the exclusive right to distribute MINIX.
Right at the start that was fine, but when Usenet and the Internet took off, that became quite painful. People trying to maintain and distribute patch sets.
A friend of mine was studying under Andy and I had a chat with him about this at his Amstelveen residence prior to the release. He was dead set on doing it that way. As a non-student and relatively poor programmer I pointed out to him that his chosen strategy would make Minix effectively unaffordable to me in spite of his stated goal of 'unlocking unix'. So I ended up in Torvald's camp when he released Linux as FOSS (I never contributed to either, but I figured as a user I should pick the one that would win the race, even if from a tech perspective I agreed more with Tanenbaum than with Torvalds).
Minix was (is?) flogged to students of VU for much longer than was beneficial to those students, all that time and effort (many 100's of man years by now) could have gone into structurally improving Linux. But that would have required admitting a mistake.
MINIX was originally a private project of ast. It worked very well for the goal of teaching student the basics of operating systems.
One thing that might have been a waste of time is making the MINIX utilities POSIX compliant. Then again, many students would like an opportunity to work on something like that. The ones that wanted to work on Linux could just do that. Students worked in their free time on lots of interesting projects that were unrelated to the university.
Sure, but time is a very finite quantity and wasting a couple of years on Tanenbaum's pet project may have resulted in some residual knowledge about how operating systems in general worked but looking at most of the developments they pursued the bulk were such dead-ends that even outside of VU there was relatively little adoption. The world had moved to Linux and VU refused to move with it.
From being ahead they ended up being behind.
Some people spent a lot more time on MINIX, but that was either as a hobby or the PhD students who worked on MINIX3. But MINIX3 generated lots of papers with a best paper award, so that can hardly be seen as wasted from an academic point of view.
Another interesting fact is that until Linux came to be, GCC only became relevant because Sun started the trend among UNIX vendors to split UNIX into user and developer SKUs, thus making the whole development tooling behind an additional license.
I'll be honest, I don't understand your point here?
I'm not sure if I'm reading satire or they are having some fun trolling.
https://en.wikipedia.org/wiki/Vrije_Universiteit_Amsterdam
Amusingly, the Dutch verb "vrijen" does, in fact, mean to have sex.
Vry == "free" (noun) or "to court/kiss/have sex" (verb, contextual).
Linux the kernel has the drivers.
https://compilers.iecc.com/comparch/article/92-04-041
UniPress made a PostScript back-end for ACK that they marketed with the NeWS version Emacs, whose slogan was "C for yourself: PostScript for NeWS!"
https://news.ycombinator.com/item?id=42838736
>UniPress ported and sold a commercial version of the "Extended Amsterdam Compiler Kit" for Andrew Tanenbaum for many CPUs and versions of Unix (like they also ported and sold his Unix version of Emacs for James Gosling), so Emacs might have been compiled with ACK on the Cray, but I don't recall.
>During the late 80's and early 90's, UniPress's Enhanced ACK cost $9,995 for a full source license, $995 for an educational source license, with front ends for C, Pascal, BASIC, Modula-2, Occam, and Fortran, and backends for VAX, 68020, NS32000, Sparc, 80368, and others, on many contemporary versions of Unix.
>Rehmi Post at UniPress also made a back-end for ACK that compiled C to PostScript for the NeWS window system and PostScript printers, called "c2ps", which cost $2,995 for binaries or $14,995 for sources.
>Independently Arthur van Hoff wrote a different C to PostScript compiler called "PdB" at the Turing Institute, not related to c2ps. It was a much simpler, more powerful, more direct compiler written from scratch, and it supported object oriented PostScript programming in NeWS, subclassing PostScript from C or C from PostScript. I can't remember how much Turing sold it for, but I think it was less than c2ps.
https://compilers.iecc.com/comparch/article/92-04-041
https://donhopkins.com/home/archive/NeWS/NeScheme.txt
My goodness, this is hard to imagine from today when open source has driven the price of software (code itself) to nil. And that's the price from decades ago. While I'm glad I don't have to pay 15K for a C to PostScript compiler, as someone who might have written similar software if I'd lived back in those days - I can imagine an alternate timeline where I'd be getting paid to write such tools instead of doing it as a hobby project.
> NeScheme.txt
Nice rabbit hole about LispScript, what a cool idea. I've been re-studying Scheme recently, its history and variants like s7, and was appreciating its elegance and smallness as a language, how relevant it still is. One of the books I'm reading uses Scheme for algorithmic music composition. (Notes from the Metalevel: An Introduction to Computer Composition)
Contrary to common understanding LLVM wasn't the very first one, ACK also not, there are others predating it when diving into compiler literature.
ThreadX.
https://en.wikipedia.org/wiki/ThreadX#Products_using_ThreadX
This RTOS, later rebranded Microsoft Azure RTOS, later still made FOSS:
https://www.theregister.com/2023/11/28/microsoft_opens_sourc...
ThreadX is the RasPi firmware. The GPU is the primary processor of the Pi: the ARM cores are essentially just co-processors.
The librerpi project
« librerpi is a FOSS boot firmware based on littlekernel for Raspberry Pi boards, it replaces the proprietary boot firmware normally used to boot. »
https://librerpi.github.io/
The sursprise comes when you try to compile the minimal book version and find out that it is not as lean as presented in the book but actually depends on hundereds of assembler files (see https://github.com/rochus-keller/Minix3/tree/Minix3_Book_TCC).
Tanenbaum explicitly mentions multiple times that the book is a subset of the code because it would be too long to print with the library. So he covers mostly the main areas.
But the source code, in its entirety, is mounted under /usr/src. And it has all the assembly files in ACK files, mostly in lib I believe. You can compile it with a make command and it works as expected.
The author makes it seem like there’s some terrible thing. Am I missing some gory directory? Yes the ACK syntax would need to be ported over to something more modern like NASM or FASM if someone wants to move the whole kitchen sink, new linker scripts made as a result of exported symbols etc. It is painful but alas, so is the archaic K&R C.
I don’t know if that’s necessary though? It sounds like a waste of time to begin with.
I mean this book is ancient, and nobody really uses 32-bit protected mode. I’m mostly doing it out of curiosity even though I already stood up a small 64-bit long mode thinger.
Let me know what I’m missing!
Unfortunately, when MINIX3 as started, it was copied directly from MINIX2 and a lot of interesting stuff was left out.
I assume you mean because the assembler was manually migrated in later Minix versions, not because there is a tool which can do so automatically. Or did I miss this?
> and a lot of interesting stuff was left out
Can you please make examples what you mean specifically?
One example is the new compiler driver that can use ACK and GCC from a single compiler driver and that can automatically convert assembler and figure out which archiver to use to create libraries.
Another example is library support for filenames longer than 14 characters that was completely transparent. MINIX3 just broken all backward compatibility by increasing the size of directory entries to a new fixed size.
I'm sure there is more stuff, these are just a few I remember.
2025: https://news.ycombinator.com/item?id=42833638
2020: https://news.ycombinator.com/item?id=22310987 and https://news.ycombinator.com/item?id=22612420
But - the repository is not "everything you need"; it actually relies on a lot from an existing platform - GCC, Lua, Make, Python etc. So, you would typically use this to cross-compile it seems.
That is not very surprising since Tannenbaum is a professor there and cowrote wrote the ACK and wrote Minix.