NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Windows NT/OS2 Design Workbook (computernewb.com)
kh9000 21 hours ago [-]
The section coding.pdf has their code style guidelines, colloquially known as Cutler Normal Form, CNF for short. I'm conflicted on it. Definitely overly verbose, but you can't argue with the results of the NT team. Such a rigid style guide almost feels like the technical version of a dress code. And there's an idea called "enclothed cognition" which is like, if you wear a business suit to work, it exerts a subconscious influence that results in you taking the work more seriously, focusing your attention, etc: https://en.wikipedia.org/wiki/Enclothed_cognition
bombcar 18 hours ago [-]
It's also important to remember that a ton of things we take for granted now simply didn't exist (source code control was in its infancy, merging was shit, syntax highlighting was minimal at best, compiling took time, etc).
jen20 10 hours ago [-]
Source control was in infancy compared to today, but still 15 years old (SCCS) when Windows NT development started!
anthk 10 hours ago [-]
Not with Borland products. Even XEmacs and Emacs had these features (code control was with CVS or close).
bombcar 5 hours ago [-]
CVS and RCS and friends were infants; barely more than copying directories or zip files around.

Complex merging as we're used to with git was unheard of.

nxobject 14 hours ago [-]
At least there was this...

> Note that the NT OS/2 system does not use the Hungarian naming convention used in some of the other Microsoft products.

markus_zhang 20 hours ago [-]
I agree that in general people are influenced by their perception of themselves. For example I always pretend I'm a kernel programmer when hacking on small kernels. This did lend me a lot of patience to debug and figure things out, which I do not have for my work.
jdboyd 4 days ago [-]
For those unclear on what the large pile of .doc and .pdf files are, it appears to be some revision of the design documents for "NT OS/2", which then turned into just NT. This appears to be the Smithsonian description of their physical copy: https://www.si.edu/object/microsoft-windows-nt-os2-design-wo...
1 days ago [-]
markus_zhang 1 days ago [-]
Yeah I think this is the original design doc for NT.
Lammy 24 hours ago [-]
My favorite part of NT is the Local Procedure Call (now obsoleted by ALPC): https://computernewb.com/~lily/files/Documents/NTDesignWorkb...

Very cool to be able to read the original design instead of just reverse-engineered ones. Thanks for posting!

cpeterso 1 days ago [-]
> The first release of NT is planned as a workstation product that will provide a strong competitor to UN*X based workstations.

UN*X spelling for trademark reasons or a joke that UNIX is verboten at Microsoft?

kryptiskt 24 hours ago [-]
That was back when there was "real" UNIX around, as well as a number of clones, including Microsofts own Xenix (maybe they had offloaded that to SCO by then). So UN*X was one way to indicate that it meant UNIX-like OSes.
kryptiskt 24 hours ago [-]
Turns out SCO bought Xenix in 1987, but Microsoft was just a couple of years removed from being the biggest Unix vendor around at this point.
Angostura 23 hours ago [-]
I guess that's probably Apple now.
skissane 21 hours ago [-]
Or IBM?

z/OS is officially a Unix

s1gsegv 16 hours ago [-]
There are an awful lot of iOS and macOS devices out there
selkin 14 hours ago [-]
macOS is a certified UNIX[0], but iOS isn’t.

[0] https://www.opengroup.org/openbrand/register/

rbanffy 12 hours ago [-]
I still think Macs outnumber IBM mainframe and POWER machines by a couple orders of magnitude.
skissane 8 hours ago [-]
By unit count, no question.

By revenue? Genuinely uncertain.

fredoralive 1 days ago [-]
IIRC Microsoft's internal email still ran on Xenix at the time (until Exchange betas got good enough for internal use c. 1995?), so perhaps more trademarks than some sort of absolute hatred of Unix. Also note that one of the two APIs that NT OS/2 was initially going to support was POSIX, albeit perhaps more because the US government wanted that than a true love of UNIX. Although the design rationale document (ntdesrtl) does lament that existing POSIX test suites tend to also test "...UNIX folklore that happens to be permissible under an interpretation of the POSIX spec".
nucleative 19 hours ago [-]
Did Microsoft never run Microsoft Mail internally?

It was an email system that ran on top of file system. If I recall, mail clients connected over a networked drive to access mailboxes. So it was never regarded as being very scalable.

canucker2016 15 hours ago [-]
Yes, MS Mail for PC Networks used a shared file system for email.

The Workgroup Apps (WGA) divison ran MS Mail for PC Networks since they produced MS Mail. Gotta dogfood your product. The WGA email system used a Xenix gateway to connect with the rest of Microsoft.

The rest of Microsoft ran MS Mail for Windows with a Xenix email backend and address book, since MS was already using Xenix before MS Mail for PC Networks existed.

Windows for Workgroups 3.11 contained a one postoffice-version of MSMail, (which could be upgraded to the full version).

Some more Microsoft email-related history at https://en.wikipedia.org/wiki/History_of_Microsoft_Exchange_...

AshamedCaptain 1 days ago [-]
It is a generic way to refer to unix and unix-like systems. It is still in use today, e.g. to indicate Linux as part of the set. For this document most likely it refers to Xenix (MS's unix).
moron4hire 19 hours ago [-]
Eunuchs is already a plural word.
vikingerik 21 hours ago [-]
As others are saying, the * is meant as a wildcard, not as censorship. It's meant to also cover the likes of Linux or Xenix etc, although there isn't actually any other name that would strictly fit the pattern of "UN*X".
cmiles74 1 days ago [-]
I think trademark, I remember Bill Gates referring to Windows NT as "a better UNIX than UNIX".
p_ing 1 days ago [-]
anthk 12 hours ago [-]
That's Plan9 and 9front. I'd say NT it's a better VMS, VMS++ in the alphabet ;), as you get V->W, M->N, S->T, so "WNT".
Eisenstein 18 hours ago [-]
I particularly like this bit in ntdesrtl.pdf:

"6.1 OS/2 Standards

Our initial OS/2 API set centers around the evolving 32-bit Cruiser, or OS/2 2.0 API set.

(The design of Cruiser APIs is being done in parallel with the NT OS/2 design.) In some respects, this standard is harder to deal with than the POSIX standards. OS/2 is tied to the Intel x86 architecture and these dependencies show up in a number of APIs. Given the nature of OS/2 design (the joint development agreement), we have had little success in influencing the design of the 2.0 APIs so that they are portable and reasonable to implement on non-x86 systems. In addition, the issue of binary compatibility with OS/2 arises when the system is back-ported to an 80386 platform.

This may involve 16-bit as well as 32-bit binary compatibility."

Very "professional" coded writing that expresses a frustration with the need to collaborate with IBM that could have been more succinctly written if they had the option to use a few choice four letter words.

JdeBP 4 hours ago [-]
And also not entirely correct. The 16-bit 1.x API was definitely x86-specific. But the 32-bit 2.x API was not, as evidenced by OS/2 for PowerPC actually existing at one point.
markus_zhang 5 days ago [-]
I think this is the only place that hosts a (hopefully) full electronic copy of the book.
anthk 12 hours ago [-]
The layout it's close to /sys/doc for 9front/plan9 with guides for Acme, Sam editors (the guide to structural expresions as a tutorial and a guide), the Rc shell, the kernel, the ACID debugger, FS' (several, fossil/venti, cwfs and now GeFS), the security, the compilers, the Rio window system, namespaces, processes and channels, plumbing (something like xdg-open but far easier and more powerful), ... much more than a closed source and propietarty NT implementation, as you get the source for free too under libre licenses. Thus, it makes debugging the system a far easier task. And it's far simpler than Unix itself too.
hulitu 8 hours ago [-]
Windows NT had a design ? It all looks like a bunch of staff thrown at a wall in the hope that it will stick.
markus_zhang 7 hours ago [-]
The core NT team came from DEC and was considered as the "adults" in the whole organization. It's definitely much more organized than DOS/Win 3.X back then.
jaen 5 hours ago [-]
I'm very curious about what you consider a real "design" then?!

The Windows NT kernel had a better design than most contemporary and succeeding operating systems (including the various Unixes, Linux, BSD, plan9 etc.).

Modern kernel designs like Google's Fuchsia have more in common with NT than POSIX/Unix/Linux.

In particular, the NT object manager approach subsumes the Unix "everything is a file, well, not quite, oh... uhh.. let us slowly fix that by adding stuff like signalfd, memfd, pidfd etc. ahh hmm, these still do not exactly fit into a FS mold... ah crap, still missing a proper ptrace FS analogue" design approach that eg. Linux has taken in the last two decades.

It also had powerful primitives like eg. APCs that could be used to build higher-level kernel features elegantly.

phendrenad2 1 days ago [-]
This is giving me flashbacks to the times when I had to implement systems based on big, verbose specification documents like this. Horrible.

If you really want to understand Windows, skip this and check out Windows 2000 Internals by Solomon and Russinovich (Win2k is a good middle-ground where Windows had matured a bit).

delta_p_delta_x 24 hours ago [-]
For more modern Windows, Windows Internals Parts 1 and 2[1][2], and Windows 10 System Programming Parts 1 and 2. Basically anything by Yosifovich, Russinovich, Ionescu et al.

[1]: https://www.microsoftpressstore.com/store/windows-internals-...

[2]: https://www.microsoftpressstore.com/store/windows-internals-...

[3]: https://leanpub.com/b/windows10systemprogrammingP1and2

markus_zhang 1 days ago [-]
Yeah this is definitely a hell of read which probably has more historical values than real ones. The book "Inside Windows NT" also gives a good overview about Windows NT 3.1. There is another one called "Understanding Windows NT File System" which talks about the fs I think.

The biggest problem is NT is not open-source, and while there are leaked copies posted online, there is no "official" build guide so people have to try their luck.

RattlesnakeJake 23 hours ago [-]
It's giving me flashbacks to a few months ago.

(*cries in X12 270/271*)

andsmi2 9 hours ago [-]
[dead]
themafia 1 days ago [-]
So.. you had _all_ this.. and for some reason just didn't want turn it into a useful set of "man" pages in your OS?

If they had their eye on the actual ball they wouldn't need to write Halloween memos and rant about developers on stage.

pdw 2 hours ago [-]
In the early 90s Microsoft distributed the full Win32 API documentation as a WinHelp file. It felt very much as hyperlinked man-pages. Super fast too, even on machines of the time. WIN32.HLP can still be easily found, but modern Windows versions no longer ship WinHelp :(
fredoralive 1 days ago [-]
This mostly describes stuff to do with the [Windows] NT [OS][/2] (delete as appropriate) kernel layer, which normal mere mortals aren't supposed to interact with. You're supposed to use stuff like the Win32 KERNEL32.DLL not the more direct DLL, NTDLL.DLL (a DLL). Of course, true hackers scorn such abstractions.
markus_zhang 24 hours ago [-]
I think Windows and DOS do have good documents. I actually think they had way better ones than Linux at the time. But I could be wrong.

For reference:

https://jacobfilipp.com/msj-index/

And also MSDN.

bombcar 22 hours ago [-]
Windows programming guides provided by Microsoft were simply amazing for the time; the documentation available was excellent.

Part of the reason they did so well, companies could easily implement software using the new APIs.

Of course, they also had secret and undocumented APIs that people found and wanted to use ...

Borg3 9 hours ago [-]
This and great backward compatibility. I still can make app targeting Win2000 and it will run on Win2000 onwards (Win10 and Win11 included.) Unfortunately, its starts to fall apart...
markus_zhang 8 hours ago [-]
I guess they finally think they captured enough "value" with Windows so there is no need to keep every subsystem maintained. It must be very expensive to keep a 20+ year developer to sit in a basement room writing code for some feature that does not generate much revenue. Sad truth. TBH I'd love to learn those subsystems and do it for free.
themafia 22 hours ago [-]
Was MSDN always free? I remember their access prices for the services in the late 90s and early 00s were eyewatering.
rerdavies 5 hours ago [-]
Eye-watering perhaps, but not unreasonably for what you got. Although the top-tier Enterprise editions truly were eye-wateringly expensive, and provided nothing compellingly useful that wasn't available in entry-level or mid-tier MSDN subscriptions.

My preference was for a mid-tier subscription that provided a complete set of licensed-for-development-use Windows Server, and Microsoft Office and SQL Server ISOs, for ~$495/year.

Of course, you had no way of knowing that the truly eye-watering top-tier $2000/yr MSDN subscriptions were jam-packed with junky "Enterprise" tools that you would never actually want or need, without actually purchasing one. Nor would you know that the priority support credits bundled with top-tier MSDN support did not materially improve upon the "shouting into the void" nature of unpaid support, which in turn was not materially better than the No Support At All for U option that we currently have. Although my one experience with a paid priority support case did actually produce an actual fix three years later. So there is that, I suppose. :-/

saratogacx 16 hours ago [-]
If I recall, MSDN was super expensive but that is because it included non-commercial licenses for just about everything Microsoft shipped as well as getting hard copies of the documentation and a bunch of other stuff.

If you just wanted to do c++ windows programming you can get visual studio which, I believe, did come with Win32 documentation (especially as CD roms became common distro methods).

The c++ software development kit itself (just libraries, documentation, and samples, no tooling) wasn't too expensive and was mainly material costs.

markus_zhang 10 hours ago [-]
I always drool over those paper documents. Imagining myself getting into some underground facility due to WW3 and hacking on some old computers reading those manuals is one of my comforts.
markus_zhang 9 hours ago [-]
Eh, just found this by a random search:

https://download.microsoft.com/download/1/6/d/16d24ada-5317-...

5,000+ pages...I think this is the FULL documentation for everything related to VB 2005.

wvenable 14 hours ago [-]
When I was a student, they gave away this stuff like candy.
markus_zhang 20 hours ago [-]
I think it was included into the development tools, like VS Professional. So maybe not free until much later.
delta_p_delta_x 24 hours ago [-]
I have a hot take. manpages are really bad for noob examples, or if you actually want to learn how to use something. They are great references if you already know 95% of the tool, but for the most common use cases, they completely lack any sort of examples.

In this sense, LLMs (as much as I am sceptical about them) are much more useful.

asveikau 21 hours ago [-]
This depends a lot on what manpage you're looking at.

When I learned C more than 20 years ago, I found libc manpages a pretty good way to learn. For many functions in section 3, you can read the manpage and make an intelligent guess on how it's implemented, and write your own implementation. I did this as an exercise back in the day.

markus_zhang 23 hours ago [-]
Yeah I agree, TBF I rarely found man-pages to be useful to me, while so far LLM is pretty good at bash scripting, at least at the level that I need. But of course still wants to learn this stuffs in depth.
p_ing 1 days ago [-]
Microsoft's eye wasn't on open sourcing their OS and describing the deep internals. They still don't want you to develop against the NT API, even though developers certainly do (and Microsoft makes compatibility shims for applications which do, when required).
torginus 12 hours ago [-]
Tbh, I think Windows' stable ABI/API at all levels + excellent docs is something the open source community could learn from.

Software that is poorly documented, has no stable api, and no way of accepting extensions other than modifying the source, is not very 'free', in the sense, that if I have a good idea, I'll have crazy amounts of trouble getting the change to users even if I manage to decipher and modify the source appropriately.

This approach describes many software projects. If I wanted to change GCC in a way the original authors didn't approve of, I'd have to fork the entire project and lobby the maintainers to include my versions.

If I wanted to change most MS stuff, like Office, I could grab the COM object out of a registry, and override it with mine, just to list one example. Then I could just put the binary on my website, and others would be able to use it.

As for MS not publishing these particular docs - its not like MS doesnt publish tomes on everything, including the kernel. If you're interested in Windows Internals, I recommend the book series of the same name by Pavel Yosifovich, or really anything by him or Mark Rusinovich.

Its also a testament to the solidity of Windows' design that tons of stuff written decades ago is still relevant today.

lproven 9 hours ago [-]
> If I wanted to change most MS stuff, like Office, I could grab the COM object out of a registry, and override it with mine

This goes back a very long time -- at least to the Windows 3.0 timeframe.

The IBM-only 32-bit OS/2 2.0 came out around the same time as Windows 3.1.

OS/2 2 could run Windows 3 inside what was effectively a VM (a decade before true VMs came to the x86 platform), running the Microsoft code on top of OS/2's built-in DOS emulator.

I remember an IBM person objecting to a journalist saying "so you have access to the Windows source code, and you patch it to run under OS/2?"

Reportedly, the IBM engineer looked a bit pained and said "we don't patch it -- we superclass it in memory to make it a good citizen, well-behaved inside OS/2's protected mode."

(It is over 30Y ago so forgive me if I am not verbatim.)

This was subsequently demonstrated to be true rather than a marketing claim. OS/2 2.0 and 2.1 included a "WinOS2" environment. OS/2 Warp 3 made this an option: it was sold in 2 versions, one with a blue box which contained a Windows 3.1 environment, and one with a red box which did not contain WinOS2 but could be installed on top of an existing Windows 3.1 system and then took over the entire Windows environment, complete with all your installed apps, and ran that inside OS/2.

So you kept all your installed 16-bit apps and settings but got a new 32-bit OS with full memory protection and pre-emptive multitasking as well.

Bear in mind that Windows has not activation mechanism then, so you could copy a complete Windows 3.x installation onto a new PC, change some drivers and it just worked without complaint.

So you could buy a new high-end 486 PC, copy Windows off your old 386, install OS/2 Warp over the top and have a whole new OS with all your apps and their files still running.

This was amazingly radical stuff in the first half of the 1990s.

markus_zhang 8 hours ago [-]
They invested huge amount of resources to make sure it is backward compatible in NT, to the disagreement of David Cutler. There is a NTVDM extended with WoW for Windows 16-bit, AFAIK. I have a copy of leaked source code of NT 3.5 but I'm not good enough to understand the code. Also probably modern VMs such as DOSBOX do a better job emulating 16-bit stuffs.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 22:05:50 GMT+0000 (Coordinated Universal Time) with Vercel.