This was so much fun to read! Very neat solutions using spherical coordinates and logarithms.
How did you get the actual idea to do this in the first place?
otterstack 3 hours ago [-]
Thanks!
It's fuzzy, but I think it was because I was learning GB assembly while working on shaders in Houdini or something (I'm a tech artist). The two worlds collided in my head, saw that there's no native multiplication on the GB, and figured it'd be a fun problem.
Agentlien 60 minutes ago [-]
As someone with a similar background (graphics programmer) it sure seems like a fun problem.
I honestly found the lack of multiplication instruction quite surprising. I did not know that!
glouwbug 1 days ago [-]
It’s nice getting real hacker material on hackernews
andix 1 days ago [-]
It wasn't just a prompt to an AI? How did they do it? ;)
FeteCommuniste 24 hours ago [-]
The lost, dark art of using one's brain to implement something line by line.
jama211 10 hours ago [-]
I mean it’s cool and all but it’s like making a painting entirely out of tiny dots with your hands tied behind your back. I’m happy for their achievement and it looks cool but it shouldn’t throw any shade on those of us who just like to use a paint brush instead.
jama211 10 hours ago [-]
Genuinely curious, what’s your goal here? Disparage those who use LLMs? Or just express your unhappiness at the amount of ai content on the HN front page? Or just want to throw shade on LLM use in general?
This is impressive and cool but I don’t understand the bitterness here.
andix 10 hours ago [-]
sarcasm. In response of HN being mostly about AI now.
I’m very interested in the AI content, but it’s also a bit sad how much it became the main topic.
zamadatix 8 hours ago [-]
I've only been here 8 years but it seems like there has always been such a topic sucking the air from the room at any given era.
This inevitably results in even the completely unrelated topics constantly becoming a reference to that conversation.
That has it's own wake of someone discussing how it's brought into every conversation by those that either love/hate - further making it suck even more air out of the room.
At this point the ink catches up with itself while folks such as folks like Danny Spencer occasionally deliver us the quick doomscrolling hit we were all really here for.
andix 7 hours ago [-]
Just admit that you don't understand sarcasm.
jama211 4 hours ago [-]
Aww man you were doing so well
zamadatix 5 hours ago [-]
? I'm not any of the previous people talking about why you night have commented. I'm talking to your above note about bringing sarcastic comments about AI into this post not previously about AI. That said, sure - I'm probably not the best sarcasm detector myself anyhow :).
I.e. AI is such the main topic here that we still have some type of comment (sarcastic or not) bringing it up in the few posts unrelated to it. It's truly sadly inescapable on more than one level, as will be whatever the next hot topic is in a few years.
andix 4 hours ago [-]
Sorry
zamadatix 2 hours ago [-]
Hey no worries, sorry I was unclear! Have a good one.
jama211 4 hours ago [-]
Ah, so number 2. Thanks for answering!
speps 1 days ago [-]
Awesome looking results. As far as I understand it's a "3D" shader in the sense that it looks 3D but it's a prerendered 2D normal map which is then lit using the resulting world space normal.
It's not that different from "real 3d" renderers. Especially in deferred rendering pipelines the rasterizer creates a bunch of buffers for depth map, normal map, color, etc but main shaders are running on those 2d buffers. That's the beauty of it parts operating with 3d triangles are kept simple simple and the expensive lighting shaders run once on flat 2d images with 0 overdraw. The shaders don't care whether normal map buffer came from 3d geometry which was rasterized just now, prerendered some time ago or the mix of 2. And even in forward rendering pipelines the fragment shader is operating on implicit 2d pixels created by vertex shaders and rasterizer from "real 3d" data.
The way I look at it if the input and math in the shader is working with 3d vectors its a 3d shader. Whether there is also a 3d rasterizer is a separate question.
Modern 3d games are exploiting it in many different ways. Prerendering a 3D model from multiple views might sound like cheating but use of imposters is a real technique used by proper 3d engines.
araes 1 days ago [-]
There's a GBDK demo that actually does something similar (spinning 2D imposters). Does not handle the lighting though, which is quite impressive.
Unfortunately, the 2D imposter mode has pretty significant difficulties with arbitrarily rotated 3D. The GBDK imposter rotation demo needs a 256k cart just to handle 64 rotation frames in a circle for a single object. Expanding that out to fully 3D views and rotations gets quite prohibitive.
Haven't tried downloading RGDBS to compile this yet. However, suspect the final file is probably similar, and pushing the upper limits on GB cart sizes.
anthk 16 hours ago [-]
Well, Cannon Fodder for the GBC it's 1 MB big, and the rest such as Metal Gear and Alone in the Dark are pretty sized for the hardware.
Someone 14 hours ago [-]
It’s a shader, not a renderer. The images are pre-rendered, but the shading is done in real time.
⇒ I think they’re correct in calling this a 3D shader.
antidamage 22 hours ago [-]
It's not that different from how some creative Mac games were doing 3d lighting on 2d textures prior to 3d accelerated hardware being available. The neat part here is that it runs on a Gameboy Colour.
bulbar 16 hours ago [-]
On a device that apparently doesn't even support floating point operations and doesn't support multiplication. Super cool.
wasmainiac 1 days ago [-]
> An overall failed attempt at using AI
> I attempted to use AI to try out the process, mostly because 1) the industry won't shut up about AI, and 2) I wanted a grounded opinion of it for novel projects, so I have a concrete and personal reference point when talking about it in the wild. At the end of the day, this is still a hobbyist project, so AI really isn't the point! But still...
> I believe in disclosing all attempts or actual uses of generative AI output, because I think it's unethical to deceive people about the process of your work. Not doing so undermines trust, and amounts to disinformation or plagiarism. Disclosure also invites people who have disagreements to engage with the work, which they should be able to. I'm open to feedback, btw.
Thank you for your honesty! Also tremendous project.
otterstack 1 days ago [-]
The funny thing is the phrasing used to be more neutral, but I changed the tone to be slightly more skeptical because people thought I was just glazing AI in my post. Another guy on Reddit seemed annoyed that I didn't love AI enough.
I just wanted to document the process for this type of project. shrug
ekipan 24 hours ago [-]
It seems to me that AI is mostly optimized for tricking suits into thinking they don't need people to do actual work. If I hear "you're absolutely right!" one more time my eyes might roll all the way back into my head.
Still, even though they suck at specific artifacts or copy, I've had success asking an LLM to poke for holes in my documentation. Things that need concrete examples, knowledge assumptions I didn't realize I was making, that sort of thing.
Sweet Gameboy shader!
VoidWhisperer 22 hours ago [-]
You're absolutely right! (sorry, I couldn't resist)
wileydragonfly 1 days ago [-]
Just… ignore Reddit.
jama211 10 hours ago [-]
I dunno about the need for disclosure in this way. In my working life I’ve copied a lot of code from stack overflow, or a forum or something when I’ve been stuck. I’ve understood it (or at least tried to) when implementing it, but I didn’t technically write it. It was never a problem though because everybody did this to some degree and no one would demand others disclose such a thing at least in hobby projects or low stakes professional work (obviously it’s different if you’re making like, autopilot software for a passenger plane or something mission critical, that’s notwithstanding).
If it’s the norm to use LLMs, which I honestly believe is the case now or at least very soon, why disclose the obvious? I’d do it the other way around, if you made it by hand disclose that it was entirely handmade, without any AI or stackoverflow or anything, and we can treat it with respect and ooh and ahh accordingly. But otherwise it’s totally reasonable to assume LLM usage, at the end of the day the developer is still responsible for the final result, how it functions, just like a company is responsible for its products even if they contracted out the development of them. Or how a filmmaker is responsible for how a scene looks even if they used abobe after effects to content aware remove an object.
otterstack 4 hours ago [-]
I disclosed AI because I think it's important to disclose it. I also take pride in the process. Mind you, I also cite Stack Overflow answers in my code if I use it. Usually with a comment like:
// Source: https://stackoverflow.com/q/11828270
With any AI code I use, I adopted this style (at least for now):
// Note: This was generated by Claude 4.5 Sonnet (AI).
// Prompt: Do something real cool.
spacebacon 1 days ago [-]
This GBC shader reveals a key truth: all computation is approximation under constraint. Multiplication becomes table lookups plus addition, while precision yields to what the eye actually sees.
VimEscapeArtist 14 hours ago [-]
I bow before the master. Genuinely outstanding work.
Since you're already doing what's essentially demoscene-grade hacking, have you thought about putting together a short demo and entering it at a demoparty? There's a list of events at demoparty.net - this kind of thing would absolutely shine there.
Waterluvian 1 days ago [-]
I’m incredibly impressed by this, largely because it actually is running on a CGB. What I see often are hacks where the game boy is just being used as a terminal and the cartridge has been packed with far more powerful processing power.
giancarlostoro 1 days ago [-]
I lowkey wish Nintendo would rerelease the GBC or GBA I would buy one. They can bake in some games into a few cartridges and make it 100% worth the buy too.
gkhartman 1 days ago [-]
You can pick a used one up for pretty cheap. Add a flash cartridge and you're done. I think the cheap android handhelds of the same form factor are a better option though.
I've still got my Gameboy collection, but rarely use it. It's just so much easier to fire up an emulator these days.
giancarlostoro 21 hours ago [-]
I still have my 90s one but would love a modern brand new one, similar to how they did the SNES Mini
vel0city 3 hours ago [-]
This is pricey but pretty awesome. Very well built, high quality. Hit up a local used game store and have a more modern hardware experience with legit copies of the actual games.
I went with getting a GBA SP and replacing the screen with a more modern panel. The kids love it.
VimEscapeArtist 1 days ago [-]
You can buy the ModRetro Chromatic from the Oculus VR creator. It's better than anything Nintendo could ever produce.
giancarlostoro 21 hours ago [-]
I seen those but I dont like the asthetic, my GBC from the 90s is dirty but sturdy as heck despite my carelessness through 28 plus years
wileydragonfly 1 days ago [-]
Doesn’t he use your money to blow people up or something?
VanTheBrand 21 hours ago [-]
No the US government does that. He just takes the money taxpayers gave the government to blow people up with, and unlike the other defense contractors, also indirectly finances the production of gameboys with some of the money. The idea that giving money to ModRetro finances arms is essentially backwards from how money actually flows.
vscode-rest 16 hours ago [-]
It would be quite hilarious if the game boy knockoff proceeds were funding the defense contract executions.
fwystup 21 hours ago [-]
Really interesting Project, thanks for sharing. Reminded me a bit of my time coding assembly on the C64 (yeah, I'm old). For 3D (wire-frame) we also needed to find creative ways around hardware limitations, especially the lack of a multiply instruction.
villgax 1 days ago [-]
This is why HN exists, almost gives me the same joy as flipping through tech magazines of yester-decades.
HeckFeck 1 days ago [-]
This is the coolest thing I've seen in months. Licence it as beerware, then I'm obliged to owe you one.
Sharlin 1 days ago [-]
The "Making it work" section seems to abruptly end at the following?
By modifying the instruction operand!
2A ld a, [hl+]
D6 08 sub a, 8
otterstack 1 days ago [-]
Aah you're right. That was from my vomit draft and forgot to tidy it up. I'll update the post soon
Sharlin 1 days ago [-]
Thanks!
jnpnj 1 days ago [-]
Always loved using old hardware with recent understandings.
fallat 1 days ago [-]
I don't think there's anything recent here, they are just pre-computing a normal map which doubles as already "baking" a 3D-looking image in.
jnpnj 5 hours ago [-]
Was there anything of that sort made during the gbc era on this hardware ? I thought nobody ever attempted it before
fallat 5 hours ago [-]
Not exactly this, but many "3D games" were pre-computed scenes. The normal map is the novel bit of this demo.
jnpnj 2 hours ago [-]
Ok, and at the time, was anybody even thinking about computing normal maps this way on such hardware ? that was my original though, "maybe" this is the result of applying more recent ideas for this group to hardware that wasn't made to support it. But maybe i'm wrong and people did try.
a_t48 1 days ago [-]
Nice, I’ll have to give this a try on my Analogue Pocket
unicorn_cowboy 22 hours ago [-]
This author is a psycho. In a good way.
ndgold 1 days ago [-]
I can’t believe it
iLoveOncall 1 days ago [-]
Isn't it a bug that when spinning the object the light also spins?
araes 1 days ago [-]
It's the equivalent of spinning the view camera around in the scene. Up / Down spins the light coordinates, Left / Right spins the camera viewpoint.
Probably could have been written that way though, since it is spinning the camera view rather than the object.
patrick4urcloud 21 hours ago [-]
nice job !
Rendered at 22:34:38 GMT+0000 (Coordinated Universal Time) with Vercel.
I'm also looking into simplifying it a bit more with environment maps, which I shared on my Bsky: https://bsky.app/profile/dannyspencer.bsky.social/post/3mecu...
How did you get the actual idea to do this in the first place?
It's fuzzy, but I think it was because I was learning GB assembly while working on shaders in Houdini or something (I'm a tech artist). The two worlds collided in my head, saw that there's no native multiplication on the GB, and figured it'd be a fun problem.
I honestly found the lack of multiplication instruction quite surprising. I did not know that!
This is impressive and cool but I don’t understand the bitterness here.
I’m very interested in the AI content, but it’s also a bit sad how much it became the main topic.
This inevitably results in even the completely unrelated topics constantly becoming a reference to that conversation.
That has it's own wake of someone discussing how it's brought into every conversation by those that either love/hate - further making it suck even more air out of the room.
At this point the ink catches up with itself while folks such as folks like Danny Spencer occasionally deliver us the quick doomscrolling hit we were all really here for.
I.e. AI is such the main topic here that we still have some type of comment (sarcastic or not) bringing it up in the few posts unrelated to it. It's truly sadly inescapable on more than one level, as will be whatever the next hot topic is in a few years.
Here are the frames: https://github.com/nukep/gbshader/tree/main/sequences/gbspin...
The way I look at it if the input and math in the shader is working with 3d vectors its a 3d shader. Whether there is also a 3d rasterizer is a separate question.
Modern 3d games are exploiting it in many different ways. Prerendering a 3D model from multiple views might sound like cheating but use of imposters is a real technique used by proper 3d engines.
https://github.com/gbdk-2020/gbdk-2020/tree/develop/gbdk-lib...
Unfortunately, the 2D imposter mode has pretty significant difficulties with arbitrarily rotated 3D. The GBDK imposter rotation demo needs a 256k cart just to handle 64 rotation frames in a circle for a single object. Expanding that out to fully 3D views and rotations gets quite prohibitive.
Haven't tried downloading RGDBS to compile this yet. However, suspect the final file is probably similar, and pushing the upper limits on GB cart sizes.
⇒ I think they’re correct in calling this a 3D shader.
> I believe in disclosing all attempts or actual uses of generative AI output, because I think it's unethical to deceive people about the process of your work. Not doing so undermines trust, and amounts to disinformation or plagiarism. Disclosure also invites people who have disagreements to engage with the work, which they should be able to. I'm open to feedback, btw.
Thank you for your honesty! Also tremendous project.
I just wanted to document the process for this type of project. shrug
Still, even though they suck at specific artifacts or copy, I've had success asking an LLM to poke for holes in my documentation. Things that need concrete examples, knowledge assumptions I didn't realize I was making, that sort of thing.
Sweet Gameboy shader!
If it’s the norm to use LLMs, which I honestly believe is the case now or at least very soon, why disclose the obvious? I’d do it the other way around, if you made it by hand disclose that it was entirely handmade, without any AI or stackoverflow or anything, and we can treat it with respect and ooh and ahh accordingly. But otherwise it’s totally reasonable to assume LLM usage, at the end of the day the developer is still responsible for the final result, how it functions, just like a company is responsible for its products even if they contracted out the development of them. Or how a filmmaker is responsible for how a scene looks even if they used abobe after effects to content aware remove an object.
Since you're already doing what's essentially demoscene-grade hacking, have you thought about putting together a short demo and entering it at a demoparty? There's a list of events at demoparty.net - this kind of thing would absolutely shine there.
I've still got my Gameboy collection, but rarely use it. It's just so much easier to fire up an emulator these days.
https://www.analogue.co/pocket
I went with getting a GBA SP and replacing the screen with a more modern panel. The kids love it.
Probably could have been written that way though, since it is spinning the camera view rather than the object.