NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Full Spectrum and Infrared Photography (timstr.website)
lytfyre 1 days ago [-]
>Whenever shooting a subject with a mixture of visible and infrared light, it becomes readily apparent that infrared light focuses differently from visible light. For many subjects, this can mean having to choose between crisp visible contours and an odd pink glow, or blurred edges with some unusual pink features inside. Some things never look sharp no matter where you move the focus.

The extent of this effect is very lens dependent. It also occurs in different colours of visible light too, depending on how well the lens design accounts for it. Optically, the term is "Chromatic Aberration" - lens designers try and account for it in the visible spectrum with optical design and lens coatings, and modern designs are generally extremely well corrected in the visible spectrum. _Usually_ designers aren't worried about the design correctly handing convergence into IR and UV, so how well designs focus them to the same point as the visible spectrum is hit or miss. There's specialist lenses out there that are designed specifically for wide spectrum apochromatism, but they tend to be special purpose and very expensive - especially if they handle UV.

The author mentions it at the bottom of the post as something they're interested in trying out, but I've found it very fun to play with dual bandpass filters - they pass a part of the Visible Spectrum + IR, which creates some interesting options in editing for visual display. There's an example in this set I shot with different filters - https://www.reddit.com/r/infraredphotography/comments/1dnki0...

shagie 23 hours ago [-]
> it becomes readily apparent that infrared light focuses differently from visible light.

On old school manual focus capable lenses you'll note a small (often red when colors were used to indicate f stops) dot to the left of the focus indication line.

https://commons.wikimedia.org/wiki/File:AiS_Nikkor_85mm-2.0_...

On more modern lenses, is simply a dot. https://www.mir.com.my/rb/photography/companies/nikon/nikkor...

This was the offset for IR photography. You'd focus normally, and then make note of the focus distance and then line up the focus distance with the red dot for IR offset.

---

The UV photography often was done with other glass since the glass used by most lenses does an ok job of filtering UV light.

The 105mm UV lens for example - https://www.mir.com.my/rb/photography//hardwares/speciallens...

It's an oddball enough lens that others don't often make that it keeps getting special runs.

https://www.nikon.com/business/industrial-lenses/lineup/uv/

Costal Optics did a run of of the lens too - https://diglloyd.com/prem/s/DAP/Coastal60f4/Coastal60f4.html...

---

One of the photographers I've stumbled across from days of old who did UV nature photography (what do bees see?) http://www.naturfotograf.com/uvstart.html

alter_igel 16 hours ago [-]
Author here. Great point about the lens-dependent abberation, I mentioned this briefly in my earlier write-up about doing the full spectrum mod [1] but forgot to mention it by name here. I've been trying to get by without spending a lot of money on lenses and have gotten a lot of mileage out of a cheap used 50mm lens that _feels_ like it's just one or two solid glass elements. Fortunately the old camera mount I'm using means all the lenses for it are used, old, and super cheap secondhand. I'm about to try my luck with a 300mm lens. IR should be fun but we'll see if I can squeeze any UV at all through that.

Beautiful shots you have with your own full spectrum camera. Originally I somewhat dismissed the Kolari IR Chrome filter because the suggested combination with a channel swap and custom LUT felt a little too heavily edited for me and I prefer to stay close to the dry camera signal. The shot with the Tiffen Deep Yellow filter is gorgeous, how does that one look on the camera LCD without the channel swap?

[1] https://timstr.website/blog/diyfullspectrummod.html

lytfyre 14 hours ago [-]
Thanks!

The IR Chrome does not need a channel swap, just setting the white balance in-camera is enough to get a usable image.

The deep yellow looks mostly like a purple and yellow mess straight out of camera. Intent is for the yellow filter to block blue light, and a UV cut filter, leaving the blue channel to have ~almost solely IR, then subtracting that from the other channels to leave you with "clean" Red, Green, and IR as your three channels you can swap around. Probably the least dry camera signal approach of the bunch, unfortunately.

miladyincontrol 23 hours ago [-]
Superachromat is the term you're looking for in terms of lenses corrected throughout IR to UV. They're not actually that expensive if you know where to look, I got my Zeiss 250mm supeachromat for about $500 from a Japanese seller. Works a treat for full spectrum film work especially on a system that lets you swap between film backs, if not for the woeful cost of film these days.
s0rce 18 hours ago [-]
You could use a reflective long focal length lens and then everything would be in focus.
fraywing 1 days ago [-]
This is really cool -- pedantically, I've always thought "full spectrum" is actually misleading from a traditional photographic sense. Like IR + visible light + UV != full spectrum. I'd love to see post-processed imagery of every-day life through an extended view of broader EM energy (similar to astrophotography)... like what does a city scene look like with x-rays and microwaves included?

Side note: have always loved this image https://imgur.com/NZjWfWT of rainbows with UV and IR visible.

alter_igel 16 hours ago [-]
Author here. I agree with you, "full spectrum" is a generous marketing phrase for what might more accurately be called _extended_ spectrum.

People way smarter than me have been able to achieve DIY spatial imaging with x-rays via compressed sensing [1] and with microwaves via phased arrays [2].

Optical wavelengths seem to be at a sweet spot of good angular resolution, varied natural sources, and harmless to humans.

[1] https://www.youtube.com/watch?v=EuVgGrun1V0

[2] https://www.youtube.com/watch?v=sXwDrcd1t-E

_Microft 23 hours ago [-]
By this measure, there is no "full spectrum" photography ever.
adrian_b 15 hours ago [-]
If you specify the source used for lighting, e.g. solar light, you can define precisely what "full spectrum" photography is, i.e. recording a bandwidth large enough so that any lighting energy that falls outside that range is negligible.
lsaferite 7 hours ago [-]
Playing with a hyper-spectral imager makes you rethink how we see things. I've talked about this before, but human vision essentially "low resolution" on the spectral bands. Using an HSI that "sees" in 4nm spectral slices from 350-1000nm is really interesting (Cubert Ultris X20 Plus). There's so much spectral information that we just totally miss. I really wish the equipment for capturing images at these higher spectral resolutions wasn't so expensive so we could see people experimenting with them on a large scale. The one I got to work with was more than a nice SUV and that's cheap in the space. The ones we looked at from Headwall that did something like 400-2500nm started at $250k. Those weren't even full-frame imagers like the Ultris, they were line scanning imagers. That massive cost jump was down to the fact that from ~1050nm+ you need much different hardware to capture spectral data.

If anyone is interested in some technical aspects of the "full frame" HSI I worked with, it's quite interesting. It had a 20MP Monochromatic Sensor that captured single-band 12-bit data behind an array of lenses that split the incoming spectral range (350-100nm) into 164 individual 4nm wide bands of light that hit 410x410px squares on the sensor. The sensor can capture from 350-1100, but the QE drops of really fast past about 850nm and the product limited the upper range to 1000nm. I'm sure I munged something there, but you should get the general idea. I highly recommend researching the space of HSI, it's fascinating.

Last thing to point out, when working with an HSI like this, one thing you can do is capture a "spectral fingerprint". Since you've gone from three bands on spectral intensity information to, in our case, 164 bands you have the ability to turn that high-density spectral data for each pixel into essentially a line graph. Using that information you can do matching against a database of known spectral fingerprints and identify materials and material properties really well. In the multi-spectra world you'll see this capability used to identify crop health. In the hyperspectral world you can identify so much more. For instance, it can see skin anomalies that aren't visible to the human eye. You can identify specific minerals in a picture of a bunch of rocks (you need up into the 2500nm range for this though). You can easily spot foreign objects on a conveyor of food items. Overall, it's a long list of capabilities and I'm certain there are many more uses we could discover if the imagers were cheaper. And if you are into the wider ML world (not just focused on LLMs I mean), you'll see ML Classification Models being trained on these spectral fingerprints as well.

Anyway, the "full-spectrum" is fascinating, especially when you are able to slice it thin.

IAmBroom 1 days ago [-]
You'd obviously have to use false-color, as most modern astronomy pictures do (even the ones that use visible tend to pump the saturation UP!).

However, the amount of light from the sun drops off exponentially away from the peak at green-blue (yellow-green, after atmospheric filtering). You'd also have to really fake the dynamic range a lot to get it to look any different from IR+Vis+NUV. (If there was 0.001% as much x-ray light as there is, say, red light, DNA could only exist in the lightless depths of the ocean.)

So, it would look like an IR+Vis photo (light falls off pretty fast in the UV, too), except the ones you've seen oversell the IR.

So it would look like a Vis-light photo, with slightly shinier objects in it.

Sorry.

mncharity 24 hours ago [-]
I like distinguishing "light" (physical world) from "color" (species-specific biology). Sunbeam blue light is already less intense than NIR-I, but human bio juices the blue. Most humans are bright-light trichromats and low-light monochromats. Rod sensitivity is 3 orders of magnitude up, with single-ish photon sensitivity. Some amphibians have an extra rod type, for low-light bichromaticity. Some deep-sea fish are bright mono and dark lotschromats (12+ rod opsins). So why not imagine seeing the world with a triple (or more) of short-wavelength super-rods, a few orders of magnitude more sensitive still, with whatever curves seem fun? Perhaps curves naturally selected for by "makes intriguing images of the world for social media"...
avidiax 1 days ago [-]
One thing I've wondered about is IR fluorescence photography.

I've seen some examples in document forensics where a page that looks blank (or at least the ink is unrecognizably smudged) because of water exposure is completely legible with an infrared photo illuminated by UV.

I suspect there must be a hidden world only visible in IR and UV (and long-wave IR, e.g. "thermal").

JKCalhoun 23 hours ago [-]
I like the idea of using the IR, where it shows a greater degree of contrast, to act as a "contrast mask" on the visible light image.

I'm thinking of the beautiful cloud detail in the one IR shot where the visible light photo had lost all of that. Seems like some compositing (sort of like HDR) you could try to pull in the best of both worlds.

mcdeltat 22 hours ago [-]
For contrast specifically, you can get much of that effect with a red filter. Just removing the majority blue/green components already changes the lighting massively. Can add a polarising filter too for an even more extreme effect on the sky.

E.g. this photo (looks quite HDR'd but it's not, it's barely edited): https://rjones.photos/gallery/photo/20251207-img9638

NoiseBert69 1 days ago [-]
You’re considering whether it would be possible - and perhaps quite elegant - to use an XY‑scanner to raster‑scan the end of an optical fiber across a prism, disperse the light, and then capture the resulting spectrum with a CCD line sensor.

With that setup, each pixel on the line sensor would effectively record the full spectral content of the light at that scanned position, all in a single acquisition.

tomtom1337 1 days ago [-]
You could probably use just an X-scanner, and instead of a CCD line sensor, use a regular 2D image sensor if you used a "1 pixel wide" slit aperture to crop the image perpendicularly to the direction that the prism disperses the light. So instead of a single pixel being dispersed, you disperse a line.

You would reduce the time required by the root of the number of pixels you want (assuming a square image).

(This is what we do in momentum-resolved electron energy loss spectroscopy. In that situation we have electromagnetic lenses that focus the electrons that have been dispersed, so we don't have as bad a chromatic aberration problem as the other response mentions).

I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!

mncharity 23 hours ago [-]
> I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!

Here[1] are some 31-band hyperspectral images of butterflies. Numpy/pillow can unpack the .mat files into normal images. Then perhaps vibecode a slider, or just browse the band images?

[1] http://www.ok.sc.e.titech.ac.jp/res/MSI/MSIdata31.html (includes 8 butterfly 31-band hyperspectral visible-light images). These butterflies are also in their VIS-SNIR dataset, and others.

I knew of the site having explored "First-tier physical-sciences graduate students are often deeply confused about color. Color is commonly taught, starting in K... very very poorly. So can we create K-3 interactive content centered around spectra, and give an actionable understanding of color?"

NoiseBert69 23 hours ago [-]
Very nice idea! That makes it much easier!
asdff 1 days ago [-]
A problem for multispectral imagery (even within visible rgb), is that the wavelengths of light are different so the lens cannot be in focus for all spectrum at once. I have tested this out with a few of my slr lenses. If you have blue channel perfectly in focus, red isn't just a little out of focus, it is actually noticeably way out.
tomtom1337 1 days ago [-]
This is called chromatic aberration, for those who are intrigued.

Given that regular phone cameras have sensors that detect RGB, I wonder if one could notice improved image sharpness if one had three camera lenses (and used single-color sensors) next to one another laterally, with a color filter for R, G and B for each one respectively. So that the camera could focus perfectly for each wavelength.

asdff 23 hours ago [-]
Next issue would be the perspective distortion in the merged image
lytfyre 1 days ago [-]
there are lenses out there designed for apochromatic performance across the UV-Vis-IR band, but they tend to be really pricey.

The Coastal Optical 60mm is a frequently cited one. UV in particular is challenging, because glass that works well in the visible light range can be quite poorly translucent in UV. Quartz is better, but drives up the cost a lot, and comes with other tradeoffs.

dheera 1 days ago [-]
I've had this problem as well, but it's just due to optical properties of the lens and extremely consistent from image to image, so you can calibrate and correct for it as long as you focus each wavelength and collect data separately.
asdff 23 hours ago [-]
I don't think you can property calibrate for it unless you also move the camera to compensate for focus breathing. I'm not sure if that would fully account for it either. That being said these things are only very noticeable pixel peeping.
dheera 23 hours ago [-]
Focus breathing can be compensated for. The "breathing" only changes the effective focal length, not the location of the camera, so you can map the pixels to match where they should be and bilinear/bicubic interpolate appropriately.

Shoot a checkerboard at both wavelengths each focused properly and then compute the mapping.

If you're shooting macro stuff then maybe you are changing the effective location of the camera slightly depending on the exact mechanics of the lens and whether the aperture slides with the focusing, but the couple of mm shift in camera location won't matter for landscapes.

Alternatively, use cine lenses which are engineered not to breathe, but they are typically more expensive for that reason.

regus 23 hours ago [-]
Looks like grainydays on youtube can finally stop chugging flaming hot mountain dew everyday in the hopes that Kodak will bring back Aerochrome.

https://www.youtube.com/watch?v=v5KBQd_DkQw

dheera 1 days ago [-]
Related project: I shot a lot of landscapes in Iceland using a thermal (long wave IR) camera to show geologic phenomena in action. This involved stitching together a lot of (narrow FOV) thermal images and overlaying it on top of visible camera images for context.

https://petapixel.com/2019/07/13/shooting-high-res-thermal-p...

alter_igel 16 hours ago [-]
Author here. Amazing work! The visible and thermal compositing is done really well and gives back so much detail and context that is lost in purely thermal images.

Does your IR camera give you access to raw temperature data? I've briefly played with a cheap thermal camera and it seemed to assign its own colours varyingly depending on the dynamic range of temperatures in view.

dheera 4 hours ago [-]
Yes, Seek RevealPro gives raw temperature data in radiometric tiff files. They are tiff files with temperatures in Celsius as float32 values in the second page of each .tiff. GUI image editors can have trouble dealing with these multi-page float tiffs but Python's PIL package can read it just fine.

Here's a basic script that converts them to greyscale uint8 with a fixed linear mapping, making them compatible with GUI panoramic stitching software.

https://github.com/dheera/iceland-thermal/blob/master/script...

vjvjvjvjghv 22 hours ago [-]
Very cool!
mcdeltat 22 hours ago [-]
This is super cool, I want one now
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 20:38:03 GMT+0000 (Coordinated Universal Time) with Vercel.