NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Show HN: A physically-based GPU ray tracer written in Julia (makie.org)
zamalek 1 days ago [-]
> Cross-vendor GPU support: A single codebase runs on AMD, NVIDIA, and CPU via KernelAbstractions.jl

This is why I wish Julia were the language for ML and sci comp in general, but Python is sucking all of the air out of the room.

jampekka 1 days ago [-]
Maybe because Python can reasonably used to make actual applications instead of just notebooks or REPL sessions.
pjmlp 14 hours ago [-]
https://juliahub.com/case-studies

Most "Python" applications are actually bindings to C, C++ and Fortran code doing the real work.

yellowapple 1 days ago [-]
What's stopping Julia from being reasonably usable to make actual applications? It's been awhile since I've touched it, but I ain't seeing a whole lot in the way of obstacles there — just less inertia.
zem 2 hours ago [-]
I was excited about julia as an application development language when it first came out, but the language and ecosystem seem to be targeting long-running processes. there was just a ton of latency in build time and startup time for things like scripts and applications, so I moved on.
fc417fc802 19 hours ago [-]
Presumably inertia and ecosystem size (but that's a follow on of inertia). When Julia came out Python already had traction for ~most things.

Keep in mind that it went with 1 based indexes to make the switch easy for Matlab types. I'm not sure if that was a good or bad move for the long term. I'm sure it got some people to move who otherwise wouldn't have but conversely there are also people like me who rejected it outright as a result (after suffering at the hands of 1 based indexing in Matlab I will never touch those again if I have any say in the matter).

I've considered switching to it a few times since seeing that they added variable indexes but Python works well enough. Honestly if I were going to the trouble of switching I'd much rather use Common Lisp or R5RS. The nearest miss for me is probably Chicken, where you can seamlessly inline C code but (fatally) not C++ templates.

If I ever encounter "Chicken, except Rust" I will probably switch to that for most things.

zem 2 hours ago [-]
I've always thought it sad that lush died; in many ways it was a spiritual predecessor to julia. here's a nice blog post about it: https://scottlocklin.wordpress.com/2024/11/19/lush-my-favori...
seertaak 11 hours ago [-]
That's part of the answer, but there's a bit more to it IMO.

The syntax is a bit weird; python, swift, rust, and zig feel more parsimonious.

I absolutely love multimethods, but I think the language would have been better served by non-symmetric multimethods (rather than the symmetric multimethods which are used). The reason is that symmetric multimethods require a PHD-level compiler implementation. That, in turn, means a developer can't easily picture what the compiler is doing in any given situation. By contrast, had the language designers used asymmetric multimethods (where argument position affects type checking), compilation becomes trivial -- in particular, easily allowing separate compilation. You already know how: it's the draw shapes trick i.e., double-dispatch. So in this case, it's trivial to keep what compiler is "doing" in your head. (Of course, the compiler is free to use clever tricks, such as dispatch tables, to speed things up.)

The aforementioned interacts sensitively with JIT compilation, with the net outcome that it's reportedly difficult to predict the performance of a snippet of Julia code.

zamalek 1 days ago [-]
It's actually better suited IMO, being a compiled language. I'm not sure how anyone could consider the current train wreck of getting python code just to run "actual applications." uv is great and all, but many of these "actual applications" don't use it.
mathisfun123 16 hours ago [-]
https://yuri.is/not-julia/

> My conclusion after using Julia for many years is that there are too many correctness and composability bugs throughout the ecosystem to justify using it in just about any context where correctness matters.

mathisfun123 16 hours ago [-]
i hope you realize this is purely because julia uses LLVM and LLVM has backends for those targets (noticeably absent are GPUs which do not have LLVM backends). any other language which uses LLVM could do the same exact same thing (and would be hampered in the exact same way).
majoe 6 hours ago [-]
Probably true, but one unique thing about Julia is, that exposes almost all stages of the compilation to the user. From typed IR to native code generation you can customise the compilation in many ways. Together with the power of LISP's metaprogramming features, that's a really fine basis for powerful and performamt DSLs and code transformations.

All those GPU targets are powered by libraries, that are not part of Julia itself (GPUCompiler.jl). The same goes for automatic differentiation. That's remarkable in my opinion.

So you're right, that many programming languages could do it, but it's no wonder, that other languages are lacking in this regard compared to Julia.

krastanov 1 days ago [-]
As an aside, it is really interesting to see a computational package that, while supporting multiple GPU vendors, was first vetted on AMD, not NVidia. It is encouraging to see ROCM finally shaking off its reputation for poor support.
mr_octopus 24 hours ago [-]
The vendor-agnostic GPU approach via KernelAbstractions is great to see. The Vulkan compute path is underrated for this — it runs on AMD, NVIDIA, and Intel without needing ROCm or CUDA, just whatever driver ships with the GPU.

Re: the compilation latency discussion — it's a real tension. JIT gives you expressiveness but kills startup. AOT gives you instant start but limits flexibility. Interesting that most GPU languages went JIT when the GPU itself runs pre-compiled SPIR-V/PTX anyway.

simondanisch 1 days ago [-]
well, I do hate vendor lock in with a passion ;) But yeah, a lot did happen, this likely wouldn't have been possible one or two years ago!
amelius 1 days ago [-]
Is the material description part of the language the same as in PBRT?

I'm asking because I had a lot of trouble trying to describe interfaces between materials, only to find out that what I wanted to do was not possible in PBRT without modifying the code. Apparently, in PBRT a material can only have one other material touching it. So, for example rendering a glass filled with water and ice is not possible without hacks. From a user's point of view this is a bit of a let-down, of course.

Context: https://news.ycombinator.com/item?id=45668543

simondanisch 1 days ago [-]
Nope, we made a complete high level Julia interface and I plan to have the Makie API be the main user facing scene description, which can be more descriptive than pbrt I think!
amelius 1 days ago [-]
Ok. Did you see this:

https://blog.yiningkarlli.com/2019/05/nested-dielectrics.htm...

And I'm curious how you solve it.

simondanisch 1 days ago [-]
Sorry, I was on my phone. This doesn't seem to be a problem of the description language, but rather how the integrator and materials work internally, so this works the same way in Julia currently. I do think though, that its more approachable to add experimental features like this in the Julia version. Would certainly be an interesting project! I do want to over time get further away from the pbrt-v4 architecture and get to something much more modular and easy to extend. I feel like the overlaps resolve should happen at scene creation time, to not have an expensive priority stack at raytracing time - then it would be just a matter of better tracking the media at boundary crossing. But haven't really thought this through of course ;)
amelius 1 days ago [-]
I think it was a problem with the language as well as how they handle it internally. It was basically the algorithm that dictates how the language works, and consequently there was no way to have one material touch more than one other material. But I might misremember.

Anyway, I'm looking at this from the user's perspective. I wanted to do some physics-based ray-tracing with lenses and pbrt is what I ended up trying. As such, I really needed the multi-material aspect to work correctly. Also, it would be nice to be able to describe surfaces using a z=f(x,y) kind of formulation, or a way to place a hook in the renderer.

simondanisch 1 days ago [-]
It's definitely an architectural problem as well. I do wonder if we could extend that though, without too much trouble for the general architecture - after all, the material does not necessarily need to represent all the outside materials and instead the ray only needs to be able to go from one medium to another. I'm happy to chat about possible extensions in that direction, although to be fair I wont have much time in the next weeks to sit down on anything like this. But, I do really hope that this can become a playground for ray tracing experiments in general!
amelius 1 days ago [-]
I think maybe the easiest way to tackle the problem is to have the language describe surfaces instead of solid objects, and let every surface have a normal and two materials. This might be the most natural representation for a ray tracer.
simondanisch 1 days ago [-]
We are working on surface support in Makie to some degree: https://github.com/MakieOrg/Makie.jl/pull/5516 If we get funding, we may also support stuff like NURBS. Obviously, once that gets merged, we do want to also add Raytracing support for it ;)
NoboruWataya 1 days ago [-]
I don't hear nearly as much about Julia as I used to. A few years ago the view was that it was about to replace Python as the language of choice for data science. Seems like that didn't happen?
simondanisch 1 days ago [-]
I think the hype has slowed down, but all growth statistics haven't. Personally, I think Julia is the only language where I can implement something like Makie without running into a maintenance nightmare, and with Julia GPU programming is actually fun and high level and composes well, which I miss in most other languages. So, I dont really care about it replacing python or not. I do think for replacing python Julia will need to solve compilation latency, shipping AOT binaries and maybe interpret more of the glue code, which currently introduces quite a lot of compilation overhead without much gains in terms of performance.
electroly 1 days ago [-]
I don't know about everyone else, but slow Julia compilation continues to cause me ongoing suffering to this day. I don't think they're ever going to "fix" this. On a standard GitHub Actions Windows worker, installing the public Julia packages I use, precompiling, and compiling the sysimage takes over an hour. That's not an exaggeration. I had to juice the worker up to a custom 4x sized worker to get the wall clock time to something reasonable.

It took me days to get that build to work; doing this compilation once in CI so you don't have to do it on every machine is trickier than it sounds in Julia. The "obvious" way (install packages in Docker, run container on target machine) does not work because Julia wants to see exactly the same machine that it was precompiled on. It ends up precompiling again every time you run the container on other machines. I nearly shed a tear the first time I got Julia not to precompile everything again on a new machine.

R and Python are done in five minutes on the standard worker and it was easy; it's just the amount of time it takes to download and extract the prebuilt binaries. Do that inside a Docker container and it's portable as expected. I maintain Linux and Windows environments for the three languages and Julia causes me the most headaches, by far. I absolutely do not care about the tiny improvement in performance from compiling for my particular microarch; I would opt into prebuilt x86_64 generic binaries if Julia had them. I'm very happy to take R's and Python's prebuilt binaries.

vchuravy 1 days ago [-]
I am very interested in improving the user-experience around precompilation and performance, may I ask why you are creating a sysimage from scratch?

> I would opt into prebuilt x86_64 generic binaries if Julia had them

The environment varial JULIA_CPU_TARGET [1] is what you are looking for, it controls what micro-architecture Julia emits for and supports multi-versioning.

As an example Julia is built with [2]: generic;sandybridge,-xsaveopt,clone_all;haswell,-rdrnd,base(1)

[1] https://docs.julialang.org/en/v1/manual/environment-variable...

[2] https://github.com/JuliaCI/julia-buildkite/blob/9c9f7d324c94...

electroly 1 days ago [-]
I have a monorepo full of Julia analysis scripts written by different people. I want to run them in a Docker container on ephemeral Linux EC2 instances and on user Windows workstations. I don't want to sit through precompilation of all dependencies whenever a new machine runs a particular version of the Julia project for the first time because it takes a truly remarkable amount of time. For the ephemeral Linux instances running Julia in Docker, that happens on every run. Precompiling at Docker build time doesn't help you; it precompiles everything again when you run that container on a different host computer. R and Python don't work like this; if you install everything during the Docker image build, they will not suddenly trigger a lengthy recompilation when run on a different host machine.

I am intimately familiar with JULIA_CPU_TARGET; it's part of configuring PackageCompiler and I had to spend a fair amount of time figuring it out. Mine is [0]. It's not related to what I was discussing there. I am looking for Julia to operate a package manager service like R's CRAN/Posit PPM or Python's PyPI/Conda that distributes compiled binaries for supported platforms. JuliaHub only distributes source code.

[0] generic;skylake-avx512,clone_all;cascadelake,clone_all;icelake-server,clone_all;sapphirerapids,clone_all;znver4,clone_all;znver2,clone_all

vchuravy 23 hours ago [-]
My point is if you set JULIA_CPU_TARGET during the docker build process, you will get relocatable binaries that are multi-versioned and will work on other micro-architecture? It's not just for PackageCompiler, but also for Julia's native code cache.
electroly 60 minutes ago [-]
It worked! I was able to drop the Windows install on a standard GitHub Actions worker from 1 hour to 27 minutes. Here's what worked:

    ARG JULIA_CPU_TARGET="generic;skylake-avx512,clone_all;cascadelake,clone_all;icelake-server,clone_all;sapphirerapids,clone_all;znver4,clone_all;znver2,clone_all"
    ARG JULIA_PROJECT=[...]
    ENV JULIA_PROJECT=[...]
    RUN julia -e "using Pkg; Pkg.activate(\"[...]\"); Pkg.instantiate(); Pkg.precompile();"
What I got wrong the first time: I failed to actually export JULIA_CPU_TARGET so it would take effect in the "Pkg.precompile()" command. In reality, I hadn't correctly tested with that environment variable set at all. I was only correctly setting it when running PackageCompiler.

Thank you so much for this! It's too late for me to edit my original post, but cutting the install time in half is a major win for me. Now it only needs to precompile, not also compile a sysimage.

21 hours ago [-]
electroly 21 hours ago [-]
That was the very first thing I tried, and I couldn't get it to work, but I'm sure I am doing something wrong. Everything seemed great at build time, and then it just precompiles again at runtime, without anything saying why it decided to do that. I'll give it another shot if you say it should be working. The PackageCompiler step is the longest part; if that can be removed, it'll make a big difference. I'd rather be wrong and have this working than the other way around :) I'll report back with what I find.
JanisErdmanis 1 days ago [-]
> It took me days to get that build to work; doing this compilation once in CI so you don't have to do it on every machine is trickier than it sounds in Julia

You may be interested in looking into AppBundler. Apart from the full application packaging it also offers ability to make Julia image bundles. While offering sysimage compilation option it also enables to bundle an application via compiled pkgimages which requires less RAM and is much faster to compile.

badlibrarian 1 days ago [-]
Versus Python, it seems to fork into the "thinkers" vs "doers" camp. Julia provides a level of abstraction that some people find comforting. I thought I could use it as a sort of open source Matlab for a lot of thinky, 1-based index code I had lying around. It didn't meet my needs. And "spend half an hour waiting for a Jupyter notebook to boot up" is real. Great for some but it's not compatible with the way I work.

Elsewhere someone used the term "janky" and perhaps it's the fact that there are so many incredibly smart people around it that makes it so janky. By way of example, somebody needed to check disk space and the architect told him to shell out to Python.

Remember when LLVM first came out and it got kudos for the quality of its error messages? Well if you miss the old-school 1980s GCC experience the nonsense that eventually comes out of the Julia compiler after an hour will relight that flame.

Want to use greek letters and other symbols that don't appear on your keyboard as variable names? You've found your people.

galleywest200 22 hours ago [-]
You should try Pluto.jl over Jupyter for Julia notebooks. It runs all sorts of compilations in the background, as well as handles state much better for Julia.
Staross 11 hours ago [-]
I think there's quite a bit of "quiet work" going on that isn't very visible. Personally I've been happily using Julia for work everyday for years. When the language was younger there was "big" updates that were news worthy, now that slowed down, but it seems there's a decent number of people using the language for serious work that is just a bit too specialised for general interest. E.g. in recent registered packages there's a simulation of earth, a method to analyse EEG recording, or a method to measure loudness.

https://github.com/NumericalEarth/NumericalEarth.jl

https://github.com/Marco-Congedo/Xloreta.jl

https://github.com/slink/ZwickerLoudness.jl

bobajeff 1 days ago [-]
As someone who currently uses dabbles in both. That prediction seems a bit unrealistic. Julia is a fantastic language but it has some trade offs that need to be considered. Probably the most well known is `time to first x`. Julia like Python is used comfortably in notebooks but loading libraries can take a minute, compared to Python where it happens right away. It may lead you to not reach for it when you want to do quick testing of something especially plotting. You can mitigate this somewhat by loading all the libraries you'll ever need at startup (preferably long before you are ready to experiment) but that assumes you already know what libraries you'll need for what you're wanting to try.
simondanisch 1 days ago [-]
What prediction? Maybe I need to rephrase what I said: My prediction is, that if Julia ever wants to have a shot at replacing Python, it absolutely has to solve the first time to first x problem! That's what I mean by shipping fully ahead of time compiled binaries and interpreting more glue code - which both have the potential to solve the first time to x problem.
bobajeff 1 days ago [-]
The prediction I was referring to was the one in the parent comment. (The one I was commenting under)
simondanisch 1 days ago [-]
Ah sorry :D
Rijanhastwoears 1 days ago [-]
Julia is great ... if you are willing to work with the Goldilocks zone it provides.

I think what happened is this: Julia got advertised as "Python syntax, C speed" but in practice it turns out to really be "Python syntax, 50% of C speed if you were willing to avoid some semi-well-documented gotchas, where avoiding said gotchas will take some non-trivial effort". Again, great if you are willing to work with it.

I am not saying that the Julia people are responsible for the "Python syntax, C speed" perception as much as that was what the prevalent perception became. And

I have talked to people in computational biology who tried Julia, and they said something or the other similar to "It just wasn't performant enough for me to give up Python," and if you really dig in, what really happened was when new people tried Julia with old mental models, they walked away thinking, "Heh, more MIT hypeware."

simondanisch 1 days ago [-]
well I've been reaching 100% of c Speed Most of the time which feels like an easy effort... I guess it depends on the problem a bit and how used you're to writing optimized, clean Julia code
leephillips 1 days ago [-]
Polyglot Jet Finding:

https://arxiv.org/abs/2309.17309

This paper in experimental high-energy physics is a good example of why Julia is popular for scientific calculations.

It shows that #julialang is over 100 times faster than Python and even faster than C++.

Rijanhastwoears 1 days ago [-]
So, my original comment really boils down to the idea that "public perception has nothing to do with objective stats". To which your response is ... citing a paper at me.

To reiterate, citing studies that show that smoking causes cancer in chain smokers does ... nothing. You are citing studies, but I am not the chain smoker; I am just the guy talking about chain smokers.

One more time, I wish we lived in a world where public perception was swayed by objective studies, but we don't.

Julia is fast, yes, but when a university sys-admin rolls their eyes at hearing its name, you have lost the battle for well and good.

jakobnissen 13 hours ago [-]
Nope. Funnily enough no one can agree on why - if you ask five people, you get six answers.

My own take is that Julia didn’t since the two language problem as much as was defeated by it.

Julia didn’t attract the high-level Python data science crowd because of Julia’s latency issue, lack of package ecosystem, and the inconveniences that a high performance compiled language incurs, such as having parametric containers.

The research software engineer crowds didn’t buy in because Julia has no interfaces or automatically checkable behavior, poor static tooling, imprecise semantics which is hard to build abstraction on, and a complex performance model that makes it hard to ensure speed, and is hard to deploy.

So, where they tried to make a language that can span the gap, they succeeded in making a language that works for neither, and which no-one wants.

I like the language. But after having used it for eight years, I find it increasingly hard to argue against the point that it’s better to choose Rust for software engineering and Python for scripting.

Edit: I should say: I used it for eight years because it IS fine for my specific niche: High performance research software engineering. Where I care neither about the convenience of Python, nor need to write truly robust and maintainable code. Where my choice of language was personal and I didn’t need to convince a team of coworkers.

wolvesechoes 10 hours ago [-]
I like your comment.

I made many critical comments about Julia here, on this website, but they mostly boil down to clumsiness of Julia.

Does it solve "two language problem"? Kinda, but through this it is less convenient to use than Python, and harder to reason about performance of the particular piece of code than C. Yes, there is a big chance that idiomatic, straightforward Julia code will run pretty fast, but there is also big chance that it will run unexpectedly slow, and you need to know fair bit to be able to debug this... so kinda like going from Python to C?

Is dynamism and interactivity useful? Immensely, but Julia pays for it with poor AOT support (yes, even with juliac and, still experimental, --trim option).

There is also stuff that I consider unacceptable. Debugger being a separate package you need to download, and it even cannot debug compiled code so it needs its custom interpreter? This piece of crap that is Base.@enum, so anyone that wanting proper enums need to install EnumX.jl? And why the hell StaticArrays.jl even exists as a separate package if Julia puts so much focus on numerical applications?

And then we come to tooling and IDE support. Oh, boy - Julia VSC extension is such a miserable experience, and there is not much else out there.

Julia is incredibly fun to play and tinker with solo, and some stuff from SciML is straight-up awesome, but overall it is wasted potential, killed by thousand cuts.

markkitti 8 hours ago [-]
There is now a Julia plugin for JetBrains IDEs: https://plugins.jetbrains.com/plugin/29356-flexible-julia
markkitti 8 hours ago [-]
> it’s better to choose Rust for software engineering and Python for scripting.

Rust and Python work in some scenarios but not all. Unless the native components are all developed already, you will need a Rust programmer on your team. Rust expertise may be available to organizations of a sufficient size and a narrow focus. In my experience, this sort of arrangement requires a team with dedicated resources.

What I encountered more frequently are attempts to use Python as a full stack language. People are not just using it to implement the scripting frontend interface but also trying to implement the backend that does the heavy processing and the numerical computation. As a developer this is a terrible experience. I'm in the middle of replacing for loops in Python with vectorized numpy code but most of my efficiency gains are erased because I still need Python tuples in the end. Yesterday, I had to consider whether exceptions I throw in Cython code can be caught properly in Python.

Research software engineering is one field where you really do need a full stack language. That kind of software engineering requires high performance with limited resources. Julia with some initial patience does deliver a better developer experience in this scenario than the equivalent Python experience for me partly because I do not need to play vectorizarion games, and I can do everything in one integrated environment.

While, yes, interfaces and static tooling could be better, I do think the situation has gotten better over time. There are interface schemes available and additional static tooling is available.

Julia could deliver a better user experience though. Admittedly, the Python tooling to deploy complex solutions has significantly improved lately via tools such as pixi. Julia tools could deliver generic binary code with packages to ease the initial user experience. Advanced users could re-precompile packages to optimize.

The most promising success I have had with Julia is putting notebook interfaces in front of scientists or deploying web apps. My observation in this scenario is that Julia code can be presented as simple enough for a non-professional programmer to modify themselves. Previously, I have only seen that work well with MATLAB and Python. I have not seen this happen with Rust, and I do not expect that to change.

The other observation is that users appreciate the speed of Julia in two ways. 1. Advanced data visualizations are responsive to changes to data. In Python, I would typically need to precompute or prerender these visualizations. With Julia, this can be done dynamically on the fly. 2. Julia user interfaces respond quickly to user input.

While I think Julia has plenty of room to improve, I do think those improvements are possible. I have also greatly appreciated how much Julia has improved in the past five years.

ssivark 1 days ago [-]
Ugh, this almost feels like flame-bait. This question invariably leads to a lot of bike-shedding around comments from people who feel strongly about some choices in the Julia language (1-based indexing and what not), and the fact that Julia is still not as polished as some other languages in certain aspects of developer experience.

"Data science" is an extremely broad term, so YMMV. That said, since you asked, Julia has absolutely replaced Python for me. I don't have anything new to add on the benefits of Julia; it's all been said before elsewhere. It's just a question of exactly what kind of stuff you want to do. Most of my recent work is math/algorithms flavored, and Python would be annoyingly verbose/inexpressive while also being substantially slower. Julia also tends to have many more high-quality packages of this kind that I can quickly use / build on.

IshKebab 1 days ago [-]
IMO it just had too many rough edges. Very slow compilation, correctness issues (https://yuri.is/not-julia/), kinda janky tooling (not nearly as bad as pip tbf). Even basic language mistakes like implicit variable declaration and 1-based indexing (in 2012??).

Yes 1-based indexing is a mistake. It leads to significantly less elegant code - especially for generic code - and is no harder to understand than 1-based indexing for people capable of programming. Fight me.

bouchard 1 days ago [-]
> Yes 1-based indexing is a mistake. It leads to significantly less elegant code - especially for generic code - and is no harder to understand than 1-based indexing for people capable of programming.

Some would argue that 0-based indexing is significantly less elegant for numerical/scientific code, but that depends on whether they come from a MATLAB/Fortran or Python/C(++) background.

A decision was made to target the MATLAB/Fortran (and unhappy? Python/C++) crowd first, thus the choice of 1-based indexing and column-major order, but at the end of the day it's a matter of personal preference.

0-based indexing would have made it easier to reach a larger audience, however.

> and is no harder to understand than 1-based indexing for people capable of programming.

The same could be said the other way around ;-)

markkitti 20 hours ago [-]
The 0 or 1 based indexing is actually a very superficial debate for people not very familiar with Julia. Note that 1-based indexing is a standard library feature not inherent to the Julia language itself.

The real indexing issue is whether arbitrary-base abstraction is too easily available.

    # Correct, Vector is 1-based
    function mysum(v::Vector{T}) where {T <: Integer}
        s = zero(T)
        for i in 1:length(v)
            s += v[i]
        end
        return s
    end

    #Incorrect, AbstractVector is not necessarily one based
    function mysum(v::AbstractVector{T}) where {T <: Integer)
        s = zero(T)
        for i in 1:length(v)
            s += v[i]
        end
        return s
    end

    #Correct
    function mysum(v::AbstractVector{T}) where {T <: Integer)
        s = zero(T)
        for e in v
            s += e
        end
        return s
    end
Basically, the concrete `Vector` type is 1-based. However, `AbstractVector` is could have an arbitrary first index. OffsetArrays.jl is a non-standard package that provides the ability to create arrays with indexes that can start at an arbitrary point including 0.
qsi 1 days ago [-]
Heh. I grew up writing C code and had real trouble adapting to Matlab's 1-based indexing. Much later I tried Python and was constantly confused by 0-based indexing.

I don't think one is better than the other but my mind is currently wired to see indexing with base 1.

Then there's Option Base 1 in VBA if you don't like the default behavior. Perfect for creating subtle off-by-one bugs.

leephillips 1 days ago [-]
Aside from the fact that 1-based indexing is better for scientific code (see Fortran), I don’t think that it matters very often. I don’t think that any Julia program I’ve ever written would need to change if Julia adopted 0-based tomorrow. You don’t typically write C-style loops in Julia; you use array functions and operators, and if you need to iterate you write `for i in array ...`. If you really need the first or last element you write `a[begin]` or `a[end]`.
fc417fc802 18 hours ago [-]
> Aside from the fact that 1-based indexing is better for scientific code

I find it to be substantially worse. It's fine as long as you don't manipulate the indicies. But as soon as you start doing math on them 1 based becomes a headache (at least IME).

Meanwhile all you get in exchange (at least as far as I can tell) is ease of speaking about them in natural language. But I'm not usually conversing about indicies.

Concise range notations are a mixed bag. There's pros and cons to either scheme there as far as the syntax goes.

IshKebab 1 days ago [-]
> the fact that 1-based indexing is better for scientific code (see Fortran)

It really isn't. "Scientific code" isn't some separate thing.

The only way it can help is if you're trying to write code that matches equations in a paper that uses 1-based indexing. But that very minor advantage doesn't outweigh the disadvantages by a wide margin. Lean doesn't make this silly mistake.

> If you really need the first or last element

What if you need the Nth block of M elements? The number of times I've written arr[(n-1)m+1:nm] in MATLAB... I do not know how anyone can prefer that nonsense to e.g. nm..<(n+1)m

Certhas 1 days ago [-]
What if I want the nth element up to the math element? arr[n:m]. And if I want to split the array into two parts, one until the nth element and the other from the m+1st element arr[1:m] and arr[(m+1):end]. Julia matches how people speak about arrays, including C programmers in their comments. Arrays are (conceptually) not pointer arithmetic. Also for your usecase typically you would just use a 2d array and write a[n,:].
IshKebab 1 days ago [-]
> arr[n:m]

arr[n..=m]

> arr[1:m] and arr[(m+1):end]

arr[0..m], arr[m..]

Much nicer.

> Arrays are (conceptually) not pointer arithmetic.

Look at a ruler. Does it start at 1?

minihoster 23 hours ago [-]
> arr[n..=m]

so you just need to overload the syntax of intervals even more to make it work

> arr[0..m], arr[m..]

now `m` refers to different things depending on which side of the interval it's on. less characters doesn't mean nicer

I get it though, I was skeptical about 1-based indexing when I started Julia. By the nature of indices vs length there will always be an off-by-one problem: either you have elements [n, m - 1] with length (m - n) or [n, m] with length (m - n + 1). Unless you're doing a bunch of pointer arithmetic type stuff, I find the symmetry of a inclusive-inclusive interval to be a better default.

As a final rebuttal I offer: range(n - 1, -1, -1)

Certhas 14 hours ago [-]
Your second point is the main argument for me personally. Numbers in brackets always mean the same thing: the ordinal number of the references object in an ordered collection. In 0 based indexing you can think of the number as refering to the space between the referenced objects. But that is simply an additional mental image on top of the original one.

As a neat bonus, in Julia 1:5 is just the iterator for the numbers 1 to 5. So slicing is typically not some special syntax either. It all works rather nicely.

Certhas 13 hours ago [-]
So if I have a row of 5 apples, I can say "take the second and third apple" or I can say "take the apples between one apple length and three apple lengths from the start".

Which is more natural? The ruler is exactly the right mental image if an array to you is a partitioned region of memory starting at a specific pointer location. If an array to you is an ordered collection of objects, you would never invent 0-based indexing or inclusive-exclusive slicing.

Either way, it's not a big deal. I have lived in both worlds, I have come to think Julia is a bit more natural and easier to teach. But it ls really the silliest bike shedding complaint, given that the language has considerable real trade offs.

mbauman 1 days ago [-]
This is such a classic example of online discourse in general. There are two options, and folks tribally cling to one or the other without realizing that both are legitimate and well-suited for different situations.

Yes, of course distances are measured starting from 0. But we count discrete things starting at 1. You can do mental gymnastics to enumerate from zero and many programmers are (unfortunately IMO) taught to do so. It's a hard thing to learn that way, so for the folks that have done so, it often becomes a point of pride and a shibboleth.

As a classic example, a four story building has four floors. But you only need to go up three flights to get to the top. You can legitimately call the top floor either 3 or 4, and folks are similarly tribal about their own cultural norms around this one, too.

Certhas 14 hours ago [-]
Fully agreed. I first struggled when switching from python to Julia, then ended up finding it slightly better for my use cases (which includes teaching scientists who are not programmers). But it's simply not a big deal either way. I am also reminded of the significant whitespace objections to python in the old days, before python took over everything...
IshKebab 24 hours ago [-]
> There are two options, and folks tribally cling to one or the other without realizing that both are legitimate and well-suited for different situations.

No I disagree entirely. One is simply better.

> It's a hard thing to learn that way, so for the folks that have done so, it often becomes a point of pride and a shibboleth.

It is not hard. It's not better because it's hard-won knowledge. It's better because it leads to simpler, more elegant code. Simple as.

mbauman 23 hours ago [-]
Thanks for proving my point perfectly.
banku_brougham 19 hours ago [-]
>It really isn't.

They way people reveal themselves is a pattern worthy of taking note.

simondanisch 1 days ago [-]
lol. There's not much to fight since its a very personal problem how you want to write code. It's evident that all the capable programmers in the Julia community, have found satisfactory ways to get around it, so if you haven't yet, I don't see how that's a Julia problem ;) I can only say I haven't had a single problem with one based indexing in 12 years of developing Julia code. I also haven't run into many correctness issues compared to other languages I've been using. I think Yuri also has been using lots of packages which haven't been very mature. How on earth can you compare a 10 years old library with lots of maintainers with packages created in one year by one person? That's at least what Yuri's critic boils down to me.
Certhas 1 days ago [-]
I disagree. Julia has correctnes issues because it chose maximum composability over specifying interfaces explicitly. And those are not just in immature packages but also in complex packages. Compared to other languages, Julia has no facilities to help structure large complex code bases. And this also leads to bad error messages and bad documentation.

Recently we got the public keyword, but even the PR there says:

"NOTE: This PR is not a complete solution to the "public interfaces are typically not well specified in Julia" problem. We would need to implement much than this to get to that point. Work on that problem is ongoing in Base and packages and contributions are welcome."

TimorousBestie 1 days ago [-]
Analogous to “time to first plot”, Julia metacommentary now has time to first “Why I no longer. . .” repost.
postflopclarity 1 days ago [-]
I'm even sympathetic to some of the concerns. I say that as someone deeply embedded in the Julia community. but seeing this same repost over and over for years honestly starts to get pretty frustrating.
the__alchemist 1 days ago [-]
The molecule and MD trajectory renders look great and an easy API! I have been doing this in rust, but it's a full program vs something scriptable like this. The images and animations on this page also look a hell of a lot better than what I cobbled together in WGPU.
bobajeff 1 days ago [-]
It's says:

>the reference implementation from Physically Based Rendering (Pharr, Jakob, Humphreys)

I'd like to know a little about the process you went through for the port. That book * sounds like an excellent resource to start from but what was it like using it and the code?

* https://pbrt.org/

simondanisch 1 days ago [-]
I've done lots of manually refactoring of the initial Prototype in Trace.jl (by Anton Smirnov, who I think ported an earlier version of the pbrt book). This helped familiarizing myself with the math and infrastructure and the general problems a raytracer faces and lay the ground work for the general architecture and what to pay attention to for fast GPU execution. One key insight was, that its possible to not need to have an UberMaterial, but instead use a MultiTypeSet for storing different materials and lights, which allows fast and concretely typed iterations.

Then I found that pbrt moved away from the initial design and I used claude code to port large parts of the new C++ code to Julia. This lead to a pretty bad port and I had lots of back and forth to fix bugs, improve the GPU acceleration, make the code more concise and "Julian" and correct the AIs mistakes and bogus design decisions ;) This polish isn't really over yet, but it works well enough and is fast enough for a beta release!

tylermw 3 hours ago [-]
Nice work! It's always fun to see a new renderer in the wild.
blueaquilae 1 days ago [-]
That's an impressive accomplishment and a fantastic tool to explore.
juleiie 22 hours ago [-]
Interesting name. Strange feeling to use language/tech named same as you for some reason and this is a name that isn’t even niche or quirky but like second or third most popular

It’s like calling a framework Mike

Twisol 21 hours ago [-]
I have a friend who named their custom-built languages "Monica" and "Joe". It's surprisingly common for homegrown languages, I think.
the_harpia_io 1 days ago [-]
honestly the AMD-first bit surprised me - usually ROCm support is an afterthought or just broken outright.

curious about BVH traversal specifically. dynamic dispatch patterns across GPU backends can get weird fast. did KernelAbstractions hold up there or were there vendor-specific fallbacks needed for the heavier acceleration structure work?

simondanisch 1 days ago [-]
Well I'm a bit of an AMD "fanboy" and really dislike NVIDIA's vendor lock in. I'm not sure what you mean by dynamic dispatch across GPU backends - nothing should be dynamic there and most easier primitives map quite nicely between vendors (e.g. local memory, work groups etc). To be honest, the BVH/TLAS has been pretty simple in comparison to the wavefront infrastructure. We haven't done anything fancy yet, but the performance is still really good. I'm sure there are still lots of things we can do to improve performance, but right now I've concentrated on getting something usable out. Right now, we're mostly matching pbrt-v4 performance, but I couldn't compare to their NVIDIA only GPU acceleration without an NVIDIA gpu. I can just say that the performance is MUCH better than what I initially aimed for and it feels equally usable as some of the state of the art renderers I've been using. A 1:1 comparison is still missing though, since it's not easy to do a good comparison without comparing apples to oranges (already mapping materials and light types from one render to another is not trivial).
the_harpia_io 1 days ago [-]
pbrt-v4 parity is a solid baseline - that codebase already leans hard on NVIDIA so a fair comparison was always going to be messy. surprised wavefront was the harder bit though, i'd have expected BVH tuning to be the nightmare.
simondanisch 1 days ago [-]
To be fair I was suprised too. But I made a relatively simple straight port from the AMD rays sdk plus some input from the pbrt-v4 CPU bvh code and it just worked relatively well out of the box... This is the main intersection function which is quite simple: https://github.com/JuliaGeometry/Raycore.jl/blob/sd/multityp... I'm not even using local memory, since it was already fast enough ;) But I think we can still do quite a lot, large parts of the construction code are still very messy, and I want to polish and modularize the code over time.
the_harpia_io 1 days ago [-]
makes sense honestly - straight port from a solid SDK beats reimplementing everything from scratch. local memory optimization is one of those rabbit holes anyway. construction code being messy is just that stage of the project
FacelessJim 14 hours ago [-]
That post is 10 years old, stale, with all issues resolved and more.

Waving around an outdated blogpost as if it would automatically invalidate everything is just silly at this point.

LoganDark 1 days ago [-]
On iOS Safari the videos are fullscreening themselves as I scroll. I've seen this on other blogs before but I don't know what causes it. Super annoying
simondanisch 1 days ago [-]
Ugh, yeah I had some super weird bugs like this in safari, still haven't found the source :(
embedding-shape 1 days ago [-]
Don't quote me on this, but I think there is a "playsinline" / "webkit-playsinline" attribute for the video element you need to add to avoid that, + if it's autoplay you need to set "muted" too. I've also had this happen and I think both/either of those solved it last time.
johnbatman 1 days ago [-]
[dead]
builderhq_io 1 days ago [-]
[dead]
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 22:41:52 GMT+0000 (Coordinated Universal Time) with Vercel.