A good technical project, but honestly useless in like 90% of scenarios.
You want to use an NVidia GPU for LLM ? just buy a basic PC on second hand (the GPU is the primary cost anyway), you want to use Mac for good amount of VRAM ? Buy a Mac.
With this proposed solution you have an half-backed system, the GPU is limited by the Thunderbolt port and you don’t have access to all of NVidia tool and library, and on other hand you have a system who doesn’t have the integration of native solution like MLX and a risk of breakage in future macOS update.
afavour 1 days ago [-]
Chicken/egg. NVidia tooling is lacking surely in part because the hardware wasn’t usable on macOS until now. Now that it’s usable that might change.
frollogaston 19 hours ago [-]
Nvidia GPUs were usable on Intel Macs, but compatibility got worse over time, and Apple stopped making a Mac Pro with regular PCIe slots in 2013. People then got hopeful about eGPUs, but they have their own caveats on top of macOS only fully working with AMD cards. So I've gotten numb to any news about Mac + GPU. The answer was always to just get a non-Apple PC with PCIe slots instead of giving yourself hoops to jump through.
zoky 17 hours ago [-]
The 2019 Intel Mac Pro had PCIe slots. The Apple Silicon Mac Pro still has them as well, but they’re pretty much useless.
fg137 16 hours ago [-]
Until there is official support for Mac coming from nvidia, I don't think anything will happen.
> the hardware wasn't usable on macOS
This eGPU thing is from a third-party if I understand correctly. I don't see why nvidia would get excited about that. If they cared about the platform, they would have released something already.
The software stack has been ready for Apple Silicon for more than a half decade.
tensor-fusion 16 hours ago [-]
There's a third option that might fit some of the "I'm on a Mac but need CUDA" cases: network-mounting an Nvidia GPU from another machine on the same LAN. The GPU stays wherever it lives (office server, lab machine, a roommate's PC), your Mac runs the CUDA workload locally without any code changes — same PyTorch/CUDA calls, just intercepted by a stub library that forwards them over the local network.
The tradeoff vs. a physical eGPU: no Thunderbolt bandwidth ceiling or cabling, but you do need to be on the same LAN and there's ~4% overhead vs. native. Doesn't help if you need the GPU while traveling, and won't fix the physical macOS driver situation for native GPU access.
Disclosure: I work on GPU Go (tensor-fusion.ai/products/gpu-go), so I'm obviously biased toward this approach — but it genuinely is a different point in the design space from eGPU.
bigyabai 16 hours ago [-]
> same PyTorch/CUDA calls, just intercepted by a stub library that forwards them over the local network.
At that point you're making more work for yourself than debugging over SSH.
tensor-fusion 11 hours ago [-]
[dead]
dapperdrake 8 hours ago [-]
Thank you for opening my mind to a viewpoint I didn’t even know existed.
Yes, for many scenarios this is "not even an academic exercise".
For a very select few applications this is Gold. Finally serious linear algebra crunch for the taking. (Without custom GPU tapeout.)
MIA_Alive 10 hours ago [-]
Even with running ML experiments you'd mostly want to run them on rented out clusters anyway
petters 12 hours ago [-]
> the GPU is limited by the Thunderbolt port
Not everything is limited by the transfer speed to/from the GPU. LLM inference, for example.
the_arun 22 hours ago [-]
I misunderstood eGPU for virtual GPU. But I was wrong it means external GPU.
throawayonthe 8 hours ago [-]
the tooling is just the standard linux tooling inside the container, no? and thunderbolt is not a real limitation
nailer 6 hours ago [-]
> GPU is limited by the Thunderbolt port
I thought Thunderbolt was like pluggable PCI? The whole point was not to limit peripherals.
zamadatix 4 hours ago [-]
There's more to peripheral limits than the protocol used. Thunderbolt connections offer higher latency and limits on bandwidth. Both, either, or neither of those things may be much of an actual problem (depending on the use case) but they are some examples of limits vs native PCIe.
syntaxing 24 hours ago [-]
From what I understand, only works with Tinygrad. Which is better than nothing but CUDA or Vulkan on pytorch isn’t going to work from this.
I don't know how Apple has evaded regulatory scrutiny for their refusal to sign Nvidia's eGPU drivers since 2018.
mrpippy 1 days ago [-]
Evidence that NVIDIA has even been trying? My understanding is that Apple didn’t allow 3rd parties to write graphics drivers past 10.13, but they could’ve done a non-graphics driver like this.
trueno 4 hours ago [-]
i emailed jen sen huang at the very tail end of the maxwell era and p much begged for maxwell support on macos. i didnt expect a reply, especially since i guessed his email based on some "how to find ceo emails" google search result.
he actually did reply weeks later and said "i didnt realize people wanted this, my team has added them. go check now". pretty sure that was the last time nvidia drivers came to macos.
there's a lot of assumptions made with this topic, particularly the assumption that apple is blocking them. at least in my experience the opposite was true, nvidia just flat out wasn't making them. however i don't doubt the truth lies somewhere in between: nvidia and apple have a pretty much nonexistant relationship now. i dont know whats required here but i also don't doubt apple makes this experience suck butt for any interested parties.
MBCook 1 days ago [-]
The government doesn’t care? They’re a minority of the market? The vast majority of their computers didn’t have slots to put Nvidia GPUs in, and now none of them do?
hgoel 1 days ago [-]
They said eGPU
the_arun 22 hours ago [-]
Yeh external GPU
frollogaston 19 hours ago [-]
eGPUs are kind of a joke. People would be way more likely to use dedicated GPUs with Macs if they had PCIe slots.
MBCook 17 hours ago [-]
The last Mac with good PCIe slots was the 2019-2023 Mac Pro. So that’s almost a decade ago, and sold poorly. It never got a real refresh.
Before that was the pre-trash can Mac Pro in 2006-2012. So that was canceled most of a decade before the 2019 model.
High bandwidth PCIe hasn’t been a thing in Apple world for most of 15 years.
frollogaston 4 hours ago [-]
Yeah I want to know who exactly was their target audience with the 2013 and 2019 models. It made sense before that. I loved my 2009 model (4,1).
15155 19 hours ago [-]
What exactly is the difference between an internal PCIe slot and Oculink or Thunderbolt from an electrical and functional perspective?
wpm 18 hours ago [-]
An internal PCIe slot can be had in up to 16x 5.0, whereas Thunderbolt 5 maxes out at 4x of 4.0.
Plus you have another Thunderbolt controller in between the CPU and the hardware, and it takes more energy to push that many bits 1m over a cable vs a few dozen cm over traces.
Also Thunderbolt is trivially disconnected, which in many critical workflows is not a positive, but an opportunity for ill-timed interruptions. Plus I don't have to buy a fucking dongle/dock for a real goddamn slot, make room for external power supplies, etc.
mulderc 1 days ago [-]
Apple doesn’t have a monopoly in any market they are in.
TheDong 1 days ago [-]
It depends how you define the market. In the 2001 microsoft case [0], the courts ruled Microsoft had a monopoly over the "Intel-based personal computer market".
Apple has a monopoly over the "M-chip" personal computer market. They have a monopoly over the iOS market with the app store. They have a monopoly over the driver market on macOS.
Like, Microsoft was found guilty of exploiting its monopoly for installing IE by default while still allowing other browser engines. On iOS, apple bundles safari by default and doesn't allow other browser engines.
If we apply the same standard that found MS a monopoly in the past, then Apple is obviously a monopoly, so at the very least I think it's fair to say that reasonable people can disagree about whether Apple is a monopoly or not.
I wouldn’t say it is obvious. Apple does not have the monopoly of ARM based PCs. Labeling it as a monopoly of M chips is not fair or accurate when comparing to MS on Intel. It’s also probably relevant that MS was not selling PCs or their own hardware. They had a monopoly on a market where you effectively had to use their software to use the hardware you bought from a different company. Because Apple is selling their own hardware and software as a single product, the consumer is not forced into restricting the hardware they bought by a second company’s policies.
AnthonyMouse 22 hours ago [-]
> Labeling it as a monopoly of M chips is not fair or accurate when comparing to MS on Intel.
The relevant thing here isn't the chips, it's tying things to the chips, because those would otherwise be separate markets. If you could feasibly buy an iPhone and install Android or Lineage OS on it or use Google Play or F-Droid on iOS then no one would be saying that Apple has a monopoly on operating systems or app stores for iOS since there would actually be alternatives to theirs.
The fake alternative is that you could use a different store by buying a different phone, but this is like saying that if Toyota is the only one who can change the brake pads on a Toyota and Ford is the only one who can change the brake pads on a Ford then there is competition for "brake pads" because when your Toyota needs new brake pads you can just buy a Ford vehicle. It's obvious why this is different than anyone being able to buy third party brake pads for your Toyota from Autozone, right?
> It’s also probably relevant that MS was not selling PCs or their own hardware.
This is the thing that unambiguously should never be relevant. It can't be a real thing that you can avoid being a monopoly by owning more of the supply chain. It's like saying that Microsoft could have avoided being a monopoly by buying Intel and AMD, or buying one of them and then exterminating the other by refusing to put Windows on it. That's a preposterous perverse incentive.
Affric 21 hours ago [-]
> It can't be a real thing that you can avoid being a monopoly by owning more of the supply chain.
Move the most important aspects of your software to hardware. Hard for MacOS but for a Chromebook style thing you could write the browser into its own pice of wafer.
Google should pay me to be this evil.
AnthonyMouse 19 hours ago [-]
> Move the most important aspects of your software to hardware.
So now you have a piece of silicon with a two year old version of Chrome with seventeen CVEs hard-coded into it, and still have all the same antitrust problems because the device still also has an ordinary general purpose CPU that you're still anti-competitively impeding people from using to run Firefox or Ladybird.
SvenL 23 hours ago [-]
Well “had to use” is a strong phrase here. Linux was already around and you could have used it too with your hardware. I think you can always bend an argument to fit your point.
detourdog 23 hours ago [-]
The PC manufacturers had to pay MS for a license no matter what operating system was installed.
SvenL 13 hours ago [-]
Didn’t knew that, but only if they also sold windows pc? Like, if a company would only sold blank PCs without any offering associated to MS they wouldn’t need to pay MS anything.
detourdog 8 hours ago [-]
That was the what the trial about. If you wanted to contract with MS you had to pay for a license on every box shipped. Dell, Compaq, Gateway, HP, IBM, Acer, and others had to sign the contract or ship only alternate OS’s
If one sold a computer with OS/2 they also paid for a windows license.
LocalH 22 hours ago [-]
Indeed. Pepperidge Farm remembers Microsoft's campaign against "naked PCs"
Underphil 1 days ago [-]
I don't think any of what you're describing are legal "monopolies". I don't have a single Apple product in my life but I'm fairly sure there's nothing I'm prevented from doing because of that.
TheDong 1 days ago [-]
And back in the "Microsoft has a monopoly on IE6" ruling's days, I did not use Windows or Internet Explorer, and I was not prevented from doing anything because of that. Netscape Navigator on Linux worked fine. Sure, I occasionally hit sites that were broken and only worked in IE, but I also right now frequently hit apps that are "macOS only" (like when Claude Cowork released, or a ton of other YC company's apps).
Microsoft was found guilty, so clearly the bar is not what you're trying to claim.
selectodude 1 days ago [-]
Microsoft was found guilty of using their market power to do product bundling, which is illegal. The fact that they had dominance in the market is not what they got popped for, nor is it illegal.
longislandguido 23 hours ago [-]
Let me know how I can unbundle Safari from macOS or iOS.
Go ahead, I'll wait.
chongli 23 hours ago [-]
It's possible on the Mac, but it's not easy. Apple uses an immutable system volume on macOS, so you can't just delete the Safari app like you would a user-installed app. To actually delete Safari you need to disable System Integrity Protection and reboot.
There are plenty of Linux distributions that use immutable root volumes. They protect the user in a huge number of ways by preventing the system from getting hosed (either by accident or by malicious unauthorized users / malware). Apple made the decision to do this for their users, and it has prevented a HUGE amount of tech support calls, as well as led to millions of happy users with trouble-free computers.
It also hasn't stopped users from installing Chrome and/or Firefox on their Macs, and millions of ordinary users have.
AnthonyMouse 13 hours ago [-]
> It also hasn't stopped users from installing Chrome and/or Firefox on their Macs, and millions of ordinary users have.
You seem to be ignoring the part where you can't install the Chome and/or Firefox browser engines on iOS and the apps with those names on that platform are just skins over Safari. Notice in particular that the iOS version of "Firefox" can't support extensions.
raw_anon_1111 5 hours ago [-]
And that has nothing to do with the Mac…
sroussey 24 hours ago [-]
You just described Apple.
selectodude 24 hours ago [-]
Apple has not, to my knowledge, required OEMs to bundle Safari with macOS alongside threats to withhold macOS if they don’t comply expressly to put Firefox out of business.
But hey, maybe some weird shit happened during the clone years that I’m not privy to.
SvenL 23 hours ago [-]
Apple requires Developers to use AppStore with their App alongside threats to withhold their App if they don’t comply.
Just an example… and yes, I know the EU ruling but it’s still fitting.
pdpi 19 hours ago [-]
The crucially important subtlety here is that Apple requiring developers to use the App Store doesn't leverage an existing monopoly (like what Microsoft had with Windows).
Compare the games console market. Nintendo is allowed to say you have to go through them to sell games for the Switch, ditto Microsoft with the Xbox. Sony doing the same thing with the Playstation is exactly equivalent, but they're approaching the sort of market dominance where it might soon be illegal for them (and them alone) to do that in some markets.
AnthonyMouse 13 hours ago [-]
> The crucially important subtlety here is that Apple requiring developers to use the App Store doesn't leverage an existing monopoly (like what Microsoft had with Windows).
Copyright (e.g. over iOS) and patent (e.g. over iPhone hardware) are explicitly government-granted monopolies. Having that monopoly is allowed on purpose, but that isn't the same as it not existing, and having a government-granted monopoly and leveraging into another market are two quite distinct things.
> Compare the games console market.
Okay, all of the consoles that require you to sell you to sell through their stores shouldn't be able to do that either.
> but they're approaching the sort of market dominance where it might soon be illegal for them (and them alone) to do that in some markets.
Wait, your theory is that a console with ~50% market share has market dominance but Apple with ~60% of US phones doesn't?
scott_w 7 hours ago [-]
There’s no such thing as “having a monopoly on iPhone” in law. You have to have a monopoly in a market, of which iPhone is part of the “smartphone” market. It is not a monopoly in the smartphone market, to the best of my knowledge.
Underphil 1 days ago [-]
Yes, but that was coupled with other factors like them strongarming vendors, already being hugely dominant on desktops and abusing that position et al. I don't see this as being the same. Maybe my bar here is wrong, but it doesn't change whether they are a monopoly or not.
pdpi 19 hours ago [-]
The issue was never "Microsoft has a monopoly on IE6". That's obviously nonsense.
The monopoly that Microsoft held was the home computer operating system market, first through DOS, then later through Windows. Holding a monopoly like that isn't illegal unto itself. What they were actually found guilty of was unfairly leveraging their monopoly on the OS market to gain the upper hand in a different market (the browser market). The subsequent range of issues we had with IE6 (compatibility, security, etc) was a result of Microsoft succeeding in achieving a monopoly on the browser market through illicit means.
Likewise, "Apple has a monopoly on the App Store" is just the same amount of nonsense. What you could argue is that Apple has a monopoly on the home computer market, or the mobile phone market, and that the way they integrate the App Store should be considered illegal leveraging of that monopoly, but that argument simply doesn't hold water — Microsoft's monopoly on the OS market at the time was pretty much incontrovertible, you simply couldn't walk into a shop and buy a computer running something else (except maybe a Mac at a more specialised place). Today, just about any shop you walk into that sells computers will probably have devices for sale running three different OSes (macOS, Windows, ChromeOS). Any phone place will have iPhones and Android devices, and probably a few more niche options. Actual market share percentage is nowhere near the high 90s that Microsoft saw in its heyday. At most, Apple is the biggest individual competitor in the market, but I don't think it hold an outright majority in any specific product class.
Mind you, I think that there is a good argument to be made that the Apple/Google duopoly on mobile devices does deserve scrutiny, but that's a very different kettle of fish.
jonhohle 1 days ago [-]
You were not prevented from doing anything, but that doesn’t mean others weren’t. For example, OEMs were not allowed to offer any other preinstalled OS as a default option. That effectively killed Be and I’m sure hindered RedHat.
raw_anon_1111 1 days ago [-]
That’s not how monopoly definitions work. That makes about as much sense as saying Nintendo has a monopoly on Nintendo consoles or Ford has a monopoly on Mustangs
SllX 22 hours ago [-]
Yes. If you define the market in a ridiculous manner and convince a court to go along with it, anybody can be a monopoly.
But the M series are an Apple product line designed by Apple with a ARM license and produced on contract by TSMC for use in other Apple products.
Don’t assume the facts from another case automatically apply in other cases.
Or as Justice Jackson once put it: “Other cases presenting different allegations and different records may lead to different conclusions”
JumpCrisscross 24 hours ago [-]
> Apple has a monopoly over the "M-chip" personal computer market. They have a monopoly over the iOS market with the app store
When a company is deemed an illegal monopoly, the DoJ basically becomes part of management. Antitrust settlements focus on germane elements, e.g. spin offs. But they also frequently include random terms of political convenience.
I don’t think we want a precedent where companies having a product means they have an automatic monopoly on said product.
ascagnel_ 21 hours ago [-]
More to the point: having a monopoly isn't de facto illegal (just look up natural monopolies), it's using the monopoly power in an anti-competitive way that's illegal. Microsoft wasn't charged with having a monopoly, they were charged because they used that monopoly to exclude Netscape Navigator and force bundling of IE.
trueno 3 hours ago [-]
> Apple has a monopoly over the "M-chip" personal computer market
lmao what ? the "M-chip" is literally their chip that they designed, built relationships with TSMC over and bankrolled into production to put in their products. literally hardware by apple for apple. this was a decade plus long thing in the making, this is the risk/gamble apple took and invested heavily into. that is apples innovation. any other manuf is free to go do this themselves for their own devices, they just didn't and for the most part still don't. that just like isn't a monopoly at all, i'm amused you even got to that point in the first place. seems to carry some broad misunderstandings of what the M-series chips are or carries an assumption that cpus are supposed to be shared to any interested parties just because that was intels business model. intel was historically slacking & their one-size-fits-most approach wasn't meeting the engineering requirements apple was after generation after generation, so apple took the cpu destiny into their own hands and made their own. if you feel like non-apple laptop chips aren't living up to that kind of perf/ppu.... well yeah you'd be right. but that's not really apples fault. that's not a monopoly thing, like at all. either laptop manufs need to go make their own chip (unlikely) or intel/qualcomm/etc need to catch up.
scott_w 7 hours ago [-]
There’s no such thing as “monopoly on Apple-produced processors” because that’s absurd. The monopoly for MacBook would be “consumer laptops” most likely. Apple does not have a monopoly in consumer laptops to the best of my knowledge.
brookst 15 hours ago [-]
Reductionism is so cringe.
Intel sold chips to anyone. Anyone could make Intel computers.
Apple does not sell chips to anyone. Nobody else can make m-series computers.
Your argument is basically that Ford has a monopoly on selling mustangs because standard oil had a monopoly on selling oil.
thisislife2 1 days ago [-]
It isn't just about monopoly or unfair competition. This can also be covered under consumer rights - the Right to Repair. No OS provider should be allowed to dictate what software you can or not run on your own device and / or OS you have paid for.
ssl-3 1 days ago [-]
> It isn't just about monopoly or unfair competition. This can also be covered under consumer rights - the Right to Repair.
If we have a right to repair (we broadly do not, AFAICT), then that doesn't necessarily mean that we have a right to modify and/or add new functionality.
When I repair a widget that has become broken, I merely return it to its previous non-broken state. I might also decide to upgrade it in some capacity as part of this repair process, but the act of repairing doesn't imply upgrades. At all.
> No OS provider should be allowed to dictate what software you can or not run on your own device and / or OS you have paid for.
I agree completely, but here we are anyway. We've been here for quite some time.
satvikpendem 1 days ago [-]
Courts have already ruled it does in the iOS app store market. You can disagree of course but then you'd be disagreeing with legal experts who know more about anti-trust law than you do.
afavour 1 days ago [-]
But Apple’s share of the desktop/laptop market is very different than their share of the mobile one.
satvikpendem 1 days ago [-]
Yes, however the parent's claim was that Apple does not have a monopoly in any market they're in which is legally demonstrably false.
hilsdev 1 days ago [-]
Credentialism to prevent discussion of political and government entities is incredibly dangerous
satvikpendem 24 hours ago [-]
You can, but that doesn't mean your opinion is as valid as those who study the subject. Otherwise we might as well follow the sovereign citizen believers.
latexr 10 hours ago [-]
What’s that got to do with anything? Having a monopoly isn’t the only reason to be regulated.
pjmlp 3 hours ago [-]
They aren't a monopoly, hence why.
GeekyBear 1 days ago [-]
The same way Google evaded regulatory scrutiny for refusing to allow a YouTube client for Windows Phone?
1 days ago [-]
bigyabai 1 days ago [-]
Internet Explorer Mobile is a YouTube client. You're describing a client-server disagreement when the user is talking about an entirely client-based conflict.
realusername 1 days ago [-]
Google deployed custom code to actively block the clients so it went beyond just a disagreement
bigyabai 1 days ago [-]
That's normal behavior when your server is being reverse-engineered or abused. Video bandwidth is not free.
Apple's decision is not constrained by server logic or ballooning costs, it is entirely a client-based policy to not sign CUDA drivers.
GeekyBear 1 days ago [-]
> That's normal behavior when your server is being reverse-engineered or abused. Video bandwidth is not free.
Microsoft rewrote their Windows Phone native client to pass through Google's ads. Google still blocked it.
Was it normal behavior when Google blocked Amazon Fire devices from connecting to YouTube with a web browser during the Google/Amazon corporate spat?
To be fair, Google did back down almost immediately when the tech press picked up on it.
Not allowing a native client for your monopoly market share video service on Amazon devices while also blocking Amazon's web browser on those devices is making things a bit too obvious.
bigyabai 23 hours ago [-]
Again - servers are always offered at-will. If the service provider wants to boot you out, their TOS usually won't give you the right to renegotiate service.
Clients are not offered at-will, they either work or they don't. Nvidia ships AArch64 UNIX drivers, Apple is the one that neglects their UNIX clients.
GeekyBear 22 hours ago [-]
Using your monopoly market share video service as a weapon against companies offering platforms that compete with your own is textbook antitrust behavior.
Google used YouTube as a weapon against both Windows Phone and devices running Amazon's Fire fork of Android.
bigyabai 22 hours ago [-]
> monopoly market share video service
A "monopoly" "service"? What have they monopolized, laziness? It's not the App Store, you can go replace it with DailyMotion at your earliest convenience.
You're still retreading why your original comment was not at all relevant to the critique being made. We have precedent for prosecuting monopolistic behavior in America, but it doesn't encompass services even when they're mandatory to use the client. It does have a precedent for arbitrarily preventing competitors from shipping a runtime that competes with the default OS, incidentally.
GeekyBear 17 hours ago [-]
When your product has a monopoly market share, you don't get to use it as a weapon against competitors in other markets, even if you claim there is some imaginary exception to antitrust law involving servers.
use that product as a weapon against competitors in other markets
That conduct is clearly illegal.
raw_anon_1111 5 hours ago [-]
The entire idea of capitalism is to use your advantages over competitors.
But exactly how does Apple have a monopoly in computers with less than 10-15% market share?
GeekyBear 2 hours ago [-]
The entire point of antitrust law is to place limits on what capitalists are allowed to do.
Apple doesn't have what American law sees as a monopoly market share in any market.
Be aware that other jurisdictions, like the EU, start placing restrictions on the behavior of companies with lower market share than American antitrust law requires.
raw_anon_1111 2 hours ago [-]
Instead of passing laws, maybe the EU could foster an economic environment where Europeans wouldn’t have to depend on American tech companies and actually has tech products that people wanted to buy…
bigyabai 16 hours ago [-]
You don't get to demand that the server support your endpoint, period. There is no precedent for that ever happening in US antitrust law, because it's not anticompetitive.
If you think otherwise, make your case to Google's lawyers instead of spinning hypothetical case law.
realusername 23 hours ago [-]
There hasn't been any abuse in this story as far as I know, it's not like mass downloads of videos happened with their client.
bigyabai 23 hours ago [-]
That's besides the point, you don't own the server. You cannot expect the server to work forever, or demand a right to access it.
You do own the client though. In the example upstream, the failure to support macOS clients can't be blamed on Nvidia because they already wrote AArch64 UNIX support.
GeekyBear 22 hours ago [-]
You cannot use a monopoly market share product like YouTube as a weapon against companies who compete with you in other areas.
This is as basic as antitrust law gets.
realusername 13 hours ago [-]
When you have a monopoly like YouTube, yes you can expect to have an access to the platform if it prevents competition. It's textbook antitrust laws.
bigyabai 3 hours ago [-]
You're going to need to cite legal precedent for that. I could not go sue HBO for monopolizing Game of Thrones and refusing to stream it in 8K to my Linux PC. There are no damages.
GeekyBear 1 hours ago [-]
You're going to need to cite this imaginary "server" exemption to American antitrust law that you claim to believe exists.
realusername 2 hours ago [-]
Google is above the law in the US, there's a reason why the CEO sits at the presidential inauguration. It's not about laws but power.
And your example is pretty poor, HBO doesn't have a 10th of the power of YouTube.
GeekyBear 1 hours ago [-]
It's worth remembering that the career officials in the Obama FTC wanted to launch an antitrust lawsuit against Google back in the day.
The political appointees (of both parties) shut that lawsuit down.
jtbayly 1 days ago [-]
Isn't all you have to do disable SIP?
frollogaston 19 hours ago [-]
Yeah I'm pretty sure Nvidia just doesn't care to make Mac drivers. For years there was no SIP, Apple sold the Mac Pro which could take Nvidia GPUs, but you basically couldn't use Nvidia because of how bad and outdated the drivers were. I had a GTX 650 in my Mac Pro for a while, it was borderline unusable.
tensor-fusion 1 days ago [-]
As more people carry ARM laptops and keep the GPU somewhere else, I think the interesting UX question becomes whether the GPU can "follow" the local workflow instead of forcing the whole workflow to move to the GPU host. That's the problem we've been looking at with GPUGo / TensorFusion: local-first dev flow, remote GPU access when needed. Curious whether people here mostly want true attached-eGPU semantics, or just the lowest-friction way to access remote compute from a Mac without turning everything into a remote desktop / VM workflow.
mort96 1 days ago [-]
I mean when it comes time to output the image from the GPU, I don't want to add a hundred milliseconds of network latency...
whalesalad 1 days ago [-]
This is re gpu for compute not graphics.
mattnewton 1 days ago [-]
Still undesirable latency for a lot of compute use cases, like image or video editing; it’s really only negligible for LLMs.
Since that’s definitely a big enough use case all on its own, I wonder if such a product should really just double down on LLMs.
serf 23 hours ago [-]
remote GPU compute payloads have been around a lot longer than LLMs, they're just few and far between.
folding@home and other such asynchronous "get this packet of work done and get back to me' style of operations rarely care much about latency.
Remote transcoding efforts can usually adjust whatever buffer needed to cover huge latency gaps , a lot of sim and render suites can do remote work regardless of machine to machine latency..
I just sort of figure the industry will trend more async when latency becomes a bigger issue than compute. Won't work in some places, but I think we tend to avoid thinking that way right now due to a lack of real need to do so; but latency is one of those numbers that trends down slowly.
mort96 1 days ago [-]
Oh. Weird use for a graphics unit.
nkrisc 24 hours ago [-]
Using GPU for compute is nothing new or unusual these days, not for quite a while.
userbinator 12 hours ago [-]
I've heard it phrased thus: The "G" in "GPU" stands for "general-purpose".
mort96 12 hours ago [-]
No, but its primary purpose remains graphics
nkrisc 2 hours ago [-]
Arguably that’s no longer true.
lostlogin 1 days ago [-]
It’s what’s driven nearly the entire AI boom.
tensor-fusion 11 hours ago [-]
[dead]
ajdegol 11 hours ago [-]
I think that metal isn’t double precision; so that limits some serious physics simming; but if you’re doing that I guess you just rent a gpu somewhere.
I would definitely be into this if adding an egpu was first class supported.
Keyframe 1 days ago [-]
Such a shame both companies are big on vanity to make great things happen. Imagine where you could run Mac hardware with nvidia on linux. It's all there, and closed walls are what's not allowing it to happen. That's what we as customers lose when we forego control of what we purchase to those that sold us the goods.
deepsun 1 days ago [-]
Don't purchase? I don't own any Apple devices, everything works fine.
TheDong 1 days ago [-]
Unfortunately, Apple still won't release iMessage for Android or Linux (unlike every other messenger platform, like whatsapp, telegram, wechat, microsoft teams, etc, which are all cross-platform).
Because of that, you need an apple device around to be able to deal with iMessage users.
pezezin 7 hours ago [-]
Good thing that iMessage is only popular in the US. I have never seen anybody using it, I don't even know how it looks, and if someone told me to use it I would laugh at them.
deepsun 20 hours ago [-]
Then it would be more correct to say that we "lose when we forego control" when our friends push the iMessage on us.
In my bubble literally noone uses iMessage. More tech savvy use Signal/GroupMe, less tech savvy use SMS/Email. Family use Signal to chat with me, as I can steer my own family a little.
Also I sometimes open web-interface of Facebook, but any attempts to offer WhatsApp I answer "sorry no Facebook apps on my phone, no Instagram/Messenger either". Never had any issues with that. Although I heard some countries are very dependent on Facebook, so might be hard there.
By the way, I noticed it's not hard to use multiple messengers actually, sometimes it's even faster to find a contact as you always remember what app to look at in recents.
UPDATE: My point is that you can also influence your life and how people communicate with you. Up to a point of course, but it's not like you can do nothing with it.
nozzlegear 18 hours ago [-]
> In my bubble
Your green bubble? =P
My social circle is the complete opposite. We're all on iMessage (except for one group of extended family on Messenger), and we like it that way. I was the last holdout for years while I went from Android -> Windows Phone -> Android -> iPhone.
troad 20 hours ago [-]
But you don't need an Apple device to contact iMessage users. Every iMessage ID is a phone number (SMS/RCS) or email.
You've listed a whole bunch of alternatives available to you, but for some reason you demand that Apple change its unique offering into just another one of those for you. Why? Is that not a completely enforced monoculture?
Apple has always been off to the side, doing their own thing, and for some reason that fact utterly enrages people. They demand that Apple become just like everyone else. But we already have everyone else! And in every single field Apple is in, there is more of everyone else than there is of Apple.
Have you considered people like Apple products precisely because they're not like everything else? That making Apple indistinguishable from Facebook or Google is no victory, but a significant loss for customer choice?
sunnybeetroot 1 days ago [-]
That is no longer true. https://bluebubbles.app/
Well… it’s not exactly no longer true, you do need an Apple VM but it doesn’t have to be the end device.
simulator5g 21 hours ago [-]
This is against ToS and they will be shut down eventually
1 days ago [-]
raw_anon_1111 12 hours ago [-]
No you don’t. You can “deal” with iMessage users by using SMS and RCS
kllrnohj 1 days ago [-]
Why? Just make iMessage users put up with green bubbles if they want to talk to you?
Thanks to Apple co-opting phone numbers, there's literally no need to ever have iMessage for anyone
Underphil 1 days ago [-]
[dead]
aljgz 1 days ago [-]
I don't understand the logic for downvotes.
We vote with our wallets.
When I could not update the Ram on my personal Dell machine I asked for a Frame.work in my new job. As my Intel based FW at work had thermal throttling problems, for my next personal purchase I got an AMD one. As Ubuntu had shady practices, I installed Fedora, as Gnome forced UX choices I did not want, I used KDE. As I wanted my machine to be even more stable I use an immutable spin.
The machine I'm using now represents my choices and matches what matters to me, and works closer to perfectly than all my machines in the past
And yes, I have worked with macs, and no, the UX and the entire tyranny in the Apple ecosystem was not something I could live with
And yes, this machine is fast, predictable, a joy to work with and is a tool I control, not a tool to control me. If something happens to it, I can order the part with the same price that goes into a new machine, and keep using my laptop
TheDong 1 days ago [-]
"We vote with our wallet, so don't complain" is a bad take in my opinion.
Like, for phones, I want a phone which runs Linux, has NFC support, and also has iMessage so my friend who only communicates with blue-bubbles and will never message a green-bubble will still talk to me. I also want it to have regulatory approval in the country I live in so I can legally use it to make calls.
Because apple has closed the iMessage ecosystem such that a linux phone can't use it, such a device is impossible. I cannot vote for it.
As such, I will complain about every phone I own for the foreseeable future.
yjftsjthsd-h 23 hours ago [-]
> Like, for phones, I want a phone which runs Linux, has NFC support, and also has iMessage so my friend who only communicates with blue-bubbles and will never message a green-bubble will still talk to me. I also want it to have regulatory approval in the country I live in so I can legally use it to make calls.
I actually agree with you, but I also suggest getting better friends.
namtab00 24 hours ago [-]
if that's what you call a "friend"...
Underphil 21 hours ago [-]
What is the blue and green bubble thing? I've never used an iPhone so don't understand the term. Does it classify messages as iMessage and non-iMessage?
TheDong 20 hours ago [-]
iOS has two built-in messaging apps. Like all phones, they have SMS built in, and hardly anyone uses it for anything except SMS 2FA codes.
And then they have iMessage, aka blue bubbles, which are kinda like Signal or Whatsapp or Telegram. Everyone in Europe uses whatsapp, and a lot of people in the US use iMessage. If you don't use whatsapp in europe, you'll have a rough time communicating with some social groups, and the same thing for iMessage in the US.
However, unlike every other messenger app I can think of, iMessage isn't cross platform.
Also unlike every other messenger I can think of, it comes installed by default and for some reason uses the same app as the SMS app, and also claims encryption but randomly switches to SMS and breaks encryption making it obviously the least secure of all the apps (and also backs up your keys to iCloud in a way apple can access them by default, neither here nor there).
Blue bubbles are when iMessage is acting as the iMessage app, and has encryption and can use features like sending high resolution photos, location, invites, and a bunch of other apple-specific features.
Green bubbles are when the iMessage app has converted itself into the SMS and RCS app, and has a reduced feature set, like being unable to remove people from group chats.
It's frankly a quite confusing decision to have two quite different apps built into the same app and indicate which feature-set is active based on the color of a UI element. I think everyone would prefer if apple split it into the 'Messages' app (SMS + RCS) and an optional 'iMessage' app which doesn't come installed by default, but you can download on the app store from Apple. I'm frankly surprised the EU hasn't forced apple to show a prompt for "default messenger app" on startup with the options being "Whatsapp", "iMessage", etc etc, like they do for default browser.
nozzlegear 17 hours ago [-]
> I think everyone would prefer if apple split it into the 'Messages' app (SMS + RCS) and an optional 'iMessage' app which doesn't come installed by default, but you can download on the app store from Apple.
No, I don't think anyone would prefer that. People on iOS like iMessage, not SMS + RCS. Nobody is confused by it, they all know that green bubbles means you're texting someone who doesn't have an iPhone. It works seamlessly, it's just annoying when you want to have along conversation with a friend on Android because it doesn't have any nice iMessage extras available – that's why people don't like green bubbles.
robertoandred 17 hours ago [-]
No, Apple has one built-in messaging app: Messages. It switches between SMS, RCS, and iMessage automatically depending on the capabilities of the devices.
EagnaIonat 9 hours ago [-]
> If you have a Thunderbolt or USB4 eGPU and a Mac, today is the day you've been waiting for!
I got an eGPU back in 2018 and could never get it to work. To the point that it soured me from doing it again.
These days for heavy duty work I just offload to the cloud. This all feels like NVidia trying to be relevant versus ARM.
embedding-shape 9 hours ago [-]
> This all feels like NVidia trying to be relevant versus ARM.
Except it's done by a third group, tinygrad, so it's more non-nvidia people wanting to use nvidia hardware one Apple hardware, than "nvidia trying to be relevant".
ffsm8 8 hours ago [-]
Yeh, Nvidia couldn't give less of a fuck about consumers.
And egpu is inherently only consumer targeted.
bigyabai 4 hours ago [-]
FWIW Nvidia already supports UNIX OSes and AArch64 with their drivers. CUDA and CUDNN could be working overnight if Apple signed the drivers.
EagnaIonat 9 hours ago [-]
Thanks for the correction. I guess my PTSD on trying to get this running before is bias'ing my response.
arjie 1 days ago [-]
Woah, this is exciting. I'm traveling but I have a 5090 lying around at home. I'm eager to give it a go. Docs are here: https://docs.tinygrad.org/tinygpu/
I hope it'll work on an M4 Mac Mini. Does anyone know what hardware to get? You'll need a full ATX PSU to supply power, right? And then tinygrad can do LLM inference on it?
999900000999 1 days ago [-]
You can buy a cheap GPU enclosure for about 100$ off ali express.
Takes a standard PSU. However, Mac Minis don't have occulink. So you might be a bit limited by whatever USB C can do.
Now if Intel can get there Arc drivers in order we'll see some real budget fun.
Those $100 ones don't come with a cage. If you do want a cage, you'll end up with $180 in total, with zero warranty.
Article mentions: "Apple finally approved our driver for both AMD and NVIDIA"
Does not mention Intel (GPUs). Select AMD GPUs work on macOS, but...
Macs (both Intel and ARM) support TB, but eGPU only work on Intel Macs, and basically only with AMD.
Good news is for medium end gaming choices are solid, and CUDA works on AMD these days.
999900000999 1 days ago [-]
Fortune favors the bold my friend.
I own one of these, the cage is just a piece of plastic. Anyway, I don't think 80$ is that big of a difference here. I can't really afford a 4k Nvidia GPU. Intel is my only hope.
Fnoord 1 days ago [-]
Almost twice the price and simply more accurate info regarding price and features.
Brand is TH3P4G3. Egpu.io has decent eGPU comparisons.
I wouldn't want all that dust in my GPU fans, prefer that near my case fans. I also don't like it given I got cats and want to store/box hw. I do use the eGPU in the fuse box. If I had a larger house, I'd use a server rack.
I was recently in the market for an eGPU but for a different niche (not eGPU/eNPU/eTPU but getting a HBA via TB to connect a LTO-6 drive via SAS). I went for a Sonnet instead, very low profile and small. I also bought an Asus one. Slightly bigger, came with more fans but TB4 instead of TB3 on the Sonnet. The cages are aluminium. Those eGPU were second hand (also without warranty but quicker S&H than Chinese New Year) but came with PSU. As you also gotta buy a PSU for it which came with the eGPUs I mentioned. For me no biggie, as I got a decent PSU lying around.
jasomill 23 hours ago [-]
I've been using a Sonnet eGPU box with Nvidia GPUs (1070/3070) on an Intel NUC for about 5 years, and it works great.
One nice thing about the Sonnet eGPU boxes is that they use standard SFX PSUs that are inexpensive to replace if they fail.
For LTO, I'm cheap, and iSCSI over a dedicated 2.5 Gbps Ethernet link is fast enough for my aging FC LTO-5 drives and spinning rust backup disks.
serf 23 hours ago [-]
I used Sonnet egpu box on a similarly equipped Dell XPS and it had so many little issues that it sold me off of eGPUs over Thunderbolt entirely.
Sleep broke across all OSs, if sleep didn't break the GPU wouldn't get powered on with the laptop. If one side lost power during an outage (the gpu side, the laptop has a battery..) it would require an elaborate voodoo ritual of cycling both of them on and off until they 'caught' each other. It would cause the rest of the USB ports on the laptop to reset and drop comms with peripherals once or twice a week, necessitating a rain-dance restart.
when Oculink first started showing up I gave up all together and just said "fuck it i'll try it again in a few years.".
It worked fine when it worked fine, but the patches in between were not worth my time.
I blame Dell and their thunderbolt controllers entirely for the issue, but it left such a bad taste in my mouth that I would have a really tough time buying the newest Sonnet box to try it out. Now I have a desktop machine and don't fall into that market.
I ended up throwing that card (an rtx 3xxx) into a dell rackmount and have been happy with that card ever since.
to your point though: the non proprietary PSU was a nice feature, but in reality the expansion card for PCI->Thunderbolt or whichever interface you're using can be bought on alibaba for like 20-30 bucks and the PSU is worth another 30-40 bucks , a generic white-label 650w. I think if I did it over i'd just do that and make an enclosure, but the Sonnet boxes aren't too bad a value by the numbers.
1 days ago [-]
manmal 1 days ago [-]
Maybe I’m lacking imagination. But how will a GPU with small-ish but fast VRAM and great compute, augment a Mac with large but slow VRAM and weak compute? The interconnect isn’t powerful enough to change layers on the GPU rapidly, I guess?
zozbot234 1 days ago [-]
> But how will a GPU with small-ish but fast VRAM and great compute, augment a Mac with large but slow VRAM and weak compute?
It would work just like a discrete GPU when doing CPU+GPU inference: you'd run a few shared layers on the discrete GPU and place the rest in unified memory. You'd want to minimize CPU/GPU transfers even more than usual, since a Thunderbolt connection only gives you equivalent throughput to PCIe 4.0 x4.
manmal 1 days ago [-]
But isn’t the Mac Mini the weak link in that scenario?
zozbot234 1 days ago [-]
It has way more unified memory than your typical dGPU.
manmal 23 hours ago [-]
Yes obviously. That VRAM is also slower and has weak compute attached. Loading to the external GPU will slow things down too much.
arjie 1 days ago [-]
My Mini is actually the smallest model so it actually has "small but slow VRAM" (haha!) so the reason I want the GPU for are the smaller Gemmas or Qwens. Realistically, I'll probably run on an RTX 6000 Pro but this might be fun for home.
GeekyBear 1 days ago [-]
We've seen many recent projects to stream models direct from SSD to a discrete GPU's limited VRAM on PCs.
How big a bottleneck is Thunderbolt 5 compared to an SSD? Is the 120 Gbps mode only available when linked to a monitor?
manmal 1 days ago [-]
That’s what, 14GB/s? The GPU‘s VRAM can do 100x that.
GeekyBear 1 days ago [-]
A discrete consumer GPU card doesn't have enough fast RAM to run a very large model that hasn't been quanitized to hell.
That's why all the projects streaming models into the GPU from an SSD popped up recently.
manmal 23 hours ago [-]
Yes. There’s just no way to get above 1t/s that way with a large model.
lowbloodsugar 1 days ago [-]
“Lying around”. I’ve got an unopened 5090 in a box that I know will suffer the same fate, so I’m sending it back. So privileged to have the money to impulse buy a 5090 and yet no time to actually do anything with it.
c-c-c-c-c 11 hours ago [-]
You should see his ferrari.
mlfreeman 1 days ago [-]
I followed the instructions link and read the scripts...although the TinyGPU app is not in source form on GitHub, this looks to me like the GPU is passed into the Linux VM underneath to use the real driver and then somehow passed back out to the Mac (which might be what the TinyGrad team actually got approved).
Or I could have totally misunderstood the role of Docker in this.
My read of everything is that they are using Docker for NVIDIA GPUs for the sake of "how do you compile code to target the GPU"; for AMD they're just compiling their own LLVM with the appropriate target on macOS.
MeetRickAI 20 hours ago [-]
[dead]
MeetRickAI 20 hours ago [-]
[dead]
eoskx 1 days ago [-]
Interesting, but cannot run CUDA or more to the point `nvidia-smi`.
embedding-shape 1 days ago [-]
Well, to be fair, the whole shebang is from a completely different company, that have their own ML library and such, so that isn't that surprising. Although I agree that some CUDA shim or similar would be a lot more interesting, still getting to the place of running inference and training with your very own library is pretty dope already.
wmf 1 days ago [-]
Pretty misleading. This driver is only for compute not graphics.
polotics 1 days ago [-]
As a sizable share of the market is going to want to use this for local LLMs, I do not think this is that misleading.
bigyabai 16 hours ago [-]
Most people I know are not using TinyGrad for inference, but CUDA or Vulkan (neither of which are provided here).
comboy 1 days ago [-]
GPUs can do graphics too?
aobdev 22 hours ago [-]
I can’t tell if you’re making a joke about the current state of AI and GPUs or refuting the purpose of this driver
manmal 1 days ago [-]
Graphics was not what came to mind when I saw the headline.
mort96 1 days ago [-]
Graphics is typically what comes to my mind when people talk about graphics processing units
manmal 12 hours ago [-]
The latest MacBook Pros don’t even need external GPUs to run AAA games.
mort96 12 hours ago [-]
What do you mean when you say "run"? Low graphics 45 FPS at 720p? Or ultra graphics 120 FPS at 4k? My assumption is that a fairly large part of that space is inaccessible with the integrated GPU.
Fnoord 1 days ago [-]
The term eGPU gives it away, but is inaccurate.
Something like eNPU or eTPU seems more appropriate here.
vondur 1 days ago [-]
If you could get Nvidia driver support on Mac’s I bet Apple would have sold more MacPro’s.
ProllyInfamous 19 hours ago [-]
If unfamiliar: it is a big deal that AAPL & NVDA again have an official relationship.
For well over the previous decade Apple has not allowed newer nVidia GPUs (by not allowing drivers).
A seven year old GPU (e.g. VEGA64, RTX1080Ti) can still process more tokens/second than most Apple Silicon (particularly the lower-ends).
As discussed elsewhere, Apple MAX/Ultra processors are best-suited for huge models (but are not as fast as e.g. RTX5090).
bigyabai 16 hours ago [-]
This is not an official relationship, this is a third-party effort by tiny corp with no Nvidia involvement.
amelius 11 hours ago [-]
These tinyboxes are so expensive (starting at $12,000), why don't they just put a CPU inside and allow users to ssh into them?
the__alchemist 1 days ago [-]
I'm writing scientific software that has components (molecular dynamics) that are much faster on GPU. I'm using CUDA only, as it's the eaisiest to code for. I'd assumed this meant no-go on ARM Macs. Does this news make that false?
wmf 1 days ago [-]
This driver doesn't support CUDA.
ksec 1 days ago [-]
This comment should be pinned at the top.
brcmthrowaway 23 hours ago [-]
Isnt mlx a cuda translation later?
ykl 22 hours ago [-]
No, MLX is nothing like a Cuda translation layer at all. It’d be more accurate to describe MLX as a NumPy translation layer; it lets you write high level code dealing with NumPy style arrays and under the hood will use a Metal GPU or CUDA GPU for execution. It doesn’t translate existing CUDA code to run on non-CUDA devices.
superb_dev 22 hours ago [-]
My understanding is that MLX is Apple’s CUDA, so a CUDA translation layer would target MLX
ykl 21 hours ago [-]
No, it’s not. MLX is Apple’s NumPy more or less.
wmf 22 hours ago [-]
Does tinygrad support MLX?
frankc 1 days ago [-]
My main thought is would this allow me to speed up prompt process for large MoE models? That is the real bottleneck for m3ultra. The tokens per second is pretty good.
embedding-shape 1 days ago [-]
tinygrad does have pretty neat support for sharding things across various devices relatively easy, that'd help. I'm guessing you'd hit the bandwidth ceiling transferring stuff back and forth though instead.
dd_xplore 1 days ago [-]
Why does Apple need to make the drivers in a walled garden? Atleast they should support major device categories with official drivers.
wtallis 1 days ago [-]
Doesn't Apple support the major standard device categories: NVMe, XHCI, AHCI, and such, like most operating systems do? The challenges are all for hardware that needs a vendor-specific driver instead of conforming to a standard driver interface (which doesn't always exist). Lots of those can be supported with userspace drivers, which can be supplied by third parties instead of needing to be written by Apple.
bigyabai 20 hours ago [-]
> NVMe
Using proprietary connectors.
> XHCI
Not on Lightning.
> AHCI
How exactly would Apple not support AHCI?
wtallis 15 hours ago [-]
>> NVMe
> Using proprietary connectors.
Not for the past decade; it's been no connectors for most products, but standard PCIe connectors for the Mac Pro, and NVMe over Thunderbolt works fine.
>> XHCI
> Not on Lightning.
Again, not relevant to any recent products. And I'm pretty sure you're misunderstanding what XHCI is if you think anything with a Lightning connector is relevant here (XHCI is not USB 3.0). You can connect a Thunderbolt dock that includes an XHCI USB host controller and it works out of the box with no further driver or software support. I assume you can do the same with a USB controller card in a Mac Pro.
>> AHCI
> How exactly would Apple not support AHCI?
This might be another case of you not understanding what you're talking about and are lost in an entirely different layer of the protocol stack. Not supporting AHCI would be easy, since they're no longer selling any products that use SATA, and PCIe SSDs that use AHCI instead of NVMe died out a decade ago. But as far as I know, a SATA controller card at the far end of a Thunderbolt link or in a Mac Pro PCIe slot should still work, if the SATA controller uses AHCI instead of something proprietary as is typical for SAS controllers.
GeekyBear 1 days ago [-]
> Why does Apple need to make the drivers in a walled garden?
For the same reason that Microsoft requires Windows driver signing?
Drivers run with root permissions.
embedding-shape 1 days ago [-]
> Why does Apple need to make the drivers in a walled garden?
Isn't that the whole point of the walled garden, that they approve things? How could they aim and realize a walled garden without making things like that have to pass through them?
curt15 21 hours ago [-]
I think the OP is asking why Apple is enclosing macs in a walled garden when that concept is generally associated with iPhones, not general-purpose computers.
mschuster91 1 days ago [-]
> Why does Apple need to make the drivers in a walled garden?
Because third party drivers usually are utter dogshit. That's how Apple managed to get double the battery life time even in the Intel era over comparable Windows based offerings.
MrArthegor 1 days ago [-]
Macs and PCs are fundamentally different. Their architectures have always been distinct though the Intel Mac era has somewhat blurred the line.
Modern Mac is Macintosh descendants and by contrast PC is IBM PC descendants (their real name is technically PC-clone but because IBM PC don’t exist anymore the clone part have been scrapped).
And with Apple silicon Mac the two is again very different, for example Mac don’t use NVMe, they use just nand (their controller part is integrated in the SoC) and they don’t use UEFI or BIOS, but a combination of Boot ROM, LLB and iBoot
ErenalpCet 4 hours ago [-]
yes good report
direwolf20 49 minutes ago [-]
Isn't it sad that we've ended up in a situation where we are talking about "Apple approves" rather than "someone creates"? Fuck Apple.
brcmthrowaway 1 days ago [-]
What are the limitations of USB4/Thunderbolt compared with a regular PCIe slot?
embedding-shape 1 days ago [-]
Well, for starters, PCIe 5.0 x16 would do something like about 60 GB/s each way, while Thunderbolt 4 does 4 GB/s each way, TB 5 does 8 GB/s each way. If you don't actually hit the bandwidth limits, it obviously matters less. Whether you'd notice a large difference would depends heavily on the type of workload.
No, it does 80 Gb/s. With encoding loss it’s closer to 8GB/s
1 days ago [-]
1 days ago [-]
yonatan8070 1 days ago [-]
I can speak to my own experience, YMMV
I hooked up a Radeon RX 9060 XT to my Feodra KDE laptop (Yoga Pro 7 14ASP9) using a Razer Core X Chroma (40Gbps), and the performance when using the eGPU was very similar to using the Radeon 880M built into the laptop's Ryzen 9 365 APU.
So at least with my setup, performance is not great at all.
On paper, TB4 is capable of pushing 5GB/s, which is somewhere between 4x and 8x of PCIe 3.0, while a 16x PCIe 4.0 link can do ~31.5GB/s.
Edit to add: the performance I measured is in gaming workloads, not compute
jasomill 21 hours ago [-]
For gaming, lots of things can affect Thunderbolt eGPU performance.
First, you need to connect the display directly to the eGPU rather than to the laptop.
Second, you need to make sure you have enough VRAM to minimize texture streaming during gameplay.
Third, you'll typically see better performance in terms of higher settings/resolutions vs higher framerates at lower settings/resolutions.
Fourth, depending on your system, you may be bottlenecked by other peripherals sharing PCH lanes with the Thunderbolt connection.
Finally, depending on the Thunderbolt version, PCIe bandwidth can be significantly lower than the advertised bandwidth of the Thunderbolt link. For example, while Thunderbolt 3 advertises 40 Gbps, and typically connects via x4 PCIe 3.0 (~32 Gbps), for whatever reason it imposes a 22 Gbps cap on PCIe data over the Thunderbolt link.
Even taking all this into account, you'll still see a significant performance drop on a current-gen GPU when running over Thunderbolt, though I'd still expect a useful performance improvement over integrated graphics in most cases (though not necessarily worth the cost of the eGPU enclosure vs just buying a cheap used minitower PC on eBay and gaming on that instead of a laptop).
justincormack 1 days ago [-]
It carries pcie, but only at x4. Thunderbolt 4 is pcie gen 3 and Thunderbolt 5 is pcie gen 4.
brcmthrowaway 22 hours ago [-]
Thats poor.. It's just copper, why can't it be as fast as a PCIe slot..
wtallis 5 hours ago [-]
Thunderbolt is its own protocol, electrically incompatible with PCIe. Its purpose is to encapsulate data traffic from multiple other protocols (PCIe, DisplayPort, USB) and multiplex that over the same wires. It cannot function exactly like an external PCIe port because it's solving a bigger, more complicated problem.
lowbloodsugar 19 hours ago [-]
Can I do prefill on the eGPU and the decode on the Mac?
Probably because you can't actually read anything more than the initial post without getting a login-wall: "Join X now to read replies on this post." (Not to mention "X" is a trash site now)
collabs 20 hours ago [-]
I don't go there unless I'm looking for something specific but I've found adding cancel at the end helps
It’s often a secondary/tertiary source unless you’re looking for official statements.
vegabook 23 hours ago [-]
[flagged]
yjftsjthsd-h 23 hours ago [-]
They... do? Or rather, they built a system where they don't need to; macs happily run Linux on bare metal or VMs. (Whether Linux supports Apple hardware well is another matter)
userbinator 24 hours ago [-]
[flagged]
amelius 23 hours ago [-]
You only own the hardware if you can use it as advertised even after breaking all ties with the vendor. Otherwise you bought a service not a product.
mrits 24 hours ago [-]
You aren't restricted at a hardware level.
ddtaylor 23 hours ago [-]
Apple has hardware level DRM in some of their products.
llm_nerd 23 hours ago [-]
So you're just replying to the headline, not the actual article. Useful.
Apple, just like Microsoft, has a driver signing process because drivers have basically system-wide access to a system. There is no evidence that nvidia has tried to get eGPU drivers signed for years, but now someone did and Apple signed it. So?
And you could always, precisely as the article states in the very first paragraph, disable System Integrity Protection if you want to run drivers that aren't signed.
u_fucking_dork 23 hours ago [-]
[flagged]
bigyabai 1 days ago [-]
The opportunity cost of Apple refusing to sign Nvidia's OEM AArch64 drivers is probably reaching the trillion-dollar mark, now that Nvidia and ARM have their own server hardware.
chuckadams 1 days ago [-]
Apple got out of the server game long before they adopted aarch64, so that's a trillion worth of server hardware they never would have sold anyway. And probably not actually a trillion.
bigyabai 1 days ago [-]
Apple was the only one stopping themselves from getting back in. It's not like the Mac is a trillion-dollar market segment to begin with.
QuantumNomad_ 1 days ago [-]
Almost everyone including myself had MacBook Pros at my last place of work.
If Apple was in the high-end server market, I see no reason why the company I was working for would not be running macOS on Apple hardware as servers, instead of the fleet of Linux based servers they had.
1 days ago [-]
bigyabai 23 hours ago [-]
Why wait? You can go run macOS as a server right now. It will take you a few hours to get Docker working, and disable mdworker_shared() and turn off SIP, and then install a package manager/XCode utilities, and finally configure macOS to run as a headless UNIX box, but it's attainable.
Despite how easy Apple makes it, nobody is really using Macs as a server in production. Apple[0] is not using them as a server in production. They would need a radically different strategy to replace Linux, because their efforts on macOS still haven't replaced Windows.
You want to use an NVidia GPU for LLM ? just buy a basic PC on second hand (the GPU is the primary cost anyway), you want to use Mac for good amount of VRAM ? Buy a Mac.
With this proposed solution you have an half-backed system, the GPU is limited by the Thunderbolt port and you don’t have access to all of NVidia tool and library, and on other hand you have a system who doesn’t have the integration of native solution like MLX and a risk of breakage in future macOS update.
> the hardware wasn't usable on macOS
This eGPU thing is from a third-party if I understand correctly. I don't see why nvidia would get excited about that. If they cared about the platform, they would have released something already.
The software stack has been ready for Apple Silicon for more than a half decade.
The tradeoff vs. a physical eGPU: no Thunderbolt bandwidth ceiling or cabling, but you do need to be on the same LAN and there's ~4% overhead vs. native. Doesn't help if you need the GPU while traveling, and won't fix the physical macOS driver situation for native GPU access.
Disclosure: I work on GPU Go (tensor-fusion.ai/products/gpu-go), so I'm obviously biased toward this approach — but it genuinely is a different point in the design space from eGPU.
At that point you're making more work for yourself than debugging over SSH.
Yes, for many scenarios this is "not even an academic exercise".
For a very select few applications this is Gold. Finally serious linear algebra crunch for the taking. (Without custom GPU tapeout.)
Not everything is limited by the transfer speed to/from the GPU. LLM inference, for example.
I thought Thunderbolt was like pluggable PCI? The whole point was not to limit peripherals.
[1] https://docs.tinygrad.org/tinygpu/
he actually did reply weeks later and said "i didnt realize people wanted this, my team has added them. go check now". pretty sure that was the last time nvidia drivers came to macos.
there's a lot of assumptions made with this topic, particularly the assumption that apple is blocking them. at least in my experience the opposite was true, nvidia just flat out wasn't making them. however i don't doubt the truth lies somewhere in between: nvidia and apple have a pretty much nonexistant relationship now. i dont know whats required here but i also don't doubt apple makes this experience suck butt for any interested parties.
Before that was the pre-trash can Mac Pro in 2006-2012. So that was canceled most of a decade before the 2019 model.
High bandwidth PCIe hasn’t been a thing in Apple world for most of 15 years.
Also Thunderbolt is trivially disconnected, which in many critical workflows is not a positive, but an opportunity for ill-timed interruptions. Plus I don't have to buy a fucking dongle/dock for a real goddamn slot, make room for external power supplies, etc.
Apple has a monopoly over the "M-chip" personal computer market. They have a monopoly over the iOS market with the app store. They have a monopoly over the driver market on macOS.
Like, Microsoft was found guilty of exploiting its monopoly for installing IE by default while still allowing other browser engines. On iOS, apple bundles safari by default and doesn't allow other browser engines.
If we apply the same standard that found MS a monopoly in the past, then Apple is obviously a monopoly, so at the very least I think it's fair to say that reasonable people can disagree about whether Apple is a monopoly or not.
[0]: https://en.wikipedia.org/wiki/United_States_v._Microsoft_Cor....
The relevant thing here isn't the chips, it's tying things to the chips, because those would otherwise be separate markets. If you could feasibly buy an iPhone and install Android or Lineage OS on it or use Google Play or F-Droid on iOS then no one would be saying that Apple has a monopoly on operating systems or app stores for iOS since there would actually be alternatives to theirs.
The fake alternative is that you could use a different store by buying a different phone, but this is like saying that if Toyota is the only one who can change the brake pads on a Toyota and Ford is the only one who can change the brake pads on a Ford then there is competition for "brake pads" because when your Toyota needs new brake pads you can just buy a Ford vehicle. It's obvious why this is different than anyone being able to buy third party brake pads for your Toyota from Autozone, right?
> It’s also probably relevant that MS was not selling PCs or their own hardware.
This is the thing that unambiguously should never be relevant. It can't be a real thing that you can avoid being a monopoly by owning more of the supply chain. It's like saying that Microsoft could have avoided being a monopoly by buying Intel and AMD, or buying one of them and then exterminating the other by refusing to put Windows on it. That's a preposterous perverse incentive.
Move the most important aspects of your software to hardware. Hard for MacOS but for a Chromebook style thing you could write the browser into its own pice of wafer.
Google should pay me to be this evil.
So now you have a piece of silicon with a two year old version of Chrome with seventeen CVEs hard-coded into it, and still have all the same antitrust problems because the device still also has an ordinary general purpose CPU that you're still anti-competitively impeding people from using to run Firefox or Ladybird.
Microsoft was found guilty, so clearly the bar is not what you're trying to claim.
Go ahead, I'll wait.
There are plenty of Linux distributions that use immutable root volumes. They protect the user in a huge number of ways by preventing the system from getting hosed (either by accident or by malicious unauthorized users / malware). Apple made the decision to do this for their users, and it has prevented a HUGE amount of tech support calls, as well as led to millions of happy users with trouble-free computers.
It also hasn't stopped users from installing Chrome and/or Firefox on their Macs, and millions of ordinary users have.
You seem to be ignoring the part where you can't install the Chome and/or Firefox browser engines on iOS and the apps with those names on that platform are just skins over Safari. Notice in particular that the iOS version of "Firefox" can't support extensions.
But hey, maybe some weird shit happened during the clone years that I’m not privy to.
Just an example… and yes, I know the EU ruling but it’s still fitting.
Compare the games console market. Nintendo is allowed to say you have to go through them to sell games for the Switch, ditto Microsoft with the Xbox. Sony doing the same thing with the Playstation is exactly equivalent, but they're approaching the sort of market dominance where it might soon be illegal for them (and them alone) to do that in some markets.
Copyright (e.g. over iOS) and patent (e.g. over iPhone hardware) are explicitly government-granted monopolies. Having that monopoly is allowed on purpose, but that isn't the same as it not existing, and having a government-granted monopoly and leveraging into another market are two quite distinct things.
> Compare the games console market.
Okay, all of the consoles that require you to sell you to sell through their stores shouldn't be able to do that either.
> but they're approaching the sort of market dominance where it might soon be illegal for them (and them alone) to do that in some markets.
Wait, your theory is that a console with ~50% market share has market dominance but Apple with ~60% of US phones doesn't?
The monopoly that Microsoft held was the home computer operating system market, first through DOS, then later through Windows. Holding a monopoly like that isn't illegal unto itself. What they were actually found guilty of was unfairly leveraging their monopoly on the OS market to gain the upper hand in a different market (the browser market). The subsequent range of issues we had with IE6 (compatibility, security, etc) was a result of Microsoft succeeding in achieving a monopoly on the browser market through illicit means.
Likewise, "Apple has a monopoly on the App Store" is just the same amount of nonsense. What you could argue is that Apple has a monopoly on the home computer market, or the mobile phone market, and that the way they integrate the App Store should be considered illegal leveraging of that monopoly, but that argument simply doesn't hold water — Microsoft's monopoly on the OS market at the time was pretty much incontrovertible, you simply couldn't walk into a shop and buy a computer running something else (except maybe a Mac at a more specialised place). Today, just about any shop you walk into that sells computers will probably have devices for sale running three different OSes (macOS, Windows, ChromeOS). Any phone place will have iPhones and Android devices, and probably a few more niche options. Actual market share percentage is nowhere near the high 90s that Microsoft saw in its heyday. At most, Apple is the biggest individual competitor in the market, but I don't think it hold an outright majority in any specific product class.
Mind you, I think that there is a good argument to be made that the Apple/Google duopoly on mobile devices does deserve scrutiny, but that's a very different kettle of fish.
But the M series are an Apple product line designed by Apple with a ARM license and produced on contract by TSMC for use in other Apple products.
Don’t assume the facts from another case automatically apply in other cases.
Or as Justice Jackson once put it: “Other cases presenting different allegations and different records may lead to different conclusions”
When a company is deemed an illegal monopoly, the DoJ basically becomes part of management. Antitrust settlements focus on germane elements, e.g. spin offs. But they also frequently include random terms of political convenience.
I don’t think we want a precedent where companies having a product means they have an automatic monopoly on said product.
lmao what ? the "M-chip" is literally their chip that they designed, built relationships with TSMC over and bankrolled into production to put in their products. literally hardware by apple for apple. this was a decade plus long thing in the making, this is the risk/gamble apple took and invested heavily into. that is apples innovation. any other manuf is free to go do this themselves for their own devices, they just didn't and for the most part still don't. that just like isn't a monopoly at all, i'm amused you even got to that point in the first place. seems to carry some broad misunderstandings of what the M-series chips are or carries an assumption that cpus are supposed to be shared to any interested parties just because that was intels business model. intel was historically slacking & their one-size-fits-most approach wasn't meeting the engineering requirements apple was after generation after generation, so apple took the cpu destiny into their own hands and made their own. if you feel like non-apple laptop chips aren't living up to that kind of perf/ppu.... well yeah you'd be right. but that's not really apples fault. that's not a monopoly thing, like at all. either laptop manufs need to go make their own chip (unlikely) or intel/qualcomm/etc need to catch up.
Intel sold chips to anyone. Anyone could make Intel computers.
Apple does not sell chips to anyone. Nobody else can make m-series computers.
Your argument is basically that Ford has a monopoly on selling mustangs because standard oil had a monopoly on selling oil.
If we have a right to repair (we broadly do not, AFAICT), then that doesn't necessarily mean that we have a right to modify and/or add new functionality.
When I repair a widget that has become broken, I merely return it to its previous non-broken state. I might also decide to upgrade it in some capacity as part of this repair process, but the act of repairing doesn't imply upgrades. At all.
> No OS provider should be allowed to dictate what software you can or not run on your own device and / or OS you have paid for.
I agree completely, but here we are anyway. We've been here for quite some time.
Apple's decision is not constrained by server logic or ballooning costs, it is entirely a client-based policy to not sign CUDA drivers.
Microsoft rewrote their Windows Phone native client to pass through Google's ads. Google still blocked it.
Was it normal behavior when Google blocked Amazon Fire devices from connecting to YouTube with a web browser during the Google/Amazon corporate spat?
To be fair, Google did back down almost immediately when the tech press picked up on it.
Not allowing a native client for your monopoly market share video service on Amazon devices while also blocking Amazon's web browser on those devices is making things a bit too obvious.
Clients are not offered at-will, they either work or they don't. Nvidia ships AArch64 UNIX drivers, Apple is the one that neglects their UNIX clients.
Google used YouTube as a weapon against both Windows Phone and devices running Amazon's Fire fork of Android.
A "monopoly" "service"? What have they monopolized, laziness? It's not the App Store, you can go replace it with DailyMotion at your earliest convenience.
You're still retreading why your original comment was not at all relevant to the critique being made. We have precedent for prosecuting monopolistic behavior in America, but it doesn't encompass services even when they're mandatory to use the client. It does have a precedent for arbitrarily preventing competitors from shipping a runtime that competes with the default OS, incidentally.
https://www.ftc.gov/advice-guidance/competition-guidance/gui...
have a product with a monopoly market share
AND
use that product as a weapon against competitors in other markets
That conduct is clearly illegal.
But exactly how does Apple have a monopoly in computers with less than 10-15% market share?
Apple doesn't have what American law sees as a monopoly market share in any market.
Be aware that other jurisdictions, like the EU, start placing restrictions on the behavior of companies with lower market share than American antitrust law requires.
If you think otherwise, make your case to Google's lawyers instead of spinning hypothetical case law.
You do own the client though. In the example upstream, the failure to support macOS clients can't be blamed on Nvidia because they already wrote AArch64 UNIX support.
This is as basic as antitrust law gets.
And your example is pretty poor, HBO doesn't have a 10th of the power of YouTube.
The political appointees (of both parties) shut that lawsuit down.
Since that’s definitely a big enough use case all on its own, I wonder if such a product should really just double down on LLMs.
folding@home and other such asynchronous "get this packet of work done and get back to me' style of operations rarely care much about latency.
Remote transcoding efforts can usually adjust whatever buffer needed to cover huge latency gaps , a lot of sim and render suites can do remote work regardless of machine to machine latency..
I just sort of figure the industry will trend more async when latency becomes a bigger issue than compute. Won't work in some places, but I think we tend to avoid thinking that way right now due to a lack of real need to do so; but latency is one of those numbers that trends down slowly.
I would definitely be into this if adding an egpu was first class supported.
Because of that, you need an apple device around to be able to deal with iMessage users.
In my bubble literally noone uses iMessage. More tech savvy use Signal/GroupMe, less tech savvy use SMS/Email. Family use Signal to chat with me, as I can steer my own family a little.
Also I sometimes open web-interface of Facebook, but any attempts to offer WhatsApp I answer "sorry no Facebook apps on my phone, no Instagram/Messenger either". Never had any issues with that. Although I heard some countries are very dependent on Facebook, so might be hard there.
By the way, I noticed it's not hard to use multiple messengers actually, sometimes it's even faster to find a contact as you always remember what app to look at in recents.
UPDATE: My point is that you can also influence your life and how people communicate with you. Up to a point of course, but it's not like you can do nothing with it.
Your green bubble? =P
My social circle is the complete opposite. We're all on iMessage (except for one group of extended family on Messenger), and we like it that way. I was the last holdout for years while I went from Android -> Windows Phone -> Android -> iPhone.
You've listed a whole bunch of alternatives available to you, but for some reason you demand that Apple change its unique offering into just another one of those for you. Why? Is that not a completely enforced monoculture?
Apple has always been off to the side, doing their own thing, and for some reason that fact utterly enrages people. They demand that Apple become just like everyone else. But we already have everyone else! And in every single field Apple is in, there is more of everyone else than there is of Apple.
Have you considered people like Apple products precisely because they're not like everything else? That making Apple indistinguishable from Facebook or Google is no victory, but a significant loss for customer choice?
Thanks to Apple co-opting phone numbers, there's literally no need to ever have iMessage for anyone
The machine I'm using now represents my choices and matches what matters to me, and works closer to perfectly than all my machines in the past
And yes, I have worked with macs, and no, the UX and the entire tyranny in the Apple ecosystem was not something I could live with
And yes, this machine is fast, predictable, a joy to work with and is a tool I control, not a tool to control me. If something happens to it, I can order the part with the same price that goes into a new machine, and keep using my laptop
Like, for phones, I want a phone which runs Linux, has NFC support, and also has iMessage so my friend who only communicates with blue-bubbles and will never message a green-bubble will still talk to me. I also want it to have regulatory approval in the country I live in so I can legally use it to make calls.
Because apple has closed the iMessage ecosystem such that a linux phone can't use it, such a device is impossible. I cannot vote for it.
As such, I will complain about every phone I own for the foreseeable future.
I actually agree with you, but I also suggest getting better friends.
And then they have iMessage, aka blue bubbles, which are kinda like Signal or Whatsapp or Telegram. Everyone in Europe uses whatsapp, and a lot of people in the US use iMessage. If you don't use whatsapp in europe, you'll have a rough time communicating with some social groups, and the same thing for iMessage in the US.
However, unlike every other messenger app I can think of, iMessage isn't cross platform.
Also unlike every other messenger I can think of, it comes installed by default and for some reason uses the same app as the SMS app, and also claims encryption but randomly switches to SMS and breaks encryption making it obviously the least secure of all the apps (and also backs up your keys to iCloud in a way apple can access them by default, neither here nor there).
Blue bubbles are when iMessage is acting as the iMessage app, and has encryption and can use features like sending high resolution photos, location, invites, and a bunch of other apple-specific features.
Green bubbles are when the iMessage app has converted itself into the SMS and RCS app, and has a reduced feature set, like being unable to remove people from group chats.
It's frankly a quite confusing decision to have two quite different apps built into the same app and indicate which feature-set is active based on the color of a UI element. I think everyone would prefer if apple split it into the 'Messages' app (SMS + RCS) and an optional 'iMessage' app which doesn't come installed by default, but you can download on the app store from Apple. I'm frankly surprised the EU hasn't forced apple to show a prompt for "default messenger app" on startup with the options being "Whatsapp", "iMessage", etc etc, like they do for default browser.
No, I don't think anyone would prefer that. People on iOS like iMessage, not SMS + RCS. Nobody is confused by it, they all know that green bubbles means you're texting someone who doesn't have an iPhone. It works seamlessly, it's just annoying when you want to have along conversation with a friend on Android because it doesn't have any nice iMessage extras available – that's why people don't like green bubbles.
I got an eGPU back in 2018 and could never get it to work. To the point that it soured me from doing it again.
These days for heavy duty work I just offload to the cloud. This all feels like NVidia trying to be relevant versus ARM.
Except it's done by a third group, tinygrad, so it's more non-nvidia people wanting to use nvidia hardware one Apple hardware, than "nvidia trying to be relevant".
I hope it'll work on an M4 Mac Mini. Does anyone know what hardware to get? You'll need a full ATX PSU to supply power, right? And then tinygrad can do LLM inference on it?
Takes a standard PSU. However, Mac Minis don't have occulink. So you might be a bit limited by whatever USB C can do.
Now if Intel can get there Arc drivers in order we'll see some real budget fun.
https://www.newegg.com/intel-arc-pro-b70-32gb-graphics-card/...
32 GB of VRAM for 1000$. Plus a 500$ Mac Mini.
Article mentions: "Apple finally approved our driver for both AMD and NVIDIA"
Does not mention Intel (GPUs). Select AMD GPUs work on macOS, but...
Macs (both Intel and ARM) support TB, but eGPU only work on Intel Macs, and basically only with AMD.
Good news is for medium end gaming choices are solid, and CUDA works on AMD these days.
I own one of these, the cage is just a piece of plastic. Anyway, I don't think 80$ is that big of a difference here. I can't really afford a 4k Nvidia GPU. Intel is my only hope.
Brand is TH3P4G3. Egpu.io has decent eGPU comparisons.
I wouldn't want all that dust in my GPU fans, prefer that near my case fans. I also don't like it given I got cats and want to store/box hw. I do use the eGPU in the fuse box. If I had a larger house, I'd use a server rack.
I was recently in the market for an eGPU but for a different niche (not eGPU/eNPU/eTPU but getting a HBA via TB to connect a LTO-6 drive via SAS). I went for a Sonnet instead, very low profile and small. I also bought an Asus one. Slightly bigger, came with more fans but TB4 instead of TB3 on the Sonnet. The cages are aluminium. Those eGPU were second hand (also without warranty but quicker S&H than Chinese New Year) but came with PSU. As you also gotta buy a PSU for it which came with the eGPUs I mentioned. For me no biggie, as I got a decent PSU lying around.
One nice thing about the Sonnet eGPU boxes is that they use standard SFX PSUs that are inexpensive to replace if they fail.
For LTO, I'm cheap, and iSCSI over a dedicated 2.5 Gbps Ethernet link is fast enough for my aging FC LTO-5 drives and spinning rust backup disks.
Sleep broke across all OSs, if sleep didn't break the GPU wouldn't get powered on with the laptop. If one side lost power during an outage (the gpu side, the laptop has a battery..) it would require an elaborate voodoo ritual of cycling both of them on and off until they 'caught' each other. It would cause the rest of the USB ports on the laptop to reset and drop comms with peripherals once or twice a week, necessitating a rain-dance restart.
when Oculink first started showing up I gave up all together and just said "fuck it i'll try it again in a few years.".
It worked fine when it worked fine, but the patches in between were not worth my time.
I blame Dell and their thunderbolt controllers entirely for the issue, but it left such a bad taste in my mouth that I would have a really tough time buying the newest Sonnet box to try it out. Now I have a desktop machine and don't fall into that market.
I ended up throwing that card (an rtx 3xxx) into a dell rackmount and have been happy with that card ever since.
to your point though: the non proprietary PSU was a nice feature, but in reality the expansion card for PCI->Thunderbolt or whichever interface you're using can be bought on alibaba for like 20-30 bucks and the PSU is worth another 30-40 bucks , a generic white-label 650w. I think if I did it over i'd just do that and make an enclosure, but the Sonnet boxes aren't too bad a value by the numbers.
It would work just like a discrete GPU when doing CPU+GPU inference: you'd run a few shared layers on the discrete GPU and place the rest in unified memory. You'd want to minimize CPU/GPU transfers even more than usual, since a Thunderbolt connection only gives you equivalent throughput to PCIe 4.0 x4.
How big a bottleneck is Thunderbolt 5 compared to an SSD? Is the 120 Gbps mode only available when linked to a monitor?
That's why all the projects streaming models into the GPU from an SSD popped up recently.
Or I could have totally misunderstood the role of Docker in this.
My read of everything is that they are using Docker for NVIDIA GPUs for the sake of "how do you compile code to target the GPU"; for AMD they're just compiling their own LLVM with the appropriate target on macOS.
Something like eNPU or eTPU seems more appropriate here.
For well over the previous decade Apple has not allowed newer nVidia GPUs (by not allowing drivers).
A seven year old GPU (e.g. VEGA64, RTX1080Ti) can still process more tokens/second than most Apple Silicon (particularly the lower-ends).
As discussed elsewhere, Apple MAX/Ultra processors are best-suited for huge models (but are not as fast as e.g. RTX5090).
Using proprietary connectors.
> XHCI
Not on Lightning.
> AHCI
How exactly would Apple not support AHCI?
> Using proprietary connectors.
Not for the past decade; it's been no connectors for most products, but standard PCIe connectors for the Mac Pro, and NVMe over Thunderbolt works fine.
>> XHCI
> Not on Lightning.
Again, not relevant to any recent products. And I'm pretty sure you're misunderstanding what XHCI is if you think anything with a Lightning connector is relevant here (XHCI is not USB 3.0). You can connect a Thunderbolt dock that includes an XHCI USB host controller and it works out of the box with no further driver or software support. I assume you can do the same with a USB controller card in a Mac Pro.
>> AHCI
> How exactly would Apple not support AHCI?
This might be another case of you not understanding what you're talking about and are lost in an entirely different layer of the protocol stack. Not supporting AHCI would be easy, since they're no longer selling any products that use SATA, and PCIe SSDs that use AHCI instead of NVMe died out a decade ago. But as far as I know, a SATA controller card at the far end of a Thunderbolt link or in a Mac Pro PCIe slot should still work, if the SATA controller uses AHCI instead of something proprietary as is typical for SAS controllers.
For the same reason that Microsoft requires Windows driver signing?
Drivers run with root permissions.
Isn't that the whole point of the walled garden, that they approve things? How could they aim and realize a walled garden without making things like that have to pass through them?
Because third party drivers usually are utter dogshit. That's how Apple managed to get double the battery life time even in the Intel era over comparable Windows based offerings.
Modern Mac is Macintosh descendants and by contrast PC is IBM PC descendants (their real name is technically PC-clone but because IBM PC don’t exist anymore the clone part have been scrapped).
And with Apple silicon Mac the two is again very different, for example Mac don’t use NVMe, they use just nand (their controller part is integrated in the SoC) and they don’t use UEFI or BIOS, but a combination of Boot ROM, LLB and iBoot
https://www.convertunits.com/from/Gbps/to/GB/s
I hooked up a Radeon RX 9060 XT to my Feodra KDE laptop (Yoga Pro 7 14ASP9) using a Razer Core X Chroma (40Gbps), and the performance when using the eGPU was very similar to using the Radeon 880M built into the laptop's Ryzen 9 365 APU.
So at least with my setup, performance is not great at all.
On paper, TB4 is capable of pushing 5GB/s, which is somewhere between 4x and 8x of PCIe 3.0, while a 16x PCIe 4.0 link can do ~31.5GB/s.
For numbers about all PCIe generations and lane counts, see the "History and revisions" section here: https://en.wikipedia.org/wiki/PCI_Express
Edit to add: the performance I measured is in gaming workloads, not compute
First, you need to connect the display directly to the eGPU rather than to the laptop.
Second, you need to make sure you have enough VRAM to minimize texture streaming during gameplay.
Third, you'll typically see better performance in terms of higher settings/resolutions vs higher framerates at lower settings/resolutions.
Fourth, depending on your system, you may be bottlenecked by other peripherals sharing PCH lanes with the Thunderbolt connection.
Finally, depending on the Thunderbolt version, PCIe bandwidth can be significantly lower than the advertised bandwidth of the Thunderbolt link. For example, while Thunderbolt 3 advertises 40 Gbps, and typically connects via x4 PCIe 3.0 (~32 Gbps), for whatever reason it imposes a 22 Gbps cap on PCIe data over the Thunderbolt link.
Even taking all this into account, you'll still see a significant performance drop on a current-gen GPU when running over Thunderbolt, though I'd still expect a useful performance improvement over integrated graphics in most cases (though not necessarily worth the cost of the eGPU enclosure vs just buying a cheap used minitower PC on eBay and gaming on that instead of a laptop).
https://xcancel.com/__tinygrad__/status/2039213719155310736
Redirect: https://x.com/*
to: https://xcancel.com/$1
Apple, just like Microsoft, has a driver signing process because drivers have basically system-wide access to a system. There is no evidence that nvidia has tried to get eGPU drivers signed for years, but now someone did and Apple signed it. So?
And you could always, precisely as the article states in the very first paragraph, disable System Integrity Protection if you want to run drivers that aren't signed.
If Apple was in the high-end server market, I see no reason why the company I was working for would not be running macOS on Apple hardware as servers, instead of the fleet of Linux based servers they had.
Despite how easy Apple makes it, nobody is really using Macs as a server in production. Apple[0] is not using them as a server in production. They would need a radically different strategy to replace Linux, because their efforts on macOS still haven't replaced Windows.
[0] https://9to5mac.com/2026/03/02/some-apple-ai-servers-are-rep...