The thing that has been bothering me for a while is that the USB spec allows for software detection of capabilities. You can read the emarker data and see the supported protocols, speeds, voltages, etc.
But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
avian 1 days ago [-]
> In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
"This device can run faster" or something.
Vogtinator 1 days ago [-]
AFAIK that's just when plugging in a USB 3 device into a USB 2 port or using a USB 2 cable.
mrandish 1 days ago [-]
> that's just when plugging in a USB 3 device into a USB 2 port
Dell XPS laptops (and some others) can also warn if the charger isn't providing the full wattage the laptop is rated for. This warning is an option that can be turned off in the BIOS settings.
I usually turn it off because I sometimes intentionally do day trips with a smaller/lighter portable charger which delivers 45w to my laptop which can need up to 65w due to having a discrete GPU. However, 45w is more than sufficient to charge the laptop during normal use on the Balanced power plan with iGPU. I only need more than 45w when gaming with the discrete GPU active.
sokoloff 1 days ago [-]
Just this morning, my old Latitude failed to boot with a “this charger is only giving 20W and that’s not enough to boot this laptop” error. (I was testing a new USB-C charger that’s obviously going back.)
Weirdest part was it was 100% charged, so could have booted with 0 Watts of charger but decided not to boot with 20 Watts more.
mrandish 1 days ago [-]
Oh, refusing to boot at all is evil. I've never seen that.
Sure, you or I would just unplug the charger and run on battery but bad UX decisions like that generate a support call to me from my 95 yr old mom. It should not only warn and continue to boot, it should use whatever power is on offer to reduce the rate of battery drain.
koyote 22 hours ago [-]
Interesting that it refused to boot.
If I have a lower wattage charger connected on booth it shows me that information but I can just press enter to continue. It's just a warning.
Maybe it's a bios setting?
Workaround is of course to boot without a charger connected and then connect it later :)
mleo 1 days ago [-]
My wife's work laptop gives this stupid warning anytime any USBC charger is plugged in, other than the Dell brick. So even a dock delivering 100w would get a complaint. The Dell brick offers non-standard charging at 140w, which can't get replaced by standards compliant, smaller chargers.
imglorp 1 days ago [-]
I wonder if it's possible for a regular machine with two high speed ports to do a cable test by itself. Maybe it can't test all the attributes but could it at least verify speed claims in software?
mrandish 1 days ago [-]
Apparently the USB driver stack doesn't report the cable's eMarker chip data back to the OS. However benchmarking actual transfer throughput is the ultimate test for data connections (vs charging use cases). Unfortunately, TFA doesn't really go into this aspect of cable testing as the tester seems to only report eMarker data, which pins are connected and copper resistance.
Since a >$1,000 automated lab cable throughput tester is overkill, my thumbnail test for high-speed USB-C data cables is to run a disk speed benchmark to a very fast, well-characterized external NVMe enclosure with a known-fast NVMe drive. I know what the throughput should be based on prior tests with an $80 active 1M Thunderbolt cable made for high-end USB-C docks and confirmed by online benchmark reviews from credible sources.
Gigachad 1 days ago [-]
There would be too many factors involved for a proper test. Many laptop USB controllers would probably not even have the capacity to run two ports at full speed simultaneously.
LoganDark 1 days ago [-]
Even Apple now has one of those, when you plug something into the USB 2 port on the MacBook Neo.
Gigachad 1 days ago [-]
There’s still nothing when you plug a usb3 device in using a usb2 cable.
colechristensen 1 days ago [-]
I strongly suspected my old xps had nonstandard things going on with its USB C charger
Barbing 1 days ago [-]
Perhaps someday it will earn the same level of importance as charging; iOS 26 calls out slow chargers on their iPhones, so you can run to the Apple Store and buy a fast one!
They probably have to weigh potential new hardware sales against added complexity. I have counterpoints too but: I believe they try to protect users’ mental models of their ecosystem (which perhaps I appreciate when I don’t notice, and can’t stand when something is uncustomizable). Like there are enough variables they don’t trust us with as it is.
tredre3 1 days ago [-]
> iOS 26 calls out slow chargers on their iPhones, so you can run to the Apple Store and buy a fast one!
You jest but that notification (it's been a thing on Android for at least 8 years, and on thinkpads for at least 10) has been very helpful to me. Sometimes the negotiation just fails and being told is helpful. Sometimes the charger lies about its specs and once again it's helpful to have a hint, rather than expect everybody to systematically have usb testers on hand.
s3p 24 hours ago [-]
This one is pretty simple to do. It requests a voltage and then starts pulling current and monitors the voltage as it increases its current draw. If the voltage goes down, alert the user.
With data speed I think it could be a little more complicated. Like OP was saying it would need access to some level of hardware information where it can see which pins are used by the cable. Since the connection 'speed' is still variable even when you DO have a supported cable.
Barbing 23 hours ago [-]
I do jest, it’s a great feature. I never considered charger negotiation failure!
EnnEmmEss 13 hours ago [-]
It's fun to see the Treedix tester come up on HN. I got one a few months back and have quite enjoyed using it. One thing which I did find interesting was that one of the cables had the emarker data lie. IIRC, the emarker data would suggested it supported much higher speeds and wattage than it did. Fortunately, the other testing screens successfully detected it only had USB 2.0 wires even though it claimed to support 40 GB/s.
vladvasiliu 1 days ago [-]
> But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
There is. I used to use a KVM with USB 2 ports connected to my PC's USB 3 port, to which I connected a monitor with integrated USB 3 hub to drive my keyboard and mouse. Windows would show a popup every time telling me that I should use a faster cable.
There are also popups telling me that my laptop is connected to a "slow" usb-c charger.
bdavbdav 1 days ago [-]
That’s quite a simplistic one unfortunately - USB 2 and 3 use different controllers in the PC, which it can indeed detect. The sub-flavours of 3/4 less so.
wholinator2 1 days ago [-]
I've used all manner of archaic usb cables for data transfer when in a pinch and windows has never shown me anything at all. Could it be the external device you were connecting to triggering the windows notification?
Forgeties79 1 days ago [-]
I have seen these kinds of notifications on occasion but they are far from the norm.
prism56 14 hours ago [-]
Android must have this in some form. My Pixel phone with a third-party app can show the voltage and current mode selected via the PD protocols.
Using DevCheck might show 2.2A/9V as an example.
QuantumNomad_ 1 days ago [-]
On iPhone, when connecting an external MIDI device via USB, the phone told me that the device was drawing too much power and would be disabled.
I don’t know if they check that via USB protocol, or if they are measuring the actual power draw on the USB port.
In order to use the device, I had to connect it via an externally powered USB hub.
graemep 1 days ago [-]
I suspect most users do not even realise things are slow.
TeMPOraL 1 days ago [-]
Oh, they very much do. But like with everything in technology, they can do fuck all about it, so they resign and maybe complain to you occasionally if you're the designated (in)voluntary tech support person for your family and friends.
Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
ChrisMarshallNY 1 days ago [-]
They also tend to hate technology, because us nerds are often unbearable.
They hate having to go through people that get them upset, in order to use their kit.
Not just tech (although it’s more prevalent). People who are “handy” can also be that way (but, for some reason, techies tend to be more abrasive).
I’ve learned the utility of being patient, and not showing the exasperation that is often boiling inside of me.
tomcam 1 days ago [-]
Amen. I couldn’t have said it better.
In general for the 40+ years I’ve been a programmer I have detested the practice of not surfacing diagnostic information to users when technology makes it possible to do so in a clear and unambiguous way.
graemep 1 days ago [-]
Most users tend to ignore diagnostic information.
"What did the error message say"
"I don't know."
Telaneo 1 days ago [-]
This is because error messages have historically been bad, unintelligible, un-actionable, and hard to separate from soft errors that don't actually matter.
'Segmentation fault. Core dumped.'
'Non-fatal error detected. Contact support.'
'An error occurred.'
'An illegal operation was performed.'
'Error 92: Insufficient marmalade.'
'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?'
'Saving as .docx is not recommended because blah-blah-blah never gonna give you up nor let you down.'
I can't blame any normal user from either not understanding nor giving a shit about any of these. If we'd given users actionable information from day 1, we'd be in a very different world. Even just 'Error 852: Couldn't reach the network. Check your connection to the internet.' does help those who haven't turned of their brains entirely yet.
ben_w 24 hours ago [-]
30 or so years back, one of the Mac magazines had a customer support quote along these lines:
"I don't understand, it says 'System Error Type 11', and no matter how many times I type 11, nothing happens!"
Telaneo 23 hours ago [-]
Now imagine if that error said 'Error 11: A memory error occurred. Your program may be faulty or misbehaving. Contact your software vendor." That's miles better than what most things provide.
That one's a good example of why these things are hard. The user could have been running 5 different programs, any one of which caused this error, and MacOS can't point the finger at anyone. Not to mention that the problem could be MacOS itself, or the user being a dunce who misconfigured something. I'm not sure if that error can occur without 3rd party software being involved, but if it can, then that error message might need to be even more vague, helping the user even less. Not to mention it could just be faulty hardware.
A paper manual offering troubleshooting steps for each error would be really helpful. Just 'Error 11. Consult your manual.' and the manual actually telling you what the problem could be is also miles better than what we usually get.
TeMPOraL 19 hours ago [-]
> The user could have been running 5 different programs, any one of which caused this error, and MacOS can't point the finger at anyone.
It's still an example why it's worth giving your users a fighting chance. MacOS may not know enough to point the finger at anyone, but the user knows what they were doing at that moment, and even if they were not paying attention, they might start now. They'll realize if something is off. Or, after 10th time they get this error, they'll connect the dots and realize it's always happening when application X is running and they try to launch Y.
Or maybe sometimes they won't. Maybe they'll form a story and maybe it'll be all bullshit, or maybe good enough. Either way, the important part is, the user retains agency in the process. Giving people information is how they can become self-sufficient users and trust technology more.
ben_w 23 hours ago [-]
This was 30 years ago, it was Mac OS classic with co-operative multitasking and zero inter-process memory protection, when the error comes up the only option was "restart" (the computer, not the task).
Telaneo 23 hours ago [-]
I know.
stoneman24 23 hours ago [-]
The author Terry Pratchett had some of best error messages in his Discworld novels. The Hex computer could produce the following
++?????++ Out of Cheese Error. Redo From Start.
+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
+++Whoops! Here comes the cheese! +++
bsder 20 hours ago [-]
Don't blame this one on programming techies. This one is ALL the fault of shitty UI designers abusing modal dialog boxes.
A modal dialog is supposed to be for something damn near irreversible--like about to blow away your application because of error. You are supposed to STOP and go get the guru or you are about to lose, badly.
Unfortunately, UI designers throw them up for everything and people get used to simply clicking "OK" to make them go away so that they can get back to doing their task. So, when the user gets an actual error, they've already blown away the dialog box with information.
Your 'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?' line is a horrifically excellent example. That is a standard "Save As..." response, and it should NEVER have been. That should have always been under "Export..." as saving should never throw away information and it would be perfectly fine to regenerate a JPG as long as you have the full information still available in the original file.
This is the stuff that infuriates me about the UI designers. Your job is about interactions, first, and pixels, second.
pseudohadamard 16 hours ago [-]
This is because error messages have historically been bad, unintelligible, un-actionable, and hard to separate from soft errors that don't actually matter.
And they've only got worse: "Something went wrong". Well no shit Sherlock, I can tell something went wrong because the thing I tried to do didn't work. Possibly the single most useless error message every created, and it's everywhere. Most of the worst-case error messages in the quoted response are still better than this one.
If you ever run into a developer who thinks "something went wrong" is an appropriate error message, have them killed. Then kill their entire family and pets, burn their house down, and plough salt into the ground where it stood. Finally, put up a sign that says "The person who used to live here thought 'something went wrong' is an appropriate error message to display when something goes wrong. Take note of their current situation when you next add an error message to your software".
dijit 1 days ago [-]
I had a programmer pushing multi-gig packages to a Meta Quest 3; and it was taking around a minute. He didn’t even think that it could be faster because he assumed the Quest or software was slow and didn’t check.
I implored him to try a different cable (after checking cables with the Treedix mentioned in TFA), and the copy went from taking over a minute to about 13s.
Its not just normal people confused.
bdavbdav 1 days ago [-]
I find some programmers (and this is presumably true of any industry) very narrow in their expertise within technology.
lpcvoid 24 hours ago [-]
Yeah, most programmers are not curious hackers anymore. They are 9-5 white collar workers with hobbies far outside of programming, systems, hardware, etc. It shows very much as soon as you meet one of them. But, like you said, this is true of any industry.
Oh, and pointy jab: these folks are also, in my opinion/experience, the most eager to vibecode shit. Make of that what you will.
ben_w 24 hours ago [-]
"anymore"? Over a decade ago, a coworker had a path for updating some app's files to a database, and it was taking something like 10 minutes on certain test inputs.
Swore blind it couldn't be improved.
By next morning's stand-up, I'd found it was doing something pointless, confirmed with the CTO that the thing it was doing was genuinely pointless and I'd not missed anything surprising, removed the pointless thing, and gotten the 10 minutes down to 200 milliseconds.
I'm not sure if you're right or wrong about the correlation with vibe-coding here, but I will say that co-workers's code was significantly worse than Claude on the one hand, and that on the other I have managed to convince Codex to recompute an Isochrone map of Berlin at 13 fps in a web browser.
lpcvoid 24 hours ago [-]
I do feel like the industry has taken a nosedive quality wise over covid in particular. Lots of new people only in tech for the money, no deep idea about computers.
But I know stories like yours from a decade past as well. A tale old as time, but compounding in recent years - IMHO.
I blame it on "software eating the world" (in general) - at some point, about two decades ago, it started to become obvious to everyone that programming is the golden ticket to life - an easy desk job paying stupid amounts of money, with no barriers to entry. So very quickly the pool of students, and then employees, became dominated by people who joined in for the pay, not because of interest in technology itself.
I think you are right, but I think what I said is also true.
People will notice some things. For example, with USB if they are using it for local backup they might notice, but with a lot of devices they will not. When they do notice, they will feel powerless.
Even if we had a wider choice, they are not well placed to pick products. There is no way they will know about details of things such as USB issues (a cable is slow, the device will not tell you if it is) at the time of purchase.
Forgeties79 1 days ago [-]
I think any of us just have to look at how many people ask us for recommendations on basic things like docks and adaptors to see how common this is. On top of that you can’t even trust what’s on the tin sometimes.
Gigachad 1 days ago [-]
This is true of basically everything. Even trivial home maintenance people will just put up with things being broken most of the time over learning how to fix them.
19 hours ago [-]
encom 24 hours ago [-]
I've lived in this apartment for about a year and a half. It took me until last week to put up lights over the stairs. I've been walking on the stairs in the dark, some times using my phone as a light.
I'm an electrician.
Telaneo 23 hours ago [-]
Physician, heal thyself. The cobbler's children have no shoes.
pseudohadamard 16 hours ago [-]
Does it matter, for anyone other than hardcore geeks? All the OS would care about is how much power can it deliver and what data speed it provides, not whether the exception handling on page 4,096, section 4(a)2.1, paragraph 4 of the spec, has been implemented.
dijit 1 days ago [-]
I actually purchased one of these as this article has surfaced before.
It’s well worth the hype, I used it to audit all my cables (both for home and work) and it’s amazing how many thick and unwieldy cables are actually terrible for data.
For example I purchased a pair of B&W Px8 S2 noise cancelling headphones, which boast a DAC if you connect via USB-C directly, the cable it came with though was thick but only rated for USB 2.0 speeds. These headphones cost more than AirPods Max, which are themselves considered overpriced, and include comforts like nappa leather; so shipping with a chunky cable that doesn’t even carry decent data feels like a bizarre oversight. Apple’s own USB-C cables manage the same power delivery at less than half the thickness with a woven shell. You’d assume a premium product would at least match that.
Honourable mention to the USB-C cables that ship with Dell Ultrasharp monitors (both pre-USB4 and post). Those support basically everything except Thunderbolt 4 despite being unmarked.
Aurornis 1 days ago [-]
> so shipping with a chunky cable that doesn’t even carry decent data feels like a bizarre oversight.
USB 2.0 can support up to 480 Mbps. It’s more than fast enough for any audio stream you can send to a DAC.
Your headphones don’t need USB 3.0 5 Gbps speeds. USB 3 requires extra wires with different properties that need to be controlled more tightly, which can impact cable flexibility. If your headphones used USB 3 when they didn’t need it that would be one more thing to break and more failure modes for the cable.
A USB 2 cable with fewer conductors was the right choice for this product. The fact that you only got miffed about it when plugging the cable into a tester, not from actually using the product or cable, is good evidence that a USB 3 cable wasn’t needed.
dijit 1 days ago [-]
Nobody said the headphones needed USB 3. The point is that the cable is physically thick and rigid (like something you'd expect to carry serious data) but doesn't. Meanwhile Apple ships a thinner, more flexible cable that supports the same USB 2.0 speeds and equivalent power delivery. The cable B&W chose is worse ergonomically for no functional benefit. That's the kind of mismatch the Treedix exposes.
goodmythical 1 days ago [-]
For headphones? I'll take the more robust cable any day. The thinner it is the sooner I'm going to have to replace it.
Aurornis 1 days ago [-]
USB 3 doesn’t just mean “higher quality wire”. It requires additional data pairs that take up space and add cost.
Apple’s iPhone cables are not known for their durability. They serve a mostly stationary purpose, unlike headphones you wear on your head.
lozenge 1 days ago [-]
A thick cable tangles less easily.
joshvm 1 days ago [-]
I started buying Belkin TB5 cables which are around $50 a pop. They can easily power a laptop at full load and can stream video at any reasonable resolution/framerate I might need. I've yet to find a need for an NVMe faster than 20 GBps nevermind finding USB4 enclosures, or that the cable supports up to 80. They're also not nearly as chunky as the Dell cables, which are good, but seem to have very rigid shielding.
I keep a few converters for older devices and servers that don't have (m)any C ports, but as far as a consumer "forever cable" goes, TB5 feels close. Certainly the cable's bandwidth is beyond what most people need, unless you're editing 8k video or continually shuffling hundreds of GBs between external disks.
dijit 1 days ago [-]
I do something largely similar.
It alleviates the anxiety of knowing what cable does what.
I use Apples Thunderbolt 4 or USB-C cables exclusively: if its white its for charging and low data, if its black its for high data.
I’ve been doing this for a few years, but its really costly as those Apple Thunderbolt cables are crazy expensive.
Barbing 20 hours ago [-]
I wish the USB spec had mandated labeling. There must be a lovely label printer they make for cables or something that shouldn’t be too expensive these days. Label a few cables a day, finish the whole house in a jiffy.
LTT or another big YouTuber made a cable and made sure to get it labeled. Also complained how difficult it was to find a supplier willing to make a better cable than usual.
vladvasiliu 23 hours ago [-]
> Honourable mention to the USB-C cables that ship with Dell Ultrasharp monitors (both pre-USB4 and post). Those support basically everything except Thunderbolt 4 despite being unmarked.
I have one of those. They are thick and unwieldy af. Since I've borked the usb connection on my monitor because of static discharge, I no longer use it and figured I'd repurpose it for my digital camera, for which I used to have a short cable that was sometimes annoying. This cable is so freaking think and hard that it'll move my (admittedly somewhat light) camera on the table.
fwip 1 days ago [-]
Is there any audio you might play that doesn't fit in 400Mbps?
dijit 1 days ago [-]
The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.
It’s rigid and thick, like a Thunderbolt 3 cable, yet only supports USB 2.0 speeds and fast charging for a device that doesn’t need fast charging.
Compare that to Apple’s iPhone USB-C cable which is thin, flexible, and supports the same features.
That matters because someone might grab that cable assuming it’s a “better cable”: it came with a £629 product, it’s thick and feels serious, so surely it’s capable. But it isn’t. And there’s nothing marked on it to tell you otherwise.
The whole system ends up relying on presumption, which is exactly the problem the device in the article is solving.
Aurornis 1 days ago [-]
> The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.
The purpose of the heavy construction is to make it durable, not to carry 5 Gbps data streams to your headphones.
Unlike most USB peripherals like your printer and keyboard that get plugged in and then don’t move around, headphone cables go to your head and move around constantly. They can get pinched in drawers or snagged on corners.
Hence the more durable construction.
dijit 1 days ago [-]
Apple's woven USB-C cable gets dragged around with iPhones, iPads and laptops daily and manages durability at half the thickness. Durability doesn't require rigidity... in fact for a headphone cable, rigidity is the opposite of what you want. Stiff cables tug on the headphones and transmit mechanical noise.
Aurornis 1 days ago [-]
You don’t wear your iPhone or iPad on your head with the cable plugged in all day like you do with headphones.
Apple’s USB iPhone cables wearing out prematurely is so common it’s a meme.
goodmythical 1 days ago [-]
Not sure why you're being downvoted.
Maybe Apple's changed their cables recently, but the fragility is the reason I avoid Apple cables.
Especially in headphones. The number of times those broke during a bike ride or run was way to high for me to keep wasting money on them knowing full well they weren't going to last more than a few months just like every other Apple headphone I've ever had.
It’s common to add weights to headphones to make them feel premium which is bizarre since actually premium headphones tend to try very hard to reduce weight as the weight makes them more uncomfortable.
I don’t know how to fix the market especially when consumers keep rewarding these practices, and I think the effectiveness of TikTok style influencer marketing will make it worse.
dijit 1 days ago [-]
I don’t think that’s what’s happening here. B&W actually reduced the weight on the Px8 S2 compared to the original, and the headphones themselves are genuinely lightweight for what they are. The cable isn’t thick to “feel premium” (it feels kinda bad); it’s thick because it’s rated for 65W+ power delivery that the headphones don’t need.
The problem is the opposite of what you’re describing, it’s not a cynical design choice, it’s a lazy one. They probably just purchased a cable for capabilities irrelevant to the product and the result is worse ergonomics and misleading physical cues about what the cable can actually do.
cjbgkagh 1 days ago [-]
“I don’t think..” Ok, you’ve made a number of assumptions and we don’t share the same priors so I’m unable to follow you to your conclusion.
I think you are underestimating the importance of perceived premium combined with the pressures of cost accounting, but I do think that is pretty normal for ‘audiophiles’ which is their target market.
dijit 1 days ago [-]
Which assumptions? The weight reduction on the S2 is documented and the cable’s 65W rating is what the tester confirmed.
If the argument is that B&W deliberately chose a thick cable to seem premium, it doesn’t square with them actively slimming down the headphones. B&W are primarily a speaker company, their USB-C product range is basically just a few headphones and earbuds.
More likely they just sourced a generic cable that happened to support high wattage and didn’t think about the mismatch.
Either way, we’re deep in the weeds on B&W’s cable procurement now. The root point is that USB-C is a mess. You can’t tell what a cable supports by looking at it, and even premium manufacturers are shipping cables that don’t do what you’d reasonably expect.
That’s exactly the problem the Treedix from the article solves.
cjbgkagh 1 days ago [-]
My point on weight was that the market for that it is common, which is probably a stronger statement than needed. I should have made the weaker argument and said the market exists which only needs one example. The company Beats can serve as that example, this company sells the majority of premium headphones but I don’t actually know what percentage have weights placed in them. I am assuming a non trivial percentage.
You are using circular reasoning in your logic, you assume the premise is true and from there you derive your evidence.
I would contend that someone thought about it and decided to go with the cheaper option because they could get away with it. I would consider my assumption to have more grounding given my experience with manufacturing and cost accounting.
dijit 1 days ago [-]
You’ve gone from “companies add weight to feel premium” to “they went with the cheaper option because they could get away with it.” Those are opposite explanations. But either way, the cable doesn’t do what its physical presence suggests, nothing on it tells you otherwise, and that’s the entire point of the device in the article.
cjbgkagh 1 days ago [-]
My position is entirely consistent, it is cheaper to signal premium quality than actually deliver it. The point I am making is that there is immense comercial pressure to do this is a highly competitive market when selling to consumers who don’t know better.
My example of weights is that the steel weighs are cheaper than the alternative of using heavier drivers, by adding weight they are signaling premium without delivering it. Similarly with the USB cable, consumers assume such cables are thick because of thicker wires and better shielding, it’s cheaper to make a thick cable without those those features, once again signaling premium without actually providing it.
dijit 1 days ago [-]
That's a more coherent version of your argument, but it's still speculative. You're attributing a deliberate strategy to what is more easily explained by indifference. B&W make about four products with USB-C cables. This isn't a company with a cable strategy, cynical or otherwise.
cjbgkagh 1 days ago [-]
4th times the charm. You’ve provided no evidence for indifference. My point remains, given industry standards indifference would be highly unusual and not at all a safe assumption.
The vast majority of high volume consumer manufacturers use cost accounting practices which would absolutely be tracking and attributing the usb cable costs and the whole point of that accounting practice is to constantly be thinking about minimizing costs of even the smallest inputs, all the way down to the individual screws used. Yes, they’re thinking about how to save 1/100ths of a cent from each screw.
windowsrookie 1 days ago [-]
The reason it is thick is because it supports 65W charging. Apple did the same with the USB-C cables that shipped with the pre-MagSafe MacBooks. It was a thicker cable that supported 100w charging but was only USB 2.0.
dijit 1 days ago [-]
Can you help me understand why that would be a reason to compromise the comfort of the cable that is supplied for a device which charges at 5w?
Or, why Apple manages the same in half the footprint?
Or, why someone would expect that a cable that came with a pair of headphones actually charges things at over 65w?
Retr0id 1 days ago [-]
Like most things in the audiophile world, it's more about aesthetics than anything else. A big cable looks like it means business.
dijit 1 days ago [-]
I think that's being a bit uncharitable to B&W specifically; they're one of the few headphone companies where the engineering does back up the price. The cable is the odd one out.
Retr0id 1 days ago [-]
I don't have an informed opinion of B&W either way, but are you sure it's not an instance of Gell-Mann amnesia?
22 hours ago [-]
ssl-3 1 days ago [-]
The headphones have equivalent performance whether a USB 2 cable is connected, or a USB 3 cable is connected. The headphones themselves are not USB 3 devices; the addition of USB 3 cabling instead of USB 2 cabling would change absolutely nothing about how they work.
So, no: I wouldn't expect the cable for a pair of headphones (of any price) to support USB 3. That represents extra complexity (literally more wires inside) that is totally irrelevant for the product the cable was sold with. (The cables included with >$1k iPhones don't support USB 3, either.)
Meanwhile: Fast charging. All correctly-made USB C cables support at least 3 amps worth of 20 volts, or 60 Watts. This isn't an added-cost feature; it's just what the bare minimum no-emarker-inside specification requires. A 25-cent USB C-to-C cable from Temu either supports 60W of USB PD, or it is broken and defiant of USB-IF's specifications.
---
Now, of course: The cable could be thinner and more flexible and do these same things. That'd probably be preferred, even: Traditional analog headphones often used very deliberately thin cables with interesting construction (like using Litz wire to reduce the amount of internal plastic insulation) to improve the user's freedom of movement, and help prevent mechanical noise from the cables dragging across clothes and such from being telegraphed to the user's ears.
Using practical cabling was something that headphone makers strived to be good at doing. I'm a little bit annoyed to learn that a once-prestigious company like B&W is shipping cables with headphones that are the antithesis of what practical headphone cables should be.
---
But yeah, both USB C cables and the ports on devices could be better marked so we know WTF they do, to limit the amount of presumption required in the real world. So that a person can tell -- at a glance! -- what charging modes a device accepts or provides, or whether it supports video, or whether it is USB 2 or USB 3, or [...].
Prior to USB C, someone familiar with the tech could look at a device or a cable and generally succeed at visually discerning its function, but that's broadly gone with USB C. What we have instead is just an oblong hole that looks like all of the other oblong holes do.
After complaining about this occasionally since the appearance of USB C a decade or so ago, I've come to realize that most people just don't care about this -- at all. Not even a little bit. Even though these things get used by common people every day, the details are completely out of the scope of their thought processes.
It doesn't have to be this way, but it's not going to change: Unmarked ports are connected together with unmarked cables and thus unknown common capabilities are just how we roll.
dijit 22 hours ago [-]
The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.
Your last paragraph is depressingly accurate though. I think that's exactly why devices like the Treedix exist: the standards bodies and manufacturers clearly aren't going to fix the marking problem, so now we need test equipment to figure out what our own cables do.
ssl-3 10 hours ago [-]
> The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.
"I heard what you guys are planning and I talked to my financial guy. He said I have enough to put a manufactured home on some land in some desolate place like the Dakotas or central Wisconsin, as long as I keep a bit of supplemental income and live a little lower. So I'm going to do that, and take my chances on growing artisanal rutabaga to sell at farmers markets.
I've already packed up the Prius. I just stopped by to wish you kids luck with your new headphone project and tell you that I won't be back."
encom 24 hours ago [-]
No. CD audio is 1,4 mbit. Even increasing the temporal and spatial resolution beyond that, which is audiophile nonsense, will never even approach USB 2 speeds.
pseudohadamard 17 hours ago [-]
There are a bunch of similar testers around that do more or less the same thing, e.g. a the ChargerLab Power-Z range or any number of dodgy third-party Amazon/Aliexpress clones. The one thing that definitely doesn't exist though outside of $1,000-and-up USB diagnostic devices is something to report on which of the 800 different ways the downstream device has screwed things up, including failing a basic cut-and-paste of pullup resistors from the spec coughRaspberryPicough.
After the publicity a few years ago of bad USB-C cables they've been mostly fixed, but what hasn't been fixed is the infinite number of broken downstream USB-C implementations. So your charging problems aren't due to the cable, which is most likely fine by now, but because the downstream device is telling the upstream one that it can't take more than 5V 1A. One sure way to tell the vendor has screwed up is when your USB-C device comes with an A-to-C cable to charge it.
pseudohadamard 15 hours ago [-]
Just to clarify the above, I'm talking about USB-C PD, not data throughput, on re-reading it the text is a bit unclear.
seanalltogether 1 days ago [-]
I wasn't surprised to learn that when Linus Tech Tips released those new usb-c cables, that they all sold out almost instantly. They put their entire reputation on the line to claim (and label) the exact capabilities of their usb cables. Isn't that all we really want?
Of course they are advertising their own new USB cable, but as someone who didn't know much about USB cables I find it quite interesting.
AceJohnny2 1 days ago [-]
They lost me at "our conductors are coax!". USB is designed around differential signaling, which is what twisted pair excels at.
Dylan16807 20 hours ago [-]
Twisted pair is good but it only gets your losses so low at these speeds. Keep in mind that USB cables have a very small budget for signal loss, and at 40Gbps they're carrying frequencies 25x higher than 10gig ethernet.
MaxikCZ 1 days ago [-]
[flagged]
argsnd 1 days ago [-]
LTT are fine. I would strongly consider their products if they had any warehousing in Europe to make shipping here cheaper.
AngryData 1 days ago [-]
Yeah I don't see any problems with their stuff. It ain't exactly cheap, but a large part of that is their work in making sure you aren't just getting some random 2 dollar trash.
amelius 1 days ago [-]
I want one that sends a pseudorandom data stream and tells me the bit error rate.
the_biot 1 days ago [-]
Yup, that's the sort of thing that's typically missing from cable testers. I have a USB cable that normally works fine, but introduces errors when doing full blast USB 2.0 bulk transfers. I keep it around just in case I ever come across a tester that can show me this in hard numbers.
HDBaseT 21 hours ago [-]
Products that do exactly that do exist, although are prohibitively expensive. [0]
It is difficult to send 10Gbps, 20Gbps, 40Gbps or even 80Gbps, then receive it, then validate it.
I'm sure it's hard to do a detailed eye analysis at those speeds. But come on, I can get a hub that does 10Gbps per lane (so 20Gbps equivalent) for $13, 40Gbps SSD enclosures for $40, and 80Gbps SSD enclosures for $150. Making that signal do a loop, maybe adding some artificial attenuation, and checking how many bits get corrupted shouldn't need fancier hardware than that.
1 days ago [-]
specialist 8 hours ago [-]
Yes and: If a cable has any faults, of any kind.
I can't believe I'm still swapping cables and ports, trial and error, trying to determine if the cable, the port (eg iPhone's charging port), or both, are failing.
Not knowing the USB specs; I assume each head of a cable can do a loop back test, determine if can even see its other head, etc.
bArray 1 days ago [-]
What I'm looking for is a differential signal tester, where you can breakout any arbitrary cable or traces and test the properties of the wire with different frequencies. It should be able to measure interesting properties such as resistance, capacitance, inductance, phase/length difference, wire length, etc.
One of these devices for approximately $100 would sell all day long.
Eisenstein 1 days ago [-]
You can do that with a nanoVNA, except for the differential part. Less than $100.
ChrisMarshallNY 1 days ago [-]
This isn't a Beagle. When I first read the headline, I was hoping that it would be more than a smart continuity tester.
It seems to be a more comprehensive "Make sure the lines go where they are supposed to" tester. Looks pretty good.
But the devices that test things like transmission speed, are a lot more expensive.
I think that many of the issues that this device tests, can be mitigated by simply buying cables from reputable sources.
Liftyee 1 days ago [-]
Once in the realm of signal integrity, it's true that the price goes straight into lab grade levels.
Even in my reputable cables, there are a couple with suspicious continuity issues. I wonder if this could find them.
You could probably build a data transfer tester using an FPGA and some signal processing.
mwexler 1 days ago [-]
Wasn't aware of this level of testing. $1300 for a Beagle is a big step up.
ChrisMarshallNY 1 days ago [-]
That's a cheap one. Some of the analyzers can go into 5 figures.
Don't forget the speeds at which modern serial interfaces go. Being able to look at the data, at that speed, requires some serious kit.
SAI_Peregrinus 21 hours ago [-]
And the Beagles only do digital-domain stuff, they won't get you an eye diagram or a bit error rate. Even for USB 2.0 that's quite expensive, the test fixture from Keysight¹ is about $800, the differential probe they recommend is the $11,000 N2752A², the oscilloscope in the minimum configuration that meets the >1.5GHz bandwidth requirement they sell is the DSOX6002A³ for $24,000, and you'll also need a software option license for the eye diagram testing option. Not to mention a PC to run the USB-IF compliance test software, but that's pocket change at this point in the spending.
Keep in mind that's just for the 480Mib/s USB 2.0. If you want to test USB 5Gbps, USB 10Gbps, USB 20Gbps, USB 40Gbps, or USB 80Gbps you'll be spending much more money. The USB4 V2 GEN4 Electrical Compliance Test Specification⁴ requires a 25GHz bandwidth realtime oscilloscope with ≥80Gs/s sample rate or better. Not to mention a ≥25.6Gbps pattern generator/BERT, 20GHz network analyzer, & RF signal generator, all of which are "call for quote" but likely to be in the 6-digit price range. You're easily looking at the better part of half a million dollars to test the 80Gbps cables fully.
Like the price point and portability of the base Treedix testers.
Talking about near perfect wishlist:
Some standard hardware with fail-safe power connections
a set of fully wired ports like in display-less Treedix USB platine tester version.
On the other side powered and extended diagnostics through USB-C can then be done from any smartphone or pc providing the display and updateable extended software layer.
Bonus for connectivity to some brother label printers.
marcosscriven 22 hours ago [-]
I haven’t yet worked out why cheap 100W PD cables partially fail after about 6 months.
They can still provide some power, but between a MacBook and official Apple power supply, they’ll keep flicking between charging and not charging.
superxpro12 6 hours ago [-]
My personal favorite is the usb-c cable that works when you rotate the connector upside-down.
bennysaurus 21 hours ago [-]
Likely the emarker chip in the cable is reporting higher power capabilities than the cable can support long term. It'll work initially but then can't maintain that after resistance in the wires increase with wear.
bean469 1 days ago [-]
Sucks that there's no USB-B support. Plenty of monitors still use it and many printers do as well
superxpro12 6 hours ago [-]
would an adapter fill the gap until support is added?
pseudohadamard 12 hours ago [-]
For older cables there's a guy who's posted a large list of test stats somewhere... ah, here we go, I remember the "long thin resistor" comment: https://www.cs.auckland.ac.nz/~pgut001/pubs/usb_cable.html. tl;dr, you can't go wrong with Anker. Also, there's some absolute garbage out there.
trinsic2 1 days ago [-]
Man. I wonder if my cables are the reason why I cant get reliable transfer speeds above USB3.0 speeds on a new USB-C dock I purchased..
I didn't know there were cable testers like this, thank you.
tetromino_ 1 days ago [-]
Attempting to access treedix.com (the advertised product) gives me
> Access Denied
> Sorry, you do not currently have the necessary permissions to access this site, or this site may not be available in your region.
Are they geoblocking the USA from even viewing their site for some reason?
dceddia 22 hours ago [-]
I was connected to a VPN and saw the same. Went away once I disconnected.
(Must say I'm not a fan of how, increasingly, taking any steps to preserve privacy is seen as deviant, or justified because bots)
LeoPanthera 1 days ago [-]
It's available on Amazon in the US.
rationalist 16 hours ago [-]
Except that I can't even load the page to see the model name or whatever to see what specific device I should search for on Amazon. I guess I could just search the brand and wade through all the different listings.
atoav 1 days ago [-]
One thing to realize is that especially for high resolution video cables these cheap testers can't really deliver. The way to test them is a eye diagram (see: https://incompliancemag.com/eye-diagram-part2/ ) and testers with that capsbility cost upwards of 10.000 Eurodollars.
SAI_Peregrinus 21 hours ago [-]
Keysight are nice enough to provide prices on their web site for all their cheaper equipment. I priced out a setup for USB 2.0 eye diagram compliance testing on their site in this comment¹ and it's more like $40,000 than $10,000.
I said 10k because I think I saw a specialized product just for HDMI starting at roundabout that price bracket once mentioned in a youtube video, csn't really remember the brand, but it was relatively unknown in terms of test equipment. More generic (and flexible) equipment like the one mentioned will indeed tear a deeper hole into your pocket, but comes with its own perks.
superxpro12 6 hours ago [-]
The "certified" lab-grade stuff is so expensive because your essentially paying for someone else to certify the function for you.
There's cheaper generic VNA's that dont claim "yup, this is HDMI Ultra High Speed" that might work in the right hands.
jmalicki 1 days ago [-]
So you're saying there is something to audiophile grade HDMI cables?
scq 1 days ago [-]
No. What it can affect though is the bandwidth of the cable, meaning e.g. for HDMI cables, they might not support higher resolutions or framerates. If it's on the border you might see random disconnects or screen blanks.
The quality degrading is not something you will see, as it's a digital protocol.
"Audiophile grade" HDMI cables are likely to just be a Shenzhen bargain-bin special with some fancy looking sheathing and connectors. I would trust them less than an Amazon Basics cable.
fmajid 1 days ago [-]
Indeed. If I want super high quality cables, I get them from Blue Jeans Cables, who tell you exactly what Belsen or Can are cable stock and what connectors, as well as the assembly methodology.
fmajid 1 days ago [-]
Belden or Canare. Pesky autocorrect.
HPsquared 1 days ago [-]
With digital signals and ECC, the cable need only be "good enough" to get perfect data transfer through the system.
jmalicki 1 days ago [-]
His link made very clear the issues of jitter and flickering.
tom_alexander 1 days ago [-]
These two statements aren't mutually exclusive. The link is looking at the analog signal through an oscilloscope. The person you replied to is pointing out that after decoding and applying error correction, you can still end up with the same digital signal output. So the eye diagram charts are useful for detecting the quality of the cable, but as long as the quality is past a certain threshold, it does not matter.
SAI_Peregrinus 21 hours ago [-]
And that threshold is "baked in" to the eye mask pattern you load into the tester. If the eye stays out of the masked areas, it passes, if it goes into the masked areas it fails. Oscilloscopes capable of eye diagram testing can trigger on failure, so if it passes an eye test it'll reconstruct correctly with proper timing.
atoav 1 days ago [-]
Correct. But especially if you're using long cables a cable with more "headroom" in the eye diagram will perform more reliable than one that is just at the edge of breakup.
For home use that doesn't matter usually, but I for example run events where I need the cable to work also after 10 people stepped on it and then this can become a significant thing.
Not in terms of quality, but reliability.
atoav 1 days ago [-]
No. What I am saying is that it is hard to test the quality of a 8K 240Hz 4444 video cable without having a device that can send and receive this or even higher.
If you send bits across a line fast enough you're grtting into the territory of RF electronics, with wrong connector or conductor geometry you will get echos on the line and all kind of signal loss. A good digital protocol should keep this at bay with error correction and similar mechanisms, but if you want to know what the good cable is on a better than binary scale of works/does not, you need to look at these things.
jmalicki 1 days ago [-]
I just need to make a cable with better eye diagrams so I can market it to AV enthusiasts with "golden eyes"!
TeMPOraL 1 days ago [-]
Our cables are so good that their eye diagrams look like a photograph of a cross-section a gold analog AV connector. That is not a coincidence!
atoav 1 days ago [-]
Well the thing is better doesn't mean better quality here. Better means you can use a longer cable or abuse the cable for longer till it dies.
This is a big part of what makes any pro gear expensive: reliability. If you just connect your home hifi to your speakers in an acoustically untreated space, you could also just use a bunch of steel wire coathangers and get an indistinguishable result. Even a el-cheapo store brand music shop cable will do the trick for years if you don't habitually change your setup four times a week (most people don't).
But if you need reliability and predictability in a studio or live context giving a damn about cable quality is mandatory since a broken cable in the wrong place can ruin your day and reputation. But it is an absolute myth that they will affect the sound in any meaningful way.
Exeption: guitar cables. The capacitance of guitar cables can shift the resonance frequency of the pickup up or down leading to audibly different results. But that id no magic either, you could just take a low capacitance cable and add in arbitrary capacitor for 10 cents as needed.
jmalicki 1 days ago [-]
I have seen shielding and gauge make quite a difference for cables carrying analog signals!
atoav 20 hours ago [-]
Sure shielding is important, especially for high impedance signsls or very small signals (e.g. microphone cables). But the difference in shielding between a 10€ cable and a 100€ cable is not going to be audible in anything but edge cases.
Gauge is only important if the driving or receiving sides suck or you're using ridiculously small gauges or driving high currents (e.g. speaker signals) for long distances. In my lab tests I drove line signals over a spool of hair thin wire without audible difference. A famous experiment that gave me a good chuckle came to similar results: https://www.diyaudio.com/community/threads/copper-wire-vs-ba...
You could literally use CAT 6 ethernet cables and call it a day. There have been tests running balanced audio signals over a kilometer of ethernet cable without audible loss. And since you got 4 twisted pairs you could run 4 channels over one cable (commercially available here for example: https://store.monkeywrenchpro.com/thomann-5fc-cat-snake-spli...). I use this in combination with an ethernet patchbay to route signals around our event spaces and it works well even for unamplified mic level signals.
jmalicki 20 hours ago [-]
I have seen cables for powered computer speakers make cat5, or coat hangers, look amazing
1 days ago [-]
Raed667 1 days ago [-]
As someone who really doesn't care about learning the details, and just want one USB-C cable that does it ALL to put in my backpack what should I buy ?
Gigachad 1 days ago [-]
You don’t really want that. A thunderbolt cable is both stiff and expensive. They only really make sense to leave attached to the back of a monitor or dock.
What would work better is a flexible 100w+ usb3 cable. You can’t do thunderbolt on it but it’s a tiny fraction of the cost and does everything you’d actually need on the go.
So much this. I have a few different categories of "known good" USB-C cables because one type doesn't fit all my use cases. Sometimes the trait I need is >100w PD charging at 1M. Sometimes I need 80 Gbps dual 4k video at 3M. Other times I need 40 Gbps .5M to a portable NVMe enclosure. USB-C cables I regularly rely on range from $5 to $100 and weight/size varies >3x.
And in my tiny 'go bike bag' for day trips I need one 2M cable that's thin, coils into a tight ball and weighs nothing yet will charge up to 45w and reliably xfer data at up to 5Gbps (USB 3.1) for quick uploads with optional USB-A and Micro-USB adapters at either end (because I still know people with Micro-USB (though it obviously drops to USB2 speeds)).
johnwalkr 1 days ago [-]
At my workplace someone always orders the what they perceive to be the "best" cables. They aren't thunderbolt, they are just oversized with thick braiding. They are all so stiff and heavy you can barely handle a phone while charging without the cable pulling itself out.
cmiles74 1 days ago [-]
I’m sure it’s overkill but I wanted to know how much power my laptop or whatever was actually drawing.
Love this! I got a USB C multimeter and used it to test the output of two dozen chargers. Wanted to see if they supplied the voltage that was advertised. Funny enough, AOHI was the only brand whose chargers actually increased their voltage as my current draw went up. It was like the engineers knew about the resistance in the wire and decided to compensate by upping the voltage slightly.
tom_alexander 1 days ago [-]
As an alternative, you could get a stand-alone USB-C power meter which can be used with any cable. That way, when the cable breaks, you don't have to buy a new power meter. Here is an example of one such product (though I've never used this model): https://www.amazon.com/Adapter-Voltage-Current-Extension-Con...
qingcharles 1 days ago [-]
LTT did a lot of work to prove their cables do everything they say:
(these are what I would buy from a sea of cables, not the cheapest, but far from the most expensive)
publicmail 1 days ago [-]
A thunderbolt cable
mystifyingpoi 1 days ago [-]
This is the solution, but it is 1) expensive and 2) Thunderbolt cables are quite short compared to regular USB-C.
Gigachad 1 days ago [-]
Apple sells a 3m one, It’s just $250AUD.
I imagine at that length and speed, signal integrity becomes difficult.
izacus 23 hours ago [-]
Well they do want it all so they can pay for all, right? :)
bombcar 1 days ago [-]
And to be precise, a nice, high quality thunderbolt cable from a reputable manufacturer like Apple or OWC. Protect the cable as it will have been expensive, but it will work very well.
Raed667 1 days ago [-]
Would it work with USB-C screens and projectors ?
Eisenstein 1 days ago [-]
Hopefully they used connectors with a high mating cycle rating.
Onavo 1 days ago [-]
I just want one that tells me the maximum voltage and current supported by a USB C cable.
fmajid 1 days ago [-]
The Treedix will tell you that, as it is a feature of the eMarker chip (no chip means 60W).
SAI_Peregrinus 21 hours ago [-]
Assuming the chip isn't fraudulently added. Like in the article, some manufacturers are shady & will sell cables with e-marker chips for capabilities the cable can't actually support.
there are several: one that is moderately priced and which I consider myself to buy is the JOY-IT UM120
Onavo 1 days ago [-]
Thanks, why do you prefer that particular model?
wolfi1 1 days ago [-]
I wanted to have a model which tells me the modes which are supported and which is actually selected for a reasonable price and which I can order at a reasonable trader. this model seems to do the trick
mmastrac 1 days ago [-]
Can you rewrite the emarker chips?
Liftyee 1 days ago [-]
Brilliant little device. I will be picking one up ASAP!. Didn't know that lying cables were a thing but I have a ton of charge only cables?!
I speculate USB B wasn't included because there are only really two types, 2.0 (regular size) and 3.0 (has an obvious extension on the connector). There also don't tend to be power-only A-B cables because they are usually found on printers, Arduino s, ... And not for charging devices.
Fun fact: A Xiaomi fast charge cable (with orange plugs) has an extra contact on the A end to support USB C PD out of a USB A charger.
Modified3019 1 days ago [-]
I’ve had one for a while as well. I don’t use it often, but frankly I couldn’t sort my cables without it.
kotaKat 1 days ago [-]
Similarly: Is there a USB-C power delivery adapter to force directionality? I needed to siphon off power from small batteries into a larger pack (that could supply more power out than the small packs) in a power outage. I absolutely could not force my larger power station to accept a charge and it kept pushing power back the wrong direction despite which ends of the cable I plugged in first.
That cable has one power input (that is only an input), and two outputs (that are only outputs), and a brainbox in the middle to direct the circus.
If we label the connectors as A, B, and C, then it works like this: A charges B and/or C, and other charging directions are no-op.
The less-complex way is to use a USB A to C cable, if that's appropriate. With these, the A side is always the source and the C side is always the sink.
---
And yeah, it's annoying. I got a cheap lithium car jump starter several years ago with some neat power bank features (like 60W USB PD in/out, on one port). So I plugged it into my phone with USB C at my desk, and discovered that they'd charge eachother seemingly randomly. While changing nothing, I'd look over and sometimes the jump starter would charge the phone, and sometimes the phone would be charging the jump starter. The conglomeration formed a heater, with more steps.
(Back and forth with the same poop, forever.)
kotaKat 1 days ago [-]
Ah, yeah, I remember those. That miiiiight work for my use case...
---
(I remember. The poop.)
throwpoaster 22 hours ago [-]
> The only downside of the USB cable tester is that I would love to support more plugs on the B side: USB-A (for my Frankencables) and USB B (which everyone but seems to think extinct).
Music production peripherals, until the recent USB C transition, commonly used USB B for some reason. Anyone know why?
pugchat 1 days ago [-]
[dead]
Rendered at 19:44:40 GMT+0000 (Coordinated Universal Time) with Vercel.
But there is not standard for usb controllers to present this data to the OS. So it’s stuck in the low level firmware and never passed up. In theory we could have a popup box that tells you that both your computer and other device support higher speeds/more power, but your cable is limiting it.
Apple seems best able to do this since they control the hardware and OS, yet they aren’t doing it either. Users are just left to be confused about why things are slow.
I'm pretty sure my old Dell XPS laptop with Windows 10 had pop-ups just like this.
"This device can run faster" or something.
Dell XPS laptops (and some others) can also warn if the charger isn't providing the full wattage the laptop is rated for. This warning is an option that can be turned off in the BIOS settings.
I usually turn it off because I sometimes intentionally do day trips with a smaller/lighter portable charger which delivers 45w to my laptop which can need up to 65w due to having a discrete GPU. However, 45w is more than sufficient to charge the laptop during normal use on the Balanced power plan with iGPU. I only need more than 45w when gaming with the discrete GPU active.
Weirdest part was it was 100% charged, so could have booted with 0 Watts of charger but decided not to boot with 20 Watts more.
Sure, you or I would just unplug the charger and run on battery but bad UX decisions like that generate a support call to me from my 95 yr old mom. It should not only warn and continue to boot, it should use whatever power is on offer to reduce the rate of battery drain.
If I have a lower wattage charger connected on booth it shows me that information but I can just press enter to continue. It's just a warning.
Maybe it's a bios setting?
Workaround is of course to boot without a charger connected and then connect it later :)
Since a >$1,000 automated lab cable throughput tester is overkill, my thumbnail test for high-speed USB-C data cables is to run a disk speed benchmark to a very fast, well-characterized external NVMe enclosure with a known-fast NVMe drive. I know what the throughput should be based on prior tests with an $80 active 1M Thunderbolt cable made for high-end USB-C docks and confirmed by online benchmark reviews from credible sources.
They probably have to weigh potential new hardware sales against added complexity. I have counterpoints too but: I believe they try to protect users’ mental models of their ecosystem (which perhaps I appreciate when I don’t notice, and can’t stand when something is uncustomizable). Like there are enough variables they don’t trust us with as it is.
You jest but that notification (it's been a thing on Android for at least 8 years, and on thinkpads for at least 10) has been very helpful to me. Sometimes the negotiation just fails and being told is helpful. Sometimes the charger lies about its specs and once again it's helpful to have a hint, rather than expect everybody to systematically have usb testers on hand.
With data speed I think it could be a little more complicated. Like OP was saying it would need access to some level of hardware information where it can see which pins are used by the cable. Since the connection 'speed' is still variable even when you DO have a supported cable.
There is. I used to use a KVM with USB 2 ports connected to my PC's USB 3 port, to which I connected a monitor with integrated USB 3 hub to drive my keyboard and mouse. Windows would show a popup every time telling me that I should use a faster cable.
There are also popups telling me that my laptop is connected to a "slow" usb-c charger.
Using DevCheck might show 2.2A/9V as an example.
I don’t know if they check that via USB protocol, or if they are measuring the actual power draw on the USB port.
In order to use the device, I had to connect it via an externally powered USB hub.
Regular people hate technology, both for how magical and how badly broken it is, but they've long learned they're powerless to change it - nobody listens to their complaints, and the whole market is supply-driven, i.e. you get to choose from what vendors graciously put on the market, not from what the space of possible devices.
They hate having to go through people that get them upset, in order to use their kit.
Not just tech (although it’s more prevalent). People who are “handy” can also be that way (but, for some reason, techies tend to be more abrasive).
I’ve learned the utility of being patient, and not showing the exasperation that is often boiling inside of me.
In general for the 40+ years I’ve been a programmer I have detested the practice of not surfacing diagnostic information to users when technology makes it possible to do so in a clear and unambiguous way.
"What did the error message say"
"I don't know."
'Segmentation fault. Core dumped.'
'Non-fatal error detected. Contact support.'
'An error occurred.'
'An illegal operation was performed.'
'Error 92: Insufficient marmalade.'
'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?'
'Saving as .docx is not recommended because blah-blah-blah never gonna give you up nor let you down.'
I can't blame any normal user from either not understanding nor giving a shit about any of these. If we'd given users actionable information from day 1, we'd be in a very different world. Even just 'Error 852: Couldn't reach the network. Check your connection to the internet.' does help those who haven't turned of their brains entirely yet.
"I don't understand, it says 'System Error Type 11', and no matter how many times I type 11, nothing happens!"
That one's a good example of why these things are hard. The user could have been running 5 different programs, any one of which caused this error, and MacOS can't point the finger at anyone. Not to mention that the problem could be MacOS itself, or the user being a dunce who misconfigured something. I'm not sure if that error can occur without 3rd party software being involved, but if it can, then that error message might need to be even more vague, helping the user even less. Not to mention it could just be faulty hardware.
A paper manual offering troubleshooting steps for each error would be really helpful. Just 'Error 11. Consult your manual.' and the manual actually telling you what the problem could be is also miles better than what we usually get.
It's still an example why it's worth giving your users a fighting chance. MacOS may not know enough to point the finger at anyone, but the user knows what they were doing at that moment, and even if they were not paying attention, they might start now. They'll realize if something is off. Or, after 10th time they get this error, they'll connect the dots and realize it's always happening when application X is running and they try to launch Y.
Or maybe sometimes they won't. Maybe they'll form a story and maybe it'll be all bullshit, or maybe good enough. Either way, the important part is, the user retains agency in the process. Giving people information is how they can become self-sufficient users and trust technology more.
++?????++ Out of Cheese Error. Redo From Start.
+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++
+++Whoops! Here comes the cheese! +++
A modal dialog is supposed to be for something damn near irreversible--like about to blow away your application because of error. You are supposed to STOP and go get the guru or you are about to lose, badly.
Unfortunately, UI designers throw them up for everything and people get used to simply clicking "OK" to make them go away so that they can get back to doing their task. So, when the user gets an actual error, they've already blown away the dialog box with information.
Your 'Saving this image as a JPG will not preserve the transparency used in the image. Save anyway?' line is a horrifically excellent example. That is a standard "Save As..." response, and it should NEVER have been. That should have always been under "Export..." as saving should never throw away information and it would be perfectly fine to regenerate a JPG as long as you have the full information still available in the original file.
This is the stuff that infuriates me about the UI designers. Your job is about interactions, first, and pixels, second.
If you ever run into a developer who thinks "something went wrong" is an appropriate error message, have them killed. Then kill their entire family and pets, burn their house down, and plough salt into the ground where it stood. Finally, put up a sign that says "The person who used to live here thought 'something went wrong' is an appropriate error message to display when something goes wrong. Take note of their current situation when you next add an error message to your software".
I implored him to try a different cable (after checking cables with the Treedix mentioned in TFA), and the copy went from taking over a minute to about 13s.
Its not just normal people confused.
Oh, and pointy jab: these folks are also, in my opinion/experience, the most eager to vibecode shit. Make of that what you will.
Swore blind it couldn't be improved.
By next morning's stand-up, I'd found it was doing something pointless, confirmed with the CTO that the thing it was doing was genuinely pointless and I'd not missed anything surprising, removed the pointless thing, and gotten the 10 minutes down to 200 milliseconds.
I'm not sure if you're right or wrong about the correlation with vibe-coding here, but I will say that co-workers's code was significantly worse than Claude on the one hand, and that on the other I have managed to convince Codex to recompute an Isochrone map of Berlin at 13 fps in a web browser.
But I know stories like yours from a decade past as well. A tale old as time, but compounding in recent years - IMHO.
People will notice some things. For example, with USB if they are using it for local backup they might notice, but with a lot of devices they will not. When they do notice, they will feel powerless.
Even if we had a wider choice, they are not well placed to pick products. There is no way they will know about details of things such as USB issues (a cable is slow, the device will not tell you if it is) at the time of purchase.
I'm an electrician.
It’s well worth the hype, I used it to audit all my cables (both for home and work) and it’s amazing how many thick and unwieldy cables are actually terrible for data.
For example I purchased a pair of B&W Px8 S2 noise cancelling headphones, which boast a DAC if you connect via USB-C directly, the cable it came with though was thick but only rated for USB 2.0 speeds. These headphones cost more than AirPods Max, which are themselves considered overpriced, and include comforts like nappa leather; so shipping with a chunky cable that doesn’t even carry decent data feels like a bizarre oversight. Apple’s own USB-C cables manage the same power delivery at less than half the thickness with a woven shell. You’d assume a premium product would at least match that.
Honourable mention to the USB-C cables that ship with Dell Ultrasharp monitors (both pre-USB4 and post). Those support basically everything except Thunderbolt 4 despite being unmarked.
USB 2.0 can support up to 480 Mbps. It’s more than fast enough for any audio stream you can send to a DAC.
Your headphones don’t need USB 3.0 5 Gbps speeds. USB 3 requires extra wires with different properties that need to be controlled more tightly, which can impact cable flexibility. If your headphones used USB 3 when they didn’t need it that would be one more thing to break and more failure modes for the cable.
A USB 2 cable with fewer conductors was the right choice for this product. The fact that you only got miffed about it when plugging the cable into a tester, not from actually using the product or cable, is good evidence that a USB 3 cable wasn’t needed.
Apple’s iPhone cables are not known for their durability. They serve a mostly stationary purpose, unlike headphones you wear on your head.
I keep a few converters for older devices and servers that don't have (m)any C ports, but as far as a consumer "forever cable" goes, TB5 feels close. Certainly the cable's bandwidth is beyond what most people need, unless you're editing 8k video or continually shuffling hundreds of GBs between external disks.
It alleviates the anxiety of knowing what cable does what.
I use Apples Thunderbolt 4 or USB-C cables exclusively: if its white its for charging and low data, if its black its for high data.
I’ve been doing this for a few years, but its really costly as those Apple Thunderbolt cables are crazy expensive.
LTT or another big YouTuber made a cable and made sure to get it labeled. Also complained how difficult it was to find a supplier willing to make a better cable than usual.
I have one of those. They are thick and unwieldy af. Since I've borked the usb connection on my monitor because of static discharge, I no longer use it and figured I'd repurpose it for my digital camera, for which I used to have a short cable that was sometimes annoying. This cable is so freaking think and hard that it'll move my (admittedly somewhat light) camera on the table.
It’s rigid and thick, like a Thunderbolt 3 cable, yet only supports USB 2.0 speeds and fast charging for a device that doesn’t need fast charging.
Compare that to Apple’s iPhone USB-C cable which is thin, flexible, and supports the same features.
That matters because someone might grab that cable assuming it’s a “better cable”: it came with a £629 product, it’s thick and feels serious, so surely it’s capable. But it isn’t. And there’s nothing marked on it to tell you otherwise.
The whole system ends up relying on presumption, which is exactly the problem the device in the article is solving.
The purpose of the heavy construction is to make it durable, not to carry 5 Gbps data streams to your headphones.
Unlike most USB peripherals like your printer and keyboard that get plugged in and then don’t move around, headphone cables go to your head and move around constantly. They can get pinched in drawers or snagged on corners.
Hence the more durable construction.
Apple’s USB iPhone cables wearing out prematurely is so common it’s a meme.
Maybe Apple's changed their cables recently, but the fragility is the reason I avoid Apple cables.
Especially in headphones. The number of times those broke during a bike ride or run was way to high for me to keep wasting money on them knowing full well they weren't going to last more than a few months just like every other Apple headphone I've ever had.
https://www.techgearlab.com/topics/electronics/best-usb-c-ca...
I don’t know how to fix the market especially when consumers keep rewarding these practices, and I think the effectiveness of TikTok style influencer marketing will make it worse.
The problem is the opposite of what you’re describing, it’s not a cynical design choice, it’s a lazy one. They probably just purchased a cable for capabilities irrelevant to the product and the result is worse ergonomics and misleading physical cues about what the cable can actually do.
I think you are underestimating the importance of perceived premium combined with the pressures of cost accounting, but I do think that is pretty normal for ‘audiophiles’ which is their target market.
If the argument is that B&W deliberately chose a thick cable to seem premium, it doesn’t square with them actively slimming down the headphones. B&W are primarily a speaker company, their USB-C product range is basically just a few headphones and earbuds.
More likely they just sourced a generic cable that happened to support high wattage and didn’t think about the mismatch.
Either way, we’re deep in the weeds on B&W’s cable procurement now. The root point is that USB-C is a mess. You can’t tell what a cable supports by looking at it, and even premium manufacturers are shipping cables that don’t do what you’d reasonably expect.
That’s exactly the problem the Treedix from the article solves.
You are using circular reasoning in your logic, you assume the premise is true and from there you derive your evidence.
I would contend that someone thought about it and decided to go with the cheaper option because they could get away with it. I would consider my assumption to have more grounding given my experience with manufacturing and cost accounting.
My example of weights is that the steel weighs are cheaper than the alternative of using heavier drivers, by adding weight they are signaling premium without delivering it. Similarly with the USB cable, consumers assume such cables are thick because of thicker wires and better shielding, it’s cheaper to make a thick cable without those those features, once again signaling premium without actually providing it.
The vast majority of high volume consumer manufacturers use cost accounting practices which would absolutely be tracking and attributing the usb cable costs and the whole point of that accounting practice is to constantly be thinking about minimizing costs of even the smallest inputs, all the way down to the individual screws used. Yes, they’re thinking about how to save 1/100ths of a cent from each screw.
Or, why Apple manages the same in half the footprint?
Or, why someone would expect that a cable that came with a pair of headphones actually charges things at over 65w?
So, no: I wouldn't expect the cable for a pair of headphones (of any price) to support USB 3. That represents extra complexity (literally more wires inside) that is totally irrelevant for the product the cable was sold with. (The cables included with >$1k iPhones don't support USB 3, either.)
Meanwhile: Fast charging. All correctly-made USB C cables support at least 3 amps worth of 20 volts, or 60 Watts. This isn't an added-cost feature; it's just what the bare minimum no-emarker-inside specification requires. A 25-cent USB C-to-C cable from Temu either supports 60W of USB PD, or it is broken and defiant of USB-IF's specifications.
---
Now, of course: The cable could be thinner and more flexible and do these same things. That'd probably be preferred, even: Traditional analog headphones often used very deliberately thin cables with interesting construction (like using Litz wire to reduce the amount of internal plastic insulation) to improve the user's freedom of movement, and help prevent mechanical noise from the cables dragging across clothes and such from being telegraphed to the user's ears.
Using practical cabling was something that headphone makers strived to be good at doing. I'm a little bit annoyed to learn that a once-prestigious company like B&W is shipping cables with headphones that are the antithesis of what practical headphone cables should be.
---
But yeah, both USB C cables and the ports on devices could be better marked so we know WTF they do, to limit the amount of presumption required in the real world. So that a person can tell -- at a glance! -- what charging modes a device accepts or provides, or whether it supports video, or whether it is USB 2 or USB 3, or [...].
Prior to USB C, someone familiar with the tech could look at a device or a cable and generally succeed at visually discerning its function, but that's broadly gone with USB C. What we have instead is just an oblong hole that looks like all of the other oblong holes do.
After complaining about this occasionally since the appearance of USB C a decade or so ago, I've come to realize that most people just don't care about this -- at all. Not even a little bit. Even though these things get used by common people every day, the details are completely out of the scope of their thought processes.
It doesn't have to be this way, but it's not going to change: Unmarked ports are connected together with unmarked cables and thus unknown common capabilities are just how we roll.
Your last paragraph is depressingly accurate though. I think that's exactly why devices like the Treedix exist: the standards bodies and manufacturers clearly aren't going to fix the marking problem, so now we need test equipment to figure out what our own cables do.
"I heard what you guys are planning and I talked to my financial guy. He said I have enough to put a manufactured home on some land in some desolate place like the Dakotas or central Wisconsin, as long as I keep a bit of supplemental income and live a little lower. So I'm going to do that, and take my chances on growing artisanal rutabaga to sell at farmers markets.
I've already packed up the Prius. I just stopped by to wish you kids luck with your new headphone project and tell you that I won't be back."
After the publicity a few years ago of bad USB-C cables they've been mostly fixed, but what hasn't been fixed is the infinite number of broken downstream USB-C implementations. So your charging problems aren't due to the cable, which is most likely fine by now, but because the downstream device is telling the upstream one that it can't take more than 5V 1A. One sure way to tell the vendor has screwed up is when your USB-C device comes with an A-to-C cable to charge it.
Of course they are advertising their own new USB cable, but as someone who didn't know much about USB cables I find it quite interesting.
It is difficult to send 10Gbps, 20Gbps, 40Gbps or even 80Gbps, then receive it, then validate it.
[0] - https://www.totalphase.com/products/advanced-cable-tester-v2...
I can't believe I'm still swapping cables and ports, trial and error, trying to determine if the cable, the port (eg iPhone's charging port), or both, are failing.
Not knowing the USB specs; I assume each head of a cable can do a loop back test, determine if can even see its other head, etc.
One of these devices for approximately $100 would sell all day long.
It seems to be a more comprehensive "Make sure the lines go where they are supposed to" tester. Looks pretty good.
But the devices that test things like transmission speed, are a lot more expensive.
I think that many of the issues that this device tests, can be mitigated by simply buying cables from reputable sources.
Even in my reputable cables, there are a couple with suspicious continuity issues. I wonder if this could find them.
You could probably build a data transfer tester using an FPGA and some signal processing.
Don't forget the speeds at which modern serial interfaces go. Being able to look at the data, at that speed, requires some serious kit.
Keep in mind that's just for the 480Mib/s USB 2.0. If you want to test USB 5Gbps, USB 10Gbps, USB 20Gbps, USB 40Gbps, or USB 80Gbps you'll be spending much more money. The USB4 V2 GEN4 Electrical Compliance Test Specification⁴ requires a 25GHz bandwidth realtime oscilloscope with ≥80Gs/s sample rate or better. Not to mention a ≥25.6Gbps pattern generator/BERT, 20GHz network analyzer, & RF signal generator, all of which are "call for quote" but likely to be in the 6-digit price range. You're easily looking at the better part of half a million dollars to test the 80Gbps cables fully.
¹https://www.keysight.com/us/en/product/E2667B/usb-2-0-hi-spe...
²https://www.keysight.com/us/en/product/N2752A/infiniimode-ac...
³https://www.keysight.com/us/en/products/oscilloscopes/bencht...
⁴https://www.usb.org/document-library/usb4-electrical-complia...
Talking about near perfect wishlist:
Some standard hardware with fail-safe power connections a set of fully wired ports like in display-less Treedix USB platine tester version.
On the other side powered and extended diagnostics through USB-C can then be done from any smartphone or pc providing the display and updateable extended software layer.
Bonus for connectivity to some brother label printers.
They can still provide some power, but between a MacBook and official Apple power supply, they’ll keep flicking between charging and not charging.
I didn't know there were cable testers like this, thank you.
> Access Denied
> Sorry, you do not currently have the necessary permissions to access this site, or this site may not be available in your region.
Are they geoblocking the USA from even viewing their site for some reason?
(Must say I'm not a fan of how, increasingly, taking any steps to preserve privacy is seen as deviant, or justified because bots)
¹https://news.ycombinator.com/item?id=47568399
There's cheaper generic VNA's that dont claim "yup, this is HDMI Ultra High Speed" that might work in the right hands.
The quality degrading is not something you will see, as it's a digital protocol.
"Audiophile grade" HDMI cables are likely to just be a Shenzhen bargain-bin special with some fancy looking sheathing and connectors. I would trust them less than an Amazon Basics cable.
For home use that doesn't matter usually, but I for example run events where I need the cable to work also after 10 people stepped on it and then this can become a significant thing.
Not in terms of quality, but reliability.
If you send bits across a line fast enough you're grtting into the territory of RF electronics, with wrong connector or conductor geometry you will get echos on the line and all kind of signal loss. A good digital protocol should keep this at bay with error correction and similar mechanisms, but if you want to know what the good cable is on a better than binary scale of works/does not, you need to look at these things.
This is a big part of what makes any pro gear expensive: reliability. If you just connect your home hifi to your speakers in an acoustically untreated space, you could also just use a bunch of steel wire coathangers and get an indistinguishable result. Even a el-cheapo store brand music shop cable will do the trick for years if you don't habitually change your setup four times a week (most people don't).
But if you need reliability and predictability in a studio or live context giving a damn about cable quality is mandatory since a broken cable in the wrong place can ruin your day and reputation. But it is an absolute myth that they will affect the sound in any meaningful way.
Exeption: guitar cables. The capacitance of guitar cables can shift the resonance frequency of the pickup up or down leading to audibly different results. But that id no magic either, you could just take a low capacitance cable and add in arbitrary capacitor for 10 cents as needed.
Gauge is only important if the driving or receiving sides suck or you're using ridiculously small gauges or driving high currents (e.g. speaker signals) for long distances. In my lab tests I drove line signals over a spool of hair thin wire without audible difference. A famous experiment that gave me a good chuckle came to similar results: https://www.diyaudio.com/community/threads/copper-wire-vs-ba...
You could literally use CAT 6 ethernet cables and call it a day. There have been tests running balanced audio signals over a kilometer of ethernet cable without audible loss. And since you got 4 twisted pairs you could run 4 channels over one cable (commercially available here for example: https://store.monkeywrenchpro.com/thomann-5fc-cat-snake-spli...). I use this in combination with an ethernet patchbay to route signals around our event spaces and it works well even for unamplified mic level signals.
What would work better is a flexible 100w+ usb3 cable. You can’t do thunderbolt on it but it’s a tiny fraction of the cost and does everything you’d actually need on the go.
If you actually do want it, this is the do everything cable https://www.apple.com/au/xc/product/MW5H3ZA/A
And in my tiny 'go bike bag' for day trips I need one 2M cable that's thin, coils into a tight ball and weighs nothing yet will charge up to 45w and reliably xfer data at up to 5Gbps (USB 3.1) for quick uploads with optional USB-A and Micro-USB adapters at either end (because I still know people with Micro-USB (though it obviously drops to USB2 speeds)).
https://iaohi.com/products/aohi-the-future-adonis-usb4-2-0-2...
https://news.ycombinator.com/item?id=47561827
(these are what I would buy from a sea of cables, not the cheapest, but far from the most expensive)
I imagine at that length and speed, signal integrity becomes difficult.
I speculate USB B wasn't included because there are only really two types, 2.0 (regular size) and 3.0 (has an obvious extension on the connector). There also don't tend to be power-only A-B cables because they are usually found on printers, Arduino s, ... And not for charging devices.
Fun fact: A Xiaomi fast charge cable (with orange plugs) has an extra contact on the A end to support USB C PD out of a USB A charger.
That cable has one power input (that is only an input), and two outputs (that are only outputs), and a brainbox in the middle to direct the circus.
If we label the connectors as A, B, and C, then it works like this: A charges B and/or C, and other charging directions are no-op.
The less-complex way is to use a USB A to C cable, if that's appropriate. With these, the A side is always the source and the C side is always the sink.
---
And yeah, it's annoying. I got a cheap lithium car jump starter several years ago with some neat power bank features (like 60W USB PD in/out, on one port). So I plugged it into my phone with USB C at my desk, and discovered that they'd charge eachother seemingly randomly. While changing nothing, I'd look over and sometimes the jump starter would charge the phone, and sometimes the phone would be charging the jump starter. The conglomeration formed a heater, with more steps.
(Back and forth with the same poop, forever.)
--- (I remember. The poop.)
Music production peripherals, until the recent USB C transition, commonly used USB B for some reason. Anyone know why?