Submitted by HugeLibertarian t3_10ou1oq in explainlikeimfive
[removed]
Submitted by HugeLibertarian t3_10ou1oq in explainlikeimfive
[removed]
This logic makes sense but it doesn't.
Why not just make all the ports USB 3.1? Even if they are all used for USB 3.1 high bandwidth devices, the odds that all these devices will use maximum throughput at a rate to saturate their PCIe lane access is near zero outside of /r/homelab situations. And most will be low bandwidth (HIDs, etc) anyway.
If they were all 3.1, you'd at least have the convenience of always getting a 3.1 port and functionality (barring intensive oversubscription of the PCIe lane).
To me this argument reminds me of my electrical panel. I have 200 amp service, which means I should only have about thirteen 15 amp breakers. Less if I'm feeding the 30 amp AC compressor and have to subtract 2 to make up for it. It's not like I'm going to install 5 amp breakers to keep total load potential below the main feed rating of 200 amps and then I can only use that one 15a outlet for appliances in my kitchen.
IMHO, still including USB 2.0 ports is a dumb legacy design that has few real world benefits and creates user inconvenience. I'm sure there are niche applications (industrial PCs with bad USB devices that puke on 3.1 ports), but that's like making everyone use a cane because some people have mobility problems.
The issue is that the lane is physically dedicated to the USB controller. It’s not like all USB devices share a common pool of lanes. A USB port will connect to a USB controller, which is connected to a lane. If you have USB 2.0 controllers, as mentioned before you can have about 10 using a single lane and be fine, but for a 3.0 or 3.1 port the lane will be assigned to it and can’t be used for something else like an SSD or GPU, even if you only use it for a 2.0 device. Since we can expect many devices (eg mice, keyboards, webcams, printers, etc) still oy need 2.0, we don’t want to waste bandwidth and so we’ll have 2.0 ports, with a few 3.x ones for storage devices, high speed network adapters, external GPUs, etc.
And what if you do need it?
This argument made me remember the time when MMORPGs started becoming mainstream in the late '90s/early '00s. I have friends whose ISPs over saturated their connections. Those people end up with 1kbps up/down, which is a big downgrade from their "up to" 256kpbs at that time. And yes, their game is basically a slide show.
Just like your logic, it's not like everyone will be hogging the line, you ask for a website, you wait for it to load then you spend more time reading that site than asking for another website. But then things change and now everyone is sending traffic all the time...
And besides, if you don't care about the bandwidth, USB 3 is compatible with USB 2 so just plug the USB3 into a USB2 port and call it a day...
OP's or Motherboard Manufacturer's logic is sound, motherboard manufacturers decide how to split the PCIE lanes. So the number of USB 2/3 ports are defined by the manufacturer (vs available lanes). If you value USB3 more than having more USB ports, I'm sure there is a mobo model for you. While those people who buy high-end mobos wanting high-end performance gets theirs as well...
Yes. It’s all about cost. You’ll still see usb 2.0 ports on cheaper computers because they’re more cost effective.
It costs more money to support faster interface standards and the people buying cheaper computers are less likely to need the speed of the newer standard so they use cheaper hardware.
I mean, let's be honest, I have like 15 USB devices connected to my computer and the majority of them have no need of the USB3 speed.
Outside of external storage, VR gear and video hardware, there is not a lot of devices that need them.
Not every device needs the speed of 3.x. Your mouse and keyboard hardly use a small fraction of the throughout of even USB 2. Why incur costs that buy you no benefit?
Front panel usually just has the headers (the connectors), with wires leading to the USB controllers (the chips that do all the USB functions) on the computer's motherboard.
So ultimately all costs are NOT equal, the USB chip may be able to control only so many USB 3 devices, but a lot more USB 2 devices, so a company may put a lot more USB 2 headers on the front panel than they could USB 3 headers.
There is a data bandwith limitation. And there's a chip shortage / chip cost limitation too.
> My understanding is that 3.0 ports are backwards compatible with 2.0 devices, and automatically default to 2.0 levels of power when paired with them.
Not perfect compatibility though. E.g. some older network cards, installers for older operating systems, specialized hardware etc. might not work properly with the 3.0 port but will work in the 2.0 port:
https://superuser.com/questions/1112714/why-do-modern-computer-cases-still-have-usb-2-0-ports
That is particularly relevant for "work", where you might have specialized, old and very expensive equipment, hence many "business notebooks" did leave one port as usb 2.0.
My friend's flipper zero would only work with the 2.0 port on his computer.
[removed]
Please read this entire message
Your submission has been removed for the following reason(s):
Questions about a business or a group's motivation are not allowed on ELI5. These are usually either straightforward, or known only to the organisations involved, leading to speculation (Rule 2).
If you would like this removal reviewed, please read the detailed rules first. If you believe this submission was removed erroneously, please use this form and we will review your submission.
It's really just price. USB has backwards compatibility built right in to the standard. If you take a USB 1.0 device and USB 3.0 device and plug them into each other, the USB 3.0 device will just work like a USB 1.0 device to ensure the 1.0 device will work properly
Yes. I'm glad you asked. The 2.0 emitted about 3 khm per second less than the 1.5. This may not seem small, but lets not forget that the sound of a kJ is a fraction of a second, less than it would take you to blink an eye.
greatvaluemeeseeks t1_j6gyfmm wrote
>Is it to avoid potentially overloading some kind of power or data connection? Or is it something else?
Sort of. The biggest factor, like everyone has alluded to is price, but a second bottleneck is the number of PCI Express lanes your computer has. Peripherals like USB, networking ports, GPUs and hard drive controllers talk to your processor on PCIE lanes. A modern AMD AM5 motherboard has up to 28 gen 4 PCIE lanes and an Intel LGA 1700 can support up to 20 gen 4 PCIE lanes. Motherboard manufacturers need to divide these lanes up between things that use the most amount of bandwidth like NVME SSDs, things people think need lots of bandwidth like GPUs and things that really don't need that much like your keyboard or PS5 controller. To utilize the full performance of a USB 3.1 port you need to dedicate an entire PCIE lane; a USB 2.0 port is 10x slower than a USB 3.1 port, so you can theoretically support 10 USB 2.0 ports at full performance on just one lane.