Right, the pins just automatically know which to use without any negotiation from the USB bus. Come on.
Edit: Your reaction to this friendly debate is, for lack of a better word, hilarious. The fact that I’ve been upvoting your responses out of mutual respect, and that you’ve been clearly downvoting mine speaks volumes.
Well, I don’t know much about the resolving power and maximum refresh rate of human vision but I’m guessing that the monitor they described is close to the limit.
The analogy refers to someone who has their thinking constrained to the current situation. They didn’t imagine that computers would become resource-intensive multimedia machines, just as this person suggests the cable wouldn’t be asked to carry more data than would be necessary for the 8k monitor.
I can imagine a scenario with dual high resolution screens, cameras and location tracking data passing through a single cable for something like a future VR headset. This may end up needing quite a bit more data throughput than the single monitor–and that isn’t even thinking outside the box. That’s still the current use case.
Do you have a crystal ball over there? I still think it’s a clever analogy.
Well then maybe you should read the comment you replied to again? They did not talk about how much data the cable would need. They even hypothesized that the cable format might even change. The meme talks about defining hd and they commented that 8k would be enough. Human eyes will not magically get more resolving so yeah, your analogy is still bad.
I do disagree on the Hz though. It would indeed be nice if we got 8k@360hz at some point in the future but that’s not resolution so I’ll let it slide.
I don’t think you’ll need much more than 8k 144hz. Your eyes just can’t see that much. Maybe the connector will eventually get smaller, such as USBC?
I can see a clear difference between 144hz and 240hz, so even that part isn’t right
And I haven’t used an 8K monitor, but I’m confident I could see a difference as well
Yeah, frequency might go a bit higher. But I doubt many people could tell the difference between 8k and 16k.
USB-C is already a plug standard for display port over thunderbolt. Apple monitors use this to daisy chain monitors together.
But its DP implemented within USB, vs an actual DP. There’s a latency there which might matter for online realtime gaming
The USB-C is just the connector. The cables are thunderbolt. Thunderbolt 3 and 4 are several times the speed of USB 3.2. No latency issue.
Yeah but there are still signal processing steps required to tell both ends of the connector which pins to use to implement the DisplayPort protocol.
Let’s hope that bus is dedicated, and not overloaded with other USB tasks!
It is. That’s always been one of the key features of thunderbolt. There’s no negotiation or latency involved after you plug it in. It just works.
Right, the pins just automatically know which to use without any negotiation from the USB bus. Come on.
Edit: Your reaction to this friendly debate is, for lack of a better word, hilarious. The fact that I’ve been upvoting your responses out of mutual respect, and that you’ve been clearly downvoting mine speaks volumes.
Just because you don’t know how any of this works, don’t lash out at me in anger.
Blocked
640k of RAM should be enough for anybody
You think that’s a clever analogy but it’s not even close.
Well, I don’t know much about the resolving power and maximum refresh rate of human vision but I’m guessing that the monitor they described is close to the limit.
The analogy refers to someone who has their thinking constrained to the current situation. They didn’t imagine that computers would become resource-intensive multimedia machines, just as this person suggests the cable wouldn’t be asked to carry more data than would be necessary for the 8k monitor.
I can imagine a scenario with dual high resolution screens, cameras and location tracking data passing through a single cable for something like a future VR headset. This may end up needing quite a bit more data throughput than the single monitor–and that isn’t even thinking outside the box. That’s still the current use case.
Do you have a crystal ball over there? I still think it’s a clever analogy.
Well then maybe you should read the comment you replied to again? They did not talk about how much data the cable would need. They even hypothesized that the cable format might even change. The meme talks about defining hd and they commented that 8k would be enough. Human eyes will not magically get more resolving so yeah, your analogy is still bad.
I do disagree on the Hz though. It would indeed be nice if we got 8k@360hz at some point in the future but that’s not resolution so I’ll let it slide.
Fair enough.
Just be sure to keep lubricated while you permit all that sliding.