• areyouevenreal@lemm.ee
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    7 months ago

    Not necessarily. The base machines aren’t that expensive, and this chip is also used in iPads. They support high resolution HDR output. The higher the number of monitors, resolution, bit depth, and refresh rate the more bandwidth is required for display output and the more complex and expensive the framebuffers are. Another system might support 3 or 4 monitors, but not support 5K output like the MacBooks do. I’ve seen Intel systems that struggled to even do a single 4K 60 FPS until I added another ram stick to make it dual channel. Apple do 5K output. Like sure they might technically support more monitors in theory, but in practice you will run into limitations if those monitors require too much bandwidth.

    Oh yeah and these systems also need to share bandwidth between the framebuffers, CPU, and GPU. It’s no wonder they didn’t put 3 or more very high resolution buffers into the lower end chips which have less bandwidth than the higher end ones. Even if it did work the performance impacts probably aren’t worth it for a small number of users.