

This is demonstrably wrong on a scale where it loops around to becoming hard to explain, so that’s a neat trick.
There are enough people who have never heard of or don’t understand the concept of virtual machines to keep Windows as the biggest mainstream OS several times over. There isn’t a “root layer” in computers as far as normal humans are concerned. They’re computers and then a Windows pops up and that’s how that works.
At the very most, they understand conversion layers on the basis of having gone from an old Macbook to a new Macbook, and even that is like a tenth of the market (still several times bigger than Linux adoption, though).
The idea that a mass of people are waiting on the sidelines, chomping at the bit for direct GPU access through an extra layer of software fine tuning to be able to run some brand name Windows app with no Linux version is absurd. Even games are not the problem, as evidenced by that being mostly solved via Proton and not changing much.
I don’t mind either way, but man, consider what other assumptions you may be making that are wildly off, particularly if they’re on something more important than your hopes for relative OS market share on home computers.

They already made it simple, are you kidding me? You are running a different OS inside an entrirely fictitious computer that doesn’t exist, and it takes a few clicks to set up on stock software that comes with your OS or is freely downloadable online. The whole thing is magic.
Magic that is still way below the awareness of common users. I’m not acting like “no one” wants to use VMs, I’m telling you that, at scale, this is not key functionality for the vast majority of the userbase. Which is entirely accurate.
And because the vast majority of the userbase is on Windows and doesn’t even know this would be a problem, that’s not WHY they’re on Windows or not on Linux. It’s not even a “tiny brain” thing, it’s just what people use (and don’t use) computers for.