I'd agree with you, but ever since mobile devices have taken off, aren't we much worse than before? In the last peak PC years, say, before 2011, I was under the impression that hardware vendors were starting to play ball, but now things seem super locked down and Linux seems to be falling behind in this tug of war between FOSS - binary-only.
I think the problem with mobile devices is not the software but the hardware.
These devices are locked down partly because of business interests (planned obsolescence), but another part is personal identity security.
A run-of-the-mill Android or iOS device carries more secrets inside, which have much more weight (biometric data, TOTP, serial numbers used as secure tokens/unique identifiers, etc). This situation makes them a "trusted security device," and allowing tampering with it opens an unpleasant can of worms. For example, during my short Android stint, I found out that no custom ROM can talk with the secure element in my SIM, and I'm locked out of my e-signature, which is not pleasant.
If manufacturers can find a good way to make these secure elements trustworthy without the need to close down the platform and weld shut, I think we can work around graphics and Wi-Fi drivers.
Of course, we also have the "radio security" problem. Still, I think it can be solved by moving the wireless radio to an independent IP block inside the processor with a dedicated postbox and firmware system. While I'd love to have completely open radio firmware, the wireless world is much more complex (I'm a newbie HAM operator, so I have some slight ideas).
So, the reasons for closing down a mobile device are varied, but the list indeed contains the desire for more money. If one of the hardware manufacturers decides to spend the money and pull the trigger (like AMD did with its HDMI/HDCP block), we can have secure systems that do not need locking down. Still, I'm not holding my breath, because while Apple loves to leave doors for tinkerers on their laptops, iPhone is their Fort Knox. On the other hand, I don't have the slightest confidence in the company called Broadcom to do the right thing.
Interesting details, thank you for providing them!
Regarding this:
> Still, I'm not holding my breath, because while Apple loves to leave doors for tinkerers on their laptops, iPhone is their Fort Knox.
People don't realize, but 99% of what Apple does hinges on the iPhone. The rest of the products pack a much lower punch if the iPhone were to vanish from the face of the Earth completely. It's the product all their customers have and they have it at all times with them. It's the product that's probably the easiest to use and the easiest to connect to other things.
So yeah, the iPhone will probably be the last non-military device on the planet to be opened up :-)
What is your reason for believing the closed source nature of nVidia's graphics drivers played a role in their success? AMD's and Intels Windows drivers are also closed source and so were AMD's Linux drivers when nVidia managed to secure the lead.
nVidia is also finally moving to an open source kernel module so a closed source one doesn't seem important to them for keeping their moat.
Presumably amadeuspagel means nvidia has walked a careful line with CUDA & ML.
CUDA is much more accessible/documented/available than a lot of comparable products. FPGAs were even more closed, more expensive and had much worse documentation. Things like compute clusters were call-for-pricing, prepare to spend as much as a small house.
On the other hand, CUDA is closed enough the chips that run it aren't a commodity. If you want to download that existing ML project and run it on your AMD card - someone will have to do the leg work to port it.
That means they've been able to invest quite a lot of $$$ into CUDA, knowing the spending gets them a competitive advantage.
nVidia built this bubble by playing dirty on many fronts.
Their Windows drivers are a black box which doesn't conform to many of the standards, or behave the way they see fit, esp. about memory management and data transfers. GameWorks library actively sabotaged AMD cards (not unlike Intel's infamous compiler saga). Many nVidia optimized games either ran on completely unoptimized or outright AMD/ATI hostile code-paths. e.g.: GTA3 ran silky smooth on an nVidia Geforce MX400 (bottom of the barrel card) while thrice powerful ATI cards stuttered. Only a handful of studios (kudos to Valve) optimized engines for both and showed that a "paltry" 9600XT can do HDR@60FPS.
On Datacenter front, they actively undermined OpenCL and other engine by aritifically performance-capping them (you can use only one DMA controller in Tesla cards, which actually have three DMA engines), slowing down memory transfers and removing ability to stream in/out of the card. They "supported" versions of OpenCL, but made it impossible to compile on their hardware, except OpenCL 1.1.
On driver front, they have a signed firmware mechanism, and they provide a seriously limited firmware to nouveau just to enable their hardware. You can't use any advanced features of their cards, because the full-capability firmware refuses to work with nouveau. Also, they re not opening their kernel module. Secret sauce is moving to firmware, leaving an open interfacing module. CUDA, GL and everything is closed source.
On the other hand, they actively said that "The docs of the cards are in the open. We neither help, nor sabotage nouveau driver project. They're free", while cooking a limited firmware for these guys.
They bought Mellanox, the sole infiniband supplier to vertically integrate. Wonder how they will cripple the IB stack with licenses and such now.
They're the Microsoft of hardware world. They're greedy, and don't hesitate to make dirty moves to dominate the market. Because of what they did, I neither respect nor like them.
Then I guess Rhodium is the largest metal and the Hinkley Point C nuclear power station is the largest building in Britain.
If you conflate value with size, both terms become near useless. Value can only mean something in relation to something else.
Picture this: You have a large company with thousands of employees having similar revenue to their competitor, which accomplishes the same with one employee and a much smaller operation.
Which one is likely to be more valuable? Obviously the smaller one. If we however conflate value with size, as is so often done in popular economics, just pointing out this single fact becomes a complicated exercise of having to carefully employ language that we neutered for no good reason at all. Not to speak of all the misunderstandings this is going to create with people who aren't used to this imprecise use of the English language.
If you mean revenue, say revenue, if you mean value, say value, if you mean size, say size. Don't use "large" to say "valuable". Why would you do that if there's a perfectly good word already? Imprecise language is often used to either confuse or leave open an avenue to cover one's ass later... which brings us back to popular economics.
> "Size" is unitless, so I disagree with your rationale.
You're going to have to expand that a little bit.
> valuation is a very common size metric
It's not a size metric.
A world where things grow larger the more people value them might be interesting though.
> and there was no confusion about OP's meaning.
Their comment makes much less sense if you replace "biggest" with "most valuable".
It's trivially obvious that the correlation between valuation and how much a company can invest into software is incredibly weak if it exists at all. On software development spending NVidia is eclipsed by many companies with sometimes only a fraction of its valuation.
So either it's a non sequitur or we are incorrectly assuming that Nvidia became the largest company.
"Size" is not only a metric of physical dimension.
OP said "biggest", and meant "largest valuation". This happens to be incorrect -- nVidia was never the highest-valued public company -- but they were the second-largest, and came very close to first.
If you did not know what OP meant immediately from awareness of business news, you still should have considered "valuation" as one of the obvious possibilities. If you did not, then you might be lacking adequate context for this conversation, and might be better served by asking questions instead of demonstrating your confusion via misplaced pedantry.
Size is not a metric. You can measure size with a metric and you can measure value with another metric. Measuring both the same way only leads to nonsense. I think we're getting to the bottom of the confusion now.
> misplaced pedantry
Pedantry is the easiest way of dismantling comments that try to turn nonsense into an argument by being intentionally vague. Should you argue directly against vague statements, the speaker can retroactively make them mean whatever they want to. You'll be chasing moving goalposts. Employ pedantry until they well and truly nail themselves down, and then explain why whatever is left is nonsense. Worked like a charm.
Also, to get ahead of any further personal attacks, this pedantry absolutely is fun to me. I wouldn't be here otherwise.
You are simply wrong. Being condescending and wrong is a fatal mix.
Size is a unitless dimension. A category of metrics, if you must. OP's word of "bigger" can be applied to population, area, weight, importance, memorability, and yes, valuation.
Can be, and frequently is, among humans. Zero humans are confused.
> Pedantry is the easiest way of
... demonstrating that you're a jerk. Nothing else.
> this pedantry absolutely is fun to me
Got it. My mistake for assuming good faith.
> I wouldn't be here otherwise
That's the most disappointing thing I've read in a while.
> Size is a unitless dimension. A category of metrics, if you must.
Just make size a category of dimensions and I'd underwrite that. It certainly doesn't refer to a single dimension.
Mathematically valuation would absolutely be a size/magnitude, but we're clearly not speaking in mathematical terms, given how the terminology is being abused. Mathematically plenty of things that are a magnitude/size do not constitute a metric space, and the singular would be wrong anyways.
I'm taking metric to mean "standard of measurement", which is why size is still not a metric. Saying "size is a category of metrics" would be getting close enough I suppose, but really we're talking about the actual dimensions.
Now that we've got that out of the way, I'm still firmly grouping valuation as a measurement of value, and not a measurement of size. I'm also standing by the assertion that not having these two be disjoint sets only leads to confusion and nonsense.
> OP's word of "bigger" can be applied to population, area, weight, importance, memorability, and yes, valuation.
Nice try. They applied it to "company". You can do that, but now we're not talking about a company's value. We have the adjective valuable for that.
> A world where things grow larger the more people value them might be interesting though.
Do you personally grow larger when you have more money in the bank? When more of your friends get elected in the senate? When you hire someone? When you buy a new house?
This is not a difference between ChromeOS and Android. ChromeOS is replete with binary vendor blobs, which is why the webcams, wifi, and touchpads work correctly on Chromebooks and poorly or not at all on other laptops running vanilla Linux.