LOL, I forgot about that. Fair point.
So sad for Microsoft that as soon as they decided to copy another one of Apple’s worst ideas, Apple moved up to 11 instead of 10.16.
LOL, I forgot about that. Fair point.
So sad for Microsoft that as soon as they decided to copy another one of Apple’s worst ideas, Apple moved up to 11 instead of 10.16.
That’s when Windows 10 stops getting security updates. Expect most software vendors to drop support for Windows 10 this year if they haven’t already. That doesn’t necessarily mean things will stop working, but it will not be tested and they won’t spend time fixing Win10-specific problems.
In enterprise, you can get an additional three years of “extended security updates”. That’s your grace period to get everyone in your org upgraded.
While I strongly relate to anyone who hates Windows 11, “continue using Windows 10 forever” was never a viable long-term strategy.
Windows 10 was released in 2015. Ten years of support for an OS is industry-leading, on par with Red Hat or Ubuntu’s enterprise offerings and far ahead of any competing consumer OS. Apple generally only offers three years of security updates. Google provides 3-4 years of security updates. Debian gets 5 years.
There has never been a time in the history of personal computing when using an OS for over 10 years without a major upgrade was realistic. That would be like using Windows 3.1 after XP was released. Windows 10 is dead, and it’s been a long time coming.
Now go download Fedora.
Half the movies released in 3D during the last wave were poorly done conversions not even shot for 3D.
Only half? -_-
I’ve only seen a few movies that were actually filmed in 3D. Even Gravity was filmed in 2D.
The problem is that actually filming in 3D requires using different (and expensive) hardware, and different creative direction all across the board. You can’t just upgrade to a 3D camera and call it a day. Not many studios will put in that kind of effort for something that is not proven in the market. And not many filmmakers are actually skilled at working in 3D, simply due to lack of direct experience.
I saw the Hobbit movies in high framerate 3D in the theater, and while they were not good movies, they looked absolutely amazing because they were committed 100% to the format from start to finish — not just with the hardware, but with the lighting, makeup, set design, everything. It’s a shame the movies sucked, and it’s a shame that there has never been a way to watch them in HFR 3D outside of select theaters.
They’re like 20 years too late to start copying Apple here. Apple had their shit together with their product line for a good while after Steve Jobs returned and eliminated the absolute insanity of Apple’s mid-90s lineup, which had at least three times more models than any sane person would find useful.
But recently, Apple went off the deep end. Boggles the mind that “Pro Max” ever made it past the brain-mouth barrier in a boardroom, let alone into an official product lineup.
Yep. AGI is still science fiction. Anyone telling you otherwise is probably just trying to fool investors. Ignore anyone who is less than three degrees of separation away from a marketing department.
The low-hanging fruit is quickly getting picked, so we’re bound to see a slowdown in advancement. And that’s a good thing. We don’t really need better language models at this point; we need better applications that use them.
The limiting factor is not so much hardware as it is our knowledge and competence in software architecture. As a historical example, 10 short years ago, computers were nowhere near top-level at Go. Then DeepMind developed AlphaGo, which was a huge leap forward and could beat a top pro. It ran on a supercomputer cluster. Thanks to the research breakthroughs around AlphaGo, within a few years had similar AI that could run on any smartphone and could beat any human player. It’s not because consumer hardware got that much faster; it’s because we learned how to make better software. Modern Go engines are a fraction of the size of AlphaGo, and generate similar or better quality results with a tiny fraction of the operations. And it seems like we’re pretty close to the limit now. A supercomputer can’t play all that much better than my laptop.
Similarly, a few years ago something like ChatGPT 3 needed a supercomputer. Now you can run a model with similar performance on a high-end phone, or a low-end laptop. Again, it’s not because hardware has improved; the difference is the software. My current laptop (2021 model) is older than ChatGPT 3 (publicly launched in 2022) and it can easily run superior models.
But the returns inevitably diminish. There’s a limit somewhere. It’s hard to say exactly where, but entropy’s gonna getcha sooner or later. You simply cannot fit more than 16GB of information in a 16GB model; you can only inch closer to that theoretical limit, and specialize into smaller scopes. At some point the world will realize that trying to encode everything into a model is a dumb idea. We already have better tools for that.
I know this is from 2015, but even then, it was a bit late to make this argument. This was already mainstream enough in the 90s to be the punchline in syndicated comic strips. By 2015, we already had “customer experience engineers” (i.e. tier-1 helpdesk). The ship has not only sailed, it has sunk.
Anyway, the phrase originated in an era when programming was very different from what it is today, when most programmers came from a background in electrical engineering or something along those lines.
Apple’s monitors have an entire OS in them. They have much of the same internals as an iPad. Honestly, I have no idea why, because they don’t do anything especially fancy.
Samsung makes “smart monitors” with Tizen or some shit like that.
I think it’s just for enterprise contracts, yeah.
Fedora seems like a good general-purpose pick to me, because it is modern, it has a large community, and it’s easy enough to install and use. It has similar advantages as Ubuntu — that is, a large community and broad commercial third-party support — without the downsides of having a lot of outdated software and lacking support for new hardware. I think Fedora is less likely to have show-stopping limitations than a lot of other distros, even beginner-friendly ones like Mint.
But that’s just one opinion. There’s nothing wrong with Ubuntu or derivatives. I’ve heard good things about Pop_OS as well, though I’ve never tried it myself.