Stalvern wrote:Sonicx9 wrote:It also makes sense why older systems had different look and feel back then!
This was because the limited power practical at the time highlighted the differences between hardware developers' decisions about what their systems could and couldn't do; with so little for a manufacturing budget to buy, it was obvious whether the money went to a bigger palette or larger sprites or more sound channels or what have you. And this was entirely due to the graphic and sound chips, not the CPU - the 6502 was in everything from the Apple II to the C64 to the NES, but it's impossible to mistake any of those systems for each other because of everything else in their designs. As technology progressed, these differences (again, having almost nothing to do with CPU choice) inherently decreased with hardware's convergence on higher and higher fidelity. The reason you can barely tell the difference between the Xbone's graphics and the PS4's isn't that they have the same CPU architecture but that they're putting more than 16 colors on the screen and playing more than three channels of sound. The immediately obvious technical differences between systems like the Intellivision and Colecovision are impossible with the power of today's hardware.Sonicx9 wrote:Not to mention lets look at an example of improvements for Switch from launch to a year, https://www.youtube.com/watch?v=cbRA1mCbrac vs https://www.youtube.com/watch?v=rMfpXkROIMw neither is perfect but the later proves that over time companies do get more familiar with the CPU architecture to get games looking and running better as they learn. But when it plain Jane X86 they know what they are doing from the get go. Because look at this example: https://www.youtube.com/watch?v=ZSI701GEWsA and https://www.youtube.com/watch?v=7n86TiqEs-k they ran at the same resolution 2 years later which shows they can not optimize much more outside of using PS4 Pro and Xbox One X shows that X86 is not always better FYI.
What point are you trying to make? The Switch's situation is the very definition of worse. If it takes longer for developers to catch up to the hardware, what possible advantage is that? And even at its best, it still never achieves the graphical detail of its competitors.
Is your entire thesis that the Switch's CPU is "better" because developers have to put in more work and still get far less out of it than they could from the PS4? I'm honestly struggling to understand your thoughts here.
But again, and I can't stress this enough, the CPU architecture is unrelated to this. The Switch is what it is (in a word, weak) because of its GPU, RAM, and storage media. And, yet again, nobody is "digging" into any "metal" (side note: for the love of God, stop typing those words and think of a single other way to express the concept of low-level development) on any modern systems, Switch or otherwise, in the way that you're talking about. Programming the Switch in assembly is a fool's errand.Sonicx9 wrote:I am not a fan of the Atari Jaguar, but love the Sega Saturn for it games, but when it was not X86 based back then companies had no choice but to do metal digging and it worked even if it was harder to work with.
If you're going to bring up the fifth console generation, you have to acknowledge that, like the PlayStation 4 and Xbox One, the PlayStation and N64 had the same CPU architecture, and they couldn't be more different from each other. It means nothing at all.
If you have one YouTuber to blaim it is ReviewTechUSA as I was heavly inspired by him right down to saying digging in the metal in this video.: https://www.youtube.com/watch?v=ZSI701GEWsA