I am not for or about inflation as the justifier for why console pries should or would go up. ut I do believe they will and should.
My reasoning for this is what I call "necessary tech", or better yet let's call it an advancement tax.
Prior to the PS360 gen, consoles basically had a fixed number of components. Processor, disc drive, controller....etc. Come to the PS360 gen, things like HDDs, cooling systems (or at least more elaborate ones), more expensive disc drives, and a slew of wireless apparatus were tacked on. Not much changed with the move into the current-gen. With next-gen, where the tech tax will end up going to is with RAM, SSD, and cooling. I believe those three areas would have them spending more money on them than they did in the current-gen while most of everything else wil remain about the same. They may also be spending more money on their respective APUs.
So for me, it's not about inflation, but that as time progresses certain kinds of tech trends are either necessitated or made mandatory and they add to hiking up prices.
Eg, A Blu ray drive will cost no less than $25. A UHD drive will cost no less than $35. A 1TB HDD will cost no less than $25. A 1TB SSD at launch may just not be able to cost any less than $40. Cooling a 160W console may have cost no less than $40, cooling a 200W+ system may require more advanced cooling and may cost no less than $60. We are already $40 "more expensive" and we haven't even got to things like RAM and the APU.
You're making a slightly different, but equally interesting point, IMO.
What you are cluing into is that we are reaching a point of diminishing returns on Moore's Law, whereby to extract a significant (or worthy) advance in performance, you're not going to rely on silicon improvements alone anymore. From the "modern age" of gaming between PS1 Gen and PS4 Gen, the silicon advancements alone were astronomical. The PS1 CPU was on a
500nm process. The PS4 Pro is on a 16nm process. Next gen looks to be targeting 7nm. Combine that with an amazing run on improvements in rendering techniques, plus all the integrated "secret sauce" rendering that was being built natively into the silicon. There were also other market factors (desktop rendering, growth of CD/DVD/BD, massive advancements in storage density), all combined to create a golden age of graphics computing.
It was enough to rely on process improvements between generational shifts that you could mostly count on those advancements to not only bring down costs (reducing price over time) but have a significant performance bump without the need to really increase die area. The run on console improvements (as a percentage of growth) on $200-$300 boxes from 1995 - 2013 may never be seen again.
Going forward, we have a slowing of Moore's First Law, the application of Moore's Second Law (or Rock's Law) becomes more apparent. It's going to get more costly to produce these upcoming nodes. And the market dynamics around storage and media are no longer in play. So not only are the things we come to associate with consoles fixed in cost, in some cases they are getting either stagnant or more expensive.
So you are completely right - one of 3 things has to happen.
1. You're going to see smaller and smaller % leaps in performance between generations if you want low cost and price drops.
2. The price of consoles are going to have to go up in order to provide meaningful differentiation. You will see less price drops over time. This is your "advancement tax"
3. Console manufacturers will have to eat more loss on HW to offset #1 and #2.
So your point is super valid. Inflation has nothing to do with it. If you're paying attention to technology, the console business is going to come under tremendous pressure as it becomes harder and harder to deliver huge leaps in performance every few years at price points that were seen has historically successful.
I don't paint this as doom-and-gloom for consoles. Only that the traditional boundaries are going to have to be broken and accepted at some point.
Fun times ahead!