I’ve stuck to the non-1M context Opus 4.6 and it works really well for me, even with on-going context compression. I honestly couldn’t deal with the 1M context change and then the compounding token devouring nonsense of 4.7
I sincerely hope Anthropic is seeing all of this and taking note. They have their work cut out for them.