Be careful. exceeding around the original 200k tokens leads to worse and worse results. It's important to have context clean and tailored to the current task.
Yes, but at the same time having the 1 million context enabled is nice because the model is aware that they have more context left and actually perform better. [0]