Say a resolution being upscaled from 480p to 1080p, and 720p scaled to 1080p, does that mean that scaling from 480p would take a little longer in milliseconds, compared to scaling from 720p which would be much shorter?
One could argue that since the difference between 480 and 1080 - it might take longer.
But 480 is over seventy-years old. Chances are they use a faster routine simply because there is so little data compared to 720.
So there is no real answer to this. Some televisions with fast CPU's probably lag the same amount to go to 1080.