The answer to the question in the headline is "no". The line of argument used is extraordinarily tenuous: first it argues that reaction time is a good proxy for intelligence, then it collects different reaction-time studies, weights them by sample size, and computes a correlation. Unfortunately, if you look at the main graph of the paper, it looks like a trend-line drawn through noise. But! Because of the sample-size weighting, the data is dominated by a single study in 1889, which had 3410 participants (the largest out of 16) and an unusually fast reaction time (the fastest). They then use frequentist statistics to sneak in the assumption that reaction time varies only with different subjects and populations, and not with different test-apparatus and methodology, arriving at a "significant at p=.003" result which I have no confidence in.
That was my first thought. Alternate hypothesis: narrow measures of g show differing trends depending on cohorts' natural environment, leading Victorians who rote-learned from cane-wielding teachers to do better when tested for instant responses and modern people exposed to the visual stimuli of modern media to do much better when tested for considered pattern recognition.
Would love to be able to actually read the article though.
I'm not confident that you can really correlate simple reaction times with intelligence in general. But if our reaction times are indeed getting longer, that's interesting for other reasons.
Consider the relatively recent development of video games, many of which depend heavily on reaction time as a basic skill. We spend our childhoods immersed in these twitch-critical tasks: training the Victorians would have envied, if they had been all that concerned about simple reaction times. Yet our reaction times continue to go up, seemingly unabated. Why would that be the case? Shouldn't the extra practice at least slow the decline?