There are 3 factors that contribute to movement judder:
1, 25FPS plays really poorly with 60hz screens (majority of them)
2, Sharp filters, or absence of softening (real cameras do not capture pixel perfect imagery, so CG sharp image filters are especially visible as strobbing on digital TVs)
3, Lack of motion blur. For some reason, almost every archviz animation misses motion blur, even the high end ones ;)
Ludvik,
Which frame rate would you use to diminish the judder?
Any, which will give you whole number when you use it to divide your screen refresh rate. For example 30 FPS:
60Hz/30FPS=2
120Hz/30FPS=4 (Means it would look nice on 120Hz TVs)
That's why you often see movies at 24FPS, to play nicely with TVs:
120Hz/24FPS=5
144Hz/24FPS=6
25FPS was standard for European TVs, because PAL standard (old European CRT TVs) Had 25Hz refresh rate (and ran on 50Hz interlaced/fields rate). US TVs, however, had NTSC standard, which ran at 29.97Hz (Which rounded well to 30hz (And again, it ran in interlaced mode = 59.94hz). And since most of the computer hardware originated in the US, US standards influenced mainstream refresh rates of today's computers screens (hence the 60hz mainstream), and since we are now in a digital age, and most of the TV content is now both made and broadcasted digitally, then even TVs these days adapt mostly US/Computer standards.
So in a nutshell:
24FPS will look good on most TVs, and won't look much worse than 25FPS on computer screens
25FPS will not look good on almost any computer screen, and may look good on some older TVs, and possibly on some new TVs too, due some smart adaptive mechanisms.
30FPS will look great on pretty much anything these days.
However, 30FPS is 16.7% more frames than 25FPS, so rendertimes/rendering resources are significant consideration.
Also, still, keep in mind that not using sharp CG image filtering for animation and -
always, no matter what- using motion blur is equally as important to avoid judder.