As I understand it, the difference between the nominal wattage rating of older (prewar and postwar) Lionel (and others) transformers and their actual, useful output has nothing to do with input v. output (There are not 30 percent losses in a Lionel transformer), nor to warming up, nor to volt-amps v. watts.
The apparent “discrepancy” is due to the use of power ratings based on peak voltage v. RMS voltage.
AC, as we all know, cycles from a peak voltage, down through zero, and then back up to the peak in the opposite direction. Effective/useful voltage is less than peak voltage; the peaks are roughly 1.4 times the effective (Root Mean Square, or RMS) voltage.
Now, early in the 20th century, in the formative years of the electrical industry and more particularly, the consumer radio business, radio manufacturers – in a power race which you might liken to the auto horsepower race of the 1950s and 1960s – managed to get “peak” power established as the industry standard. IOW, power ratings were based on peak voltage, rather than RMS voltage. In the Teens and Twenties, when transformers were being brought into the toy train world, they were considered “power transformers” and rated that way, leading to nominal ratings a third or so greater than the actual power available based on RMS voltage. This practice continued until sometime in the 1970s.
This is why a 270 watt contemporary transformer (MRC, for example) will actually deliver close to 270 watts (allowing for some modest internal losses), while a 275 watt (peak) postwar ZW is hard put to manage 200 watts of useful power (RMS).
And just BTW, one reason for all the confusion is that Lionel dissembled over the years, talking about “warming up,” etc., in instruction books and other material, probably because they didn’t want to get into the technical issues of peak v. RMS, etc., and in eff