I have a question for some of the more technical folk here.

I've been playing with the numbers for stove output etc. and wondering if I have this right. I've calculated that it takes about 317 BTU (from memory as I don't have the paper in front of me) to raise 1L of 20°C (68°F) water to boiling. If a stove puts out (typically) 10,000 BTU, presuming per hour, it would take 1.9 minutes to boil the water assuming 100% heat transfer. If a stove took 4 minutes to boil the water it would only be 47.5% efficient? I've noticed that most stoves are rated at just about 10,000BTU, even the large Coleman camp stove/grill I have is rated at 10K BTU for the burner and the grill assembly. I'm curious as to why the 10K ceiling/limit; does this have to do with the output properties of the hydrocarbon fuels?

I'm trying to wrap my head around this because of my recent stove purchase, as how an old Coleman 440 being rated at (digging deep into the recesses of my brain) 10,000 BTU and brand new Brunton Vapor AF at 10,000BTU could vary in boil times by such a great amount. Coleman 440 around 4 minutes and a Vapor AF at 12+. Something just isn't adding up in my mind as to how stove could vary so much in efficiency when most are designed very similar. If someone could go through the technical aspects of this I'd appreciate it. Thanks.