I like that Fahrenheit has a narrower range for degrees. 1C is 1.8 degrees F. So, F allows you to have more precision without the use of decimals. Like, 71F feels noticeably different to me than 64F, but that is only a 3.8 degree difference in C.
3 degrees celcius is easily noticeable too so that’s a bit of a moot point. If anything, 1 degree celcius is much harder to discern and therefore having an even more granular scale is unnecessary.
Where in the chicken I jam the thermometer makes several degrees difference. If you truly require that level of granularity whilst grilling, I’d wager reading a decimal figure isn’t the end of the world. Us normies can continue to bring chicken to 74 and call it a day
I like that Fahrenheit has a narrower range for degrees. 1C is 1.8 degrees F. So, F allows you to have more precision without the use of decimals. Like, 71F feels noticeably different to me than 64F, but that is only a 3.8 degree difference in C.
3 degrees celcius is easily noticeable too so that’s a bit of a moot point. If anything, 1 degree celcius is much harder to discern and therefore having an even more granular scale is unnecessary.
But that also doesn’t matter because the granularity is meaningless if you don’t make decisions for differences between 71F and 70F
Not at those exact temperatures, but one degree matters in in grilling meat, making mash for beer, making candy, etc.
Where in the chicken I jam the thermometer makes several degrees difference. If you truly require that level of granularity whilst grilling, I’d wager reading a decimal figure isn’t the end of the world. Us normies can continue to bring chicken to 74 and call it a day
Sure, but you should be using Celsius for those things. That’s the main argument here.
Which was what I was going at. I sort of assumed ‘chicken at 74’ was enough of a pointer :)
You win best username. I’m assuming you’re a Linux nerd as well. <3