While we are at it, let's all (as in the entire planet) switch to 24hour UTC and the YYYY.MM.DD date format.
Some ISO8601 formats are good, but some are unreadable (like 20240607T054831Z for date and time).
I agree but they're hard to read at a glance when debugging and there's lots of them :)
Having said that, a lot of client-server communications use Unix timestamps though, which are even harder to read at a glance.
At least it’s human readable and not protobuf 😬 * though the transport channel doesn’t really matter it could be formatted this way anyhow.
That one feels kinda meh to me. It solves a handful of non-issues with our current calendar (I don't care that the month starts on the same day, nor do I care that each day of the year is always the same day of the week). Each months having the same number of days is an improvement. It persists the problem that you still can't use months or years as a real mathematical unit of measure and extends it to weeks, which is the biggest annoyance with calendars, although it reduces how often that becomes significant. Adding two days that have neither a day of the week nor month would mean significant changes to every computer system that needs to deal with dates, and is just hateful.
The 1st of a month to the 1st of the next will always be one month, but it depends on the month and year how many days that is. So a month as a duration will span either 28 or 29 days. A week is now sometimes 8 days, and a year might still have 365 or 366 days, depending on the year.
How do you even write the date for the days that don't fit? Like, a form with a box for the date needs to be able to handle Y-M-D formatting but also Y-YearDay. Probably people would just say 06-29 and 12-29, or 07-00 and 01-00, although if year day is the last day of the year it kinda gets weird to say the last day of the year is the zeroth day of the first month of the next year.
There's just a lot of momentum behind a 12 month year with every day being part of a month and week. Like, more than 6000 years. You start to run into weird issues where people's religion dictates that every seventh days is special which we've currently built into our calendar.
Without actually solving significant issues, it's just change for changes sake.
Well, this is shitpost. And I wasn't serious about this. I responded to someone that wants the whole world to switch to a global time, and since mankind existed we used some local time in our daily lives.
Also UTC is not perfect because of leap seconds. Which means you cannot calculate with a simple formula how many seconds are between two time stamps, you need a leap seconds table for that. And leap seconds are only announced under 6 months into the future. So everything farther away, you cannot say how much time is between two stamps.
So with UTC a minute can have more or less seconds that 60.
YYYY.MM.DD and 24 hour for sure.
Everyone using UTC? Nah. Creates more problems than it solves (which are already solved, because you can just lookup what time it is elsewhere, and use calendars to automatically convert, etc.).
I for one do not want to do mental gymnastics /calculation just to know what solar time it is somewhere else. And if you just look up what solar time it is somewhere, we've already arrived back at what we're already doing.
Much easier just looking up what time (solar) time it is in a timezone. No need to re-learn what time means when you arrive somewhere on holiday, no need for movies to spell out exactly where they are in the world whenever they speak about time just so you know what it means. (Seriously, imagine how dumb it would be watching international films and they say: "meet you at 14 o'clock", and you have no idea what solar time that is, unless they literally tell you their timezone.)
Further, a lot more business than currently would have to start splitting their days not at 00:00 (I'm aware places like nightclubs do this already).
Getting rid of timezones makes no sense, and I do not understand why people on the internet keep suggesting it like it's a good idea.
I'm pretty sure they don't mean "give up on time zones" but "express your timezone in UTC". For example, central Europe is UTC+1. Makes almost no difference in everyday life, only when you tell someone in another zone your time. The idea is to have one common reference point and do the calculation immediately when someone gives you their UTC zone. For example, if you use pacific time and tell me that, it means nothing to me, but if you say "UTC-8" I know exactly what time it is for you.
Oh right, yeah. We do this at my company which has operations world-wide. If we say timezone we say UTC±. Apologies for the misunderstanding
You mean base-10? My totally unrealistic pipe dream would be to have the world switch to base-12.
I mean something like 1 day = 10 hours = 1 000 minutes = 100 000 seconds (currently 86 400 seconds so a second would only get slightly faster).
https://en.m.wikipedia.org/wiki/Decimal_time
This term is often used specifically to refer to the
French Republican calendar
time system used in France from 1794 to 1800, during the French Revolution, which divided the day into 10 decimal hours, each decimal hour into 100 decimal minutes and each decimal minute into 100 decimal seconds
Oh, nice! It's funny how it's the same as the one I just made up which further proves that it simply makes sense.
Yeah. I think if someone had a sensible method for how we could switch from one to the other with minimal impact, it might work.
What would very difficult for me would be the recalibration of my internal clock. Knowing a second is slightly shorter, and a minute is longer, and an hour is much longer, would be hell for a while.
Unfortunately I think something that's pretty hard coded into the society at this point is that a day should be able to divide by so we end up with the 8hr work, 8 hr rest, 8 hr sleep. I'd be interested in a 30hr day over a 10 hr day. But that one doesn't make much sense either since it misses the mark on bringing tim fully into the 10 base metric system, but still has all the same troubles you'd encounter for getting people to switch.
That's good for file/record sorting, so let's just use it for that
For day to day, DD.MM.YY is much more practical.
For day to day, DD.MM.YY is much more practical.
It's not though... It's ambiguous as to if the day or month is first. With the year first, there's no ambiguity.
If you want to use d-m-y then at least use month names (eg. 7-June-2024).
It's ambiguous as to if the day or month is first.
Not if everyone is using it, as they should.
Besides, so is is yours. 2024.06.07 could be the 7th of June or (if you're an American and thus used to the months and days being in an illogical order) 6th of July.
As for writing out the month names, that's no longer shorthand. That's just taking more time and space than necessary.
Au contraire! With a three character month, period separation isn't needed, and the date is shorter. (Admittedly there's likely to be a language translation issue, depending on audience.)
Hard disagree.
Least specific -> most specific is generally better in spoken language as the first part spoken is the part the listener begins interpreting.
Like if I ask if you're free on "the 15th of March" vs "March 15", the first example is slightly jarring for your brain to interpret because at first it hears "15th" and starts processing all the 15ths it's aware of, then "March" to finally clarify which month the 15th is referencing.
The only thing practical about DD.MM.YY is that it is easier for the speaker because they can drop the implied information, or continue to add it as they develop the sentence.
"Are you free on the 15th" [oh shit, that's probably confusing, I meant a few months from now] "of July" [oh shit, I actually mean next summer not this one] "next year (or 2025)".
So the format is really a question of who is more important in spoken language: the speaker or the listener? And I firmly believe the listener is more important, because the entire point of communication is to take the idea you've formulated into your head, and accurately describe that idea in a way that recreates that same idea in the listener's head. Making it easier for the speaker to make a sentence is pointless if the sentence itself is confusing to the listener. That's literally a failure to communicate.
You're confusing your own familiarity and experience with a general human rule.
My mother tongue (Portuguese) has the same order when saying numbers as English (i.e. twenty seven) and indeed when I learned Dutch it was jarring that their number order is the reverse (i.e. seven and twenty) until I got used to it, by which point it stopped being jarring.
The brain doesn't really care beyond "this is not how I'm used to parse numbers" and once you get used to do it that way, it works just as well.
As for dates, people using year first is jarring to me, because I grew up hearing day first then month, then year. There is only one advantage for year first, which is very specifically when in text form, sorting by text dates written in year-month-day by alphabetical order will correctly sort by date, which is nice if you're a programmer (and the reason why when I need to have a date as part of a filename I'll user year first). Meanwhile the advantage of day first is that often you don't need to say the rest since if you don't it's implied as the present one (i.e. if I tell you now "let's have that meeting on the 10th" June and 2024 are implied) so you can convey the same infomation with less words (however in written form meant to preserve the date for future reference you have to write the whole thing anyway)
Personally I recognize that it's mainly familiarity that makes me favour one format over the other and logically I don't think one way is overall better than the other one as the advantages of each are situational.
Meanwhile the advantage of day first is that often you don't need to say the rest since if you don't it's implied as the present one (i.e. if I tell you now "let's have that meeting on the 10th" June and 2024 are implied) so you can convey the same infomation with less words (however in written form meant to preserve the date for future reference you have to write the whole thing anyway)
That advantage is not exclusive to the date-first system. You can still leave out implied information with month-first as well.
Personally I recognize that it's mainly familiarity that makes me favour one format over the other and logically I don't think one way is overall better than the other one as the advantages of each are situational.
This is the biggest part of it. No one wants to change what they know. I'm from the US and moved to the UK, and interact with continental Europeans on a daily basis. I've seen and used both systems day to day. But when I approach this question, my answer isn't "this one is better because that's the one I like or I'm most comfortable with", my answer is "if no one knew any system right now, and we all had to choose between one of the two options, which one is the more sensible option?"
dd-mm-yyyy has no benefit over yyyy-mm-dd, while yyyy-mm-dd does have benefits over dd-mm-yyyy. The choice is easy.
The minimal or non-existent benefits for most people in most situation of yyyy-mm-dd (no, the brain doesn't need the highest dimensional scale value to come first: that's just your own habit because of how numbers are spoken in the English language and possibly because the kind of situation where you use dates involves many things which are further than a year forwards or backwards in time, which for most people is unusual) - people sorting dates by alphabetical order in computer systems (which is where yyyy-mm-yy is the only one that works well) is just the product of either programmer laziness or people misusing text fields for dates - so don't add to enough to justify the "jarring" for other people due to changing from the date format they're used to, not the mention the costs in anything from having to change existing computer systems to having to redesign and print new paper forms with fill-in data fields with a different order.
In a similar logic, the benefits of dd-mm-yyyy are mainly the ease of shortenning it in spoken language (i.e. just the day, or just the day and month) and depend on knowing the month and year of when a shortenned date was used (which usually doesn't work well for anything but immediate transfer of information as the month and day would still need to be store somewhere if they're not coming from "present date") so they too do not justify the "jarring" for other people due to changing from the date format they're used to.
Frankly even in an imaginary situation were we would be starting from scratch and had to pick one, I don't know which one would be better since they both have flawed advantages - year first only really being advantageous for allowing misusing of text data fields or programmer laziness in computer systems whilst day first only being advantageous in immediate transfer of date information where it gives the possibility of using a shortenned date, something which is but a tiny gain in terms of time or, if in a computers system or written form, storage space.
It's really not a hill worth dying on and I only answered your point because you seemed to be confusing how comfortable it felt for you to use one or the other - a comfort which derives from familiarization - with there being some kind of general cognitive advantage for using any order (which, in my experience, there is not).
if I ask if you're free on "the 15th of March" vs "March 15", the first example is slightly jarring for your brain to interpret
Sounds like you're just used to it being said the opposite (read: wrong) way. If you told someone in my country March 15th, it would be just as jarring to the listener.
at first it hears "15th" and starts processing all the 15ths it's aware of, then "March" to finally clarify which month the 15th is referencing.
not in daily use. When you ask someone "what day is it today?", they usually have a handle on what month it is and just need the day. For making plans, it's only if you make them way in advance that you need the month first, which would be sorting and scheduling, not daily use.
When you ask someone "what day is it today?", they usually have a handle on what month it is and just need the day.
You're still allowed to exclude implied information, no matter which method of dating you want to go with. You can just say "the 15th".
For making plans, it's only if you make them way in advance that you need the month first, which would be sorting and scheduling, not daily use.
I can't speak for you, but for me I am making plans, sorting, and scheduling every single day.
I can't speak for you, but for me I am making plans, sorting, and scheduling every single day
Sounds exhausting tbh, I'm sorry..
I get SO frustrated when I see a date like 4/3/2024 and have to spend time trying to figure out if it's the 4th of March, or if some US company wrote the software I'm using and it has defaulted to silly format.
Try working for an American company while not living in America. I have spent years trying to convince my US colleagues to please use unambiguous date formats when sending email to a global audience. But no.... they just can't see why it would be necessary or even helpful to do that.
As a mechanical design engineer in America having dual systems creates unnecessary complexity and frustration and cost for me all day every day. I full force embrace switching to metric
as a mechanic working in a hodgepodge US/EU factory line, I have to suffer through always carrying double the tools to service metric and SAE machines. and after so many years in the industry, I still slip up and say 3/16 when I mean 3/8 sometimes, because fractions are a shit system for wrenches.
oh, and some of our linear encoders readout decimal-feet, because fuck it, why not?
My condolences. I'm already annoyed with the times USC units are presented in Australia (our nominal pipe sizes are often talked about in inches, and sometimes valves and such have USC flow coefficients because the manufacturer is American).
So I cannot imagine the pain you must be subjected to.
Metric yes please. Also for fucks sake use the 24 hour clock. Some of us learned it from the military but it’s just earth time and way easier than adding letters to a number
the 24 hour clock
I switched to it in my later teens when I realised how many cases it would be better in.
Conversion during conversation might be an extra step, but I'll be pushing for the next generation to have this by default.
Also, much better when using for file names.
Also, YYYY-MM-DD. There's a reason why it is the ISO
The conversion is pretty much the only hurdle I ever hear about, but that’s easy enough. How many songs/films talk about “if I could rewind the last 12+12 hours”…it’s just a matter of making it fit in context people can understand when they know a day is 24 but are used to 12.
ISO and while we’re at it, the NATO phonetic alphabet for English speakers. “A as in apple B as in boy” means fuck all when you’re grasping for any word that starts with that letter, and if English isn’t your first language fuckin forget about it.
ISO and while we’re at it, the NATO phonetic alphabet for English speakers. “A as in apple B as in boy” means fuck all when you’re grasping for any word that starts with that letter, and if English isn’t your first language fuckin forget about it.
err... didn't get what you're trying to say
We standardized an alphabet among all countries for clear communication.
Here is an example of it going wrong.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.
I'm pretty sure that's an example of why you should use the chosen ones instead of going "mancy/nancy" all over the place.
Also, didn't they just make a standard for themselves and other just took it because it was probably easier than making one for their own language (oh right, NATO... but let's be honest here, NATO is just a forum for America to flaunt its power while PR-ing peaceful, so it makes sense they use English, which is also easier to be a second language than most other ones).
Though I feel like China might have made their own.
The radio words were chosen to be distinct, such that for people who trained in them, it would be easier to distinguish letters being spoken over low quality radio.
Not very relevant in the era of 2G HD audio, and now VoLTE.
But when there's a bad signal and you have to tell someone a callsign, it makes sense.
I like ISO, because in whatever cases I have interacted with it, it has made programming easier for me.
I like YYYY-MM-DD, because when files lose their metadata, if they are named using this, I can still sort by name and get results by date.
Conversion during conversation might be an extra step
Conversion is always extra step, but you don't need it if you use same timezone as other participant.
Also, YYYY-MM-DD. There's a reason why it is the ISO
Big-endian is big. Alternatively DD.MM.YYYY or DD.MM.YY for little-endian lovers.
It's more along the lines of most signigicant bit/least significant bit, rather then byte order.
Right, and the most significant bit of the whole date is the first Y in YYYY, which we can't put at the end unless we reverse the year itself. So we can either have pure big-endian, or PDP-endian. I know which one I'm picking.
Your literal statement is also just wrong. The solitary implication of endianness is byte ordering, because individual bits in a byte have no ordering in memory. Every single one has the exact same address; they have significance order, but that's entirely orthogonal to memory. Hex readouts order nybbles on the same axis as memory so as not to require 256 visually distinct digits and because they only have two axes; that's a visual artefact, and reflects nothing about the state of memory itself. ISO 8601 on the other hand is a visual representation, so digit and field ordering are in fact the same axis.
Every single one has the exact same address; they have significance order, but that's entirely orthogonal to memory.
We are talking about transferring data, not storing it. For example SPI allows both for LSb-first and MSb-first. In date digit-number-date is like bit-byte-word.
Right, and in data transfer every byte can be placed in an absolute order relative to every other. And the digits within the respective fields are already big-endian (most significant digit first), so making the fields within the whole date little-endian is mixed-endian.
I have iterated this several times, so I worry there's a fundamental miscommunication happening here.
The 12 hour one is just so wildly dumb and inconsistent.
Why does it go from 11 AM to 12 PM to 1 PM?
base12 has the advantage of being divisible by 2, 3, 4 and 6, while base10 is only divisible by 2 and 5.
The Dozenal system does have some advantages over base10. Feel free to poke around []https://dozenal.org/drupal/content/brief-introduction-dozenal-counting.html to learn a bit about Donzenal/Duodecimal counting and maths.
And to bring up a point, why did every nation that adopted the metric system require a law(s) to force people to use it? Complete with penalties if you don't. If it was such a good and great idea, people would have naturally gravitated to it don't you think?
Ahh, another connoisseur of the Dozenal system! Everyone should add a little dek and el to their life!
You listed 2 twice(thrice if counting 6) for base12 and once for base10. Generally, when talking about bases better talk only about prime factors. Base12 has 2 and 3 as prime factors, while base10 has 2 and 5.
You don't need to add or multiply time very often. Division is super important tho, and base60 is better than base10 for that.
Too easy. Plus we put in the 3/5 “compromise” so you can’t expect old white racists to learn proper math
The French did try it back when they were in the process of changing to the metric system in the 1700s. Even THEY quickly determined that, much like the creation of the universe, it was a very bad idea. And it was very quietly dropped. French tried hard to scrub that moment of insanity from the history books. But well, the internet is truly forever in both directions I guess.
Metric time quickly got out of sync with the periods of light and dark. Mother Nature evidently doesn't like humans dicking around with the time periods of her celestial movements. (Dozenal for the win!)
Cause then we’d be thinking we’re monkeys on a spherical rock in a vacuum instead of calibrating clocks to a radioactive element to make sure everyone tunes in to wheel of fortune on time while this oblate spheroid tumbles around
Also, it’s hard enough getting people to equate Km and C with known quantities, Americans can’t handle base unit shifts like that
Cause then we’d be thinking we’re monkeys on a spherical rock in a vacuum instead of calibrating clocks to a radioactive element to make sure everyone tunes in to wheel of fortune on time while this oblate spheroid tumbles around
Just a little sodium chloride
If America is going to go through the trouble to convert everything to metric, might as well switch to base 10/decimal time as well lol
I love the 24 hour clock and living in London, UK I used it all the time. However, I remember one time I bought movie tickets at lunch for 17:30 and my brain thought it was for 7:30pm and I called my friend at the last moment saying: "you have to leave work early if we're gonna make it!"
There was a beautiful time back when I was young where we tried to change to metric and schools taught us nothing but. Now I'm ~50 years old and don't even know how many pints are in a gallon. Or feet in a mile. Always forget whether it's 12 or 16 that's inches in a foot / ounce to pound. Always have to look that shit up. Because they didn't teach us that garbage. Ever.
Guess what I NEVER have to look up? The measurements that tell you in their fucking prefixes how many X are in Y. What a concept.
Don't worry. You likely wouldn't remember even if you were taught. 5280 feet/mile is just not worth the brain space. Neither is 8 pints/gallon. I don't think you would convert between the two often enough to make it useful information to just know.
And I do have to look up those prefixes for the less used ones. It's exa then peta or peta then exa and what's bigger than them? What's smaller than nano? I don't remember because it rarely comes up. But I'm in tech, so it's starting to more.
I remember 5280 despite being Australian because I saw that stupid mnemonic tweet. I remember the SI prefixes because of xkcd.
Metric has been legally "preferred" in the US since 1975. We just don't use it.
Also while I was looking up that year I came across this wild factoid:
In 1793, Thomas Jefferson requested artifacts from France that could be used to adopt the metric system in the United States, and Joseph Dombey was sent from France with a standard kilogram. Before reaching the United States, Dombey's ship was blown off course by a storm and captured by pirates, and he died in captivity on Montserrat.
We should have gone metric in the 70s. This year will be the 45th anniversary of the Metric Conversion Act, which was signed on December 23, 1975, by President Gerald R. Ford. You may have even seen a map that has been incriminatingly illustrated to show how they are out of step with the rest of the world. It’s a compelling story and often repeated, but you might be surprised to learn that it’s simply untrue!
You could always use the metric system, that was always allowed. Most food (I've seen) has both imperial and metric measurements. Most digital measuring devices and lots of analog ones will have options for both. Speedometers generally have both.
Really, the only one stopping you from using the metric system in your daily life is you. Unless of course you're saying you want other people to use it. Which is a distinctly different proposition.
I'd argue the two greatest barriers for the average, non-STEM individual adopting metric in America is the speed limits being in mph and the temperature being in °F. Both are convertible easily enough, but when you constantly have to do so to engage with critical infrastructure or safety (cooking temps, etc.) It provides a barrier against adoption for anyone without the drive to make a concerted effort to use metric.
Between the two, I think temperature is the harder one. But strangely, it also brings weight and volume back into it: Cookbooks.
So many recipes are finely tuned balances of measurements that just look plain alien when converted to metric.
In the UK we're mostly using metric with the odd exception (we still love a pint of beer), one of which is that speeds are measured in MPH. It's not really a big deal, there aren't many customers between miles and kilometres and anything less than a km is still usually measured in metres.
I think we were the first with metric money? We still pay for things in centidollars.
They're different things. The metric system uses decimal. All metric units are decimals, but not all decimals are metric measurements.
You're right that money is decimal, not metric.
Because that's it's name
https://en.m.wikipedia.org/wiki/Metric_system
But if you wanna get all specific about it we can call it SI
https://en.m.wikipedia.org/wiki/International_System_of_Units
It's certainly not the Decimal system
Sorry, I thought you were making a general comment. I didn't realize you we're criticizing the "metric money" statement.
But, reading over that person's comment again they also say "centidollars", which also doesn't exist, so I believe they were trying to make the point that the US was the first to make a currency that seems to adhere to the same principles as the metric system since their currently since 1 centidollars = 1 cent = 1 dollar/100.
(I'm pretty sure it was a joke though. We don't use kilodollars, etc)
Non-metric or non-decimal money is referring to systems where the multiple tiers of money, like our cents and dollars, are seperated by amounts other than 10 or 100, like the old British system of 12 pence in a shilling and 20 shillings in a pound https://en.wikipedia.org/wiki/%C2%A3sd
You can't have it because of peer pressure from dead people. You gotta take them seriously, motherfuckers will haunt your ass and say shit like "thirty fathoms, gold dubloons and schooners, twenty nickel shillings". We have the metric system in our country and the ghosts suck, they don't even try to come up with sensical nonsense phrases for the sake of the bit, the lazy bastards.
I'm a scientist. I've used the metric system since grade school. In fact, I convert Imperial measurements to metric to do estimates.
Engineer here, I just use whatever's convenient. It's handy to know both.
That said, I did confuse a poor coworker of mine this week when I was using bar for tank pressure and psi for the safety reliefs. That's totally on me though.
To be clear this was a conversation over the phone, not a tech review or something. And I was explicitly naming the units, it was just jumping all around that had him confused.
Official documentation and programs should always be explicitly clear on what units are being used, especially pressure.
I dunno, it'd probably be better but there's nothing stopping people from using metric in places where it makes sense. I write most of my recipes in grams because it makes them easier to multiply or divide.
At the same time, the most common thing people use units for is a point of reference, and it really makes no difference whether your point of reference is metric or traditional units.
That's fine right up to when you're complaining about the temperature to an american.
That's fine right up to when you're complaining about the temperature to an american.
But I am an American. To learn Celsius I came up with a quick heuristic to do “accurate enough” conversions for the months between switching off Fahrenheit and getting to the point where I knew Celsius well enough.
So I can pretty quickly go from Celsius to Fahrenheit for my ignorant compatriots.
Edit: For anyone downvoting me, if it’s because I called people who don’t know Celsius “ignorant,” please understand I’m using this definition: “lacking knowledge, information, or awareness about a particular thing.”
Not this one: “lacking knowledge or awareness in general; uneducated or unsophisticated,” nor this one: “discourteous or rude.”
We are all ignorant about things we don’t know about. No shame in ignorance, it’s the default state of all living beings!
You could drive this one stretch https://www.atlasobscura.com/places/i19-americas-only-metric-interstate
I ended up on this last year as I was exploring the South West. I found it confusing even as a Canadian.
I then later was confusion when Google Maps told me to go 80 on Hwy 10 in Texas once I came up from Big Bend NP. I thought the GPS was confused. 80 kms on the highway in the US? It was then I realized I wasn't in Oregon anymore with their 60 mph highways. Texas goes fast and even 80 mph isn't enough for most people. Even the single lane highways with construction workers was 65 mph work zones in Texas.
It was the most amount of road kill I've ever seen in all my travels. I think at one stage a herd of goats must of tried to cross the highway based on the carnage I came across. I finally understood the reason for the huge bumpers on the front of trucks in Texas now.
A few years ago I started using Celsius in my everyday life. It's been pretty easy, just remember that C scales twice as fast as F, and 32F=0C and you're set for conversations. It helps to be quick with math, but finding it difficult may make it easier to convince other people to use it instead of F near you. To acclimate yourself you'll want to change the settings on your phone to use C by default.
I haven't switched over to m in everyday use, because all the roadsigns are in Mph and doing that conversion while driving is bad juju.
I'm thinking of rewriting all my recepies in grams and liters. If I can figure out how to get our stupidly-over-designed-yet-entirely-jank oven to use C, that'd be good too. If we had one with a bimetalic strip and a knob I'd be able to just print one with the new temperature scale.
Honestly, temperature (in terms of weather preparedness, not cooking) makes WAY more sense with Fahrenheit. Largely the only temperatures you care about are 0 to 100 and generally you feel a good difference in temp every 10 degrees F.
Almost everything else I prefer metric. But that's one where Celsius is just terrible.
The reason you feel that way is because you're used to it.
Similarly, Celsius feels natural to me, as I've lived with it all my life.
That's a poor argument, though, when the justification for utilizing volume, mass, and distance is because it is very "base 10"-y and is easily divisible and understood.
Celsius absolutely is shit for that.
I could use your logic to justify why imperial units are better for length, for example, but we all know it's a bit fucked. Celsius is absolutely fucked for temperature regarding human comfort and is imprecise.
Replying again because you've edited your comment and added another paragraph.
Your edit asserts that Celsius is "absolutely fucked" regarding temperature for human comfort... which is an utterly bizarre argument to make because it only makes sense to people who are used to Fahrenheit and have an intuitive sense of what 72F means to them, but have no intuitive sense of what 22C means
I'm not entirely sure that you're not just trolling now.
No negative numbers needed for most cases, 0-100 scale for the extremes MOST people need to care about with relative "feels like" every 10 degrees (but realistically every 5 is distinguishable, even smaller amounts depending). Ez pz.
IDK why you're so defensive about Celsius lol. It's okay to admit when an SI unit has a poor application. Your ONLY defense for it is "well people can get used to it" which is the exact same reason I could say "well you could just get used to feet, inches, yards, miles, pounds, ounces, fluid ounces, teaspoons, tablespoons, etc" - it's a shit argument for both.
But oh that's right this is Lemmy where "america bad" for everything.
Right, so...once again, your argument is that you feel that Fahrenheit makes more sense, because that's what you're used to
I never said that C is better because people can get used to it, you're just making that up. I said that the system people are used to is inherently going to be the one that makes the most intuitive sense to them, and that applies to both C and F.
The rest of what you said applies equally to any system of measurement.
I don't understand why you're so angry about this?
The entire point of this post under which we are all commenting is insinuating a superior system of measure. Jesus you actually are this stupid.
Ah yes, more insults. Your argument of 'the system I use is better because I abuse people who disagree with me' is very compelling indeed.
Saying that you didn't read my argument because your point ignored it entirely is an insult? It's abuse? LMAO.
Are you fucking stupid? <- that is an insult
So, uh, no? In fact within the "human spectrum" you generally care at least somewhat about every tick of the number. So it's actually more useful for people.
Because I doubt you can feel the difference between 71f and 72f. But it's possible to notice the difference between 21 and 22, although you're pretty picky if you do.
Tread lightly my friend. I already won the Fahrenheit vs. Celsius debate a few months ago, but non-Americans are insanely defensive about the metric system and won't accept the truth.
https://sh.itjust.works/comment/9757434
I'll transcribe my best arguments because that thread was an absolute shitshow and it's hard to find my comment even with the direct link. Almost all of my most downvoted comments on Lemmy came during my defense of the Fahrenheit temperature scale, and I'm weirdly proud of that fact.
::: spoiler Fahrenheit Supremacy Gang Celsius is adequate because it’s based on water, and all life on earth is also based on water, so it’s not totally out of our wheelhouse. But for humans specifically I think Fahrenheit is the clear answer.
One point that many may overlook is that most of us here are relatively smart and educated. There are a good number of people on this planet who just aren’t very good with numbers. Obviously a genius could easily adapt their mind to Kelvin or whatever.
You have to use negative numbers more frequently with Celsius > Celsius has a less intuitive frame of reference
Each Celsius degree is nearly two Fahrenheit degrees > Celsius is less granular
The reason I argue the more granular Fahrenheit is more intuitive is because a one degree change should intuitively be quite minor. But since you only have like 40 or 50 degrees to describe the entire gamut of human experiences with Celsius, it blends together a bit too much. I know that people will say to use decimals, but its the same flaw as negative numbers. It’s simply unintuitive and cumbersome.
B) 66F is room temperature. Halfway between freezing (32F) and 100F.
the intuition is learned and not natural.
All scales have to be learned, obviously. It’s far easier to create intuitive anchorpoints in a 0-100 system than a -18 to 38 system. Thus, Fahrenheit is more intuitive for the average person.
I should note that if you are a scientist, the argument completely changes. If you are doing experiments and making calcualtions across a much wider range of temperatures, Celsius and Kelvin are much more intuitive. But we are talking about the average human experience, and for that situation, I maintain Fahrenheit supremacy
It’s not about the specific numbers, but the range that they cover. It’s about the relation of the scale to our lived experience. Hypothetically, if you wanted to design a temperature scale around our species, you would assign the range of 0-100 to the range that would be the most frequently utilized, because those are the shortest numbers. It’s not an absolute range, but the middle of a bell curve which covers 95% of practical scenarios that people encounter. It doesn’t make any sense to start that range at some arbitrary value like 1000 or -18.
When the temperature starts to go above the human body temperature, most humans cannot survive in those environments. Thus, they would have little reason to describe such a temperature. Celsius wastes many double digit numbers between 40-100 that are rarely used. Instead, it forces you to use more negative numbers.
This winter, many days were in the 10s and 20s where I live. Using Celsius would have been marginally more inconvenient in those scenarios, which happen every winter. This is yet another benefit of Fahrenheit, it has a set of base 10 divisions that can be easily communicated, allowing for a convenient level of uncertainty when describing a temperature.
Generally -40 to 40 are the extremes of livable areas.
Sure, water is a really good system and it works well.
And for F that range is -40 to 104. See how you get 64 extra degrees of precision and nearly all of them are double digit numbers? No downside.
Furthermore F can use its base 10 system to describe useful ranges of temperature such as the 20s, 60s, etc. So you have 144 degrees instead of just 80, and you also have the option to utilize a more broad 16 degree scale that’s also built in.
You might say that Celsius technically also has an 8 degree scale(10s, 30s), but I would argue that the range of 10 degrees Celsius is too broad to be useful in the same way. In order to scale such that 0C is water freezing and 100C boiling, it was necessary for the units to become larger and thus the 10C shorthand is much less descriptive than the 10F shorthand, at least for most human purposes. :::
What's funny is the person who brought up arguments FOR Fahrenheit over Celsius to me that I hadn't considered is actually a Brit. They lived in England and the US and your explanation here is very similar to theirs.
You certainly didn't win any arguments with those claims.
0-100f is not anywhere close to the scale people see in the weather anywhere most people live. Taking where I've ever lived as an example:
The most iimportant number with respect to the weather is freezing, it's handy knowing if you're dealing with ice. The standard range for where people live is not -40 degrees, something like 2/3 of the world live between the tropics and will never see freezing or below. The -40 number makes sense if you live in Alaska or Siberia and maybe even somewhere like Minnesota, but certainly not to someone in India or Indonesia....
Neither scale is relative to cooking (which isequally arbitrary for both), though metric is easier for things like brewing 80°C tea since you need 4/5th a cup boiling water and 1/5 a cup and no thermometer.
The "feel" of the weather is hugely impacted by humidity which is why every forecast has a "feels like" measure and why 90°f in Dubai is lovely but 90°f in Houston is miserable. The increments of 10f doesn't make sense at all, though seems to be a common perception among people who prefer fahrenheit
The comment about farenheit being more granular would be true in an alternative universe where decimals don't exist, but not in this one.
Americans literally like farenheit more because it's familiar, any other rationalisation is nonsense. Both measures make perfect sense after you've taken the time to learn them and use them daily (I know this firsthand).
The increments of 10f doesn’t make sense at all, though seems to be a common perception among people who prefer fahrenheit
What doesn't make sense about it? You can tell another person it's in the 30s outside, and you have efficiently communicated more information than is possible when using Celsius. You'd have to say it's between 4 and negative 1, which is just lame. And this remains true across every temperature, because of a variety of factors which I explained above.
In every climate which you mentioned above, it's easier to communicate how hot or cold it is outside using Fahrenheit. This is because all of the numbers being used are non-negative integers (aka natural numbers). Even the triple digit ones are one-ten or one-twenty.
I wonder why mathematicians named them that? Possibly because they come naturally? Unlike negative one point seven.
They will defend Celsius being used for everyday weather reporting with their last breath with their ONLY fallback being "well you're just used to fahrenheit durrrrrrr" as if that logic can't be applied to every unit system on earth.
as if that logic can't be applied to every unit system on earth.
Mate that's my whole point. I grew up Celsius in Australia and use Farenheit day to day now. They are literally interchangable once you learn. It takes a month or two to get used of using them and beyond that, the literal only difference in difficulty of use is that it takes about ten seconds longer to calculate a green tea brew in f, which has no bearing on the weather anyway. All of the arguments above are garbage, as they are garbage when the exact same, inverted arguments are made by metric proponents.
All measurements scales are interchangeable once you learn - that's not the point of this particular thread of comments. It's "what's most useful comparatively given the SI penchant for base 10". The answer isn't a temperature scale that, for day to day human concern, is not -18 to 38 - that's fucking stupid.
Yeah. I've had some time to ruminate and I think part of it stems from the impossibility of them not using Celsius in their lives. Like, they're not going to singlehandedly make their country start using Fahrenheit, so accepting it as better would just create cognitive dissonance.
What doesn't make sense about it? You can tell another person it's in the 30s outside, and you have efficiently communicated more information than is possible when using Celsius. You'd have to say it's between 4 and negative 1, which is just lame. And this remains true across every temperature, because of a variety of factors which I explained above.
It doesn't tell you anything that Celsius can't with a 5 degree swing. This the absolute peak of arbitrary, both 5s and 10s are easy scales to work with. Your example of between 4 and negative one is deranged. I'm in houston right, it's 90°f - if I want to comunnicate that to my yankee girlfriend I'd say "babe it's 90° outside, might get up to one hundred" and so far, you're right this is easy to articulate. If I want to communicate that same information to my mum, I'd say "hey it's 30° outside, might get up to 35°". Both cases convey information with the same accuracy, both cases I haven't mentioned humidity, which for actual temperature feel has a way higher influence then 5 degrees, the extra information I'd gain by strictly converting 31-37.8°C is junk data, the farenheit measure is approximated to begin with and because of a humidity swing carries a huge variability in actual "feel" anyway. I tried to explain this above and clearly failed, as your response doesn't touch on this at all and just insists that people who think in metric don't default to easy to work with numbers.
In every climate which you mentioned above, it's easier to communicate how hot or cold it is outside using Fahrenheit. This is because all of the numbers being used are non-negative integers (aka natural numbers). Even the triple digit ones are one-ten or one-twenty.
The only place with negative integers was Pittsburg, so that point doesn't make sense for the rest and even if it was, your argument is insane. Saying negative 5 is no harder than saying 25, plus having negatives where snow and ice come into play makes it obvious when to be careful outside. I mean your argument here just makes no sense, if there is some added complexity to saying "negative" then it is surely comparable to having to remember a random number of 32. Literal kindergardeners understand negative numbers. Neither this or remembering the 32 number add any meaningful complexity and certainly have 0 impact on anyone's actual use of either scale.
Literal kindergardeners understand negative numbers.
Literal adults have trouble with negative numbers. I can't do this all over again, sorry and have a nice day. Hopefully it's somewhere in the 80s wherever you are
Mate I have to reply to that, because it's such an insane claim - the US, the only country that doesn't use °C, has this huge reliance on a monstrously complex credit system (obviously the entire concept of credit is reliant on the concept of debt and negatives). It's flat out insane to suggest that the same people who live and function with such a credit system conceptually struggle with the fundamentals negative numbers. It's a mind boggling claim.
Anyway, have a good one.
Actually, we're on the metric system. The foot and inch are defined exactly by their metric conversion values, and so is the pound
We're actually just using conversion factors
How is this supposed to be considered using the metric system? If you tell someone that you weigh 80kg and he doesn't have a clue what you mean, then you're not really using the metric system, are you?
Also, another issue with what you're suggesting is that people have to memorize several conversion factors as well. Inherently, you only have to be able to convert inches -> cm and pounds -> kg, but unless you want to do even more math in your head, you also have to remember feet -> cm, yards -> cm, miles -> cm, square feet -> square meters, cubic feet -> cubic meters (phew, that's just all the length conversions), pounds -> kg, ounces -> grams, pounds -> grams, cups -> grams (for every fluid you might want to measure), litres -> gallons, litres -> pints, etc.
Or you could just go through the one-time effort of actually using the metric system so you don't have to carry this mental burden with you everywhere you go....
The problems with that are:
hardly anyone knows the conversion factor
other people aren't going to do the math in their head
That's on them
them == everybody in this case. Practically, nobody is going to do what you suggest - instead, non-metric users will ask metric users to do the conversion for them. And why should we be responsible for doing the work when they are the ones who refuse to use the system that 96% of the world has adopted?
Rock it like a Brit. Most things in metric except for your height (feet and inches) and your car speed (miles per hour) and when you measure your manhood (inches... Or fractions thereof).
Also, milk is pints.
Land is acres.
And the ponies run furlongs.