Yes, the distinction between theory and hypothesis can be important, but a distinction should also be made between popular science and rigorous peer-reviewed literature. Popular science is meant to be easy for the general reader to understand. Often that means using simpler words, as long as they are still accurate in a general way.
In this case, I believe theory is an acceptable word to use. The MW dictionary, for the definition of 'theory', says
2a. "a belief, policy, or procedure proposed or followed as the basis of action"
3a. "a hypothesis assumed for the sake of argument or investigation"
Seems to me that the usage in the article fits either of these definitions well enough.
> The Greek theoria (θεωρία) meant "contemplation, speculation, a looking at, things looked at", from theorein (θεωρεῖν) "to consider, speculate, look at", from theoros (θεωρός) "spectator", from thea (θέα) "a view" + horan (ὁρᾶν) "to see".
"Theory" should refer only to the beatific vision or its analogical precursors in temporal existence achieved through mystical union with the Uncreated Light!
The only reason this Ancient Greek definition holds is because it is a dead language. Prescriptivism cannot stop natural language change. Any "X should only mean Y" statement is a dead-end conservative approach.
> Popular science is meant to be easy for the general reader to understand. Often that means using simpler words
The point above, and I agree, is that the word 'hypothesis' is taught every year in public schools, from elementary school through highschool. Even dropouts should know it, and from what I've seen do know it. At a certain point we've got to stop pandering to morons. Or worse, pandering to what we assume is the level of morons but is actually substantially lower. I don't think the general public is perplexed by that word; anybody who thinks they are is probably underestimating the public. Even if the average Joe on the street isn't disciplined about using the word hypothesis vs theory, they still understand what the word means when they read it.
Even if I know (or can trivially decipher) the definition of a technical word, reading an article saturated with technical words takes more work than reading an article written in more common language. Articles that take more work to read are going to be read by fewer people, and the population of people that do read them is going to be biased to the well-educated.
If you're trying to communicate to the general public something like "Gophers are actually good sometimes", you probably want that message to go beyond the well-educated.
For this is what the Sovereign Lord says: I am about to deliver you into the hands of those you hate, to those you turned away from in disgust. 29 They will deal with you in hatred and take away everything you have worked for. They will leave you stark naked, and the shame of your prostitution will be exposed. Your lewdness and promiscuity 30 have brought this on you, because you lusted after the nations and defiled yourself with their idols. 31 You have gone the way of your sister; so I will put her cup into your hand.
Seems like the key issue here is this: what is the purpose of conducting the authentication? In the case of personal accounts, it's for the benefit of the individual. They get their own account to safely store personal data. Here, the individual management of biometric authentication devices, as you described, is a great thing. A passkey can be generated without exposing biometric data. The individual has the responsibility and incentive to keep their devices secure.
But the above article is an example of the opposite case, where the authentication is for public security. In this situation, the individual cannot be entrusted with their own auth, so if each person were to use their own device, it would need to be quite tamper-proof. Seems far simpler at this point to do face / fingerprint auth, where the security guard ensures that no one is wearing a mask or fake finger. Yes, there is the concern that the bio-data could be stolen / misused, and for that reason I think that bio-auth for public safety should be limited to a single standard type (e.g. face), with the others being reserved only for private auth. That way, a compromise can be reached between public safety and individual privacy.
What do you find 'loaded' about the phrase 'mixed society'? It is more descriptive than the meaningless phrase in the original headline: "How Does Paris Stay Paris?".
The most sought after address in all of South park for only the very privileged few. You can take in the views from the deck spa and enjoy the mixed Sodosopa culture. Also, featuring a private fitness center, clubhouse and so much more. Welcome home.
Possibly because it presumes Paris is 'mixed', but actually the city of Paris is notably better off than the surrounding suburbs, especially the ones on the north/east. This has some good maps: https://medium.com/perspective-critique/the-geography-of-ine...
Many organizations go with B already. Usually with some arbitrary password update period, with more sensitive information requiring shorter periods.
The user response is to choose a new password that is similar to the previous password to avoid loosing access due to forgetting. This means that an attackers best way to find the users current password, is to know their old password. NIST has recognized this, and advises against these policies: “Reset—Required only if the password is compromised or forgotten.” [1].
Best mitigation I see for systems that exclusively take password input is to use a user pin plus a PKI card or RSA key.
Changing passwords as a cracking mitigation is "bad medicine", always has been, and is now acknowledged as such.
Mathematically, imagine it is raining (stochastically speaking, evenly distributed on the interval, with replacement). Are you more or less likely to get hit by a rain drop if you dance around or stand still? Nope, odds are the same. (Although technically by moving around a lot you are sweeping space and thereby increasing the surface area for rain to impact + amount of rain, so actually you are increasing the odds.)
Ok, try this instead. Flip a coin and guess whether it's heads or tails. Does it matter whether I guess heads every time, alternate heads/tails, or flip another coin? No, it does not.
Now in the case of people who re-use passwords... in the longer term we'll find out whether the propensity to be one or the other produces an evolutionary signal or whether people are impossibly bad at "random" in any case.
Finally, imagine someone cracking passwords: this is your adversary, and there is only one. Are they going to start with the hardest, most difficult to compute / type / memorize / come up with in the first place passwords? Let's encourage them to do that, and start with passwords which you'd never be able to enumerate starting from null before the heat death of the universe. Ok, so maybe that won't work, they're going to start with the easy ones first. So in this case, the optimal strategy would be to pick a really difficult password, and then at some point in time switch to one of the easy ones since it's already been checked.
It's an exposure mitigation rather than a cracking mitigation, isn't it? The idea is that if it got badly stored somewhere it's only dangerous for 30 days or whatever.
Yes, I suppose it is an exposure mitigation as well. Although if someone is having users change passwords every 30 days (or 30 seconds? whatever) due to exposure I have a lot of WTF questions. If passwords suffer from that much unavoidable exposure I'd be expecting automated systems (hello HOTP / TOTP) and OOB authenticators which are resistant or agnostic to that exposure.
(ssa.gov generates printable one-time pads if you're masochistic enough to request one.)
Universities, especially the more prestigious ones, have been trending towards aggressively simplistic labeling of complex social issues. While their stated goal is to protect minorities, the oversimplification increases polarization and division. These doctrines in fact harm the minorities that they were intended to help, by forcing people to choose a side. Plus, they focus too much on the negatives, when we need more positive, meaningful, constructive discourse.
Nepotism means hiring friends or family. The advantage there is that you start out with a greater level of trust, depending on how long you've known the person. This must be weighed against the disadvantage that there are likely more skilled individuals to be found. When dealing with delicate matters (i.e. when small mistakes cause big problems), reliability becomes more important.
Most people trust their families more than their work acquaintances. Still, the family member must have more than trust, to do a good job in a skilled profession.
Proof? I'm not going to spend the effort to cite data here, since we have no prior relationship.
If you have a good enough model, you can identify anything.
To discuss super-intelligence, we should first define it. Wikipedia says that it is intelligence surpassing the brightest of human minds. Taken as a whole, the internet already has the knowledge of a super-intelligence: it contains more useful information than any individual human. But it is far more limited in its control and use of information.
LLMs rely heavily on inferences from their training data, meaning that they struggle to generalize to new situations. If you had a program that could use abstract reasoning to learn any topic, then it could solve any problem better than a human, given that a supercomputer can process and store more data than a human. This program would be a super-intelligence.
I expect that the development of intelligence (software) superior to humans will happen much faster than the development of superior hardware did, based on the timescales of human evolution (billions of years) compared to the evolution of civilization (thousands of years).
Here's one take: good developers learn fast, bad developers learn slow. Software is one of the fastest moving fields, so it takes a quick mind to keep up. Clearly, everyone needs some basic knowledge to program, and equally clearly, there will be gaps in every developer's knowledge. When needed, we fill the gaps as quickly as possible. If someone can learn 10x as fast, maybe they only need 1/10th the knowledge base.
Rather than everyone voting for a few people at the top, what if each level voted for the one above and the one below? That way, people would be voting for the positions and people that they know the most about.
Like the other response said, when people with only a surface-level understanding get involved, democracy devolves into a surface-level popularity contest.
In this case, I believe theory is an acceptable word to use. The MW dictionary, for the definition of 'theory', says
2a. "a belief, policy, or procedure proposed or followed as the basis of action"
3a. "a hypothesis assumed for the sake of argument or investigation"
Seems to me that the usage in the article fits either of these definitions well enough.