Because FB doesn't support that. Either you build it in and it runs some code when your app starts or you can't use it.
I would vote for "not use it" but others want Facebook integration supported and happily share information on all their users to Facebook ...
P.S. I believe part of this is also Apple, who don't like runtime loading of code, as that makes their verification harder, but I'm no iOS dev or user.
Docs are less likely to be broken if GitHub implements some automated, global way to rename the main branch: when this isn't done in an ad-hoc, per-repo way, they can implement logic to redirect URLs.
(Or, for a quick and dirty fix, always redirect master to the main branch)
> I think a key aspect of remote work being successful is for the distributed team to be in the same time zone or have less than 1 hour time difference
I think the exact time difference doesn't matter so much as everyone can be at work simultaneously at least 70/80% of the time.
I'm very obviously not a lawyer nor lawmaker, but I'm under the impression that anything can be classified as an act of war as long as another country is willing to treat it as such, and other superpowers/blocs are willing to sign off on this assessment, and back it militarily and through sanctions/incentives.
(Though this would all probably swept aside as soon as somebody reverse engineers the vaccine.)
It's already here...I saw an issue a few weeks ago a bunch of production systems failed upon seeing a certificate signed by a CA root that expires after 2038.
If you store time as seconds since the Unix epoch (1st Jan 1970), you'll overflow a 32 bit unsigned integer in 2038 (around March iirc) and time will suddenly be back in 1970 again. I believe the Linux kernel did some work to circumvent this on 32 bit systems in a recent release, but if you're running an old 32 bit system you're probably out of luck.
Actually, it's signed 32-bit integers that overflow in 2038. Signed integers have been used because people wanted to store dates earlier than 1970 too.
And probably because signed integers are a default choice in certain languages and/or maybe on certain architectures.
Java, for example, famously doesn't even have an unsigned 32-bit integer primitive type. (But it has library functions you can use to treat signed integers as unsigned.) Ultimately not a good design choice, but the fact that it actually wasn't that limiting and relatively few people care or notice tells you that many people have a mindset where they use signed integers unless there's a great reason to do something different.
Aside from just mindset and inertia, if your language doesn't support it well, it can be error-prone to use unsigned integers. In C, you can freely assign from an int to an unsigned int variable, with no warnings. And you can do a printf() with "%d" instead of "%u" by mistake. And I'm fairly sure that converting an out of range unsigned int to an int results in random-ish (implementation-defined) behavior, so if you accidentally declare a function parameter as int instead of unsigned int, thereby accidentally doing an unsigned to signed and back to unsigned conversion, you could corrupt certain values without any compiler warning.
> Ultimately not a good design choice, but the fact that it actually wasn't that limiting and relatively few people care or notice
It was a horrific design choice, and people do notice and do hate it.
Not so much because of how it applies to ints, but because the design choice Java made was to not support any unsigned integer types. So the byte type is signed, conveniently offering you a range of values from -128 to 127. Try comparing a byte to the literal 0xA0. Try initializing a byte to 0xA0!
In contrast, C# more sensibly offers the types int / uint, short / ushort, long / ulong, and byte / sbyte.
I think he's referring to an unsigned integer value that's out of range for the signed integer type of the same width—usually a bit pattern that would be interpreted as a negative number when cast to a signed integer type, but which negative number is not defined by the C standard because it doesn't mandate two's complement representation.
One of the biggest mistakes in IT ever, in my opinion.
I'd even go so far to say that defaulting to signed instead of unsigned was also one of the biggest blunders ever. I would've never defaulted to a type that inherently poses the risk of UB if I have another type that doesn't.
Though it's also possible that precisely that was the reasoning for it.
Maybe I'm overengineering, but couldn't you store the sanitized version as the normal value, and also store and make publicly available the original unsanitized value in an ominously and obviously named key (say, dangerouslyUnsanitizedValue) that happens to be easily greppable/lintable?
Plain text can contain anything and it shall be treated as such, it is that simple.
As for security, don't assume everything in your database came from a trusted source. Maybe there are remains from an old version of your code that didn't sanitize, maybe you improperly used admin tools that bypassed checks.
How would you determine which value to display? It seems to suffer from the same issue where if you display the sanitized value then the comment is still missing necessary characters, but if you use the unsanitized value then your application will be vulnerable to XSS.
In most cases, that would be overengineering, but it is an entirely plausible solution if you happen to have a case where you need to allow the user to enter things like angle brackets, and for some reason you cannot escape them.
Hm... Features take some time to get to production. The stuff I use/appreciate most launched either before Microsoft acquired them, or shortly after the deal — so its development predates that.
In those places, old Apple hardware isn't significantly cheaper than new hardware. You might get a 25% discount for a degraded experience (security updates may end in ~3 years) or buy much older hardware (e.g. 9 year old models) that cannot run current macOS and Xcode, etc.
And yes, a local PC can work out to lower prices due to specific tax arrangements, sourcing components straight from manufacturers and assembling them locally.
So do you expect Apple to give away Mac OS if they did allow it to run on a VM or should they go back to charging $129 for the OS like they use to (and like MS still does)?
The current version of OS X runs on the 2012 Mac Mini. I see one on eBay for $145.
https://en.m.wikipedia.org/wiki/Paradox_of_tolerance