I still expect him to sell the golf course in Scotland for half a billion dollar to a triple-sandwiched private equity fund, which will eventually be discovered to be controlled from either SA or RU.
]]>Actually it's more fundamental: They literally have no idea what "different countries" mean and entail, having never been abroad, and never seen any reason to go abroad, because they have been indoctrinated to know that they live in "the worlds greatest country".
]]>Some years back, a friend who happens to know a lot about China, explained to me why "China doesn't do something about North Korea" and it really blew my mind that nobody /ever/ mentions that aspect anywhere in the west.
If North Korea falls apart, China will have millions of refugees on their hands, refugees who have been indoctrinated from birth, and therefore have a mental model of the world which has very little to do with reality.
It would take decades to reeducate those refugees before they could be released into and integrated with the rest of China, with a large fraction of them never able to make the transition.
There's simply no way that scenario can end well for China, so they do not want North Korea to fall apart.
QED.
To us in the west, North Korea is very far away, so the "being next door neighbor" aspect is not part of our mental machinery for thinking about North Korea.
I feel there is a similar mental disconnect exists about USA-Canadian relations, but with the opposite sign.
Everybody from USA tacitly assume that "they can just move to Canada", but despite Canada having better and cheaper healthcare, comprehensive gun-control and seems to have solved pretty much all the things people in USA complain about being "broken", the USAnians still do not cross the border.
A very big reason for why is because they have all been indoctrinated from birth that "USA is the best country in the world", so even if everything is actually better, it would still be a step down to go somewhere else.
But they think it is nice to know that "the option is there", and because they are from "USA, the best country in the world", it never occurs to them that Canada might not welcome them with open arms.
Nothing can convince me, that Canada's government has not already drawn up a contingency plan, for how to handle if ten thousand USAnians, with their fundamentally deeply flawed mental model of the world, comes crashing across the border.
Being Canadian, those are undoubtedly very nice and generous plans, but their primary objective will be to prevent ten thousand from becoming hundred thousand.
]]>Being somewhat intimately involved in this, I would say that is the least of the problems.
The one word you will see thrown around all the time is "privacy", but they never specify who's privacy.
As a general rule you can assume that it is not so much about the end user's privacy as about certain big companies wanting total privacy from independent researchers and governments, to invade the lives of consumers as they deem most profitable.
I still wont be the least surprised if HTTP/3 gets banned and blocked on national security grounds in jurisdictions which a competent government.
]]>The most-photographed-by-tourists road signs in Denmark are the ones pointing to the city of "Middelfart".
(The names origin is "midpoint of travel" via platdeutsch)
]]>According to this study:
https://www.pnas.org/doi/full/10.1073/pnas.2118631119
USA is probably becoming smarter year by year, because people born after 1990 did not grow up in a fog of nano particles of lead, from leaded gasoline.
Measured in feet, wearing wellies, that study says the people younger than 35 years are about 5% less "effing stupid" than the people aged 50-60 are "effing stupid".
Confirmation bias is a very tricky thing, but based on how well that study matches my personal experience, I would have exclaimed:
»ONLY in effing stupid USA (50 year and over)«
If you just want to vent, that may seem overly pedantic, but if you want to actually understand what's going on in USA, you have to keep influences like this in mind.
]]>Those are not important parts of accountancy, those are mere implementation details.
The important part is the "double-entry-keeping" method, invented in Venice 700 years ago.
As inventions go, it is pretty damn important, enabling amongst other things inter-generational debt and practically eliminating trivial embezzling of money.
]]>You can buy ARM servers, but in units of entire racks.
]]>Really?
Multi-socket systems are a horrible bandaid for not being able to cool silicon with many hundred I/O pins, and in Intel's case for being cocksure about how much better your manufacturing process and yield is.
And those I/O pins are themselves a band-aid: Most of them are there for connecting to the memory sticks, and just communicating between the CPUs and the RAM accounts for a very large fraction of the total power and thus heat, because of the need to charge and drain all those long high-capacitance PCB traces.
Apple did the sensible thing: The very moment IBM's patents expired, they put the RAM chip right next to the CPU chip, with connections measured in mm instead cm. This is why their laptops use 10-20W less than all others.
(In 1980 IBM 3081 was the first commercial computer to use multi-chip-modules, it was used in some military projects before that.)
Maxing out at 192GB, given Apple's market segment seems pretty sensible to me. I know that the actual CPU chiplets can address at least 1TB, possibly probably more, so they dont even have to rev the expensive silicon, but only the cheap interposer which connects the CPU and the RAM chiplets.
But yeah, go ahead and complain that Apple' silicon have no hay-rack...
]]>One of the best examples of Intel's active incompetence is "ACPI" which was supposed to make everything simpler and nicer with respect to machine-dependent stuff, such as how you adjust the brightness etc.
Instead of doing something sensible, Intel came up with a nightmare of a language which is neither fish nor fowl, no matter what taxinomy you throw at it.
The implementation of the language is a quarter million lines of code.
Lua, which would have been a good choice, is 30000 lines of code.
And did anything get better ?
Of course it did not, because BIOS writers have grown up sqeezed between Intel and Microsoft, so they'll do /anything/ to pass the compliance tests, except read the documentation of how things should work.
My impression is that the Intel culture is something like "We're Intel, we do complicated stuff" and if something looks too simple and clean, it must therefore be wrong.
I'll never forgive Busicom
]]>You do not, and let's just leave it at that.
]]>Intel is due to do something monumentally stupid again right about how, because ARM chips runs both faster and cooler than anything Intel has to offer.
]]>As one of the main archivists of digital data in Datamuseum.dk, I just want to say that the situation is not nearly as dire as you paint it.
Acid-free paper-tape being the clear and undisputed winner: 100% read rate.
The last two months I have read ~250 hard-sectored 8" floppoes from a WANG word processing system, and only about ten of those were not read perfectly, and in most cases only one or two sectors are missing.
The worst cohort of media we have encountered so far, are cheap disks from the 1990'es, where about 10% of the drives we have tried could would lack at least one head or more.
What the situation will be with post Y2K datamedia is anyone's guess.
]]>Yes, there is something to be said for that idea, and that something is "rubbish".
If you want an idea about Intel which is fit for this forum, it is this:
Intel is the C.M.O.T Dibbler of microprocessors.
Intel has always sucked at designing new architectures, which they tend to demonstrate about once per decade and they usually run the company nearly into the ground along the way.
8086 was a rapidly kludged together stopgap, after their iAPX432 turned out to be a turd.
The idea of iAPX432 was actually very interesting, essentially an object-oriented architecture, and could possibly have gone places (the Rational R1000 did.), but Intel totally botched it, so badly in fact, that nearly no silicon has survived.
The 8086 kludge was an 8080 with 16 bit instructions and four extra address lines bolted on. 8080 again was 8008 reheated with a gratin sauce on top, 8008 was a DataPoint 2200 on a chip.
As far as I can tell, Intels fundamental problem seems to be what has been called "Marchitecture" or Marketing driven Architecture.
The iapx432 was Intel trying to capture "The Next Big Thing" which they thought would be a massively lucrative market for embedded Ada systems (Z8002 and 68K got that instead) AND at the same time going for IBM's throat in the low end mainframe market.
The Itanic was also Intel trying to capture "The Next Big Thing", in this case 64 bit computing. The "x86" architecture we use to day was designed by AMD and, eventually, adopted by Intel when it was either that or go out of business.
Between those two architectures they tried i960 aka "P7", to capture "The Next Big Thing", in that case the RISC market, and when that got delayed they also started i860 for good measure, and because of those two, they nearly cancelled the "uninteresting" i386 project, which saved their bacon in the end.
The specifics of why IBM chose 8088, the discounted 8-bit version of the 8086 for the IBM PC, is another interesting story, but if they hadn't, Intel wouldn't have existed by 1990.
]]>Isn't that almost the perfect temperature for Legionella growth ?!
Here in Denmark the heater/tank has an "anti-scalding" thermostatic valve on the outlet, so that no matter what temperature water in the tank might be, the delivered water is never above 50°C.
]]>