I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I suppose it is a kind of survival training? One of my bandmates who’s served came up after. “So here’s the deal. You watch what everyone else is eating. If they’re meticulously avoiding the peach cobbler or whatever it is, you F’ing stay away from that S if you know what’s good for you!”
I’m in a band that performs on occasion at CFBs (Canadian Forces Bases). We typically eat there and spend the night either in barracks or guest housing.
I have noticed that when we play for officers, dinner is like steak and lobster. When we play for enlisted, it’s more like high school cafeteria. The one and only time I had to excuse myself towards the end of a concert and miss the closing number was after eating at the enlisted mess and getting explosive diarrhea.
So the next captcha will be a list of AI-generated statements and you have to decide which are bat shit crazy?
“Recall uses Copilot+ PC advanced processing capabilities to take images of your active screen every few seconds,”
Seems like a lot of extra disk thrashing that would shorten the life expectancy of an SSD? Like it would be considerably more than your usual background chatter of daemons writing to log files and what not. Unless I’m misunderstanding this?
We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.
I think I could get very nervous coding for the military, depending on what sort of application I was working on. If it were some sort of administrative database, that doesn’t sound so bad. If it were a missile guidance system, on man! A single bug and there goes a village full of civilians. Even something without direct human casualties could be nerve-wracking. Like if it were your code which bricked a billion-dollar military satellite.
Speaking of missile guidance systems, I once met someone who worked a stint for a military contractor. He told me a story about a junior dev who discovered an egregious memory leak in a cruise missile’s software. The senior dev then told him “Yeah, I know about that one. But the memory leak would take an hour before it brings the system down and the missile’s maximum flight time is less than that, so no problem!” I think coding like that would just drive me into some OCD hell.
I have only written potentially life-threatening code once in my life. It had to do with voltage/current regulation in the firmware of a high-powered instrument used by field workers at the company where I work. It was a white-knuckled week I spent on just a single page of code, checking and re-checking it countless times and unit testing it in every conceivable way I could imagine.
Fair enough. I’m just looking for some independent confirmation as this is pretty big news.
Is this official though, or wishful thinking on the part of Cameron?
1st reaction: lmao
2nd reaction: hey wait, this is pure genius!
I have some vague recollection of a hacker convention from the 90s where people were challenged to come up with wireless networking in a one night coding marathon. (This was long before wifi.) So some dude used speech synthesis to get a machine to say “one zero one one zero…” and another to assemble the binary data into packets using speech recognition. It was hilarious, and the dev had to keep telling people to shut up and stop laughing so he could complete the demo.
But anyways… what I’m trying to suggest here is you might have the best luck if your notification sounds contain spoken commands and you use speech recognition to trigger scripts? That tech is pretty mature at this point.
FTA:
By 2030, the country’s goal is to manufacture chips using a 28nm process technology – something TSMC did in 2011.
That’s assuming they really do have no choice but to do all fabrication domestically.
In Putin’s Russia, even the chips defect.
There should be a law that whenever this happens, the changes must be highlighted in bold.
I was astonished to find the other day that LibreOffice has no problem opening ClarisWorks files. That is an ancient Mac format that even Apple’s Pages has long since abandoned.
I remember thinking last summer it was a good thing we still had some N95s in the closet. I wore them when I was riding around on my bike and they did help.
Wouldn’t 3k Euro be essentially the luxury sedan of ebikes? My first was a class 1 aimed at tourist rentals that cost around $1.5k CAD ($1k Euro). I considered that entry level at the time, though there are cheaper ones out there now. My current one is a $2.5k CAD ($1.7k Euro) class 2. It is pretty much everything I could want in a bike. I can’t see myself spending more. Well, maybe a cargo model would cost more?
I get what you’re saying about theft though. I am lucky in that I have indoor parking both at home and work. A coworker of mine lives in a condo where he can’t park a bike indoors. So while it was thankfully never stolen, he was sufficiently nervous about it that he eventually sold his and replaced it with an escooter.
So jealous… I think where I live, a doubling of the cycling population would be like “Oh hey look, there’s another guy!”
You can always combine integer operations in smaller chunks to simulate something that’s too big to fit in a register. Python even does this transparently for you, so your integers can be as big as you want.
The fundamental problem that led to requiring 64-bit was when we needed to start addressing more than 4 GB of RAM. It’s kind of similar to the problem of the Internet, where 4 billion unique IP addresses falls rather short of what we need. IPv6 has a host of improvements, but the massively improved address space is what gets talked about the most since that’s what is desperately needed.
Going back to RAM though, it’s sort of interesting that at the lowest levels of accessing memory, it is done in chunks that are larger than 8 bits, and that’s been the case for a long time now. CPUs have to provide the illusion that an 8-bit byte is the smallest addressible unit of memory since software would break badly were this not the case, but it’s somewhat amusing to me that we still shouldn’t really need more than 32 bits to address RAM at the lowest levels even with the 16 GB I have in my laptop right now. I’ve worked with 32-bit microcontrollers where the byte size is > 8 bits, and yeah, you can have plenty of addressible memory in there if you wanted.