|
Quote: User of MSAs must agree that Microsoft can collect certain personal information. That information includes "credentials, name and contact data, payment data, device and usage data, your contacts, information about your activities, and your interests and favorites," according to Microsoft's "Privacy Statement" document. F*** Microsoft.
|
|
|
|
|
A classic example of, “if you’re not paying for it, you’re the product.”
TTFN - Kent
|
|
|
|
|
Quote: Possibly, the announcement of an MSA requirement for Windows 11 Pro edition personal users doesn't just apply to Windows Insider Program testers, but also will apply more broadly to future purchasers of the Windows 11 Pro edition product. Maybe even if you pay for it. By the sound of it, that 'maybe' is disappearing.
|
|
|
|
|
Why do I suspect that they'll define personal as "not on a domain" to suck in small businesses and schools too?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Dan Neely wrote: Why do I suspect that they'll define personal as "not on a domain" to suck in small businesses and schools too? Because you are a realist?
|
|
|
|
|
Because that's how Microsoft has defined personal use since at least Windows 7.
|
|
|
|
|
The hearing comes after the Pentagon report on unidentified aerial phenomena was released last June. Because nothing will solve UFOs like IPPs (Identified Perambulating Politicians)
It's not like there are other issues to solve these days
|
|
|
|
|
UFOs - Unprincipled Financial Operators?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
A masterpiece!
TTFN - Kent
|
|
|
|
|
Daniel Pfeffer wrote: UFOs - Unprincipled Financial Operators? Will be included in the next C++ revision as the $$$ operator.
|
|
|
|
|
This release brings continued improvements to the C# and .NET experiences, new Git performance and experiences, updates for C++ developers, and new Azure tools for local development and deployment. Because you can never have enough 17.2 versions of products
|
|
|
|
|
1 hour later... and still 0 byte downloaded...
|
|
|
|
|
Intel's 12th-gen CPUs have already made a strong showing against AMD's Ryzen 6000 chips, but now Intel is taking things to another level with its "HX" processors. Not a laptop, but a lap cooker
|
|
|
|
|
Someone should make a video of making an omelet on this CPU.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
My XPS15 laptop has 10 cores, hyperthreaded to 20. Since even today very little software handles multiple cores I'm not sure the reason for this other than to one-up AMD.
|
|
|
|
|
obermd wrote: even today very little software handles multiple cores Certainly true, and I must admit that I feel a special joy when I run a simple application that manages to utilize my 12 cores (6 hyperthreaded) to 95+% load, doing useful work. It doesn't happen very often! (My most recent example: Using Handbrake to recode a Blu-ray image into a h.265 file.)
Many years ago when I went from a single-core to a two-core CPU, heavy applications didn't speed up that much - but all slogginess in mouse and keyboard response disappeared completely, when interacting with one application while another one was eating CPU in the background. Having one core for each application made a tremendous difference.
Going from one to two is of course a lot more fundamental than going from 12 to 16. You cannot expect very much improved performance from a single application, in most cases. But you can probably start 12 different applications eating CPU at the same time, with each one of them running at almost the same speed as if they were alone.
|
|
|
|
|
More power efficiency's the main reason for this. It'll have 8 fast hyperthreaded cores for heavy workloads; and 8 optimized for lower power consumption for lighter ones.
I haven't see anything in depth about how well the power savings works in the real world. We know it works on mobile (ARMs big.LITTLE), desktop OSes are much more complicated in terms of what they have going on though. On the one hand that means a lot more background crap that can be offloaded because it's non-time sensitive. On the other a lot more stuff makes the scheduling a lot more complicated.
ex At one point MS stated they intended for W11 to send everything but the foreground app to the efficiency cores. While that's a reasonable baseline there are cases where it's not what the user ones, ie when they're doing time filling stuff while waiting for a video rendering job to complete.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Rust is infamous for having a steep learning curve. Assuming you never sleep either?
|
|
|
|
|
And the crux of the reason Rust is "hard" is in the first paragraph:
Quote: but with a good mental model of how memory works, how objects move,
Far too many "programmers" today have no clue to how memory works or how you move stuff around in memory.
|
|
|
|
|
An OS programmer obviously must know the details of "how memory works or how you move stuff around in memory", but for a plain programmer, I am not so sure.
With veteran cars, you may have to learn double clutching, but at least since 1960 (maybe even earlier), all standard cars have had synchronized gearboxes, and you don't even have to know how that synchronization is achieved. Up until the 1980s you should (at least if you live in a country where winter temperatures are far below freezing), you should understand the whys and hows of a manual choke. For the last 30 years, new cars all have automatic choke. Electronic ignition has removed the need to understand how a distributor works. And so on, and so on ...
Even though we oldboys think it a pity that young programmers know nothing about, say, disk scheduling and memory management hardware, we must be willing to accept that some of our knowledge is of secondary importance, at best, to the everyday application programmer today. A lot of it isn't even relevant to OS programmers, such as calculating the track and sector number from a logical disk address.
I think it is time to accept that application programmers need not know how you move stuff around in memory. Of course they should be aware of the total space requirements, and relate that to available resources, but then leave to the OS and MMS hardware to provide it. Just like we leave to the file system and disk logic to provide a block of data from the disk, without understanding track and sector addressing.
|
|
|
|
|
trønderen wrote: Even though we oldboys think it a pity that young programmers know nothing about, say, disk scheduling and memory management hardware,
Our computers these days are "mini-mainframes", with I/O processors, graphics processors, etc. The only part of the computer that most of us program is the CPU, but each of the other components has non-trivial code running on it (e.g. the firmware built in to modern HDDs & SSDs, etc.).
trønderen wrote: I think it is time to accept that application programmers need not know how you move stuff around in memory.
I must disagree with you here. Storage locality (memory, disk, etc.) plays a large part in code optimization. The essence of optimization is intelligent caching of data; having your code "thrash" to disk or require a main memory read for almost all operations is guaranteed to slow your program down by a large amount.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Without an understanding of memory management, OS level task and thread scheduling, and even in some cases disk management, it is impossible to consistently write high performance code. This is especially true in garbage collected languages such as any dotNet language, Java, and even javascript. This lack of understanding memory allocations and management leads to poor design decisions making code that simply cannot scale without adding hardware.
Even business programmers, which I am, need to understand memory management to guide in the selection the correct data structures to ensure their applications are responsive and low impact on the user PCs. A perfect example was at my last job where we used OpenVMS running on a dual processor Alpha Server. We had an incredibly long and somewhat complex overnight processing script that needed to be complete by 6 AM every morning. As our data grew it was taking until close to 7 AM on some mornings. Using my general knowledge of task scheduling, disk performance, and CPU utilization I was able to rewrite the longest component of this process to partition that part of the problem into independent units and then submit those units for processing. Then I monitored disk I/O and CPU load to find the optimum number of simultaneous job partitions that could be run and set the VMS queue to allow that many jobs to run at one time. These changes cut three to four hours of our nightly processing time. Without my knowledge of system resource usage this simply wasn't possible and my boss, who actually wrote the application software, didn't think we could shorten the overnight processing time and was thinking about adding hardware.
|
|
|
|
|
"As long as you have a team of experts to do the hard stuff, it's complexity isn't a problem for everyone else" is hardly a ringing endorsement.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Thinking a bit more, it's not a hard/long term block for the language's growth. If the language maintainers/framework authors can successfully encapsulate enough of the hard stuff behind libraries that normal devs don't need to worry about it it could still go reasonably mainstream.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
What’s new going forward is the addition of a 4,000-qubit quantum computer by the end of 2025. 3-5 years in the future is the new 5-10?
|
|
|
|