|
I would start at sysinternals, process monitor.
Windows Sysinternals | Microsoft Docs[^]
I would also consider putting in another hard drive (SSD) and testing with a new install of Windows. I will avoid the obligatory suggestion of installing that other OS.
>64
If you can keep your head while those about you are losing theirs, perhaps you don't understand the situation.
|
|
|
|
|
I am certainly considering upgrading to a SSD, as at present it is just a 7200 rpm hybrid, but for it's main purpose of music streaming, it hardly seems worth all those beer tokens.
Thanks for the Sysinternal prompt - I am just now going to give it a whirl.
|
|
|
|
|
Maybe it doesn't like the music you are streaming.
Seriously, how old is the laptop? I hate to say it but perhaps the cpu is just dying.
I have had that happen to me.
Hope you figure it out.
|
|
|
|
|
Have you checked that its airways are clear? Laptops have a habit of building a lovely dustbunny just downstream of the fan, against the fins it is trying to cool.
In most of them it's a matter of around 10 screws to remove to get access, but in some cases as high as 47! (Lenovo ThinkPad X300, I'm looking at you)
Maintenance manuals are generally available online.
Open it up, work through with a soft paintbrush and vacuum. I'm not a fan of canned air - it just tends to relocate the dust to somewhere worse.
Cheers,
Peter
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Peter_in_2780 wrote: I'm not a fan of canned air - it just tends to relocate the dust to somewhere worse.
I'd agree with that - normally into my eyes at some point.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
All been done and there is good - but very very hot - airflow from the port. One minor curiosity - when the machine locks up, the fan continues to run, but I guess that's to be expected. Also the screen stays showing what was happening when it died.
|
|
|
|
|
Are you sure it's not something like a bad/clogged fan going to your GPU?
Real programmers use butterflies
|
|
|
|
|
The heat sink is a thick copper bar, as explained above, and it handles both CPU and GPU. If it were a blocked/failing fan then both would overheat.
|
|
|
|
|
Still could be the paste bonding the GPU to the sink went dry/bad.
That used to happen to me on old CPUs a lot. If the GPU routinely ran hotter than the CPU then I could see the paste decaying at different rates.
Anyway, just a thought. Most of the time I face catastrophic issues with my PCs it's either a hardware problem or a Windows update.
Real programmers use butterflies
|
|
|
|
|
|
Chris C-B wrote: switch long press needed. It always happened when the GPU was working hard, but the CPU temperature never went above 78°C - well within bounds for a Core i7 - and usually ran at around 65°C.
Just noting of course that the CPU and GPU are two different things. And in two different places.
Brief search also suggests that GPUs have a higher failure rate than CPUs. So if the GPU is on the edge of a failure then running hotter but not hot, might push it over the edge.
|
|
|
|
|
I am fully aware of that, but I had no way of measuring the GPU temperature. They are, however, on the same cooling bar, the thick copper bar heat sink that snakes from one to the other.
|
|
|
|
|
I am replying to my own message, having, I hope, resolved the issue. I stopped tvnserver.exe loading at startup, and all now appears to be well - it ran last night quite happily from late afternoon to 23:00. I am not suggesting that tvnserver.exe was the direct cause of the problem, but it may well have been letting something else in that was burning up the GPU.
|
|
|
|
|
OK, this place has a lot of good articles to read, and there was one that wanted to make a comment on, so I sign in, which then triggers an E-mail sent to me that has a link that logs me in. So I go back to the original article that I had wanted to comment on, but that page has me as not being logged in! Even if I copy & paste the URL from the original tab to the tab that got opened with the E-mail link, it still comes up as me not being logged in, and when I log in from there, to paraphrase Lou Costello, we're back again at getting an E-mail link (i.e., 1st base).
|
|
|
|
|
What did you select as your cookie settings? If it's "everything off" then it can't store your login status and that may be something to do with it.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: What did you select as your cookie settings? If it's "everything off" then it can't store your login status and that may be something to do with it.
The logins for all other webpages persist fine.
|
|
|
|
|
No, the site specific pop-up that lets you disable ad cookies for example. If you said "disable all" some sites take that as "no cookies at all" and forget logins as a result.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
re Medium: [^]
As a writer, I'd never use Medium because:Quote: Unless otherwise agreed in writing, by submitting, posting, or displaying content on or through the Services, you grant Medium a nonexclusive, royalty-free, worldwide, fully paid, and sublicensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your content and any name, username or likeness provided in connection with your content in all media formats and distribution methods now known or later developed on the Services.
Medium needs this license because you own your content and Medium therefore can’t display it across its various surfaces (i.e., mobile, web) without your permission. For a critique of using Medium to post your own content: [^]
As reader, I'm too cheap to pay for its premium content or full-access But, I do find certain free articles I really enjoy, like: [^]
As a non-subscriber, you should have access to a couple of non-premium articles a month.
I don't think the not-logged-in problem is cookie related. I'm not a member, and, I have Privacy Badger running on Chrome, and, it's actively blocking two Medium trackers, but, I have no problem logging in using the link they send me.
I assume you've reloaded the page with control-F5, and you are not trying to access premium content.
«One day it will have to be officially admitted that what we have christened reality is an even greater illusion than the world of dreams.» Salvador Dali
modified 17-Jul-21 4:43am.
|
|
|
|
|
Bug in my code: "Oh yeah, small oversight on my part, happens to the best of us!"
Bug in someone else's code: " ing piece of code! Dumb ing programmers are totally clueless and I hope they ing die in a fire!!! "
|
|
|
|
|
Don't hold back. tell us how you really feel!
"It's better out than in." - Mrs. Cosmopolite
|
|
|
|
|
I'm trying to do in-place decompression of zip files such that you don't need to actually extract the streams in order to get the decompressed contents.
Basically, you can open a zip and read it and it will decompress on demand while you do block reads off of a special implementation of a stream it gives you.
I'm not great at math, or doing compression and cryptography algorithms even if I vaguely understand the concepts, so it was lucky for me that I found some of the relevant code in the public domain.
However, it does callbacks such that you give it a function it uses to flush data as it writes.
I can't use callbacks because I'm presenting a stream interface. You need to be able to request a block to be decompressed, at which point one fragment of the decompression takes place.
So basically, I need to turn this into a coroutine.
It's a bit like turning a SAX style xml reader into an XmlReader style pull parser.
It's mind bending. One big issue is I don't know how big the minimum buffer for an huffman block is, and even when I do I have to sift through some incomprehensible code.
What I thought would take me part of a day may take me a few, but it's an adventure.
I keep telling myself that in the end this will be worth all the effort because it means I can create better e-pub readers on cheaper hardware, and possibly even browse the web from an ESP32 or soon, an ARM Cortex-M or other IoT gadget (the code should work, I'm just not clear on the memory requirements of my own code yet, but it's light).
Why zips? Zips are just part of the mess because EPUBs are renamed zip files that contain their html and image content. And I'll be using them as packages to deploy HTML based UIs probably as well.
The only reason any of this works is all of it is streamed on demand and progressively loaded, from the zips, to the images and html contained therein, to the truetype fonts used to render the text. It's all demand streamed so it never needs to be loaded all at once, keeping the memory requirements tiny
It's gonna be so darned cool. Imagine a (bare bones, think lynxish but with graphics) web browser on a $5-$10 SoC
Real programmers use butterflies
|
|
|
|
|
I don't know if it helps, but:
We use zip compression in the file format for a logging application. Our approach to this problem is to compress incoming data until the size of the currently compressed content reaches or exceeds a given size (in our case, 64K). At that point we write the compressed size to the file followed by the compressed data.
When reading we do the reverse. Read the size, use the size to read the compressed data, decompress. While we don't need to create fixed size buffers on reading, you could alter our approach. While you're compressing incoming blocks, accumulate both the compressed and the uncompressed totals. When the uncompressed total reaches your decompression buffer size, output the compressed size and the compressed block. That would guarantee a limit to the size of the buffer you would have to allow for on incoming data.
This approach of course requires that you control both the compression and decompression handling of the data. If you can only control one, then this wouldn't apply.
Software Zen: delete this;
|
|
|
|
|
That's essentially what I'm going to do, but in order to do that I either need to finish reverse engineering this code or write my own. I'd prefer to reverse engineer this stuff though since it's also used for PNG decompression elsewhere and I'd like to share bits to keep the code footprint down.
Real programmers use butterflies
|
|
|
|
|
Xiaomi overtakes Apple as number two smartphone vendor for first time - The Verge[^]
tl;dr?
Mobile phones sales wordwide:
1st (19%) Samsung
2nd (17%) Xiaomi
3rd (15%) Apple
It was Huawei above Apple until the US sanctions stopped them using Android ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Where's Motorola? Oh, it's Lenovo now? Where's Lenovo?
|
|
|
|
|