|
Sander Rossel wrote: I'm using IMAP, but on the mail server all my mails are unread and uncategorized, so it doesn't sync anything except inbox mailserver to inbox Gmail (one way).
Is this something you have limited? I used other emails on GMail and have not faced any issue.
Alternatively, create a new email. I checked and can confirm that sander_cool_dude on outlook is available.
"It is easy to decipher extraterrestrial signals after deciphering Javascript and VB6 themselves.", ISanti[ ^]
|
|
|
|
|
lw@zi wrote: Is this something you have limited? I used other emails on GMail and have not faced any issue. Nope, I'm not sure what it is or if it's supposed to sync labels too, but it doesn't do so by default it seems.
I've checked the Gmail settings and it seems it should sync labels though.
Creating a new email isn't an issue as I have my own domain name, but that's really not what I want
|
|
|
|
|
This has clusterfuck written all over it
"There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies. The first method is far more difficult." - C.A.R. Hoare
Home | LinkedIn | Google+ | Twitter
|
|
|
|
|
I have multiple gmail accounts for different purposes and use Thunderbird to transfer emails between accounts without having to forward everything. On occasion I mess up and send from the wrong account.
You can download everything from gmail into Thunderbird, and you should be able to upload into Outlook.
Warning: Thunderbird doesn't understand labels -- it treats them as folders. If an email has more than 1 label, it will get downloaded once for each label.
|
|
|
|
|
I recently rewrote the hand rolled Slang parser to make it better.
That was successful. It parses all my test material, and faster than the two previous parsers.
However, I then began reintegrating it back into projects like Deslang and Parsley.
So far so good, except wait, this is curious.
Deslang is no longer able to resolve my LookAheadEnumerator code! (Not even part of the parsing process!)
Funny thing, it can't determine the runtime type for IEnumerator<t> (duh! it has no runtime type - it's an uninstantiated generic!)
I'm not confused as to why it doesn't work. What blows my mind is that it worked before.
I have no idea how. It shouldn't have, as far as I can tell.
And yet, I know it did, because if it didn't, Parsley couldn't even parse word one. LookAheadEnumerator sits right next to its black little heart.
I still have the old codebase, but it's non-trivial to find out why Deslang worked in this case. So much easier to trap error conditions than success
Update: I fixed it all and figured out the old magic. I just convinced myself that magic wouldn't work, but yes Ramona, you can call
typeof(IEnumerator<>).GetMembers()
and get a meaningful result
Update 2: in what was one of may the most hair raising migrations i've encountered since i left the professional fold, I've migrated all of BuildPack to use the new slang parser.
19 projects, all building successfully (which requires slang to work or these build tools error!)
so yay. If this blew up i don't know what i'd do.
Steve Wozniak is the only thing at Apple that isn't evil.
modified 13-Jan-20 7:34am.
|
|
|
|
|
Obligatory YouTube: The Reason Why[^]
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It worked
Steve Wozniak is the only thing at Apple that isn't evil.
|
|
|
|
|
Have you tried turning it off and on again?
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
actually kind of.
But seriously, I found the problem. It *was* with the parser and it was silly.
ParseNamespaceName()
was returning "SystemCollectionsGeneric" for "System.Collections.Generic" so types weren't getting resolved.
I figured out the magic of how the old codebase worked and it's still working.
Basically, you can do
typeof(IEnumerator<>).GetMembers()
To cheat when you don't have a generic type argument to pass. Like when I find
IEnumerator<T> in my code.
For some reason I got it into my head that this didn't work.
Steve Wozniak is the only thing at Apple that isn't evil.
|
|
|
|
|
I used PaintShop Pro 2020 today, and when I went to close it, it announced that Update 2 was available, and did I want to install it?
That's fine - it's the civilized time to ask, when I'm finished with it. So I say "Yes", up comes the UAC box, OK, Setup starts. First it complains that the app is still running ...
Then it starts, and it gets about 20% or so and just sits there saying "Installing" - after half an hour or so, I add a Explorer window so I can see if the progress bar is just moving so slowly I can't see it, and fifteen minutes later it still hasn't moved a pixel.
So I reach for the "Cancel" button ... and it jumps to 75% and starts showing file names as soon as the mouse is over it.
I hate that - just try to give us kind of indicator that you are alive, allright? Or has Corel employed the guy who wrote the original Windows "File Copy" progress indicator?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Obligatory xkcd: Estimation[^]
"Five fruits and vegetables a day? What a joke!
Personally, after the third watermelon, I'm full."
|
|
|
|
|
It's art, man.
The dramatic build-up of tension is important to the overall artistic experience.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
OriginalGriff wrote: First it complains that the app is still running If you want to write over a file, it cannot be in use. Not something new, is it?
OriginalGriff wrote: So I reach for the "Cancel" button ... and it jumps to 75% and starts showing file names as soon as the mouse is over it. Sounds like it wasn't updating the UI for a bit.
OriginalGriff wrote: Or has Corel employed the guy who wrote the original Windows "File Copy" progress indicator? If you can calculate the average time it takes to move a MB, you can calculate the estimated time for the entire operation. If the blocks you move are of different sizes, then the speed will differ during the operation, meaning the average changes. I like that dialog; it does not say it will be done at that exact time, it just tries to update its best guess regularly.
So, when I show a progressbar, I include the estimated time remaining. You write the functionality once and embed if in the progressbar.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: If you want to write over a file, it cannot be in use. Not something new, is it?
No, and I agree. But ... when I say "close the app" and it pops up "do you want to update?" surely it should continue and actually close the app just to be civilized?
when I show a progressbar, I include the estimated time remaining. You write the functionality once and embed if in the progressbar.
Yes - and particularly when you write an updater, because you know what files you are replacing with what size new ones and should be able to come up with a "smooth" progress bar and a "reasonable estimate" of how much time is left.
This is just lazy coding by Corel ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: But ... when I say "close the app" and it pops up "do you want to update?" surely it should continue and actually close the app just to be civilized? Only if the app allows that; it can block being shut down, like in case they tend to do when there's unsaved work.
OriginalGriff wrote: Yes - and particularly when you write an updater, because you know what files you are replacing with what size new ones and should be able to come up with a "smooth" progress bar and a "reasonable estimate" of how much time is left. That's the "improvement" I noticed in Win8; it tries to give a better estimate by enumerating all files to get their sizes before it does anything else. If you have a lot of files, you can see the difference with the Win2000 version.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I think the point is: if the app offers an update on close then it should just do the damn job properly and make sure the program is shut down first... like most applications that have this feature do.
Eddy Vluggen wrote: Only if the app allows that; it can block being shut down, like in case they tend to do when there's unsaved work.
If the app shows the "update" dialog before the "unsaved changes" dialog then that's some pretty awful design. Also, the app should be capable of checking it's own state before it shows the update dialog to make sure it can be shut down without problems.
|
|
|
|
|
musefan wrote: If the app shows the "update" dialog before the "unsaved changes" dialog then that's some pretty awful design. I think it is awfull design to have the app try to update itself.
musefan wrote: Also, the app should be capable of checking it's own state before it shows the update dialog to make sure it can be shut down without problems. Most installers offer the closing
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: If you want to write over a file, it cannot be in use. Not something new, is it? 40 years ago, when I was still a student, I learned about how computer controlled phone switches did updates by managing two parallell versions of a code module, both active. When the updated module is installed, all new connections use it, but already established connections continue to use the old version. As soon as the reference count for the old one drops to zero, it is released from memory.
For years I have been reading about the Windows facilites for marking a file for deletion at reboot (although I haven't actually made use of it myself). The file is no longer visible in the file system directories, but it remains available on disk for processes that have already opened it. This certainly has been available since the arrival of Windows 7, but I would guess that it came with XP (or that it has always been present in the NT family).
So you can certainly replace a file that is still in use. If you insist on using exactly the same physical disk pages, all the users of it must have opened the file in the proper mode, but you don't need that neither to delete a program system nor to update to a new version.
Eddy Vluggen wrote: Sounds like it wasn't updating the UI for a bit. Isn't that exactly the complaint? If the procedure doesn't care to make any update between 20% and 75% completion, it really needs no progress bar. A message "Operation started" says it all. I sometimes use that myself when the operation involves running software over which I have no control, but for an uninstall procedure, the software manufacturer certainly should kow what is to be done!
Even if you activate the uninstall from the running application, this may activate the uninstaller as a separate program, running as a detached process, and then terminate. If the uninstaller is set up to run from a paged image, it may delete both the application files and its own program file; the .exe needs not to be kept open if all code has been read in for paging. There are scores of uninstallers that do a complete uninstall, including deleting the uninstaller itself. If it happens, it must be possible.
Eddy Vluggen wrote: If the blocks you move are of different sizes, then the speed will differ during the operation, meaning the average changes. The uninstaller knows which files to delete, how many and their size. In any case it should start out with a rough idea of the performance you can expect from a typical disk, both in time to create / open / delete a file, read / write a megabyte etc. Maybe, if the uninstaller doesn't care to check if the disk it an old rotating one or a flash disk, the first estimate may be a little off, but simply looking at how long the first few operations took should quickly correct that first guess.
As the operation progresses, with the number of completed operations continously increasing, the estimates ought to be more and more stable, not jump all over the place. If you have deleted X files, releasing Y Mbytes of disk space, you should expect the next file delete to require 1/X as much fixed time plus Z/Y (with Z being the size of the file) in variable time. Of course you can add other factors into the calculation as well, but even a primitive fixedTime + volumeDependentTime estimate based on previous operations ought to give us far more stable progress bar progress than we see in most programs.
|
|
|
|
|
Member 7989122 wrote: So you can certainly replace a file that is still in use. Not in Windows; replacing requires writing (and locking) to a file that is in use.
Member 7989122 wrote: Isn't that exactly the complaint? That's two different things; the progress not being updated by the installer, or Windows not painting because the main-process is busy.
Member 7989122 wrote: The uninstaller knows which files to delete, how many and their size. Usually, it doesn't.
Member 7989122 wrote: but simply looking at how long the first few operations took should quickly correct that first guess. That's exactly what the Windows progressbar shows when copying files and where all those people complain about it moving from 10s to 30m to 15s.
Member 7989122 wrote: not jump all over the place. Given a FS works, the file may be "all over the place". Also, some updates may have changed sizes, and there'll be a lot of small files created to save preferences and stuff. Lots of small files take more than than a single file of the same size. So no, not as easy as you paint
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Eddy Vluggen wrote: Not in Windows; replacing requires writing (and locking) to a file that is in use. If the application starts the uninstaller and then terminates, the application executable is not in use. The uninstaller might also check for other instances running, and present a list to the user, offering the user to close down each instance in a controlled way, or optionally killing those processes forcefully. I am regularly handling several uninstallers that do that.
Now that I look it up, I see that with the delete/rename on reboot, the old file name is valid until reboot time. But at that time it certainly disappears. I mixed it up with another OS I worked with, where you could delete the name of a file to make it temporary: On Close() on a file that has no name, the file is deleted. (So you could make a temporary file by creating it with an empty name.)
So, with the Windows delet/rename at reboot mechanism, there is meaning to the question if you want to reboot now: With some uninstallers, some files may not disappear until reboot. But again: Scores of uninstallers manage to remove all relevant files, including itself an the directory where the uninstaller was residing, without a reboot. They are using other mechanisms. It happens, so it must be possible.
Eddy Vluggen wrote: That's two different things; the progress not being updated by the installer, or Windows not painting because the main-process is busy. It is obviously possible to set your screen update thread to a very low priority and the worker threads doing the deletion and cleanup to a very high priority. That is a case of "You Asked For It, You Got It". But I wouldn't say that is a recommended way to make your uninstaller, or any program with a progress bar. You give the display update process a high enough priority that it is not significantly delayed - certainly not for minutes, as the original poster experienced - by the background worker process.
Eddy Vluggen wrote: Usually, it doesn't [know the files to delete]. In that case, I would think it could run into problems deleting them. Sure, at entry to "int main(int arc, char**argv)" it doesn't know, but it should create a list of files to be deleted quite early in the process.
Eddy Vluggen wrote: That's exactly what the Windows progressbar shows when copying files and where all those people complain about it moving from 10s to 30m to 15s. First: The jumping up and down continues throughout, it doesn't stabilize as the uninstaller, or whatever, gains experience with how fast the disk and other system components are (although my gut feeling is that it was much worse in earlier versions of Windows). Second: While there may be some flucutations of the estimate, nothing can justify orders of magnitude jumps up and down! We frequently see estimates jumping up or down by a factor of ten or more, indicating a completely crazy estimate of remaining work.
Eddy Vluggen wrote: Given a FS works, the file may be "all over the place". The estimate may jump all over the place, from 10 sec to 30 minutes to 15 sec. The files' disc locations doesn't influence that very much (more so on disks thirty years ago, but very little with modern disks).
Eddy Vluggen wrote: Lots of small files take more than than a single file of the same size. So, as I wrote, there is a time cost per file delete. Lots of small files will each cost the time of a file delete. When the uninstaller digs up the directory entries to create the file lists, it should also be able to see the actual size of a file: Releasing the space for a gigabyte file takes more time than releasing 2 kbytes of file space. This is particularly true on file systems allocating on a page basis (such as DOS and "classical" *nix file systems); on NTFS (and other files with extent allocation), it is more a function of the number of extents than of disk pages. On a fully defragmented disk (every file has only a single fragment) there will be far less difference in time between deleting a 2k and a 1G file, although there still is some. But if the last three files of 1 Gbyte each took half a second to delete, the uninstaller may assume that number four will take roughly as much time. If the last twenty kilobytes files each were deleted in 10 ms, it may assume that the next fifty kilobyte files will require a similar amount of time each.
It seems like some programmers handling progress bars are not thinking at all! E.g. if you are given a "pause" option, and use it, the clock is not stopped when the operation is paused: If half of the files/megabytes were processed in 10 seconds, and then you pause everything for 30 seconds before continuing with the next file, the time estimate is calculated from the 30.01 seconds it took to handle the first file after the resume, completely ignoring that 30 of 30.01 seconds were not used for processing the file.
It is not easy to get a perfect pre-calculation that will be correct within a margin of, say, 5%. But getting within a factor of two should be easy, within 50% definitely also realistic - at least when the operation is 10% on its way.
I made myself an incremental, deduplicating backup system, that handles all my disks of various technologies and interfaces. In the backup management data, I keep information for each individual disk how much time it typically takes to open a file (in various modes), how long it typically takes to read 10 kbyte, 100 kbyte, 1 Mbyte, 10 Mbyte, how long a create or delete takes. After each backup is performed, these figures are adjusted. They might be somewhat off from the "historical" value, e.g. if I usually plug in the USB3 disk in an USB3 connection, but then one day plug it into an USB2 connection. It is detected quite rapidly during the backup that one disk is slower than the management data indicates, so the estimated time for completion is adjusted. The managed value is adjusted somewhat, being set to 75% old value + 25% new value (percentage was arbitrarily chosen) so that it remains fairly stable but will slowly adapt to e.g. new interfaces.
An uninstaller, being run once or maybe a couple of times, with no place to save former experience, cannot easily save historical data the way I do in my backup system. The application could, though: I have been using several applications that can tune itself by running timing tests on the speed of disks, display cards and other components. These data are saved a configuration file (or in the Registry) for use by the application itself, but it could also be used by any other function that displays a progress bar, such as an unistaller.
Getting a good progress indicator, a lot better than what is typical today, is actually not that difficult. Getting a perfect one may be difficult, but we are not asking for that. Only for something that is reasonably close to real world operations.
|
|
|
|
|
Member 7989122 wrote: If the application starts the uninstaller and then terminates, the application executable is not in use. That's what we usually do, since you can't overwrite the executable while it is in use.
Member 7989122 wrote: In that case, I would think it could run into problems deleting them. No, since it usually keeps those files in a folder it knows. Updates aren't downloaded to random places.
Member 7989122 wrote: The estimate may jump all over the place, from 10 sec to 30 minutes to 15 sec. The files' disc locations doesn't influence that very much (more so on disks thirty years ago, but very little with modern disks). It was one example of things influencing speed. The amount of data a sector holds also plays a role; in my own Dokan-based FileSystem, it is visible that even the amount of folders plays a role, since for each "Get Directory contents" command Windows will look for some files it expects there. Also interfering with the correctness of the total time-calculation here is your virus-scanner.
So yes, the estimate is going to jump a lot. It is what it is, an estimate, where people act as if it is a promise.
Member 7989122 wrote: Getting a good progress indicator, a lot better than what is typical today, is actually not that difficult Go ahead and write an article on how
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
OriginalGriff wrote: Or has Corel employed the guy who wrote the original Windows "File Copy" progress indicator?
Corel has left such a bad taste in my mouth - ever since the Windows 3.1 days, when their software would take over your system and they'd replace user controls with their own version because they were so arrogant they honestly believed they were doing a better job of it than Microsoft. If you tried to uninstall their software, odds were pretty good you were left with an unusable system and the only solution was to pave over the entire thing...
I have no idea if they're still true to their roots to this day. I don't ever intend to go out of my way to find out.
|
|
|
|
|
Back in the days of dinosaurs, I bought Paint Shop Pro 5. This was before Corel bought it out. I still use it every day. Later I tried one of the Corel versions , but hated it so I dumped it.
Rarely I want to do some fancy image editing, but then I use an old version of Photoshop (version CS2).
|
|
|
|
|
I stuck with the JASC Version up to V9 (the last pe-Corel version) for yeas, until it's problems with Aero on Win7 finally moved me to X6 in 2014. (It was just hassle to try and do screen shots: you had to take the shot, run PSP which disabled Aero, save / edit the image, close PSP which re-enabled Aero, take the next shot, ...
I am currently using PSP 2020 due to some excellent special offers over the years! It's pretty good, but not as "focussed" as the JASC version was - Corel is damn good at adding bugs and bloating, damn them.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Hi All,
I was wondering if others who had seen shared the opinion there were some ideas that were not developed or fully used. Plus X-Wings are really sealed again moisture ingress (mind you they are space ships...)
Glenn
|
|
|
|
|