|
It's just past noon and I'm sitting here drinking a cup of coffee; result!
|
|
|
|
|
Nice, my addiction finally payed off!
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
Immortality, here I come!
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I find this set of Microsoft Visual C++ 6.0 MFC reference library on ebay. its tag price is $90.
I am only interested in the MFC library reference Volume 1&2, so I am not willing to pay the tag price. not ready to give up, then I searched and find this set in thrift book with only $8.00. I order this set yesterday night.
Recently I am interested to know the origin of MFC library and try to pick something legacy. I am amazed to see MFC library still works well and evolves nowadays.
diligent hands rule....
|
|
|
|
|
|
thanks for this thread, I almost forgot this.
I got my answer from a legacy book on MFC already with the confirmations from this community.
diligent hands rule....
|
|
|
|
|
I had VC++ 6 and the reference disks but not sure if I still do
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
the chapter on resource file is really enlightening to me: I did not find it in other books.
maybe it is too simple topic...
diligent hands rule....
|
|
|
|
|
I had it all on a XP VM but when I upgraded to Win10 I lost it. Don't know if I still have disks or not, they're buried deep somewhere in an old drawer if I do still have them.
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
There were a lot of tricks when the entire windows system ran in 512K of memory. Resources were designed to quickly load, do something with them for the GUI and allow for a quick memory reclamation. E.g. the resources for a popup menu might only be retained in memory until the user selected an item or cancelled the menu.
Throw the resources into a DLL separate from the UI exe and that is the only thing that needs to be translated.
Exe loads the right DLL based on locales.
rc- resource compiler was always a separate make.exe step before linkage.
The slowest part of any GUI is usually the user.
MFC is just a convience layer on top of the C based OS layers.
I am agreement with another poster (maybe you?) that the backwards compatibility for Windows has been incredible. I still use a utility on Win10 that predates the scroll wheels on mice! It barfs on double byte encoding, but use UTF-8 for most items so it is fine.
|
|
|
|
|
good to know these history of resource files
diligent hands rule....
|
|
|
|
|
Why buy the books? I remember VC6 came with the MSDN documentation that can be installed on the computer from the DVDs. As for your "MFC evolves nowadays" comment, MFC evolves with newer Visual Studio with bug-fix and new features. Why bother with such an old product?
|
|
|
|
|
somehow I like the paper touch instead of staring at screen
diligent hands rule....
|
|
|
|
|
|
thanks for this great link! it is worthy for me to post this message:
diligent hands rule....
|
|
|
|
|
I am not sure what you mean by "evolves" because it has been in maintenance mode for quite some time.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I see some extension methods from legacy MFC methods...
diligent hands rule....
|
|
|
|
|
I thought this might stir up a few of the 'traditionalists' on here! Rather than answer the many comments individually, I'll try to respond with this single post.
I had spent nearly 5 years developing COBOL applications before our boss told us about the Pick database that we were going to be moving to. To be honest, I couldn't see how it could possibly work and raised the same concerns that have been posted here today. There are a couple of common observations:
Including 'DB rules' in the code/UI. First of all, we already do this. I'm pretty sure most developers would define the MaxLength for a TextBox and use a Calendar Widget for date input. Secondly, validation logic should be developed as reusable code - which minimises the need for future developers to learn and re-code that logic.
Performance. You are going to have to take my word for it, but I am 100% sure that with less disk, less CPU and less memory, I can deliver better performance than could be achieved using a traditional RDBMS. Also, indexing data does not have a performance hit. And the greater the volume of data, the more confident I would be that performance would be better.
|
|
|
|
|
Your mileage will vary. Particularly regarding indexing. Needless/useless indices definitely waste resources. The application you are working on may be very different from mine. Don't generalize based on one particular type of application.
I'm not real sure what you are on about though.
|
|
|
|
|
Hmmmm,
Many years ago I was tasked with writing a VDR[^] for the IMO[^] and the Det Norske Veritas[^] and most of the signals were coming in at around 10Hz and I was required to log them in 'real time'. Back then the IEEE defined 'real time' as 1.5 times the signal rate.
For performance reasons I chose SQLlite[^] and a single BLOB[^] in the voyage data recorder.
Worked great for a few years. Then we had two incidents, "incidents" in the maritime industry usually means millions of dollars of damage or greater. No problem I thought, let's pull the black box and see what happened!
The BLOB fields did not match the C++ structs we were using for logging the signals, something was off. After a lengthy forensics analysis we realized that the office in Norway was using a different build and the Norwegian C++ structs did not match with the U.S. code. They were somehow compiling the vessel navigation software with different signal headers.
The 'BLOB' field did not give us any evidence of what was different and we wasted an enormous amount of time investigating this. Also... these millions of dollars of damage were disputed between multiple nation states. In the maritime industry these judgements are decided by arbitrators rather than an actual legal system.
[cheers]
modified 28-Aug-21 5:48am.
|
|
|
|
|
Randor wrote: I should probably delete this so... If you do, I will be glad I read it before that happens! Your stories are always interesting.
|
|
|
|
|
I have worked with an organisation who catered to the vagaries of traders, who made them lots of money, the front end software allowed basically anything to be entered into a value field (many of them). The software sorted out how to deal with the garbage that was entered, the DB had no rules or definitions everything was a string (except dates, even they were not that stupid). So you would see 5.2m 300k 1.12b 3.3mill etc. Every time the system broke because some idjit entered new garbage the software company made $1000s to adjust the UI.
I was tasked with generating reports from such rubbish. Your premise sucks big grey donkey balls! Am I bitter and twisted after dealing with such garbage in, dammed right I am. I designed data structures that worked, were fast, efficient and anyone could use the data without knowing any arcane rules living in the UI.
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
I can just imagine what was entered by traders who blew up their accounts.
|
|
|
|
|
5teveH wrote: Including 'DB rules' in the code/UI. First of all, we already do this. I'm pretty sure most developers would define the MaxLength for a TextBox and use a Calendar Widget for date input. Secondly, validation logic should be developed as reusable code - which minimises the need for future developers to learn and re-code that logic. Called SQL. The "S" standing for standard, and SQL92 still being supported.
5teveH wrote: Performance. You are going to have to take my word for it, but I am 100% sure that with less disk, less CPU and less memory, I can deliver better performance than could be achieved using a traditional RDBMS. I'll outpace you with a traditional RDBMS. I will take that bet for a banana (not allowed to bet more, but I owe over a trillion bananas now).
RDBMS is not optimized for speed, but rationality. It is an abstraction layer (what lots of us pretend to write) that abstracts away physicical storage. You no longer have to remember at what location a record begins, because the RDBMS handles that. The RDBMS was for years our Data Abstraction Layer (The DAL, which many a company asked money for, while doing one on one calls).
5teveH wrote: Also, indexing data does not have a performance hit "Almost" none; but there's a nice gain on retrieving the index. That's why indexes exist.
5teveH wrote: And the greater the volume of data, the more confident I would be that performance would be better. I'd be looking mostly at the capabilities of the dba.
Also; we restrict some things at db level because that's how it is done. Given a system, where some desktop and some phone app interact with a db, where do you put the restriction? Come on, we been doing this for 30 years
You don't need to use an RDMS, nor need to understand what BNF is. That's your choice
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
5teveH wrote: Also, indexing data does not have a performance hit.
This is flat out wrong. Indexing unstructured data takes a lot of CPU time and disk time. Take a look at the amount of CPU and disk time spent by full text index systems during data insertion and updating. Updating can actually be worse as it requires you delete from the index as well as add to it.
Indexing fixed size fields is still expensive, but the disk IO is easily an order of magnitude less, which translates to better performance.
|
|
|
|