I created a C# appln using .Net 1.1v framework, which is tested on a system having .Net 2.0v framework installed. Application is opening and working fine. But some places, application is throwing messages "Object reference not set to an instance of an object." and continues to work. But these messages no where appeared on my system which is having only .Net 1.1 framework. But when I install .Net 1.1v framework on the test machine, application is taking the 1.1 framework and working fine for the previous test case. Why some classes of .Net 2.0 are not working with 1.1v framework created application assembly. Is .Net2.0 is backward compatible with .Net1.1???
I guess you've found that the answer is 'no'. In fact, you can mark your app to require the exact version that was used to build it. I'd have said that overall I'd not expect a problem, you should try to work out the issue, just in case it needs resolving either way.
Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog
This In General, we do like this only. We will work and create appln on .Net1.1, and later point of time, users can install .Net2.0 or .Net3.0 or some future release...We cannot pinpoint each piece of code again and again and do some work around for each framework. .Net framework itself should take care about this issue. But it is not doing.
Any body can please comment on this statement ".Net 2.0 is not backward compatible with .Net 1.1".
Well, you can either track down the issue and understand it, or you can trawl online forums, hoping someone will make a statement that makes you feel better about the issue you are having. I've never had issues moving from 1.1 to 2.0, and I've moved at least one large project over.
Christian Graus - Microsoft MVP - C++ Metal Musings - Rex and my new metal blog
And one more thing I observed is, Encoding class(ASCIIEncoding.ASCII.GetBytes(msg) method) is giving wrong results. When I try to create a key using the appln which uses .Net2.0 and again I try to create a key with .Net1.1, both keys are different. Here no error message is coming, but the keys are not matching.
This makes me to feel that .Net2.0 is not backward compatible with .Net1.1.
Please if anybody facing the same problem can throw some light on this...
The only reason why backward compatibility should exist is because you want to use .NET2.0 features or improvements. In that case, you will have to correct for incompatibility. You cannot prevent Microsoft to correct bugs and potentially unsafe code.
Microsoft designed the .NET framework to avoid problems with compatibility as good as possible. You can specify wich dll versions must be used in an appplication. I recommend to read Jeff Richters book Applied programming .NET framework (o something like that). In this book you find four chapters on deployment, assemblies and versoning. This exactly addresses your topic (which is very complex).
It cannot be 100% unless either no existing class is modified or each framework version also contains all prior versions. Both are not really an option.
I don't remember exactly where this happened but I had a case where .Net 1.1 had a small bug and I had to built a workaround in my code to get around it. After moving to .Net 2.0 I realized that this bug has been resolved (which is a good thing) but as a result my workaround (logically) began throwing errors.
Sure backwards compatibility is a great thing in theory - in practice we call it "DLL Hell" and it is a blessing Microsoft didn't make that mistake with the .NET Framework. As far as I remember you can set your application to not even attempt to run on the 2.0 Framework (I do wish this would have been the default setting, but we can't have it all).
Long time since I posted anything here...
Well, I've written a small app that reads a file from the disk into objects and stores them in a list. Then, what I wanted was to edit each of these objects individually and save them back to the disk. Yeah, just a stupid editor.
So, I discover a nice thing about DataGridView: I can use it together with BindingSource to access items in a List<>. Since I already have the list, I just do that:
_list = new Records("C:\\somefile.list");<br />
BindingSource bs = new BindingSource(_list, "");<br />
this.dgEntries.DataSource = bs; // Set the datasource for the DataGridView<br />
Records is a class that implements the IList interface. So far, so good.
Thing is, I go ahead and delete a few rows from the list and expect to see the result. But unfortunatly, the BindingSource seems unable to propagate the change correcly. No change inflicted upon the DataGridView is propagated to the list. Actually, I can't even sort the list.
Does anyone has a insight over the subject? Anything that would shed a light on how to correctly use the BindingSource would help me a lot.
have a look at one of my articles[^]. It adresses the issue with the missing sorting and filtering capabilities. For this I implemented a new class named BindingListView (which is contained in the project). It needs a type and an IList to work and can be bound to the DataSource property of the DataGridView. I haven't actually tested if change operations work but you might have a good chance that it does because I've implemented the complete IBindingListView interface which is the interface the DataGridView internally works one no matter what you bind to its DataSource property.
Be sure to download the DataGridView version and not the DataGrid version and please give me some feedback it it works or not .
Dispose is mainly provided to allow the prompt release of objects available in finite, small quantities like windows handles, or DB connections. The GC will eventually free the memory the object used but isn't smart enough to realize that some objects need disposed asap to free shared resources.
Rules of thumb should not be taken for the whole hand.
Is it mandatory to call the "Dispose()"-method of an object that implements it or is the reserved memory be freed by the GC at a later time anyhow? If yes, why should I use IDisposable at all?
It's not really the memory that is the concern, but unmanaged resources. If you don't call Dispose, the object has to be finallized to free the resources, and that is not guranteed to take place at any specific time, or actually at all. If there are too many objects that needs finalizing when the application ends, it won't have time to finalize them all, so it will just kill them off anyway.
Also, calling Dispose makes collecting the object more efficient, as it then can be garbage collected directly. If you don't call Dispose, the object will go through one garbage collection, be placed in the queue to be finalized, be finalized, then go through a second garbage collection before it's finally freed.
If I use GC.SuppressFinalize(this) like in the example, and forget to "manually" dispose a used managed component in the Dispose-method, do I then have a memory leak?
You have a potential resource leak. If this unmanaged resource happens to be unmanaged memory, you have a potential memory leak.
If the component that you did not dispose uses any unmanaged resources, they are not guaranteed to be freed. Sometimes the garbage collector will manage to free it in time anyway before it becomes a problem, sometimes not.
If Dispose() is not called, the Finalizer will do the cleanup.
Multiple calls to Dispose() do no harm.
There is just one method for cleanup to maintain [Dispose( bool disposing )].
And this method can be extended by derived classes.
As Guffa already pointed out, the interface IDisposable should only be used, if the classes uses unmanaged resources.
It isn't "mandatory" that you call Dispose in the sense that nothing will force you to include a call to it in your code. The use of IDisposable gives the person using your class an indication that there are unmanaged resources being used and that Dispose should be called when the object is no longer needed. It also allows the object to be wrapped in a using block to help guarantee that Dispose is called.
The pattern is designed for any unmanaged resource, which may or may not include memory. If you forget to call Dispose the GC.SupressFinalaize method also won't be called. If you forget to dispose some unmanaged resources in your Dispose method, then you would have a resource leak, which can potentially be different than a memory leak. (Usually they end up being the same, but not always.)
In just two days, tomorrow will be yesterday.
I'm writing a vb.net program that is going to have a huge number of instances of a particular object and I want to figure out how much memmory each instance of this object is going to use. It has no subs or functions nor any subclasses, just variables and arrays. I know how to calculate the memmory usage of the variables and arrays themselves but I was wondering how to calculate the memmory usage for the behind the scenes elements.
I'm guessing that each variable or array has a 4 byte pointer behind the scenes pointing to it and all these pointers are in an array which would add another 4 bytes. Could somebody tell me if this is correct and if not how should I calculate it?
C# has a sizeof[^] operator that should give you what you want. There is also the Marshal.SizeOf method, but it calculates the size of an object after it's been marshaled to unmanaged code. This size can be different than the managed-code version, which can also be different from a hand-calculated size too.
Hi every body,
Can I customize the paper size and print a document directly by coding. And my client have some printed papers, he needs to print the data at required places only. Like "Name : Code Project" here Name : is laready printed and it has to print CodeProject at specified place exactly.
How can we do it please advice me.Thank you.
Both in Web and windows application, how should I do it. And in web applications server side or even client side. If you can, could you please give me the complete idea or even some code snippet to if possible. Thanks in advance.