|
Something not being in a collection isn't by definition a bug. It might be in your case. If y should approach 0 with your collection then exceptions would be the way to go.
This has more to do with the data your collection services and the qualities of that data. If you have a unique set for your domain, then the collection will probably rarely run into an access outside of that domain. If the domain is generic then accesses outside of the registered or active parts of the domain will probably be much more common.
|
|
|
|
|
Well, perhaps "bug" was over-stating my opinion a bit, perhaps "poorly-developed" is a better term, and as I stated in my edit -- searching for a parameter by name (and probably members of other collections as well) is a code smell. Best to avoid it. In many cases, the developer will know what's in a collection. This is true of the program I was working on -- I know the parameter is in there, I even know that it's at index 0, so why search for it by name?
BUT, I write a lot of library/framework code that others might use (yeah, as if), so I need to keep "lesser developers" in mind.
As luck would have it, I was just re-reading a "List of Principles" of Programming Languages and saw this:
"Localized Cost: Users should only pay for what they use; avoid distributed costs." -- MacLennan, 1987
I would apply this to the current discussion as: "Programs which call GetParameter only when the parameter is known to exist should not have to pay for the double-checking that would benefit only other programs."
Is my code still violating that Principle? Yes. Because my GetParameter function tries to be provider-neutral, it has protection against calls that return NULL (e.g. Oracle) -- a program that uses SQL Server will still be paying that cost with no benefit to itself. However, the test for NULL is much less than the double-search.
|
|
|
|
|
I see where you're coming from given that example. I tend to enjoy writing more dev-oriented tools as well.
Abstractions such as labels/names don't bother me though. In my mind there's no difference between representing a key via integer or string if you're free to make the choice; they're both just abstractions in this context. The only requirement for a key is a unique set of bits however that ends up implemented.
I recently watched a great Computerphile[^] video where he was touching on the subject of abstraction using assemblers. Apparently John von Neumann thought assemblers were harmful because they took more processing time. Neumann thought this was wholly unnecessary because you could simply manually address the program making a double-pass by the assembler a waste of time (double because you need to scan for forward-jumping labels first). Got me thinking about the more general debate topic when you have multiple techniques to accomplish effectively the same thing and how advantages/disadvantages can sometimes be relative to the developer or the architecture.
|
|
|
|
|
If you are concerned about finding the maximal intersection of readability, performance and reliability, it is advised to ALWAYS use the bool TryGet( key, out value ) pattern and avoid bool Contains( key ) like the plague.
Moreover, after a TryGet returns false, do not use the Add method since it checks AGAIN for the existence of the item. Instead, use the set indexer to assign the value to the key's index.
if (!items.TryGet( key, out T value ))
{
value = GetNewValueLogic( key );
items[key] = value
}
Extremely readable, and it provides the best possible performance in all cases.
|
|
|
|
|
A good suggestion; but the IDataParameterCollection interface doesn't provide a TryGet method. As far as I can see, none of the concrete implementations do either.
Personally, I think ConcurrentDictionary[^] provides a better API than the standard Dictionary :
T value = items.GetOrAdd(key, GetNewValueLogic);
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Apologies. That's not an API that I was familiar with. I'm shocked that a TryGet isn't provided!
|
|
|
|
|
I had an evil idea so I ran this script on my local SQL server database:
CREATE PROCEDURE TESTPROC
AS BEGIN
SELECT 1
EXEC TESTPROC
END
When I executed the stored procedure, it stopped after 32 levels of nesting. My evil plan failed and I learnt something new today.
"It is easy to decipher extraterrestrial signals after deciphering Javascript and VB6 themselves.", ISanti[ ^]
|
|
|
|
|
If you change to a recursive CTE you can play around with OPTION (MAXRECURSION 100)
|
|
|
|
|
Each time a stored procedure calls another stored procedure or executes managed code by referencing a common language runtime (CLR) routine, type, or aggregate, the nesting level is incremented. When the maximum of 32 is exceeded, the transaction is terminated.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I will take over the world before you do.
Just letting you know!
|
|
|
|
|
No it won't, unless it takes over 32 people at a time.
"It is easy to decipher extraterrestrial signals after deciphering Javascript and VB6 themselves.", ISanti[ ^]
|
|
|
|
|
It will take over thousands upon thousands!
And it's only 10 years away around the corner!!!!
|
|
|
|
|
I saw something similar happen with a trigger making a change that was then running the same trigger again.
The updates were extremely slow and I discovered that there was, like you have discovered, a point at which SQL Server gave up - when the recursion was 32 deep.
You would hope that the SQL Server would just catch this sort of thing at compile time.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
I resorted to a GOTO in an SQL query Loop that created a new transaction to work round a repetitive action with an unknown data set size. It was dirty, but worked. Glad I didn't try recursion as that would have definitely ended in tears by the sound of things!
You can read about the problem here.....Developing Automated Data Purge Solution, but it basically looks like this;
LoopStart:
IF (@Count - @RowCountTotal) > @BatchSize
Begin
Begin Transaction
Delete Top (@BatchSize) From Comment Where CommentTime < @StartDate
Set @RowCountTotal = @RowCountTotal + @@RowCount
Commit Transaction
Goto LoopStart
End
Else
If (@Count - @RowCountTotal) > 0
Begin<br />
Begin Transaction
Delete Top (@Count - @RowCountTotal) From Comment Where CommentTime < @StartDate
Set @RowCountTotal = @RowCountTotal + @@RowCount<br />
Commit Transaction
End
|
|
|
|
|
IMO, recursion is inherently dangerous, and must be avoided whenever possible. Moreover, when you must use recursion, you must also take steps to ensure that your recursive routine converges and unwinds its recursive stack. Every recursive routine must incorporate a test that stops the recursion in such a way that the recursion depth is self-limiting.
This is true regardless of the programming language in which the recursion occurs. Whenever I encounter a recursion, my first thought is to investigate other algorithms that don't rely on recursion. More often than not, I have managed to succeed in finding one.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
I have conclusive proof of something that I have long suspected; file times for files that fall outside the current DST/Standard time period are reported off by 1 hour by the Windows command prompt (CMD.exe). I'll be publishing the proof later today.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
|
Thanks for the article citation. Nevertheless, since Windows has the transition date tables mentioned in the article, and NTFS stores UTC times on disk, I see no reason that it shouldn't be fixed, even if doing so required a new switch. After all, said switch could be set in the DIRCMD environment variable.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Like the Unix linefeed mode support in Notepad that will be introduced after 30 years with the next Windows 10 version?
That will be enabled by default but can be disabled with a registry key.
|
|
|
|
|
Keep in mind that MS just finally "discovered" Linux.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Hmmmm,
Looking forward to you discovering File System Tunneling[^]. Report back and let us know what you've discovered.
Start your research here[^].
Best Wishes,
-David Delaune
|
|
|
|
|
Quarks, eh?
I didn't respond in the thread yesterday because I was using my phone to access my email, since the fiber cable to my house was down, due to a break in the last segment of the line. However, I managed to follow the link and read both articles, which were most intriguing.
Eventually, I'll probably delve into it more. Meanwhile, I've run some tests in assorted scripting languages, with mixed results. So far, I've tested the stat() function, as implemented in Perl and PHP. Next come JavaScript, in the Node CLI, and VBScript, and, finally, PowerShell, which I anticipate will mirror the C# program that put the nail in the coffin. Now that I've learned all this, I think the resulting article will be a third installment in the Time Zone Lab series that I started a couple of years ago.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
David A. Gray wrote: So far, I've tested the stat() function, as implemented in Perl and PHP. Next come JavaScript, in the Node CLI, and VBScript, and, finally, PowerShell, which I anticipate will mirror the C# program that put the nail in the coffin. Now that I've learned all this, I think the resulting article will be a third installment in the Time Zone Lab series that I started a couple of years ago.
I expect that you will get the same results from Perl, PHP, Javascript, VBScript because all of those high-level languages will call into the GetFileAttributesEx Windows API to get file time stamps.
The C lib function _stat() and all 32 bit variants are forwarded to GetFileAttributes
The C lib function _stat64() and all 64 bit variants functions are forwarded to GetFileAttributesEx.
GetFileAttributes simply forwards to GetFileAttributesEx
Finally... GetFileAttributeEx calls into the NtQueryAttributesFile function[^] via a SYSCALL and then in kernelmode the request walks it's way through the filter drivers down to the filesystem driver.
You are basically testing NtQueryAttributesFile function multiple times from the Thirteenth Floor.
Those high level languages may format the date differently... but they are all getting the same timestamp from the filesystem layer.
Best Wishes,
-David Delaune
|
|
|
|
|
The problem with that theory, which has already born fruit, is that all of those functions retrieve a UTC time stamp. The issue is what happens on the 12th floor when the runtime that got the ball rolling converts the returned time stamp to local time. The outcome depends on whether they take into account the actual date and the DST transition dates on either side of it. Some frameworks take the same dumb approach implemented by CMD.exe, which treats all times as if they were in the current transition (DST vs. Standard Time). So far, I've already found two other frameworks (Perl and VBScript) that take that naieve approach.
David A. Gray
Delivering Solutions for the Ages, One Problem at a Time
Interpreting the Fundamental Principle of Tabular Reporting
|
|
|
|
|
Hi,
There use to be some amazing articles on the internet back in the 1990's that had charts and listed all the differences between the programming languages and how the handle date/time conversions. I briefly searched but could not find those old articles. Even Dr. Dobbs[^] has retired.
I think it's a great idea to write an article about those differences. Looking forward to reading what you come up with.
Best Wishes,
-David Delaune
|
|
|
|