|
PIEBALDconsult wrote: So is this how the responder writes production code when it matters
Rubbish, fix the problem, fix it now and save the script, if it needs repeating then spend the time to make it elegant and efficient. Knowing that it is a kludge is more important than making a one off elegant.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
PIEBALDconsult wrote: Perfect practice makes perfect. Do it the right way every time.
Nope. Not in the real world.
The real world is about money. Getting it perfect every time takes time and thus costs money. In the real world it isn't possible to get it perfect and most of the time it will not have any significant impact on actual required production functionality. Thus the extra cost is lost revenue which will never be regained.
This of course isn't the same as saying that one can write poor code all the time. But rather one must learn to recognize that ones time is in fact valuable and thus one must prioritize what one spends time on. And since, again in the real world, one can not spend an infinite amount of time on everything, one must make compromises.
And one might specifically look at this thread as an example and note that without knowing about the actual business of which this artifact is a part one will not have any idea about specifics such as size of the table, usage statistics, allowed maintenance, etc. And when one looks at all possible tables in all possible businesses, on average it won't matter.
PIEBALDconsult wrote: I see too much bad SQL being written. It should be nipped in its bud.
Myself I see a lot of average code being written - it isn't specific to any language. Which is quite comforting since otherwise humans wouldn't be the ones writing it. If you are seeing a lot of poor SQL then maybe you need to start looking at a broader range of sources.
|
|
|
|
|
PIEBALDconsult wrote: Won't that rescan the table/index for each id?
Yes. You could load up a temp table (Emp_id, date) and then join the update to the temp table. It would lower the lookups by a 5th at the cost increasing the code complexity.
For something that will run once, I think the clarity of the correlated sub-query overrides any benefit on time. Of course, if we are talking gigabyte tables, I take it all back.
|
|
|
|
|
But of more import is that we are talking about a beginner who got this response from an expert* -- that's the big problem in my opinion. The beginner should be shown better code because he won't know any better otherwise. The poster, or other inexperienced developers who happen by, may think that the provided code is good for all situations.
* Anyone posting responses here assumes the role of expert with all the responsibilities thereof.
Michael Potter wrote: increasing the code complexity
I don't think my solution is any more complex than the other solution.
Michael Potter wrote: lower the lookups by a 5th
Correct me if I'm wrong (I could very well be), but my expectation of the first solution is n+1 lookups, whereas mine is 2 lookups. Granting that a modern database like SQL Server should work smarter than I would.
|
|
|
|
|
PIEBALDconsult wrote: may think that the provided code is good for all situations.
Very possible. But that is a different discussion and one that applies to many things in developement.
And a code snippet will not teach that.
|
|
|
|
|
Yes, but it's very important on a site like this. And in the real world, too, not just development.
If you're going to teach a man to fish, do it right. Sure, in a pinch you can use a shotgun, but you shouldn't be teaching newbies to fish with a shotgun.
|
|
|
|
|
PIEBALDconsult wrote: If you're going to teach a man to fish, do it right. Sure, in a pinch you can use a shotgun, but you shouldn't be teaching newbies to fish with a shotgun.
Now that really depends upon how hungry they are!
|
|
|
|
|
But you have to make it clear that this is for special cases only, not for every day.
|
|
|
|
|
PIEBALDconsult wrote: If you're going to teach a man to fish, do it right
The point here however is not how to fish.
Rather it is how to use a spinner when trolling for big mouth bass at dawn around slow moving water amoungst rocks. Which is something that someone might need to know. But not without the qualification. And it is not teaching them how to fish.
|
|
|
|
|
Here's how I'd do it.
UPDATE BEN
SET Start_date=B.Start_date
FROM BEN A
INNER JOIN
(
SELECT EmpId
, Start_date
FROM BEN
WHERE Ben_code=1
) B
ON A.EmpId=B.EmpID
WHERE A.Ben_code<>1 -- optional
Also get in the habit of using transactions when doing such things.
|
|
|
|
|
If each employee has five Ben_Codes that each have a date that needs to be the same but can be different, then you should consider normalizing the database a bit.
People say nothing is impossible, but I do nothing every day.
|
|
|
|
|
Thank you all for the responses. I was able to create the SQL I needed based on all your replies. I opt'd to follow example provided by PIEBALD as I am new to writing SQL and figure I should learn the more efficient methodology and when I am more versed in SQL I can do the quick and "dirty" when I have too.
Once again the help provided has been great and I really appreciate it!
Michelle
|
|
|
|
|
I need to know whether drive free space amount is enough for database backup.
How can I approximately calculate SQL Server database backup size (for example in FULL recovery mode) before doing backup?
Thanks!
|
|
|
|
|
|
Usual accounting program running database is SQL server 2005 express version, do not know why the last few days very slow when running reports, so no way for it to run faster? can specify the steps.
|
|
|
|
|
Turn it off and turn it back on again. Seriously.
|
|
|
|
|
If, after you have used the piebalds solution and checked the hard disks for space and your memory usage on the server, it is still a problem, turn on SQL profiler and run the reports, then analyse the action plan an determine if you have a problem with the queries.
If after optimising your queries you still have an issue do some research into Parameter Sniffing, this is a long shot.
If you still have a problem then hire a DBA consultant to look at your database!
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
|
I deleted my last question since I feel like I've made some kind of progress. However, I'm still not sure how to group all of the data together. I have five tables that hold data. What I am doing is selecting:
UserId, ProductNum, Qty
into a temporary table for each master data table. Here's an example of what each temp table looks like:
John PART-NUM-ONE 5
Jane PART-NUM-ONE 8
Jane PART-NUM-TWO 4
Jack PART-NUM-THREE 9
I have five tables structured just like that, except the Part Number and Qty will be different in each table.
How can I build a query to group by Name, then by Part Number, and a column for each temp table (e.g. Passed, Finished, Scrapped)? I just can't seem to combine them. The Part Numbers come from a Parts table, which stores the Part Number.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|
|
I came a bit late to the problem. Can you confirm that in your example there are 4 temporary tables? Is this going to be static? (Always 4 Users)
The solution appears to be using a series of Select statements with UNION.
|
|
|
|
|
Well, I forgot to elaborate the end result I am going for. It is not a limit of users. The temp tables store every user's numbers for a particular piece of the process (Pass, Finished, Scrap, etc.). The end result I'm looking for is like this:
Name Part Passed Finished Scrap
-----------------------------------------------
John P-ONE 8 5 7
Jane P-ONE 4 6 10
Jane P-TWO 9 2 3
Jack P-THREE 10 12 6
Bring in the data from ALL temp tables into a group of columns, I guess sort of like a pivot. However, I can't seem to implement any kind of pivot in SQL that solves the problem.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|
|
I would use a pivot
SELECT UserId,ProductNum,Passed,Finished,Scrapped
FROM
(SELECT UserId, ProductNum, Qty, 'passed' as status
FROM Passed
UNION
SELECT UserId, ProductNum, Qty, 'finished' as status
FROM finished
UNION
SELECT UserId, ProductNum, Qty, 'scrapped' as status
FROM scrapped
) AS SourceTable
PIVOT
(
sum(qty)
FOR status IN ('passed', 'finished','scrapped')
) AS PivotTable;
I havent tested this, but the principle should work.
I also have the opinion that you should normalize the database, several tables with the same content isn't good.
Put them together into one table with a status column.
People say nothing is impossible, but I do nothing every day.
|
|
|
|
|
Thank you, Jorgen. I will give that a try in the morning and I will provide an update.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|
|
I agree with Jorgen, Pivot is you best option, and the database should be normalised, the design is rubbish!
Alternatively you could do this:
Select Name, Part, 1 Passed,0 Finished,0 Scrap
from PassedTable
UNION
Select Name, Part, 0 Passed,1 Finished,0 Scrap
from FinishedTable
UNION
Select Name, Part, 0 Passed,0 Finished,1 Scrap
from ScrapTable
Group and sum the results as required.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
How is it not normalized? We must keep up with each movement for historical purposes. The temp tables only contain the same type of data because I cannot figure out how to make it work with the core tables.
djj55: Nice but may have a permission problem
Pete O'Hanlon: He has my permission to run it.
|
|
|
|