|
Thank you, Sir.
100
|
|
|
|
|
Ali Al Omairi wrote: 100
What it mean it's a virtual points to Mycroft or bless of 100 year to live.
Regards,
Hiren.
be good(Help people in CP),do good(Vote if one finds helpful) all will happen good, In case happens bad(You are getting downvote for your best try to help OP) it will be good for later after(Countered with more points by humble member). - Gita sar in context of CP.
-So Guys don't care about downvote believe in you.
|
|
|
|
|
Sir, In Jordan we use the expression "A hundred of flowers" often to say that every thing is ok. we also use it to thank people and to say hi to them.
|
|
|
|
|
Interesting cultural tidbit, Ali. I never know what I'm going to learn when I stop by CodeProject.
Will Rogers never met me.
|
|
|
|
|
That's good to know, Ali.
Thanks 100 to you too for that.
Regards,
Hiren.
-"I don't know, I don't care, and it doesn't make any difference".
|
|
|
|
|
Sir, Since you told me so i was trying to reinstall the software
but i had some problem with some components
1- MSXML6
2- SQL Server Database Services
what can be the reson
Call me "Abu AlHassan", Please.
|
|
|
|
|
I am currently writing a stored proc for a database that will help create a new user. The logic I am using is something I had used before, but then I had used three separate stored procs and thought it would be more effecient to just call one stored proc to handle the whole job. Here are the tables I am using:
users:
userID (primary key, IsIdentity turned on)
firstName
lastName
sex
authenticationID (foreign key that points back to the authentication table)
contactID (foreign key that points back to the contact table)
authentication:
authenticationID (primary key, IsIdentity turned on)
userName
password
contact:
contactID (primary key, IsIdentity turned on)
street
city
state
emailAddress
In my stored proc, I pass in all of the parameters needed for each table (listed above, except, of course, the identity fields) and start the inserts. Since I need the contact and authentication inserts done first, I naturally would want to start with those tables, but how would I get the ID fields for each tables so I can insert them into the users table?
|
|
|
|
|
The sproc could return it using a output [^]-parameter. Or you could write a function instead of an sproc, and have that return the identity. Or you check the latest identity values for your transaction
I are Troll
|
|
|
|
|
Use the @@IDENTITY function within the SP
|
|
|
|
|
Nevermind. Did a little bit of experimentation and got it to work. Thanks for all the input though.
|
|
|
|
|
I think this is what you need.
Just fill the blanks....
<code>
-- =============================================
-- Author: Martin Arapovic
-- Create date: 19.01.2011
-- Description: ....
-- =============================================
CREATE PROCEDURE InsertUserData
(
-- Add the parameters for the stored procedure here
-- Define needed parameters
-- Contact Input Data
@Street VARCHAR(50),
@City VARCHAR(50),
@State VARCHAR(50),
@EmailAddress VARCHAR(50),
--Users Input Data
@UserName VARCHAR(25),
@Password VARCHAR(25),
-- User Input Data
@FirstName varchar(50),
@LastName varchar(50),
@Sex varchar(50)
-- Add other paramas if you have any
)
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
DECLARE @ContactID INT
DECLARE @AuthenticationID INT
--
-- Note: This is only skeleton of the proc that you want to implement
--
-- Do all in one transaction
BEGIN TRANSACTION
--
-- Insert Contact data and get ContactID
INSERT INTO [Contact]
([Street]
,[City]
,[State]
,[EmailAddress])
VALUES
(@Street
,@City
,@State
,@EmailAddress)
-- GET CointactID
SET @ContactID = SCOPE_IDENTITY()
--
-- Insert Authentication data and get AuthenticationID
INSERT INTO [Authentication]
([UserName]
,[Password])
VALUES
(@UserName
,@Password)
-- GET Authentication
SET @AuthenticationID = SCOPE_IDENTITY()
--
-- Insert Users data into users table.
INSERT INTO [Users]
([FirstName]
,[LastName]
,[Sex]
,[AuthenticationID]
,[ContactID])
VALUES
(@FirstName
,@LastName
,@Sex
,@AuthenticationID
,@ContactID)
-- Commit
COMMIT TRANSACTION
-- This is the general example and you need to add error handling and
-- other bussines logic if you have any... :)
END
GO
</code>
|
|
|
|
|
Hi!
Is there a way to compare data of the same table in two different databases?
Thanks!
|
|
|
|
|
Export to text files and use a difference tool on the files. Just one suggestion.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
You can do it that way but shame on you!
|
|
|
|
|
Actually it's not the only way I do it, but I have found it a useful way when 'migrating' an application from one physical DB to another. I usually automate it so that I have only to review a log report of any differences that are actually discovered. But considering that there are many reasons for comparing the data, until you know all the requiremens, sometimes the simple solutions work best.
Chris Meech
I am Canadian. [heard in a local bar]
In theory there is no difference between theory and practice. In practice there is. [Yogi Berra]
posting about Crystal Reports here is like discussing gay marriage on a catholic church’s website.[Nishant Sivakumar]
|
|
|
|
|
What database server are you using?
|
|
|
|
|
|
Example of finding records in D1 that are not in D2
SELECT D1.[DepartmentID]
,D1.[Name]
,D1.[GroupName]
,D1.[ModifiedDate]
FROM [AdventureWorks].[HumanResources].[Department] D1
LEFT JOIN [AdventureWorks2008R2].[HumanResources].[Department] D2
ON D1.DepartmentID = D2.DepartmentID
WHERE D2.DepartmentID IS NULL
Notice how the table is referenced by Database.Schema.Table.
Use the where clause to compare individual columns.
|
|
|
|
|
vanikanc wrote: Is there a way to compare data of the same table in two different databases?
Yes! Multiple! What have you Googled?
On a more serious note, RedGate has a nice tool that could help out.
vanikanc wrote: Thanks!
You're welcome
|
|
|
|
|
Hello,
When I try the followinf in sql 2005 i get an error message:
---
declare @rc varchar(8000)
select @rc = 'bcp hqsqlsrvr.web_ref.dbo.anytable_USA out c:\anytable_usa.dat -c -T'
exec xp_cmdshell @rc
---
an error occured while processing command line.
Any pointers?
thanks.
|
|
|
|
|
Try this one :
declare @rc varchar(8000)
select @rc = 'bcp "select col1, col2 from hqsqlsrvr.web_ref.dbo.anytable_USA" out c:\anytable_usa.dat -c -T'
exec xp_cmdshell @rc
Putting the query in double quote may solve the problem..
|
|
|
|
|
Hi all,
A bit of quick info before I come to my question:
My company uses SQL Server 2008, two databases for one application and almost no schemas.
Our biggest customer now has a DB of 40 gig (took them a few years to get that 'big').
One of the databases is for storing production data and the other is for some data our application needs and user settings.
Now I recently had a discussion with my boss about one particular table in one of our databases.
It stores settings on stock, sales, client servers, user rights, user settings and general settings (no joke... a topic about it would even shame the hall of shame).
I told my boss that normalization might be very necessary for this table, but he told me that normalizing is not always a good thing. In the case of this particular table I know it is, but I am more interested in when we should not normalize?
One of my boss' arguments was that one table to store 'all' settings is rather handy because it keeps the amount of tables low and you can always find a setting (for whatever) in one table.
I know there are some rules for normalizing and denormalizing, but rules are meant to be broken
So I was wondering if my boss' argument about having lesser tables is valid at all?
And I would like to see some real-life examples of people who chose not to normalize and for what reason.
|
|
|
|
|
This is a very good question - in fact, there's a lengthy discussion on the subject in my favorite SQL tuning book[^].
To me, denormalization is always an exercise in query optimization. I start off with my "ideal" (i.e. normalized) schema, populate its key parts with random but realistic data, quickly prototype my queries, and then look for particularly ugly execution plans.
What your boss had in mind is probably minimizing the number of joins: if you always look for all (or a large subset of all) settings at once, and you keep doing it over and over again, using a single table for all settings will reduce the number of joins, and therefore improve your speed. However, I would not make a decision one way or the other without running a query analyzer: optimizing without a profiler is like driving in the dark without a map.
|
|
|
|
|
Your boss is right and you are right, there is no magic rule that can be applied, it all comes down to "depends on requirements".
If you have a single table with an expanding number of columns to hold the settings then I would change the design. I often use a SettingType, SettingValue table and store the lot in there. Hoewever you cannot use FKs in that scenario so it is less than ideal.
I do have 1 general rule and it is based on the database usage, if it is a transactional database for supporting applications I normalise as much as is reasonable. If it is a reporting database, this includes BI support DBs then normalisation goes out the window in favour of query optimisation. Try servicing Oracle BI, it will drive a relational database person nuts.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
There is no specific rule as to when to normalize/denormalize. The answer is "it depends"
In general it is a trade off between Query response vs Cost of data inconsistency. Normalized data keep the data consistent at a cost of query time. See Database normalization[^] and Denormalization[^] for some descriptions in wikipedia.
|
|
|
|