Declare @t table([Name] varchar(20),[Status] varchar(50))
insert into @t values
('John Smith','Pending Approval'),('John Smith','Waiting Email Verfication')
--,('User2','Pending Approved'),('User2','Sent email'),('User2','Other activity')
--Select * from @t
select Result = [Name] + ' ' + Stuff((Select ',' + CAST([Status] as varchar(100)) from @t t2 where t1.[Name]
= t2.[Name] for XML Path('')),1,1,'')
from @t t1
group by t1.[Name]
John Smith Pending Approval,Waiting Email Verfication
I wondered if Sql Server supported somehow to do that?
By that, I mean having the structure on one database and the data on another.
The purpose is to be able to lunch a set of integration tests that would get the data on the test database but would execute the stored procedures from the original database.
I didn't found much on google, so I suppose this isn't possible.
Edit: I didn't said, that I'd like to leave the stored procedures as they are, prefixing table names with a database would be too much work. Ideally it should be a config parameter or something like that.
The stored procedure is executed on a server not a database. You can have
multiple databases on one server.
Not so sure about that, I may be wrong though.
We have multiple databases in one single server, indeed, but some databases contain some procedures and other databases contain other procedures, so I'd say that the database is the procedure's owner.
Anyway, linked servers may be a solution indeed, but it would require to prefix the table names with the server/database name. We actually have too many stored procedures, so this is not an option right now.
If you try running a stored procedure that stores to the c:\ drive, see where the file ends up. Now there is most likely a way to change behavior of where a process runs but I use the defaults. See sp_who also will show you what is running on a given server.
You can run a stored procedure from one linked server on another by using the server/database name. Within a stored procedure I like to have the database with the table so if it is ran from another database it still runs. I think this is called fully qualified but I do not remember. But database.schema.table is what I suggest and with the new schema importance in 2005/2008 referencing the schema should be the minimum. (We still currently use dbo for our tables though).
If you try running a stored procedure that stores to the c:\ drive, see where
the file ends up.
Are you talking about Sql Server, or Windows Server? Checking the location of a file stored from a stored procedure makes me believe you are talking about the second one.
If you have an Sql Server that has two databases, db1 and db2, and you create a Stored Procedure only on db2, trying to run that stored procedure from db1 won't work.
Check the details of EXEC sp_who you'll see that the dbname is specified.
We too are referencing the schema, and we still use dbo too by the way.
I think that the only way would be prefixing all our table/views/etc. calls with the database name, the name should be a parameter (if we're running tests then the data should be retrieved from the test database, otherwise it should get the data from the context's database). If we use a parameter, then, that would mean converting all our procedures to dynamic sql. Considering that we almost have 3000 stored procedures, this is not an option, as said before.
I was wondering if Sql Server supported some sort of parameter, indicating that it should read the data from another database, while executing a procedure from another (without having to edit the stored procedures, this is a key factor).
If the synonym is on the test server and points to the code on the production server then any change to the prod server should immediately be available to the test server. You run the code on the test server but it is the code from production.
Or am I not seeing the problem?
Give a man a mug, he drinks for a day. Teach a man to mug...
USE tempdb;<br />
-- Create a synonym for the Product table in AdventureWorks2008R2.<br />
CREATE SYNONYM MyProduct<br />
FOR AdventureWorks2008R2.Production.Product;<br />
-- Query the Product table by using the synonym.<br />
USE tempdb;<br />
SELECT ProductID, Name <br />
FROM MyProduct<br />
WHERE ProductID < 5;<br />
Maybe I'm not seeing it, but how would you manage to use the same procedure and call different databases depending on who called the procedure?
Let's say we have db1 and db2 and we're doing:
SELECT A, B, C FROM dbo.Table
What I need is to call the data from db1 if the application made the call, but go read the data on db2 if the call was made by the build server (the unit/integration tests).
Still does not make sense to me. Why not take a copy of the procedures and run them on the test server. We use SQL Compare to do this except we do it the other way around. Develop on the dev server, test on UAT and deploy to Production, SQL Compare does the synching of the code in both steps.
When we require another environment we take a backup of production onto yet another server. Then the only change is the server name and we don't use that in the qualification in the code. I admit this may become complicated if you are using distributed databases but we use Oracle for any really big data (multi TB)
Never underestimate the power of human stupidity
We recently sized an Oracle system at 6TB, we can identify 3tb as the initial data coming from part of the trading system.
Really big data is only useful if your structure is good and your data clean. Unfortunately our legacy systems were designed by incompetent, moronic, lazy, uneducated, idiots who only had a vague idea about the business and absolutely no idea about data structures.
Never underestimate the power of human stupidity
Still does not make sense to me. Why not take a copy of the procedures and run
them on the test server.
Because we're using continuous integration with tfs 2010, and want to launch the test sets while the build server is... building. If we have to copy our stored procedures to the test database then this will require a manual action each time a sp is updated, which I think, is a bad idea. Less humans around, less problems.
Apart from checking bugs, we'd also like to guarantee that the deploy succeed regardless of the environment (dev, tst, post test, production).
I know this seems a bit strange since we should only test on one of those, but we have numerous errors due to deployment issues (large amount of files to merge, a lot of manual actions) and we'd like to prevent that, or at least, be warned as soon as possible.