"I wanted to use a single SQL statement since this is web-based and the less calls to the database the quicker the website will load. "
You may be looking to optimize a portion that really doesn't need to be optimized. If you want a quick web-page, maybe consider using Ajax so that you are not pushing and pulling the entire page around like you do with a traditional postback.
Agree with Blue_Boy, Union All, just give a type, description, and ID to each subset you need like this:
Select 1 as nType, Instructor_Name as sName, Instructor_ID as nID
Where (optional condition)
Select 2 as nType, Class_Name as sName, Class_ID as nID
Where (optional condition)
Order By nType, sName
Using UNION ALL as suggested by some folks would not be an optimal solution, because the time taken to access the tables would remain the same, you would just end up saving the time to open and close additional connections (which ASP.NET saves you anyway through connection pooling).
Since the Instructors and Courses do not change often, you can use the System.Web.Caching.Cache class to load the data from the database in the event and bind the controls from the Cache. There are mechanisms in SQL Server to hook on to Table change notifications and invalidate the Cache. So anytime the data in the base tables change, your Cache gets invalidated and the fresh data is loaded.
I think you need to go back and look at your datadase design. You say there is no connection between Instructors and Courses but I think, in reality, there is a many-to-many relationship. You can achieve the connection with a linked (or intermediate) table which only contains the two primary keys - one from each table. Then create foreign keys on this table to the Instructors and Classes tables. The Primary Key of the linking table needs to be the composite of the two elements InstructorID and CourseID - see below.
Instructor Table: InstructorID, FirstName, LastName,...<br />
Course Table: CourseID, Subject, DateStart, DateEnd,.... <br />
InstructorCourse Table: InstructorID, CourseID //This is a new table with just two fields
I presume you don't want Lecturer A being able to teach Geography when really his/her subject is History, but Lecturer B can teach Mathematics and Statistics - no problem.
It’s not because things are difficult that we do not dare, it’s because we do not dare that things are difficult. ~Seneca
I think several good points have been made in this thread.
You could, if you are bent on doing this, use the row number function and join based on that.
select * from
,row_number() over (order by name) rowa
full outer join
select name Classname
,row_number() over (order by name ) rowb
This should give you a listing of columns from table a, and table b as if they'd been put together into a spreadsheet. Then you just have to be sure you don't have a duplicate column name (as I showed in the second subquery).
Give a man a mug, he drinks for a day. Teach a man to mug...
My SP uses a temp table since it uses nestes SP's.
Where is the temp table being created at?
the moment I close it the temp table gets dropped.
That is how temp tables work, as soon as the process that created the temp table completes and returns (exits scope) all temporary objects such as temp tables and variables get cleaned up (ie dropped)
What are you trying to accomplish with the temp table? Return results to the SSIS package?
I am guessing here that you are looking to parse the results of the stored proceedures in the SSIS package.
-> In your SSIS pacakge
-> Execute SQL Task
-> General tab, Change ResultSet to 'Full Result Set'
-> Result Set tab, 'Result Name' = 0 and 'Variable Name' = User::User_Defined (Variable of type object)
-> Change stored proceedure
-- Create a temp table to house the run info of the procs --CREATETABLE #Results(msg VARCHAR(200), LogTime SMALLDATETIMEDEFAULT GETDATE())
-- Start time --INSERTINTO #Results ("Start first child proc")
-- Execute the stored procEXEC sp_Test_First_Child_Proc
-- Completion Time --INSERTINTO #Results ("Completed first child proc")
-- Start time --INSERTINTO #Results ("Start second child proc")
-- Execute the stored procEXEC sp_Test_Second_Child_Proc
-- Completion Time --INSERTINTO #Results ("Completed second child proc")
-- Select the results from the temp table, this will return the result back to the SSIS package ---- This is the important step, if you dont as a last step select the results out for return then SSIS will never the them and when the proceedure completes the #Results table will be dropped automaticly --SELECT msg, LogTime FROM #Results
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
Hi , I'm doing log shipping for publisher database (Marge Replication).
I configured my publisher db for log shipping its doing work well. All transaction logs are getting restore after a time.The publisher Database is readonly/Offline mode.I then activate/Online this db.
As we also need master,Msdb and distributor (if same publisher is the distributor) databases to make secondary server as publisher, i am backing up these system dbs using maintenance plans. I have no all four latest databases (i.e. publisher,distributor,msdb and master db).
My publisher database is online and latest,I restored latest backup of msdb and distributor that is backed up by maintenance plans.When i restore master database my publisher database is no more available , i can just see its name icon in management studio but when click on it nothing to expand/attached with this database.
Both Primary server and secondary servers have SQL server enterprise with sp1 installed on same path. I wanna to just rename secondary server SQL instance name & computer name and it should works just like my primary server as a publisher. Help will be appreciated.
We have an application that combines data from multiple business areas. (POS, Inventory, Accounting, Payroll, etc.) Most of the time, these sources are from different vendors and most of the time these vendors utilize SQL Server. For years, we have simply worked with the IT departments of our customers to create sql logins for these other databases (db_datareader role only!) in order to extract data used by our application. This usually involves the customer sending us a backup that we restore, analyze against a report set, and build queries against. The goal is to solve the business need of the customer who needs to have these many puzzle pieces in a single place...we are not competing with the other vendors, only making use of the data from their systems. The reason for this post is that last week, I got a call from an angry software developer for one of these vendors who wanted to know who gave us permission to connect to their sql database and blaming us for missing records. I expained the concept of the db_datareader role and expressed my opinion that the data belongs to the customer...the customer gave us permission for the connection in order to fulfill a business need. He said he will be contacting his legal experts on the matter. So, how would you feel if you found out another company was mining your database?
This all depends on the contract between the other vendor and the client. Almost certainly the data belongs to the client, however the structure the data is contained in may well be the IP of the vendor and having another SW house ratting arround the structure could be percieved as infringing on their IP.
Most vendors needing to protect their IP in the structure will publish views that can be consumed by the client (and other vendors).
I suspect his legal experts are going to tell him to pull his head in!
Never underestimate the power of human stupidity
Last Visit: 27-Nov-20 17:14 Last Update: 27-Nov-20 17:14