No, I said the data was temporarily stored: the tables are a permanent fixture and are used to transition that data before it gets exposed in the production tables. For my purposes it's okay for the tables to reside in the prod database but that might not suit you. There is no reason it can't be a wholly separate database: that's up to you and the requirements.
All of the code is either in stored procedures and/or c#. The usual scenario is that the tables are truncated and (in my case) filtered data is dumped into them and then other processes may work on them and then the data is moved to the primary tables.
Note that this is just how our solution works for us: it may not be right for you. Use anything like this with extreme caution, especially near production data.
"If you think it's expensive to hire a professional to do the job, wait until you hire an amateur." Red Adair. nils illegitimus carborundum
We do it somewhat different to Mark, our staging table are exact replicas of the source data, this allows us to use Bulk Copy when loading the data. It also insures there are no errors in the Load part of the ELT process.
We then use stored procedures to do the transforms to the production format (these may also be staging tables in some solutions). The major benefit is that you are doing any transforms on a limited data set and it is easy to debug the procedure. It then becomes a simple matter to dump in the staging data to the production tables.
I find a well crafted ELT process leave any ETL application in the dust. There are some ETL apps that this may not apply to but the cost is extraordinary.
Never underestimate the power of human stupidity
I have worked for some CIO's that have come up with way more atrocities than ETL or Y2K.
Mycroft Holmes wrote:
Extract - Load - Transform simplifies and speeds up the entire process.
I think you can write bad code / processes in any language or tool. It's up to the person to find the best way to do it that works for them. Not sure about you but I have some amazingly fast SSIS packages that load a gig of data in seconds, but have seen some others that take hours doing a lot less data.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
Hi All ... in my form app. i have textboxs related databindings yonet_ogrt and I have 2 datagridview yonet_prg
and yonet_nbt.. so I want to all together navigate but textboxs and datagridview related yonet_prg is working but
datagridview related yonet_nbt not working... this is my trouble...
My code like this...
public void vt_iliski()
string sorgu_ogretmen = "select * from ogretmen";
string sorgu_program = "select * from program";
string sorgu_nobet = "select * from nobet";
ds_gnd = new DataSet();
DataTable tablo = new DataTable();
adpt_ogrt = new SqlCeDataAdapter(sorgu_ogretmen, baglan());
adpt_prg = new SqlCeDataAdapter(sorgu_program, baglan());
adpt_nbt = new SqlCeDataAdapter(sorgu_nobet, baglan());
The following subquery returns 1024 records with a single count column. I am trying to return multiple columns with 1024 records. When I make changes to return multiple columns I get 17,000 records instead of 1024 records. I have very litle experience with subqueries. Does anyone have any suggestions?
SELECT IFNULL(COUNT(DISTINCT A.computerid),0) AS 'Machine Count',
A.hotfixID AS 'Last Contact'
FROM (SELECT HF.computerid, HF.hotfixID
FROM hotfixdata HFD
INNER JOIN hotfix HF
ON HF.hotfixid = HFD.hotfixid
AND HFD.ignore <> 1
LEFT OUTER JOIN Computers AS C
AND HF.installed <> 1
AND HF.Approved = 1
GROUP BY HF.hotfixID, HF.ComputerID) A;
I have a class and the data in it has to be stored in the database. I have done this by extracting every single property and sending each of them as input parameters to a stored procedure in the database. But the thing is that the class has too many properties. Is there any way to accomplish this in a single go(lesser amount of code)? OH and I am using C# and Sql Server!
Presumably it throws an exception and posting that exception as part of your post is helpful.
Presuming that the exception is telling you the statement is too big...
If your SQL statement is too long then SQL/C# will have a problem with it. The solution is to reduce the size. (Might note that that could indicate a database design issue as well.)
Using the database import tools (sqlldr?) might help with the problem but might not. If not or you don't want to use it then solutions.
1. Do two calls.
2. Pass in a blob and parse it to create dynamic sql.
If it was me I would presume that there is a design issue.
The thing is I am still new. Is it not good that we do more of the grunt work at the starting stages itself? And DAL coding is quite hectic. Sometimes I really feel why do we need it anyway. Thank you for eye opener atleast.
Search a topic like "Business Intelligence" and you will get an idea of what others are doing.
At a high level, you have a central database which collects data from different systems, then you have reporting tools which either you or your user community can write their own reports. Also, sometimes you can have these reports run on a scheduled basis and have their output delivered to whoever needs it.
The cost of implementing such a system can be quite high.
You might want to implement some pieces of this.
1) Build a tool which will load your data extracts (xml files) into a central database.
2) Create views or stored procedures which "normalize" your data
3) Use a reporting tool to generate the reports you need or show your users how they can connect to the central database via Excel to get their own data. (See topics like Microsoft Query)
Empower the user to access the data and you will spend less time on maintenance.