We do it somewhat different to Mark, our staging table are exact replicas of the source data, this allows us to use Bulk Copy when loading the data. It also insures there are no errors in the Load part of the ELT process.
We then use stored procedures to do the transforms to the production format (these may also be staging tables in some solutions). The major benefit is that you are doing any transforms on a limited data set and it is easy to debug the procedure. It then becomes a simple matter to dump in the staging data to the production tables.
I find a well crafted ELT process leave any ETL application in the dust. There are some ETL apps that this may not apply to but the cost is extraordinary.
Never underestimate the power of human stupidity
I have worked for some CIO's that have come up with way more atrocities than ETL or Y2K.
Mycroft Holmes wrote:
Extract - Load - Transform simplifies and speeds up the entire process.
I think you can write bad code / processes in any language or tool. It's up to the person to find the best way to do it that works for them. Not sure about you but I have some amazingly fast SSIS packages that load a gig of data in seconds, but have seen some others that take hours doing a lot less data.
Common sense is admitting there is cause and effect and that you can exert some control over what you understand.
Hi All ... in my form app. i have textboxs related databindings yonet_ogrt and I have 2 datagridview yonet_prg
and yonet_nbt.. so I want to all together navigate but textboxs and datagridview related yonet_prg is working but
datagridview related yonet_nbt not working... this is my trouble...
My code like this...
public void vt_iliski()
string sorgu_ogretmen = "select * from ogretmen";
string sorgu_program = "select * from program";
string sorgu_nobet = "select * from nobet";
ds_gnd = new DataSet();
DataTable tablo = new DataTable();
adpt_ogrt = new SqlCeDataAdapter(sorgu_ogretmen, baglan());
adpt_prg = new SqlCeDataAdapter(sorgu_program, baglan());
adpt_nbt = new SqlCeDataAdapter(sorgu_nobet, baglan());
The following subquery returns 1024 records with a single count column. I am trying to return multiple columns with 1024 records. When I make changes to return multiple columns I get 17,000 records instead of 1024 records. I have very litle experience with subqueries. Does anyone have any suggestions?
SELECT IFNULL(COUNT(DISTINCT A.computerid),0) AS 'Machine Count',
A.hotfixID AS 'Last Contact'
FROM (SELECT HF.computerid, HF.hotfixID
FROM hotfixdata HFD
INNER JOIN hotfix HF
ON HF.hotfixid = HFD.hotfixid
AND HFD.ignore <> 1
LEFT OUTER JOIN Computers AS C
AND HF.installed <> 1
AND HF.Approved = 1
GROUP BY HF.hotfixID, HF.ComputerID) A;
I have a class and the data in it has to be stored in the database. I have done this by extracting every single property and sending each of them as input parameters to a stored procedure in the database. But the thing is that the class has too many properties. Is there any way to accomplish this in a single go(lesser amount of code)? OH and I am using C# and Sql Server!
Presumably it throws an exception and posting that exception as part of your post is helpful.
Presuming that the exception is telling you the statement is too big...
If your SQL statement is too long then SQL/C# will have a problem with it. The solution is to reduce the size. (Might note that that could indicate a database design issue as well.)
Using the database import tools (sqlldr?) might help with the problem but might not. If not or you don't want to use it then solutions.
1. Do two calls.
2. Pass in a blob and parse it to create dynamic sql.
If it was me I would presume that there is a design issue.
The thing is I am still new. Is it not good that we do more of the grunt work at the starting stages itself? And DAL coding is quite hectic. Sometimes I really feel why do we need it anyway. Thank you for eye opener atleast.
Search a topic like "Business Intelligence" and you will get an idea of what others are doing.
At a high level, you have a central database which collects data from different systems, then you have reporting tools which either you or your user community can write their own reports. Also, sometimes you can have these reports run on a scheduled basis and have their output delivered to whoever needs it.
The cost of implementing such a system can be quite high.
You might want to implement some pieces of this.
1) Build a tool which will load your data extracts (xml files) into a central database.
2) Create views or stored procedures which "normalize" your data
3) Use a reporting tool to generate the reports you need or show your users how they can connect to the central database via Excel to get their own data. (See topics like Microsoft Query)
Empower the user to access the data and you will spend less time on maintenance.
Best as I can tell from your question you are asking for something that is impossible.
SQL Server normally installs as an application. Thus it is ALWAYS visible on the client machine.
Now you could re-architect your application to use the embedded version of SQL Server (I think.) Then it would appear that only your application existed. I suspect this would require a bit of work. There are limitations to that approach.
Note as well that this has nothing to do with security. Security means something completely different.
If any user want to insert another row in that table then it is not possible message should be display....
Please Just reply the Query & CODE....
A database does not display messages, it holds data. It holds data in bulk, not in "single records". What you want does not make much sense from a data-viewpoint.
Now, this is not the place to "order" code like you do. The questions on the forums are answered by volunteers, in their free time. Put it in a object, serialize it, and stuff it in a single-column blob.
I didn't downvote you, but I think most people here won't like to do your work. You can ask questions, you can show your code and ask for help, but you have to do your work on your own. So my answer is: Definitely not!
Super solution but can't be replicated in other databases like oracle.
Identity is a near equivalent to rowid concept in Oracle but its a dynamic value i.e not a sequence generator . Hence may be Only possible in sql server.
If you change the start value of the identity seed (I'm pretty sure this can be done) and truncate the table the hard coded 1 would fail mind you it is a silly request in the first place and I suspect the OP does not even know what you are talking about.
Never underestimate the power of human stupidity
Not sure it is the right forum...
I'm using VisualStudio 2010 and trying to use schema compare tool to deploy changes from my db project to my database (SQL server 2008).
when trying to deploy the changes I made in my db project I get anexception with the following message:
"Cannot create index. Object 'v_TQ_IFCR_DATA_COVERAGE' was created with the following SET options off: 'ANSI_NULLS, QUOTED_IDENTIFIER'."
If I will go the the index properties,I see that AnsiNulls
and QuatedIdentifiers are set to project default.
When changing those settings to ON I manage to deploy all updates without any exception.
How can I change project default settings?
I have more then 200 different objects(tables,stored procedures,functions,indexes etc.) and I don't want to change the settings of each object.
Last Visit: 31-Dec-99 19:00 Last Update: 27-Mar-15 2:07