|
You could use SQL Server Integration Services (SSIS) to achieve this. For each table, set up a task that writes the contents of that table out to an individual flat (text) file.
This site[^] has plenty of useful resources for SSIS.
Paul Marfleet
"No, his mind is not for rent
To any God or government"
Tom Sawyer - Rush
|
|
|
|
|
Hi,
Can you tell me exactly the way in which i can proceed?
Setting up a task for each table in the sense how can i proceed?
Should i go for workflowfoundation? or manually select each table and do the process?
Is that not feasible implementing a loop?
Thanks
Raghavendran
|
|
|
|
|
I'm afraid I'm not an expect in SSIS. You probably can set up a package to iterate over tables in a database but I don't know how to do that. If there are not many tables in your database and the design is not likely to change, you could manually set up tasks to export each table. I suggest you read the SSIS documentation to find out what it is capable of doing.
Paul Marfleet
"No, his mind is not for rent
To any God or government"
Tom Sawyer - Rush
|
|
|
|
|
I have a ms access database table with "ID" autonumber formatted to "AB"0001. It incremented automatically to AB0002 and so on.It works properly in Ms Access table. But when I wanted get the ID recordset to Datagrid it shows only 1, 2 and so on not the same format as i mentioned before.
Any solution
|
|
|
|
|
Make sure you are pulling the correct column. "ID" may be a reserved word, and when you query the table, you are pulling the reserved word (like rowid in O.r.a.c.l.e.) and not your column.
When you SELECT data from the table, can you Alias the column or quote it? It may address the issue.
-Alias
SELECT ID as 'RowIdentifier' FROM table
-Quote
SELECT [ID] FROM table
If you get stuck, you can always append the "AB0000" to the results before they are displayed to the user.
|
|
|
|
|
Hello everyone,
Is it possible to read the contents of a .bak (SQL backup) file via .NET?
Thanks and best regards..
.:: Something is Wrong ::.
|
|
|
|
|
You can read any file in .NET by using the classes in the System.IO namespace. Whether you can make sense of the data is another matter. Backup files aren't designed to be read outside of SQL Server, unless you know how the data is structured it won't make any sense. You can invoke code to backup/restore a SQL Server database from .NET, but trying to read a backup file yourself seems pointless.
Paul Marfleet
"No, his mind is not for rent
To any God or government"
Tom Sawyer - Rush
|
|
|
|
|
Thanks for your reply, let's me explain why I need to read the backup file :
The SQL Server which I work with is at a remote location, and also I do not have full administrative rights. So :
1. While the server is at a remote machine, the server cannot recognize the backup file path I give, the file should be uploaded to the machine in which the server is running.
2. Even if this is done the server will not give me restore permission.
By the way, someone will send me backup files and I know the tables, fields etc. I just need to read field contents to a data table via a wrapper.
Regards..
.:: Something is Wrong ::.
|
|
|
|
|
As I have already stated, you can't read the backup file directly. You need to restore it. If you don't have permissions to restore the backup then you need to obtain them, otherwise you're stuck!
Paul Marfleet
"No, his mind is not for rent
To any God or government"
Tom Sawyer - Rush
|
|
|
|
|
Which is the best method to pass a value from DAL(Data access layer) to BAL(Business abstract layer) ? I used to pass DataSet/DataTable instances. But recently I heard, passing custom entities would be much good. But thinking in the performance perceptive, filling the custom entities after database operation will be time consuming. Is it ?
Anybody having any idea about DTO ? Is it possible to have a generalized DTO ?
|
|
|
|
|
N a v a n e e t h wrote: the custom entities after database operation done will be time consuming
You have to fill something with values from the database. A DataSet/DataTable still needs to be filled, so filling one as opposed to the other isn't an argument. Typically you will fill your custom object using a DataReader which is the fastest method for populating individual rows of data (google firehose cursor to find out why).
The key thing to understand is that a DataSet/DataTable is a convenience to make things easier for you as a developer. They do have tradeoffs though - for a start, because they are so generic, they have to be able to cope with a wide variety of data situations. As a consequence, they do tend to be very large. Secondly, they are MS specific constructs, so you are putting a large burden on external systems if they need to interact with yours - in other words, don't return a DataTable/DataSet over a web service.
|
|
|
|
|
Pete O`Hanlon wrote: in other words, don't return a DataTable/DataSet over a web service.
Thanks Pete. I am not speaking about webservices. I just want to know, "is it a good practice to pass DataSet/Reader/DataTable from Business layer to Data access layer" ? If it is not how this can be done efficiently ? I was doing a search for DTO which looks closer to this. But couldn't get enough information on DTO.
|
|
|
|
|
Well, DTO is basically a design pattern that is intended to remove the need to make multiple calls. Consider a sample where you want to get the name and address of a customer. One way of doing this would be to make two calls, with the first one getting the customer details and then the second one getting the address details. The other approach would be to return the customer and address information using one call.
As with everything else, there are tradeoffs that need to be considered. Is it more efficient to retrieve the two items in one go or is it better to get them one at a time? This could arise because you only need the address very occassionally, so the load to the server would be less with two calls. If you need the customer and address information all the time then you would be better off considering a DTO.
|
|
|
|
|
Thanks Pete for your help.
|
|
|
|
|
Your business layer should definitely be operating on your business/domain objects. The general principle here is that the your domain objects are abstracted away from the storage mechanism.
If you are using DTOs, then to keep this abstraction going, its fairly likely that your DTOs will be closer to your domain objects than the storage mechanism. This brings up the obvious idea that since your DTOs look so much like your domain objects, then lets give the DAL the capability of acting on your domain objects directly.
Now you are entering ORM territory, and if you don't mind me saying so I think our product Diamond Binding[^] does this quite well.
In regards to performance, a naive DAL implementation is going to be slower than, say, ad-hoc queries to the database. Even a highly optimised commercial OR Mapper like Diamond Binding is going to be slightly slower when performing trivial operations (insert single record, etc). Where you gain a lot of performance back is when you are performing more complex operations.
"Invoice.Load(34);" may not appear to be doing much on first glance, but theres huge gains that can be made here when your DAL is fully aware of the relationships in your domain model. DB will pull the invoice back, making some intelligent decisions about joins, and you'll have the invoice34.Customer and the invoice34.lineItems[] as well, in one round trip. Theres similar gains with mature caching strategies as well.
Anyway, yes a generalized DTO is realized by eliminating it A DAL that's aware of the domain model can act on your business objects intelligently, and pick the most optimal strategy for the particular action you are requesting. (And my biased opinion is that its much easier to use an off the shelf one than trying to write one yourself
|
|
|
|
|
Thanks mark for valuable post. Thanks again.
|
|
|
|
|
i have cooencted a access fron end with a sql backend.
what sort cursors are used?
Is it client side or server side?
|
|
|
|
|
That depends entirely on your connection and the way you interact with it.
|
|
|
|
|
None if you can help it. Cursors are sometimes a necessary evil (thier performance is not great and thier logic (procedural) is in conflict with the set based nature of SQL), but you should first try REAL hard to avoid using them.
Not to steer you away from this wonderful site and its knowledgable participants, but you may get more specialized assistance with your question at a place like sqlservercentral.com.
|
|
|
|
|
That's not what he's talking about. He's talking about a client-side versus a server-side cursor. This was more of an issue with old ADO code than ADO.NET.
|
|
|
|
|
I am using SSRS 2005. I can fetch report from remote reporting server. But user can access there reports. I mean, User can only access reports that are part of their logged on credentials (role). Help please.
|
|
|
|
|
see the URL: http://msdn2.microsoft.com/en-us/library/ms157198.aspx
Kishore.P
|
|
|
|
|
Hi,
I am using SSRS 2005. I used charts to show the report. I have to change the chart types at run time basesd on the option. User can select any chart eg. bar chart, pie chart, column chart etc. On selection screen should populate the chart type. I am not able to change chart type at runtime. Help please.
|
|
|
|
|
You can't change the chart type of a chart at runtime. What you can do is have a number of chart controls on your report, each one configured for a different chart type. At runtime, change the visibility property of each chart control so that only the type that the user has selected is displayed.
Paul Marfleet
"No, his mind is not for rent
To any God or government"
Tom Sawyer - Rush
|
|
|
|
|
Is it possible to install MS SQL Server 2000 on Windows XP Professional so that i have a local, independent database on that particular PC? The idea is to have a local database for use on that PC.
PS: Installing MS SQL Server 2000 on Windows Server 2003 gives me the local database.
Is it that MSDE is the only way(of the two) to have a local database??
|
|
|
|