Building a dynamic query in SQL, even inside a stored procedure, will not prevent SQL injection[^].
Your example could easily be rewritten as a simple UPDATE statement. (NB: Constant values like 100 are never null!) If any of the new values are passed in as parameters, then your query is open to SQL injection.
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
I believe I've found a very unexpected bug in SQLBulkCopy that took me a while to track down. My test program reads in records from a CSV file with exactly 1 column which is defined as a "numeric(19,5)" for statistical uploads.
Condition 1: If the input file has 25 rows with the only value being the digit zero, followed by a number with a decimal (like '1.23456'), the data is imported perfectly. Condition 2: If the input file has 26 rows with the only value being the digit zero (or any non-decimal number), followed by a number with a decimal (like '1.23456'), the data is imported and the last (and any subsequent) rows have the field imported with truncated decimal digits (like '1.00000'). Condition 3: If the input file has 26 rows and the first row is '0.0', all of the rows are imported perfectly.
I've checked the table/field definition for Condition 2 and it's the same as 1 and 3 so there's no manipulation of the field type (as far as I can tell).
Re: the numbers The test only had a single column although I first noticed the behavior with 100 columns. Each row only has 1 value (zero) until the 27th row which is the decimal number, i.e. 0 0 0 ... 0 1.23456
Re: other questions 1) The destination was SQL Server 2008 and the SQLBulkCopy came from the .Net 4.5 library. This brings up the question "Is the SQLBulkCopy the culprit or the destination SQL Server instance?" 2) Example data: see above 3) Relevant code
using (SqlConnection connection = new SqlConnection(strConn))
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
bulkCopy.DestinationTableName = strSQL_table_name;
bulkCopy.BatchSize = GetBulkCopySize();
bulkCopy.BulkCopyTimeout = 2000;
bulkCopy.NotifyAfter = GetNotifyAfter();
bulkCopy.SqlRowsCopied += new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
catch (Exception ex)
string error = ex.Message;
MessageBox.Show("Error(CSV_To_SQL-a): " + ex.Message);
4) single column in a table, name="BigNumeric", data type="numeric(18,5)
The column still gets cast as an Int32 from FillSchema(), then I try to change it to typeof(Decimal) followed by Fill() and the decimal points are still truncated in the resulting DataTable. I even tried to set the DataType to String but that didn't work either.
Next I've discovered another practically hidden Microsoft "feature" called TypeGuessRows (and IMEX=1) which supposedly can be modified in the registry or the OleDbConnection string. I haven't gotten it to work yet but at least I know that others have seen this behavior too.
I then open SSMS an another PC and attempt to connect to my IP, which fails with
“A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections.
Do I need to do something at my router page to allow access to the server? If so, what?
Do keep in mind that there are people out there who are running portscans in their free time. You might find strange things happening in your database if you hook it up to the internet without some precautions;
Set the thing to "Windows Authentication" and turn the SA-account of. There could be a checklist on Google, didn't check that - but you might want to
Bastard Programmer from Hell if you can't read my code, try converting it here[^]
When your server is behind the router, then it has a "private" IP address, hasn't it? Then things get complicated. You could create a VPN, such that home and office are virtually in the same network (same range of IP addresses), or you have to configure port forwarding on your router. I cannot give details on how to do that...
Above question is a "small" question about something I'm at a dead end at the moment. How do I design a database to store all kind of addresses from all over the world? All countries have different address specifications. Some countries have more address fields than others, and/or of different structure.
This is just a fraction of a larger problem I'm facing. I need to redesign a database that is country specific to a global design. Addresses might be the most easy thing to globalize.
The answer is to normalize properly. One table for countries. One for address types, such as delivery addresses or box addresses and so on, per country. One for address parts with one row per street housenumber, town, zip and so on. Another one for holding the combination of addressparts and addresstypes, with ordering, or rather placement on the letter. You'll need one table to hold the addresses as an entity, with an address type reference, not the actual data. This table could of course be combined with a person or company if that's fitting, but I woud split it as a company might have more than one address. And lastly one for holding the actual data with one row per address_type_part and address.
Make a model of this and check if I've forgotten anything.
Well there are of course different ways. One way is to create a dynamic database:
Create a adress-table with fields: DataId INT, FieldName VARCHAR(X), FieldValue VARCHAR(Y) or BINARY Index is (DataId, FieldName)
Instead of storing "New York" in a column "CITY" you store "City" in "FieldName" and "New York" in FieldValue".
You can also define Field-Groups: like "Adress" which my occur more than one time, by adding another column Line INT, that allows to store an Address 1, Adress 2, ..., Adress n and of course a Phone 1, Phone 2, ..., Phone n
If n isn't large enough you can even store x values
This way ain't efficient for tables with a lot of complex queries, but I think you don't need to have comples queries here, so the responde-time should be good enough in this case.