|
Hello everyone,
I am developing an ASP.NET MVC 4 application with SQL Server 2008 and I said a method in the controller and I want to send a parameter from my page (eg Index.cshtml) and retrieve the results to display on this page.
do you have an idea or approach?
Thank you
|
|
|
|
|
Nope.
I have absolutely no idea what you are talking about!
Remember that we can't see your screen, access your HDD, or read your mind. So we only get exactly what you tell us to work from. So explain in detail what you have tried, where you are stuck, what you want help with, and perhaps supply relevant code fragments?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Member 12165973 wrote: do you have an idea or approach? Yes, post your question in the ASP.NET forum where people will be able to give you more suggestions.
|
|
|
|
|
You can post the values in Form and retrieve the values in Controller
Example:
First Name:
|
|
|
|
|
Hi all,
I know that I write crappy code, but the code snippet below takes 16 seconds on my machine in debug and that is unacceptable.
Please help me to speed it up.
string valueString = string.Empty;
int length = AtronBytes.Count - 39;
for (int i = 0; i < length; i += 8)
{
int colNum;
if (i == 0) colNum = 1;
else colNum = i / 8 + 1;
string column = machineDataColumnNames[colNum];
byte[] valueByte = tempAtronBytes.GetRange(0, 8).ToArray();
byte chkVal = Extentions.CalculateOpdrChecksum(tempAtronBytes.GetRange(0, 7).ToArray());
if (chkVal == tempAtronBytes[7])
{
valueString = Encoding.ASCII.GetString(
tempAtronBytes.GetRange(0, 7).ToArray()).Trim();
tempAtronBytes.RemoveRange(0, 8);
SqlCeExtentions.ZakdataDatabase_Update(column, valueString, opdrString);
logAppend(column.PadRight(30), valueString);
if (colNum > 146)
{
logResult();
}
}
else
{
MessageBox.Show("Checksum error");
return;
}
}
Thanks,
Groover
0200 A9 23
0202 8D 01 80
0205 00
|
|
|
|
|
You need to look closely at your numbers, and at the other code involved.
You say the loop runs for 1107 iterations, and that ZakdataDatabase_Updatetakes 87ms - so if your code called the method each time round, then it wouldn't be 16 seconds, it would be 1107 * 0.087 seconds: 96 seconds. So the chances are that it isn't calling it every time, but that the slowdown is in that method. At a guess, it might be worth trying to build up a table of updates, and carrying them out as a single bulk operation, or at the very least looking at what that method does and how it does it to see if you can bring that time down.
[edit]Gah! I got the numbers wrong (I missed the "i += 8") which reduces it to 12 secondsStill the same problem and solution, though![/edit]
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
modified 8-Apr-16 7:32am.
|
|
|
|
|
for (int i = 0; i < length; i += 8)
-> 1176 / 8 = 147 loop executions
SqlCeExtentions.ZakdataDatabase_Update(column, valueString, opdrString);
-> 147 * 87ms = 12.8 seconds
the code snippet below takes 16 seconds
-> 12.8s * 100% / 16s = 80%
The database call makes up for 80% of the time spend for that code. You don't need to optimize the code that you've shown here, you need to optimize the database call. You probably should build a batch statement and use that to update the database only once after the loop is done.
If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson
|
|
|
|
|
I missed the "i += 8" - your numbers are right, mine are wrong. But we both think it's the same thing, which is good.
You get my upvote and I've edited mine!
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Thank You for your fast response.
I should have done the math myself !
Unfortunately there is no bulk-copy for SQL Compact Edition, so the only way I think Is to show the data first in the richTextBox and insert it in the database on a different thread and notify the user when it is successfully finished. Or is there another way?
Groover,
0200 A9 23
0202 8D 01 80
0205 00
|
|
|
|
|
To be honest, I didn't spend too much thought on the fact that you're using SqlCe. (If you're planning on using this application for several years you might want to consider using a different database because SqlCe is "in deprecation state", meaning that there won't be any new versions and support is running out soon-ish.)
I don't really know about the restrictions of SqlCe compared to SQL Server, but Google seems to tell me that SqlCe does not only not support bulk copy but also doesn't support batch statements, which I was originally thinking about. FYI, in "real" database systems you can execute multiple statements at once by sending them as one command-text ("batch"), separated with a semi-colon, e.g. this could be executed with a single command on SQL Server:
UPDATE myTable SET col1 = @p1 WHERE id = @p2;
UPDATE myTable SET col1 = @p3 WHERE id = @p4;
UPDATE myTable SET col1 = @p5 WHERE id = @p6;
But SqlCe should at least allow you to execute multiple commands (each with one update-statement) within one transaction, which would be the next best solution and should also speed up the process greatly so that you wouldn't have to resort to creating a separate thread for that.
I found two projects which might be helpful (though the former suggestion shouldn't be hard to implement):
1) SQL Compact Bulk Insert Library - Home[^] (Apparently only for inserting, not updating)
2) GitHub - lokiworld/Lyare.SqlServerCe.MultiQuery: An extension to the Microsoft SQL Server Compact Edition client library, which simulates the support of the multiple statements in a single command.[^]
If the brain were so simple we could understand it, we would be so simple we couldn't. — Lyall Watson
|
|
|
|
|
I haven't used SQLCE for some time, but the alternative SQLite. SQLite is slow if you update (or insert) in a loop, unless you wrap it in a transaction.
I'd also recommend using a List<object> to hold the data, and not a RTF which needs to be redrawn on each modification.
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
You should "count" the number of database inserts / accesses; I think it makes it easier to contemplate the thing.
For example, using Entity Framework and SQL Server Express, I can sustain over 1000 table inserts per second while loading entire object graphs.
|
|
|
|
|
I followed Your advice and went for SQLite database and entity framework EF6 and the total time needed for the same same code is now 1400 ms vs 16 sec before!
It took some time to learn and implement EF(hence the late reply) but it was worth the efford
Thank You,
Groover
0200 A9 23
0202 8D 01 80
0205 00
|
|
|
|
|
Glad it worked out for you!
Didn't mean to say you should use EF; just that more "benchmark numbers" would be useful.
But yes; EF lets us think a little less about data access strategies and avoid some mistakes that can cause performance problems.
If you haven't already, these were good starting links for EF:
Entity Framework (EF) Documentation[^]
|
|
|
|
|
Only a detail
replace
for (int i = 0; i < length; i += 8)
{
int colNum;
if (i == 0) colNum = 1;
else colNum = i / 8 + 1;
with
int colNum=0;
for (int i = 0; i < length; i += 8)
{
colNum ++;
It will not really change the overall timing, but should improve a little.
Patrice
“Everything should be made as simple as possible, but no simpler.” Albert Einstein
|
|
|
|
|
You are absolutely right, I did mention that I write crappy code.
changed it to:
int colNum = 0;
for (int i = 0; i < length; i += 8)
{
colNum = i / 8 + 1;
and zero divided by 8 gives zero, so no need for checking that.
Thanks,
Groover
0200 A9 23
0202 8D 01 80
0205 00
|
|
|
|
|
Your change correct, but mine is faster.
Patrice
“Everything should be made as simple as possible, but no simpler.” Albert Einstein
|
|
|
|
|
Hi guys,
So I have an object that I'm exposing via .NET remoting (for historical / political reasons I can't just drop this and go to something more modern), which I use to process a list of items, one call per item in my list. The remoting server is set up as Single call rather than Singleton, and so my understanding is that I should get a new instance of the server class on each invocation - i.e. for each item in my list.
This seems to work just fine for small datasets, but when I push it a little by processing a list containing approx. 200K items, it reaches around 19K processed, I see a timeout, and the call fails with a socket exception - "A connection attempt failed because the connected party did not properly respond after a period of time or established connection failed because connected host has failed to respond". The next call then succeeds and processing continues...but then fails again later.
There seems to be no significance in the number of records processed - this varies from around 19K to 19.1k and fails at different points and with different records.
Additionally, when I run the system in multi-threaded mode (Parallel.ForEach() & MaxDegreeOfParallelism == -1), then I'm seeing the same "profile" of failures, but instead of a single record fail followed by the next "working", I'm seeing about 10 failures before it comes back to life - which leads me to believe it's something like the server actually dying and being re-instantiated.
Anyone have any ideas? I can batch out the data and get a "new" remoting connection if I create a factory to do so (I'm injecting my remoting client), but this seems like avoiding the problem rather than understanding the cause and fixing it. Ultimately this will be used to process > 1.2M records
Cheers guys!
C# has already designed away most of the tedium of C++.
|
|
|
|
|
I would pass the whole lot of records at one time. Not only is it going fubar after 19k records, but if you process 1.2M records, imagine how much time you are adding with the network latency between each record.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
You should open this as a ticket to your network providder
|
|
|
|
|
I am currently working on monitoring the folder on desktop. When the hex data files are dragged into the "Monitored" folder, it would automatically start to analyze the hex files and create the interpreted files into other folder.
I have created the Local Service process "Interpreter Service". The thing is that when I started the service then it abruptly stopped running. Upon checking Event Viewer, it reported the error:
Service cannot be started. System.ArgumentException: The directory name C:\Users\RBWorkstation\Desktop\Monitored Folder\ is invalid.
at System.IO.FileSystemWatcher.set_Path(String value)
at InterpreterService.UM_InterpreterService.OnStart(String[] args)
at System.ServiceProcess.ServiceBase.ServiceQueuedMainCallback(Object state)
The folder is right on my desktop! So anyone have any suggestion on how to fix this so this service can find the folder?
Thanks!
~ Ron Boucher
|
|
|
|
|
What account is your service running as? Does it have access to the folders in your profile?
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Help me to clarify the "account" you are speaking of? I'm on Administrator account.
|
|
|
|
|
Yes, but what account is the service running as?
When you created the service installer, you selected an account for the service to run as. That account needs to have sufficient permissions to access the folder it's trying to monitor.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
As I said on the first post: Local Service.
|
|
|
|