It seems that no-one is willing to answer you, so I'll give it a shot.
1) Should we abandon the use of a local DB? Our customers are required to have a broadband connection, so speed wouldn't be a huge issue, I am concerned though about bandwidth usage.
Depends, mostly on the size of the recordsets you fetch. If it's limited to getting the data that's going to be displayed, then broadband is broad enough. If you're going to perform lots of manipulations on the data while moving back and forth, it might be more performant if you could do it locally.
You might want to consider a hybrid; keep the data 'in the cloud' (on your server), and have tables that aren't mutated in a local database. Or synchronize the most used data to a local cache - most reports don't require 'realtime' data, and are quite happy with a database that's synchronized every now and then.
2) If we should keep the local DB, does WCF support this type of data model? I haven't been able to find any examples of people doing this (which is leading me to believe it's a bad idea)
All the "local" examples would treat the server as if it's "remote"; that's due to using WCF - communication. Without WCF, you'd be talking directly to the local database. That might be an option too; it's faster, but makes it harder to share data. I'd recommend it only for the reports, who could use a local (readonly version) of the data.
3) Are there other alternatives to the Entity framework and WCF that would better suit our applications needs?
An alternative to EF would be NHibernate, but EF is a bit more than just an ORM. I think the tools you choose are the best for the job; with the added suggestion to add EntLib in the mix of ingredients.
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
to make using/maintaining our application much easier.
I doubt that will happen.
Since you already have experience with the existing application you know how it works. To re-write it you would need to learn one or more technologies and correctly implement them. Even supposing that if everything was equal while you were learning and correctly implementing the new stuff it would in fact be harder to maintain the application.
If and only if you do everything correctly then in time it might be easier. I doubt however that it would be "much" easier. Although if the original application did not grow in a disciplined way and the new application does then it could be. That however has nothing to do with technology.
3) We have a client database on every computer that stores a copy of the server
data (this is to make reads faster).
Based on the rest of your post the real reason for this concern was probably network latency. Older databases didn't have problems with read speed.
I am concerned though about bandwidth usage.
That depends on application usage. Exactly what do your users do for their normal work flow. You can't answer the question without that knowledge nor can you really architect a solution without that either.
If we should keep the local DB, does WCF support this type of data model?
If your user base is really using different databases (which is different than a marketing claim that someone could if they wanted to), then you would need to craft a solution to do this yourself. Your current architected solution is still sufficient in that regard.
WCF doesn't provide anything specifically helpful for that.
Are there other alternatives to the Entity framework and WCF that would better
suit our applications needs?
I doubt that can be answered without more specific information. At a minimum it can't be answered without at least some specific information about what exact traffic you anticipate. A medical imaging system is vastly different than a cash register app.
Each company can have location data on a number of different databases. Each
location's data is only on 1 database.
Not exactly sure what that means. I also question if your customers, at least a significant number, actually do that.
If your customers really do do that or if you really want to continue to market it that way (despite no one doing it) then it rules out database replication which would normally be the way to do this.
I have taken ownership of a Sql 2008 database that has a huge number of rows with HTML content. I need to implement some new functionality that would let users intelligently search through within the HTML of these records.
If you got a lot of spare time, you could build something similar yourself; fetch all the varchar-fields that contain the text, count how often it occurs, add a bonus if the text occurs in a title, and you have an 'intelligent' search
Bastard Programmer from Hell
if you can't read my code, try converting it here[^]
Where "html" means what exactly? More specifically the html represents what type of data? If there are tables, versus pictures, versus blog posts it impacts what one might need to focus on searching for.
I am working on a core project a class library.
and it is acting as a DAL for client
now DAL is going to fetch records from database
i heard about memorycache
but once i created a cache i want it to update itlself if a new record is added in table
is there a way my cache is recreated if record is entered in table?
If it's ADO.NET and SQL2005 or higher, you can define a "cache dependency" that automatically refreshes cached stuff when it changes in the database. I suspect it relies on the fact that SQL2005 can host the CLR to have a way to swap the ordinary roles (ordinarily we use database servers as... SERVERS, but here the *server* needs to notify a *client*, which simply isn't possible given the definitions of what a client and a server is).
If it's ADO.NET and an older SQL server version I believe there's a cache dependency that *might* help. It depends. This thing uses a polling model, which means there's a delay involved. Dirty cache reads are still possible. If dirty reads causes serious problems, you can't use this. If seeing the old data is OK for a short time after it has really changed, you probably can use it.
Otherwise, you probably have to write the code yourself to keep the cache in sync. If the data is only modified by a single instance of your own application this isn't really difficult to accomplish. If there are multiple instances it gets more complicated, unless the instances are operating on isolated subsets of the data, since you will now need to establish some mechanism to ensure caches are kept in sync. It will also mean a performance hit and will require distributed transactions.
Cache invalidation is one of the hardest problems in all of common practical programming. In many cases, there are better practical approaches. Rather than trying to solve the cache invalidation problem, it may be easier and good enough to combine imperfect caching with optimistic concurrency. Concurrency is an often-ignored aspect of multi-user systems that you should probably address anyway (most apps should, but few do). Depending on how you handle concurrency violations (e.g. you could let the user resolve conflicts) having this in place often makes it acceptable to live with some dirty cache data.
You can use a timeout policy to prevent old data from sticking around for too long. Volatile data should perhaps not be cached at all. The ideal cache item is one that is expensive to get but changes infrequently. And never cache stuff that is inexpensive to get for a very long time. Saving a millisecond makes sense in something that happens often, but whether you save a millisecond every minute or every hour is not going to make any difference to the performance of your system. So in such cases, use a short timeout.
Also don't forget that there are frequently opportunities to influence how expensive stuff actually is to get. Look for opportunities to fetch (or compute) more than one thing at a time. Think about what information the application needs to get to show a much-accessed screen. Or better yet, run some profiling sessions and find out exactly what information the application gets, how often it gets *the same* information, and so on.
Lets see if this helps. I am reading back data from a device into a rich text box this is all working fine I can save out as a file fine, the data may contain multiple units so I need to check the first 8 bits to see if they are the same I can split them up fine it is how to store them. I don't really want to use separate text boxes so I was thinking of strings would this be a dumb idea as I don't want to cripple performance? Plus I don't really know how many separate readings are likely....
I want to develop a model on pipe irrigation system optimization using genetic algorithm,my boss want me to use the MAT LAB software but i am new on this software i am asking if there is any one who have idea on how to use MAT LAB for model development here is my email if you have document you can mail to
Before I ask a question, I should explain why I'm asking it and what for, to avoid flame, as question is highly flamable.
I'm in software development for 20+ years. I used to work with numerous (over 18) languages, many OSes, frameworks, platforms, etc. Over time I grew tired of learning some "new exciting tool/language/platform/you name it" just to have it disappear year or two later, or got superceeded with something else. I have only one life and don't want to waste it on learning "fashion" things that come and go before you even notice them. Besides, over time I grew to appreciate only two things - performance and "code beauty" (such as readability and maintainability). So, my by far preferred choice is C++. Yes, it is over 40 years old, but I bet it'll stay here for at least same amount of time. STL earned it's way into same league of "yes, we are here forever" (although I'm not big fan of it, but yes, it is well worth learning and sometimes even using it). And, or course, SQL. The new guy on a block is OpenCL, which was fighting with CUDA for quite some time now, and finally started to get into the same status as C++, STL and SQL.
I want to make myself clear - this is my personal choice, and I'm not advocating anybody else to agree with me, I just want to explain the starting point for my question.
I'm undertaking a new project right now with two goals in mind. First one is obvious - if I succeed, I wouldn't need to work for somebody else ever again . Second goal is - if I'll fail, then at least I will learn something new to make me more marketable, as in "able to find a decent and fun job with high pay" . And this is second goal that gives me headache right now.
Major part of the project is "number crunching", so I'm fine with what I already know and have. But minor part of the project involves GUI, and that is where I was hoping to learn something new AND have value of that skill to be retainable for quite some time. I used to work with Delphi/VCL (which I deem as a very, very good attempt at being perfect framework), Powerbuilder (well, this was not quite a framework, but it was, well, not too bad, and for displaying large amounts of data it was the fastest thing I ever saw), and last few years I worked extensively with MFC/ATL (which I deem as ugliest thing ever invented), and some other libs/frameworks that are not even worth mentioning.
So, I set myself on a search for a new GUI framework that "is here to stay". I code 99.99% of time for Windows, so I didn't care for other platforms. And, yes, I use MSVS - sorry, I like it the most. Call it personal preference, but I'd like to stick with it.
First, I checked Borland's (now Embarcadero) C++ Builder. Took them off the list - can't use VCL in MSVS, and they don't even have 64-bit compiler (which is an absolute must these days).
Then, I checked Qt. Oh, I wanted to try it for some time by now, and believe me, I spent a week reading, reading and reading about Qt 5, trying to convince myself to go this route - after all, they've been around long enough and very successful. But sorry, looks like I have to take this framework off the list also. And the reason is - they made a move that indicates beginning of the end of the Qt. You probably heard it: QML. Jeeze, I seen a zillion attempts of big corporations to introduce language/tool/methodology to "simplify" programming or design, so they can hire low-paid people off the street with no skills, and have them produce programs using this new thing. All of those attempts failed, and failed miserably. And the reason is well versed in this quote: "make a tool that even a fool can use, and only a fool would use it". Well, whatever the reason is under the choice Qt devs made, it takes Qt off the future desktop development. Yes, they will stay in for probaly another 5-10 years, but starting this summer (with inroduction of QtQuick 2) this is a dying framework (from C++ point of view).
Then I saw Ultimate++. Looks excellent, nice, neat and fast code, been here for 11 years, stable enough. Two problems: first, I can't use MSVS with them. Second: well, with all due respect, just how many U++ jobs are out there? Yes, zero.
So, am I stuck with MFC/ATL/WTL? Or can someone recommend a framework/library that "is here to stay" and have a "hire-able" value? I really want to find something, as MFC causes a PITA every time I have to use it.
P.S. Yes, I'm aware of WPF and WinForms, but no, with managed C++ they are not worth it.
recommend a framework/library that "is here to stay" and have a "hire-able" value?
You are asking people to foretell the future. How can we know what may be in vogue next month or next year, or what experience employers will be looking for? My best advice would be to get away from C++ and look at WPF, HTML 5 etc; but then, what do I know?
One of these days I'm going to think of a really clever signature.
Well, you are right to some extent - nobody can foresee a future. For example, if someone asked me same question six months ago, I'd say - go Qt. And then they blew up. On another hand, it's year 2012, and it amazes me that these is still no widespread good, "industry standard" C++ GUI framework. Or at least I don't know about the one.
MFC/ATL is "industry standard" (on windows), but it is by no means "good". Qt was meeting these requirements for quite some time, but not anymore. So I'm kind of clueless at this moment.
As for HTML5, for example - I can code it; I also know Java, C# and many other things. Yes, they are in demand, but, as I tried to explain, I can't make myself to have fun with them - not after you get with C++. Yes, for someone struggling to find at least some job I might look nitpicking and arrogant, but as long as I have choice, I'd like to stick to something that gives me sense of inner satisfaction and joy. Language-wise, it happend to be C++, so I'll try to ignore other things for as long as I can
How do you define "good"? Lots of people use either one of these and are quite
happy with them; others use WinForms in C# or VB.NET, still others use WPF.
Well, personally (and others have other views, I'm sure) I deem framework "good" when I:
1) don't need documentation to use it, i.e. it is self-explanatory, and usage is obvious with no hidden surprises. If you need to read hundred doc pages before writing "HelloWorld", it is not good;
2) spend my time using framework, instead of fighting with it, trying to overcome shortcomings/bad design;
3) can use it the way I'd see feasible, and not the "only" way authors force you to go. For example, MFC is bound to Doc/View paradigm. As soon as you even think to try to go other way, you are penalized. I mean, it is still doable, but with much more efforts than it could be and should be.
As for WinForms or WPF - they are very good if you use C# (or other .Net lang). If I'd go with C#, I'd jump on winForms, and there would be no question. Question is about C++ framework...
For example, I'd say that Win32++ is close to ideal, with two exceptions. First one is minor - author uses same names as MFC, so you can't mix'n'match those two, but this is really minor. But second exception is very big, and is very same as U++ problem - who uses it? Very, very few people, so even though I like this one, it is of no use for my goal.
When I programmed in C++, I never felt comfortable with creating a GUI. What about being a little schizophrenic: stay with C++ for most programming purposes, and use Managed C++ (Windows Forms) for the GUI?
Well, I tried to explain my goals. For the needs of the project I can use anything available - Qt, Win32++, or even straight Win API - it'll all work out fine, as GUI that I need it very minor. But the goal is not just use some framework to create GUI, but learn it for good and then use in extensively on my next job(s), and not jumping on a new framework every (half)year. I'm already at the age when job should also bring fun, not just money. And Managed C++ is by no means "fun" - whoever designed it was really, really sick, and probably inspired by Brainf.ck language Basically, there is no beauty of C++ left in it, only "Managed" horror :(
Kosta Cherry wrote: And Managed C++ is by no means "fun"
Depends on your definition of fun. MC is obsolete btw, have you tried C++/CLI? I find C++/CLI very enjoyable with the added satisfaction that I can mix native c++ code like no other managed environment would let me do. For a good amount of time, I struggled to find a decent standardized c++ framework for web applications and even though there were some I didnt like any (just like you) and finally the idea of using C++/CLI to utilize the powerfull .NET framework stuck. Its fun for me because with just a few lines of code I get to expose my native c++ functionality as WCF webservices and from the same executable. It is marketable because you get to learn new .NET stuff as well as the intricacies of mixing with native. Mastering WPF, even though with C++/CLI, will be profitable because WPF is going to stay especially with Metro apps coming out in upcoming Windows8 and the possibilities they open up for Windows App Store are huge.
A downside however is that mono still does not support C++/CLI therefore its not portable yet.