Computers have been "online" since before 1980. General subscribers have been playing games through billed services and bulletin board posters have been placing peers on the alert through the years up until the turn of the century when the latest morph/changling of the proud parent "the internet", the cloud, was born. Since then the whole business of "connectivity" has blossumed into a billion dollar industry and everybody welcomed another venue to exploit this electronics gene-pool. But computers, aside from what one reads now, are calculating devices at a level just above electronics. And hopefully you'd agree with me being a coder and a CPian, that just to the left of any ethernet card or serial port device, growth of this technology is around speed and volume.
CPU and memory. Faster and more of something. And possibly smaller footprint.
Now look at the cloud. what is it? A storage device. It's an external "hard drive" to the computer. Instead of adding a drive to my stack I add a drive to an external location? Seems more like an unknown Ex on a map where I have little influence over maintenance and security. Average dunderheaded cloud user (the one who signs up to store stuff, his apps, his information, his things he's going to make available to others) there could just as easily plop another hard drive into his ccabinet and have locally, his way, when he wants it, out from under any eyes but his own, for a decade until it's full, all his stuff just as easily.
It's a conservative spin on an alarmingly liberal communicative yackety-yack non-sense-for-the-sake-connected-non-sense that I will stand behind. Frankly, noone's going to chuck their hulking 100lb stack of hardrives and their gigantic 6.0GHz CPU and the 256GB of RAM they call "MyComputer" just because "cloud" storage is free now.
I lay prone on my belly with my thirty-ought-six and a full magazine waiting for the enemy to come up over the trench. There are many trees in view in directions all over the horizon most defoliated by precious weapon battle. I can hear the enemy coming my way, his tromping and plodding as he approaches signals me I must steel my nerves. I aim by dropping my eye to the gunsight.
I squeeze the trigger at the first sight of a beret.
That is a meaningless question because "optimal" is a subjective term.
Obviously one difference is that if the hosting provider is responsible for the database then they insure that the server is up, that it is backed up, that it has reasonable security restrictions, etc. The downside is that it will be limited in some way.
Many hosting companies are now offering virtual servers (called VPS).
I am planning to book a VPS for one month to see if the performance are OK.
To avoid writing a special application, are there ones that can be easily installed and that can perform benchmarking of the server ?
My principal doubt is regarding two things:
For the CPU, I'd like to verify if the frequency offered is real. So if an offer tells that a VPS will be given a 2.2 GHz CPU, I'd like an application to repeat the execution of some code for a long duration and then give idea of the execution speed and the evolution of that speed over time.
I fear that the CPU is shared by some other applications on other VPS.
Concerning memory, this is really very expensive. So for example 2 GB of RAM is the most that one can afford. However hosting companies tell that they provide SSD memory. So when swap occurs, it goes to SSD and not to the disk.
I wonder if any benchmarking tool would be able to also test the performance of this SSD.
I hope the tools in Linux.
Push Framework - now released !
I am trying to ftp from our QA server, based on a ftp tool(Filezilla) set up on my laptop. The QA server is literally located out of the state i live in - outside our DMZ. The QA server is Windows 2008 server.
I can see that the FTP server and FTP service is "installed" on the QA server. But when I used commands from my dos prompt like
ftp> open <qa servername=""> - it does not come back with username and password request.
The user account I am using is an Admin account on the server.
Is there a way to check if the FTP service is enabled?
That depends. Are all the files that will ever be stored on your FTP site open to the public? If so, allowing anonymous read access shouldn't be a problem. You only need to restrict read access if your site hosts files which are private or confidential.
Anonymous write access, on the other hand, is a Very Bad Idea™. Unless, of course, you actually want your server's hard-drive filled with viruses, pirated software and illegal pornography.
why can I see only the c:\inetpub\ftprrot folder when I connect remotely?
Because that's the directory your FTP site is pointing to. If you want it to point to another root directory, go to Advanced Settings, and change the Physical Path.
Alternatively, you could add the other folders as virtual directories - for example, you could have ftp.yoursite.com/somefiles pointing to E:\FTP\Public Files.
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
many ISPs block HTTP ports.
If not, you're lucky. Also you'll require to configure port forwarding in your router such that HTTP requests received on your IP address are forwarded to your machine via router