That is a meaningless question because "optimal" is a subjective term.
Obviously one difference is that if the hosting provider is responsible for the database then they insure that the server is up, that it is backed up, that it has reasonable security restrictions, etc. The downside is that it will be limited in some way.
Many hosting companies are now offering virtual servers (called VPS).
I am planning to book a VPS for one month to see if the performance are OK.
To avoid writing a special application, are there ones that can be easily installed and that can perform benchmarking of the server ?
My principal doubt is regarding two things:
For the CPU, I'd like to verify if the frequency offered is real. So if an offer tells that a VPS will be given a 2.2 GHz CPU, I'd like an application to repeat the execution of some code for a long duration and then give idea of the execution speed and the evolution of that speed over time.
I fear that the CPU is shared by some other applications on other VPS.
Concerning memory, this is really very expensive. So for example 2 GB of RAM is the most that one can afford. However hosting companies tell that they provide SSD memory. So when swap occurs, it goes to SSD and not to the disk.
I wonder if any benchmarking tool would be able to also test the performance of this SSD.
I hope the tools in Linux.
Push Framework - now released !
I am trying to ftp from our QA server, based on a ftp tool(Filezilla) set up on my laptop. The QA server is literally located out of the state i live in - outside our DMZ. The QA server is Windows 2008 server.
I can see that the FTP server and FTP service is "installed" on the QA server. But when I used commands from my dos prompt like
ftp> open <qa servername=""> - it does not come back with username and password request.
The user account I am using is an Admin account on the server.
Is there a way to check if the FTP service is enabled?
That depends. Are all the files that will ever be stored on your FTP site open to the public? If so, allowing anonymous read access shouldn't be a problem. You only need to restrict read access if your site hosts files which are private or confidential.
Anonymous write access, on the other hand, is a Very Bad Idea™. Unless, of course, you actually want your server's hard-drive filled with viruses, pirated software and illegal pornography.
why can I see only the c:\inetpub\ftprrot folder when I connect remotely?
Because that's the directory your FTP site is pointing to. If you want it to point to another root directory, go to Advanced Settings, and change the Physical Path.
Alternatively, you could add the other folders as virtual directories - for example, you could have ftp.yoursite.com/somefiles pointing to E:\FTP\Public Files.
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
many ISPs block HTTP ports.
If not, you're lucky. Also you'll require to configure port forwarding in your router such that HTTP requests received on your IP address are forwarded to your machine via router
TY. I agree absolutely. I knew that the biggest problem I had to date was not being able to run multiple sessions, but this was just odd. I did not even think to use the CMD prompt.
Once again Eddy, you helped me bunches.
Need an opinion. I have two websites that probably get well over 200 hits a year at least, if you include my visits. I have them hosted at a cheap ISP (reliable though) It costs me about $90 a year... I think. They provide free Access and MySQL 4.x. The only drawback is that I am a web developer and one of the sites is a portfolio site and it's a hassle that I don't have even the most basic SQL Server available ($50 a month from the ISP). I was considering hosting my own site. I can get an IP address for $20 a month, so that would be a minor expense. I could get a cheapish notebook and put a small SSD drive in it so it would always be on with minimal power. With just Windows 7 Pro, I think I could have 10 simultaneous connections (gonna rule the web).
Sew... Can you suggest any pluses or drawbacks that I didn't think of?
PS. Reading the "Hosting and Servers" comment, I'll mention that years back I used to work for XXCal Test Labs. That was a place to work with smokey hardware. One test was to test 7 SCSI devices on a single controller. It produces unimaginable heat. On the other side, I know from experience how long it takes to dry a PC back into working order after your manager throws a cup of water into it. I also remeber when we gor the first "Yamato" notebook to test. That's "Thinkpad" to you. Those were fun days.
Heck, I was working for Juggy Tandon when he started selling the first 10 Megabyte harddrive. The Winchester! Twice as big as his first 5 Meg one. Ya know, the burn in room in Simi Valley got hot one weekend and we fried 120 drives. I could go on, but you'd call me a fogey.
As a web developer, you should absolutely host your own website. I can think of a few good reasons for doing so:
0: The experience of configuring and maintaining a web server.
1: You can use third-party or custom libraries.
2: It's much more convenient to work with local files and databases.
On the downside, there is the issue of security and backups. If you host your own site, you should be prepared for attacks from the outside world. Follow best practices for security...rename/disable the Administrator account, use strong passwords, etc...
You may want to look into IIS 8 Express to replace the 7.5 that comes with Win7. It's a little faster and has some extra security features.
I have one site web hosted and one self hosted. The web hosted site is mostly static pages and links to downloads. (the heavy lifting) The self-hosted site serves out around 20 customer reporting applicatios and demos. Good luck!