Yes, I copy pasted your code, substitutuing the domain name, to he live server - but see my other post in reply to this...
While the certificate is installed on this server (of course) the domain's DNS is handled by a third-party, adn it *seems* as though what is happening is that when the original request comes in, the DNS server is requesting authorisation from the certificate before passing teh request on.... is that how it works? It would explain it, but leave me in a fix...
The DNS server shouldn't be requesting anything from your site. It's simply an address book which maps an entry like www.mydomain.com to an IP address.
From your other post, it sounds like the browser is making the request for https://mydomain.com, and then barfing when it gets a certificate for www.mydomain.com, since they might not be the same site.
Many certificate authorities will issue a certificate which is valid for both www.mydomain.com and mydomain.com, so it might be worth seeing if that's an option. Otherwise, you'll need a spare IP address and a second SSL cert to avoid this error. (You could probably get away with a free cert from StartSSL[^], since you're only securing a redirect to the real site.)
"These people looked deep within my soul and assigned me a number based on the order in which I joined." - Homer
Yes, I was afraid that might be the case. Given that it's pretty much a standard that mydomain.com and www.mydomain.com be the same site, you'd think this would be a standard too for SSL certificates. Sigh.
AH.... what's hapening is that the browser still shows https://mydomain.com in the address bar, and throws up a "This connection is untrusted" warning... but if you click through that THEN you will get redirected to https://www.mydomain.com, adn all is well.
However, I need, of course, to by-pass this warning...
How can allow a user (Domain User?) to log on to a DC, and unlock user accounts?.
I've created a Group called "SuperUsers".
I've added the group Superusers to the Remote Desktop Users, and added the group in the "Allow logon through Terminal Services" option in Local Security Policy - and Delegate Control... by setting the Read/Write lockout properties on user accounts.
But when the users tries to start mmc.exe, the users is asked for Administrator credentials.
I know that you can do it with Remote Desktop Administrative Tools - but it's an assignment, where it has to be done on the Domain Controller.
Edit: Wrong forum. How do i move it to "System Admin"
I need to create an extra GPO, which overrules the "Default Domain Controllers Policy" - and add the SuperUsers group to the "Allow log on locally" property in Policies -> Windows Settings -> Security Settings -> Local Policies/User Rights Assignment.
Now a normal Domain User can be delegated to job of unlocking users on the domain controller using Remote Desktop.
Hello everyone! Please I need some guidelines about web hosting and its techniques. I want to host a website that will charge visitors per click. Now the question is how can I host the website so that it can have its own domain name. And, how can I be paid when a click is performed. Please help! Emmanuel.
I have an idea to make cloud software update service for desktop apps.
Most of the apps require updates, and i don't know flexible update systems (If you know such solutions, please, let me know).
It consist of two parts:
Update client. Software component, that checks for updates (scheduled, on app start, etc), downloads them, and performs update. Update client is made as multiplatform SDK, that can be easily integrated into projects.
Update server. Holds updates on cloud storage. Has web-based UI, with access control, products/releases dashboard, and client update logs/reports.
Here are some features of it:
- SDK, that can be easily integrated to .Net/C++/Java apps.
- Customisable update client
- Differential updates. Update is binary diff. So, it 20 MB file has a few changes, will be downloaded a few bytes, instead of full 20 Mb file
- Client update logs and reports
- Multiple products and releases
- Update rollback
- Integrated with MS Visual Studio and Elipse
What do you think about it?
If you are interested in such service, please send me your comments and suggestions.
Don't take this the wronng way, but, since you asked and I get 10 points for answering (here in Discussions) as well as can now go on record as having answered a survey about a "trendy topic" of sorts ...
This whole cloud paradigm is doomed to fail. There.
And with any work on my part will fail sooner than later.
Computers have been "online" since before 1980. General subscribers have been playing games through billed services and bulletin board posters have been placing peers on the alert through the years up until the turn of the century when the latest morph/changling of the proud parent "the internet", the cloud, was born. Since then the whole business of "connectivity" has blossumed into a billion dollar industry and everybody welcomed another venue to exploit this electronics gene-pool. But computers, aside from what one reads now, are calculating devices at a level just above electronics. And hopefully you'd agree with me being a coder and a CPian, that just to the left of any ethernet card or serial port device, growth of this technology is around speed and volume.
CPU and memory. Faster and more of something. And possibly smaller footprint.
Now look at the cloud. what is it? A storage device. It's an external "hard drive" to the computer. Instead of adding a drive to my stack I add a drive to an external location? Seems more like an unknown Ex on a map where I have little influence over maintenance and security. Average dunderheaded cloud user (the one who signs up to store stuff, his apps, his information, his things he's going to make available to others) there could just as easily plop another hard drive into his ccabinet and have locally, his way, when he wants it, out from under any eyes but his own, for a decade until it's full, all his stuff just as easily.
It's a conservative spin on an alarmingly liberal communicative yackety-yack non-sense-for-the-sake-connected-non-sense that I will stand behind. Frankly, noone's going to chuck their hulking 100lb stack of hardrives and their gigantic 6.0GHz CPU and the 256GB of RAM they call "MyComputer" just because "cloud" storage is free now.
I lay prone on my belly with my thirty-ought-six and a full magazine waiting for the enemy to come up over the trench. There are many trees in view in directions all over the horizon most defoliated by precious weapon battle. I can hear the enemy coming my way, his tromping and plodding as he approaches signals me I must steel my nerves. I aim by dropping my eye to the gunsight.
I squeeze the trigger at the first sight of a beret.
That is a meaningless question because "optimal" is a subjective term.
Obviously one difference is that if the hosting provider is responsible for the database then they insure that the server is up, that it is backed up, that it has reasonable security restrictions, etc. The downside is that it will be limited in some way.
Many hosting companies are now offering virtual servers (called VPS).
I am planning to book a VPS for one month to see if the performance are OK.
To avoid writing a special application, are there ones that can be easily installed and that can perform benchmarking of the server ?
My principal doubt is regarding two things:
For the CPU, I'd like to verify if the frequency offered is real. So if an offer tells that a VPS will be given a 2.2 GHz CPU, I'd like an application to repeat the execution of some code for a long duration and then give idea of the execution speed and the evolution of that speed over time.
I fear that the CPU is shared by some other applications on other VPS.
Concerning memory, this is really very expensive. So for example 2 GB of RAM is the most that one can afford. However hosting companies tell that they provide SSD memory. So when swap occurs, it goes to SSD and not to the disk.
I wonder if any benchmarking tool would be able to also test the performance of this SSD.
I hope the tools in Linux.
Push Framework - now released !
I am trying to ftp from our QA server, based on a ftp tool(Filezilla) set up on my laptop. The QA server is literally located out of the state i live in - outside our DMZ. The QA server is Windows 2008 server.
I can see that the FTP server and FTP service is "installed" on the QA server. But when I used commands from my dos prompt like
ftp> open <QA servername> - it does not come back with username and password request.
The user account I am using is an Admin account on the server.
Is there a way to check if the FTP service is enabled?