I truly appreciate code project, it has been at my aid too many times to count. Workspaces had some distance to get to be a 'github', but I really liked the way workspaces was moving and that I was going to be able to get a group together working on projects in private spaces and then publish articles and making repositories publicly available. (Finally I'd like to have setup publishing to codeplex) Anyway, I understand that sometimes initiatives don't end up being core competencies of their organization and look forward to seeing where your efforts go.
We have no active subscriptions in our Paypal account (all cancelled). You can confirm this on your end by looking in your Paypal account to see if you have any recurring subscriptions set up. Look on the "My Pre-approved Payments" page on your paypal profile.
Codeproject is great for writing and publishing article, github/git is great for hosting code.
Do you ever evaluated some sort of integration?
For example instead of manually uploading a zip file you can provide a way to pull it from a git repo. In this way updating the source code can be really fast and easy.
A similar solution can be provided for article text (a markdown file inside the repository?).
I say this because I hate to always keep up to date my articles on codeproject (cleaning the project, zip it, uploading it, modify the link, ...). Usually I simply put a header that point to the official github repository and I never update it on codeproject.
One of the big issues with having 10M members is names. Everyone has a name, and most want to use their name, or at least something vaguely resembling it. The issue is that Real Names are messy, human things meant for messy human things, and are terrible as a way to label things in a way that makes it easy (for a programmer) to reference that name within text or in a URI. We can't have //www.codeproject.com/members/Chris Maunder because the HTTP spec doesn't allow spaces in URIs, nor can we confidently say Chris Maunder refers to me in text, because it could also refer to someone names Chris who is rambling incoherently[^].
So we have Display names as a way to label your content such as posts and articles, and we have usernames as a way to provide a human readable and programmer parsable handle to your account. //www.codeproject.com/members/chris-maunder as a link to you and @chris-maunder as a reference to you in messages.
You don't need to use @username and you can be safely ignore the feature if it bugs you. However, if you like the convenience then a member's username can be found on their profile page or in the popup that appears when you hover over their name in the forums (assuming you have "Profile Popups" enabled in the forums).
We've finished reworking our caching of forums and articles and are happy to see load times for forums go from half-second to 6 milliseconds. That's beyond what we thought we'd get. Start diving deep into messages from the days of yore and load times don't appreciably change. We've essentially opened up the entire corpus of Forum postings for instant retrieval and slain a number of bugs, thrown out pages of code in the process and reduced our database load by a factor of three. It's almost idling now.
On the article side of things we've improved performance even more and have one final push, after which time we'll hunt down and nuke any remaining load issues.
For a decade we've been working against a local cache on each of the webservers. This meant that we either had to keep the time-to-live short, or we had to work out a sensible way of ensuring that when a member changes an article on one server, and is then directed to another server, they see their updated information - even though the local caches didn't talk to one another.
Yes: distributed caching is a solved problem but there weren't many canned solutions when we started, and we did end up doing some clever things to ensure it all looked sensible, give or take some "expected" caching issues such as a deleted article still occasionally being around for 10 or so minutes. "Expected" really comes does to what is forgiveable, and in this day and age even stuff like that stretches the friendship so we've finally had a chance to bite the bullet, plug up the local cache and add a couple of Redis [^] servers. We're using the ServiceStack Redis client[^] and implemented - fairly easily - a distributed cache that not just solves our cache-sync issues but speeds up application spool up time since the cache is off-server and independent of the webservers themselves. No need to recache on startup - the data's already there.
We are, obviously, seeing our cache load times go up since it's no longer a local cache but requires a network round trip plus serialisation, the overall database load is nicely down and our code is far cleaner.
I was sick of moving between computers, sick of the power outages in our building knocking me off my machine, and sick of having to Remote Desktop from home to my office machine to be productive. I'd setup various machines that would allow me to do the basics when I had to use them (eg for travel) but it's never the same. Like sleeping in someone's spare room - no matter how comfortable it's never quite the same.
I figured that laptops these days were pretty damn powerful and after road testing[^] a couple of Ultrabooks I decided that anything Core i7 with 8GB RAM would be more than enough for me. All I needed was something that would let me install Windows 7, something that was light, and something that had a big, fast SSD.
Enter the mid-2013 Macbook Air. Core i7, 8GB RAM, and the fastest 256GB SSD around.
To cut to the chase: it's an excellent dev machine and is faster than my 4 year old quad i7 desktop. I'm seriously impressed. I'm now able to work on a single machine anywhere in the world without having to compromise by switching to a slower machine for travelling, and I have the added bonus that I no longer need a desktop for the office and a laptop for travel. A single unit does the trick.
The annoying bits
It's a Mac. Apple did not go out of their way to make the Bootcamp experience exceptional. The trackpad sucks in Windows, yet it's by far my favourite trackpad when in MacOS. It's brilliant. Trackpad++ sort of fixes this, though.
I tried using parallels to create a VM from my Bootcamp partition in order to run VS while in the Mac environment. This was great, and you get the proper trackpad experience, but the big glaring issue was that I needed to use a USB DisplayLink adapater to hookup to an external monitor and installing DispalyLink drivers in bootcamp and then running it under Parallels causes the Windows VM to bluescreen. Parallels is aware of the issue and had no plans at the time to do anything about it.
So I stick to Bootcamp or MacOs and never the twain shall meet.
Docking stations became a big issue because I need a lot of screen real estate. I hate cable spaghetti, though, and tried a number of options before settling on an option that gives me almost everyhing for the (ironically) cheapest price: A thunderbolt display.
Thunderbolt displays are expensive. However, they come with a split thunderbolt / power adapter that plugs into the thunderbolt port on one side and provide a power cable to the laptop on the other side. Within the thunderbolt display are a pair of excellent speakers, a webcam, USB 3.0 ports, and gigabit ethernet. It's essentially a fully self-contained docking station built into one of the nicest monitors I've ever used, and with the 27" running 2560 x 1440, it allows me to run VS on one half and SQL MS or Chrome or anything else on the other half in the same manner that I'd previously been using two separate screens.
So factor in the cost of two 19" screens, a docking station ($250 - $300) plus speakers / external webcam + cables and you'll find that a refurbished 27" thunderbolt display is way cheaper, far more convenient and (for me at least) a much nicer experience.
The drawback is that Windows doesn't play well with thunderbolt and you may have to physically shut down your machine before unplugging the monitor if you have the monitor set as your primary display. Further, you need to plug the monitor in before you boot up a windows box because Windows only scans for thunderbolt on bootup. This is really, really annoying.
The only other annoying bit is fan noise. I hammer that poor little laptop and in a quiet room at 2AM when you're building code and running a zillion unit tests then thing really winds up and gets a bit rowdy. I'm still waiting to see what Apple does with the 13" Macbook Pro since a quad core Haswell unit could have a little more headroom before it starts to get hot and bothered - or at the very least it'll be done with it's tasks sooner meaning noise for a shorter time. A retina display would be nice, but totally not needed, but the added weight is a real issue. Touchscreen - while soemthing I've grown to love with the Ultrabooks - is a complete waste for me. The laptop sits by my monitor, closed, while I work. I have no desire to put finger prints all over my big display, and after my experiences with the Perceptual Computing Challenge I know how tired arms get after spending even short periods trying to navigate with your arms up.
Overall a 7/10.
- single machine whereever I am in the world
- excellent setup with the external thunderbolt display
- built in UPS. Love it.
- Totally fast enough.
- Windows issues with thunderbolt connections
- Noisy when hot and bothered
- Did I rally say a computer was "fast enough"? I lied. No such thing.
We skipped VS2013 / .NET 4.5 and jumped straight over to VS 2013 / .NET 4.5.1 because, y'know, it's far more exciting running your production servers on beta software rather than on the boring "tested" stuff.
We're not using any of the fun stuff explicitly, yet, but Matthew has already been eyeing off a bunch of code that can do with some async action. The thing that's most immediate to me is the multi-core JIT and startup time; all cores are actually getting used, CPU usage is up where it should be, and the site spools up much, much nicer than it ever has. Simply getting the advantages of the framework improvements is (almost) enough for me.
The other obvious timesaver is build time: much, much faster than VS2010, even when bogged down with all the other stuff I have open. I'm developing, testing, and running the site on my Macbook Air on Win7. There's the VS IDE, SQL Server Management Studio, IIS running the actual site, Outlook groaning under the weight of a 23Gb pst, various Word docs and spreadsheets, 6 remote desktop windows and half a dozen browser windows and it's all humming along nicely.
Although sometimes (especially during compile time) the humming sounds suspiciously like the Mac's fan is about to attempt takeoff. It gets disturbingly loud.
I'm still not taken with the new VS look - a little harsh, a little lacking on warmth, but it's way faster and, so far, more stable than my old creaking install of VS2010.
As to my experiment with moving my developer life onto a tiny, ultralight laptop: the jury's still out. 7/10 so far.
I got sick of typing URLs for members and so, well, I coded.
To provide a link to another member just use the tried and true @username syntax, where the username is the username generated from their name (or manually modified) in the form first-last. Everyone's profile shows the username just under their profile image.
So if I want to shout out to a ray of sunshine I can just go @Michael-Martin (no link - just type that literally) and when the message is saved the link is generated.
Obviously this is opening a can of worms and I know the next two requests. Yes, soon.