|
This is not the soapbox, so I will not write anything about what happened when the cop said to the driver: "Then you will have to blow."
The language is JavaScript. that of Mordor, which I will not utter here
This is Javascript. If you put big wheels and a racing stripe on a golf cart, it's still a f***ing golf cart.
"I don't know, extraterrestrial?"
"You mean like from space?"
"No, from Canada."
If software development were a circus, we would all be the clowns.
|
|
|
|
|
A blonde got pulled over - the blonde female police officer walked up to the car and asked for ID.
The driver searched frantically in her purse for a while and finally said to the blonde policewoman, "What does a driver's license look like?'
Irritated, the blonde cop said, "Don't be a smartass!, it's got your picture on it!"
The blonde driver frantically searched her purse again and found a small, rectangular mirror down at the bottom. She held it up to her face and said, "Aha! This must be my driver's license", then handed it to the policewoman.
The cop looked in the mirror, handed it back to the driver and said, "You're free to go. And, if I had known you were a police officer too, we could have avoided all of this."
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
I posted before I saw your reply, but here it is in picture form: Click
|
|
|
|
|
Reminds me of this one
|
|
|
|
|
|
Cop: Got any ID
Me: 'bout what?
Rules for playing Javascript frameworks.
1. You can't win.
2. You can't break even.
3. You can't get out of the game.
|
|
|
|
|
Hello all,
We have some big files that need to be sent from a remote computer to the company server each day.
The work computer has a 10 MB DSL connection and the server has an optical fiber (50MB) connection.
The remote worker is not tech savvy.
I can imagine three ways of handling that:
1. Files stored in the local computer and send them each day to the remote server (via a scheduled backup that will run automatically at a certain hour). The computer user can shut the computer down at the evening.
2. Files stored in the local computer synchronized automatically using a kind of "Google Drive" that the server has natively included.
3. Files in the remote server and access them remotely via WebDAV or similar.
Knowing that... what would you choose? And why?
Thank you very much!
|
|
|
|
|
A 10MB DSL connection is enough to RDP with. Unless they're doing multimedia work... problem solved.
Jeremy Falcon
|
|
|
|
|
Thank you... it looks like a good suggestion.
|
|
|
|
|
You're totally welcome.
Jeremy Falcon
|
|
|
|
|
And think about security. The company's data remains at the company, it is never stored at the remote worker's home. For that alone, I prefer RDP for all remote workers by FAR.
|
|
|
|
|
Yeah, that too.
Jeremy Falcon
|
|
|
|
|
Agreed. It's how I work from home - citrix connection to remote terminal which RDPs onto my desktop machine (and it could just as equally be onto a VM) and then no files ever have to get to my home computer and none to transfer back, everything is kept secure in the company network. Both screens on my office PC visible on both screens at home, works very well.
|
|
|
|
|
Also, just how big and how many files are we talking about here?
Jeremy Falcon
|
|
|
|
|
3.19GB
11300 files
Yes, probably the remote connection is the way to go...
I had not thought about that...
|
|
|
|
|
It's not so bad either once you get into the swing of it. It's much like programming in a VM if your connection is solid enough.
If they were technical, it would be a whole different story with available options. But since they aren't...
Jeremy Falcon
|
|
|
|
|
I did something like that with an overseas connection a few years ago between Houston and France. I did some testing with the connection using various strategies. We tried copying files in native form and also dumped into 'split' zip files. We used a scheduled batch file to do the hard work. The zip file strategy got the best throughput if we were copying all files but our files typically had less than 1% change per 24 hour period... so we used "xcopy /d" (which does not delete files on the remote side which get deleted on the local side).
There was a considerable difference in elapsed time on whether we used xcopy on the sending end or receiving end. I've forgotten now which was faster but I think we did one of the following:
xcopy /a /d /s /e /c /i /q /h /r /k /o /x /y D:\localfiles \\remote\blahblah
xcopy /a /d /s /e /c /i /q /h /r /k /o /x /y \\remote\blahblah D:\localfiles
I'm retired. There's a nap for that...
- Harvey
|
|
|
|
|
For future reference, you should look into something like the rsync protocol (usually available by default on linux, but there are versions for windows too).
This is designed to synchronise large files over slow connections, and hence exchanges CPU power for line speed, by hashing the contents of large files and only sending changed blocks within files: thus, if you have a 4Gb file where only 10 bytes gets added every day, instead of transmitting the whole file each time it is updated, it will only sync the block(s) that contain the changed bytes.
I've used this to keep many Gb of data backed up off-site when I only had a 512k upload speed to do it. The initial backup can take a long time, but once its done...
|
|
|
|
|
We were not interested in continuous updates, but periodic updates upon demand. My understanding of rsync is that it runs continuously.
Also, we weren't dealing with large files, but a mix, mostly small files.
I'm retired. There's a nap for that...
- Harvey
|
|
|
|
|
Fair enough. My experience is that rsync still works well. You can schedule when it runs easily enough - backups were only done at night to avoid loading the slow broadband during the day when it was in use for news feeds etc.
|
|
|
|
|
We work using remote desktop (Germany <-> Asia) and it goes pretty well.
For many small files is good enough. For big files is a PITA though.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Have you considered using Dropbox or one of the other "cloud" drives? They keep the content synchronized pretty continuously.
|
|
|
|
|
Second option in the original post.
The server has an inbuilt "dropbox" like software that allow that.
Thank you!
|
|
|
|
|
RDP is an excellent tool BUT if you decide to allow the remote working then have an FTP Server (I use FileZilla) at the hosting end and use something like "winscp" to feed the modified files back
winscp runs a sync across the ftp protocol and therefore only transfer modified files and can be run multiple times to make sure that transfers aborted due to PC restarts get completed
you can run over ssl/tls and providing you have decent usernames and passwords then it would be reasonably secure
for added/alternative security factor in a VPN
FileZilla would also allow you to set different transfer rates for different times of day so that the DSL wouldn't be maxed out at busy times
|
|
|
|
|
I'll have to look for the VPN thing (which sounds good when it increases security), but, the RDP (VNC, TeamViewer, the one that comes with windows...)... will help a lot and then, no files need to be stored into the remote computer as everything will stay in the server always...
Thank you!
|
|
|
|