|
To make a blanket statement that 2 weeks to do a code review is wrong is, well... wrong.
What if it is an a nuclear or other critical environment?
Having said that, the code review time frame should be as fast or faster than the development environment. If the code review is taking longer, then management needs to adjust priorities.
I am working on a project that has multiple event based calculations; the process was falling behind because the calculations could to be processed as fast as they were received. We had to rework the calculations to get them processed in a more efficient manner... you have the same situation; your 'events' are code waiting to be reviewed; your 'review' is your calculation... something needs to be adjusted.
|
|
|
|
|
Yeah, the tech lead told us code review are important and must be done first and quick.
I must insist on it being resolved with the other developer when submitted.
plus there is a developer who is particularly slow at them.... and he is going to another team!
|
|
|
|
|
Tim Carmichael wrote: What if it is an a nuclear or other critical environment?
Wow, I can't imaging a code review taking more than a couple of hours, tops.
How long does a nuclear code review take, if you don't mind me asking?
|
|
|
|
|
Think of the complexity of a simple change and everything it touches. Remember DLL hell? Imagine that at a nuclear site; the change must be reviewed from a code perspective, but it must also be reviewed from the perspective of what it can impact.
What happens if the code is put into production with an error? So... a two week review is not unheard of... there is a reason nuclear sites are not running the latest and greatest versions of operating systems.
Now, from a colleague that worked nuclear...
So , for example when I would write and engineering change (EC) package... You are required to get a peer review - they would formally document their review and you had to document how you disposioned each of their comments. After they were happy with how you addressed their comments then they would sign off on your package. The time it took for someone to review the package would take 1/3 - 2/3 the time it took to write it.
Because your review is your signature of the work just as you did it. It is what I know as a true peer review.
After you get a peer review you then would have to get discipline specific reviews done. For example, you have to get a 10CFR50.69 review, I&C Engineering Review, Operations Review, Maintenance review, sometimes even Chemistry and Radiation depends on what I was putting in and who was impacted by my change to the plant design
You have to disposition the comments for each person and they are captured in the package.
|
|
|
|
|
Wow! I am so glad I don't work on nuclear projects.
|
|
|
|
|
I've worked in medical devices for the past 10 years. The longest code review we ever did was a day, and that included checking out the code and performing a full regression test with the physical devices.
Small checkins are good, large checkins are bad. Merging can be challenging enough with a small checkin, depending on what's being changed. Then add a two week time frame and that's a huge disaster waiting to happen.
Two weeks? Time to bring this up with the manager or whoever is in charge. That's ridiculous.
"Computer games don't affect kids; I mean if Pac-Man affected us as kids, we'd all be running around in darkened rooms, munching magic pills and listening to repetitive electronic music."
-- Marcus Brigstocke, British Comedian
|
|
|
|
|
2 weeks code review is your impediment, you can ask your boss to remove it. Be agile.
|
|
|
|
|
The flaw in the current process is that the code should really be reviewed "post integration", not pre integration. What if the review is fine, but due to the two week lag the merging/integration introduces a defect? What good did the review do?
"Works on my branch" will become the new mantra.
Maybe all of the merging needs to happen on a "review" branch.
Items are migrated from "review" to "dev" in the order they are committed on "review"
Or else flip it.
Integrate on "dev", migrate via code review to "review" in commit order and build masters from "review" instead of dev.
If someone really wants the fifth package in the "queue" to make it into a build, the preceding four packages will have to be reviewed first.
Another approach would be to perform weekly reviews post label/milestone/sprint. Depending on the churn in certain files, some people would end up reviewing impacts from multiple packages at one time, but I do not see why that would be a problem. This might be more efficient vs. making developers switch more frequently between writing/reviewing.
Also, per another comment. It sounds like the team is not blocking off enough time to perform reviews. For 3 days of coding, you probably need 1 day of review.
Make Monday review day... no new dev work until the review backlog is cleared out.
|
|
|
|
|
Change your organization OR Change your organization (Fix it, or leave, LOL).
Okay, 2 weeks is insane. I would BEG at the minimum for a Peer-Review to get things moving.
And more complete review later (could it be that you are NEW?)
When we get new developers, we do DAILY code reviews with them, and do NOT let them check anything in unless reviewed. But this is done more as mentoring and training to get them up to speed, and to identify their strengths and weaknesses...
Is this temporary?
|
|
|
|
|
You're not preventing the incompatible changes, just putting the burden on your teammate.
|
|
|
|
|
I like to do backups where I am completely in control, so I don't think that the standard model for backups would work. That said, I want to have copies of the latest files that I have updated or downloaded saved somewhere in case disaster strikes by my hard drive crashing or getting hit with ransomware.
My idea of this backup application would be something that runs in the background and keeps track of any new files that are created, or of existing files that have been modified, and keeps a list, so that when I am ready to run a backup, it gives me a ListControl with checkboxes that allows the user to select/deselect, and then simply copies the checked files to some backup repository, perhaps with the existing directory structure. Of course, setting up which directories to look for would allow for the Windows directory, etc. to be skipped.
So what this would mean is that I'd have an external drive with a bunch of directories that include all the files that had been tagged and copied over from all the backup executions. Whenever I would do a complete backup (i.e., on my own), i would be able to delete this, but if disaster were to strike between then, at least I would have copies somewhere such that I could piece together that and a standard backup and not lose anything.
It seems to me that with ransomware, the key is to have the backups not connected to the system, as anything that is connected would get encrypted - and so this ideal backup application would allow for backups to be done to a given directory that would not have the other files.
I don't know - maybe I just need to spend a lot of time researching this, but I don't like to just add applications willy-nilly as that tends to screw up the registry or worse, install cr@pware. I figure that I could roll my own in VC#, but I have a hunch that someone has had the same feeling as me about the ideal backup and has already developed this.
modified 25-Jun-17 14:40pm.
|
|
|
|
|
|
swampwiz wrote: My idea of this backup application would be something that runs in the background and keeps track of any new files that are created, or of existing files that have been modified, and keeps a list, so that when I am ready to run a backup, it gives me a ListControl with checkboxes that allows the user to select/deselect, and then simply copies the checked files to some backup repository, perhaps with the existing directory structure. SVN?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Nelek wrote: SVN?
Now there's an idea -- create a repo at c:\, include everything. I wonder if GitHub would complain about an 100GB repo?
Marc
Latest Article - Create a Dockerized Python Fiddle Web App
Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny
Artificial intelligence is the only remedy for natural stupidity. - CDP1802
|
|
|
|
|
I am not sure if that would be that safe, but I was thinking on:
- Local Home Server with SVN (or similar) in Intranet (but nowhere mapped as network drive)
- Repository for data partition of laptops
- Local periodic task changing IP to intranet in Laptops and doing a commit of all deltas in data partitions. When finished back to DHCP for internet connection.
- Periodic backups of laptops in a second intranet network drive. Change IP, NET USE to connect, Do backup, NET USE delete, change IP back
- Periodic backups of SVN and Leppies images within the Server from one drive / partition to another.
- Sporadic backup of second drive of the server in external USB (normally unplugged) Device
Not started trying yet :S
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
modified 26-Jun-17 8:19am.
|
|
|
|
|
Zoolz is offering free 75 gb. cloud account with no credit card data entry required. But, you have to have a "business" e-mail-address to register: i.e., not a gmail-type address: [^].
I'm just trying it out , and it appears their Win local client offers the fine-grained control over back-ups you are after ... however, that's a quick initial impression.
cheers, Bill
«Beauty is in the eye of the beholder, and it may be necessary from time to time to give a stupid or misinformed beholder a black eye.» Miss Piggy
|
|
|
|
|
|
I look forward to the article and your code
«Beauty is in the eye of the beholder, and it may be necessary from time to time to give a stupid or misinformed beholder a black eye.» Miss Piggy
|
|
|
|
|
That is what I pretty much did. Just a bunch of bat files that only run when a certain drive is attached. does a robocopy of the folders I want for backup. Works for me.
To err is human to really mess up you need a computer
|
|
|
|
|
Likewise. A combination of svn, robocopy mirror and grandfather/father/son works for me.
We're philosophical about power outages here. A.C. come, A.C. go.
|
|
|
|
|
Microsoft SyncToy?
Robocopy??
I use Synctoy and Expandrive to back up to Google or Amazon
|
|
|
|
|
Take a look at CopyTo synchroniser. I've used this for a long time to replicate partial directory structures, collections of files etc to external devices or across networks. Has umpteen modes of operation based on changes /new / old / size comparisons (including binary compare if you don't mind the time it takes). Allows manual intervention per file, interactive changing of direction of copy, copes with DST changes etc. Somewhat quirky interface, and does take a bit of experimenting to find the best settings, how to save settings once established etc, but really does what it says on the tin.
Hasn't been updated since 2010, but then again hasn't needed to be, and works fine on latest Win10 etc
There are other tools that are better at keeping directories etc in sync across devices and so forth (OwnCloud for example) but if you want full atomic control over the files that get copied/moved/updated, this is hard to beat.
Kish utilities:
[^]
|
|
|
|
|
Hello, same problem arised for me. MS Backup not worthy because does not what i wanted - full control. So wrote my own, scans (Initiated by me) all files (omits specified dirs/files), compares them with the backuped (external), and if unequal, saves them. Backuped files NOT overwritten, saved by generation type. The program is NOT perfect- but it works since a long time. Always a little bit modified to meet new functions. Written in "FreePascal".
jordi@ponttrencat.es
|
|
|
|
|
I'm using Beyond Compare to back up at the moment to my NAS. It's directory sync mode works much like you describe except it works out what is out of date when you are ready to back up rather than tracking this in the background. Once it has finished comparing the two copies (I set it to compare size / modified date time) you can then mark folders you don't want to sync. Beyond Compare isn't free though ($30 for the standard version), but perhaps a similar folder comparison tool is out there for free.
|
|
|
|
|
I use for private usage the tool Pure Sync. It has the features I need: simple usage and some batching
Press F1 for help or google it.
Greetings from Germany
|
|
|
|
|