|
good tool!
|
|
|
|
|
If you have multiple people editing a map at the same time, as long as it's well structured, you could gain some benefits from merge tools that are already available. Binary files you get no such benefit, and it will always be last in wins.
|
|
|
|
|
Maybe...But as I see you never merge files when you work with binary...You merge objects in the memory...
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
I didn't mean in-app. I was thinking more along the lines of source control and working with contributors in a non-realtime setting.
|
|
|
|
|
Super Lloyd wrote: I am working, albeit very slowly, on my take over the world project.
How is Pinky[^] anyway?
Mongo: Mongo only pawn... in game of life.
|
|
|
|
|
I only know from second hand....
Us, evil genius, are a secretive bunch!
|
|
|
|
|
I'd go binary for large data. If someone cares enough to edit that crap then they'll know enough about how to translate it anyway.
Jeremy Falcon
|
|
|
|
|
Right, indeed!
|
|
|
|
|
Nah, do what Microsoft did. Save everything as XML!
Marc
|
|
|
|
|
Seems like a perfectly logical way to save an image...
="1.0"

Jeremy Falcon
|
|
|
|
|
Use power of BSON, Luke!
|
|
|
|
|
I've been researching refresh rate and other display characteristics, thinking another monitor might make a difference.
My take on higher refresh rates (> 60 hz.) is they make a difference for gamers and people watching videos, but i'm not sure they would be any more readable, or less migraine-inducing (during long sessions).
Just curious if anyone else with older/diminishing eyesight here has considered this issue.
thanks, Bill
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
I think that anything over 100,000hz is going a bit far!
veni bibi saltavi
|
|
|
|
|
It's soooooooooooooooooo refreshing.
|
|
|
|
|
I'm not going to mention what organ this post suggests may be diminishing in capacity
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
Probably defaulting back to the question: what are you gonna use it for?
For me, I've got pretty bad eyesight buuuut never really had too much of an issue with lower refresh rate screens. They're nice to have to be sure but when I'm working on user input apps which are mostly lots of entry forms or a general intranet which has lots of relatively static content web parts, it's fine.
As you've alluded to though, when I've been working on graphically based bits of work, it's really noticeable what the difference is. And certainly if I'm working on input based on visuals (last one was a point on a map thing) it needed to be quite high for my purposes because I was running through so many different weird scenarios in a short space of time when compared to a regular end user.
Budgetry reasons though, would that be the thing restricting you? I would personally go with higher, just because then you're covered in both scenarios whereas a lower one wouldn't.
|
|
|
|
|
Thanks ! Given the state of my eyes (surgery upcoming), and the fact that I think there'll be bargains galore on this year's state-of-the-art monitors after the holidays ... I will probably pop for a new monitor with an IPS screen.
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
Not so sure about that: the initial move to 70+ framerates was (I thought) to get away from beat frequencies with the mains supply causing flicker between the screen refresh and ambient lighting.
Video doesn't need a high refresh, as the source is only captured at 24 / 48 / occasionally 60 FPS anyway (Avatar was filmed at 24 fps, The Hobbit was 48 for example)
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Games achieve higher frame rates, I suppose.
|
|
|
|
|
Thanks, Griff, the mains here are 220v. 50 hertz; I hadn't thought about that issue. My nine-year old 32 inch beast is an LCD with around 720p whatever; I am running HDMI out from video card to HDMI in on monitor.
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|
|
OriginalGriff wrote: Not so sure about that: the initial move to 70+ framerates was (I thought) to get away from beat frequencies with the mains supply causing flicker between the screen refresh and ambient lighting.
I thought the move to 120+ refresh rates was to support 3D rendering? At least when using shutter technology it halves the framerate as it alternates the images for each eye.
|
|
|
|
|
That makes sense.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
AIUI that flicker sync issue only mattered with CRTs. LCDs don't normally go dark between refreshes (some high framerate TV gimmickry not withstanding).
For a TV 120Hz eliminates the judder (alternating 2 or 3 refresh display time for each frame) from 24 input FPS playback; so it has some merit there. 240 lets you do the alternating blank frame gimmickry again, but meh. 480 is just one number bigger failsauce.
For twitchy gaming higher FPSes are nice (assuming you can afford the GPU to feed them). I've also seen a number of people with 120 or 144 screens claim that mouse cursor movement, window dragging, and document scrolling are noticeably smoother at higher refresh rates. I've never had a high FPS screen to corroborate those claims though. (I've always spent my bandwidth on higher resolutions instead.)
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Are you having an LCD or CRT monitor?
Old CRT monitors flicker because of how the picture is drawn on the screen and therefore gets a better picture at higher rates.
If you have an LCD monitor and you're watching a static picture it's not flickering because the pixels don't change state.
The refresh rate for an LCD monitor is only telling you how many times per second it might change state, which in theory might have an impact for gamers.
|
|
|
|
|
Thanks, Jorgen, oh, indeed, been years since I used a CRT monitor, so I is in LCD land, with a 32 inch tv about nine-years old, with roughly 720p capability and HDMI inputs (was high-end back then).
cheers, Bill
«I want to stay as close to the edge as I can without going over. Out on the edge you see all kinds of things you can't see from the center» Kurt Vonnegut.
|
|
|
|