The Lounge is rated Safe For Work. If you're about to post something inappropriate for a shared office environment, then don't post it. No ads, no abuse, and no programming questions. Trolling, (political, climate, religious or whatever) will result in your account being removed.
No amount of caffeine is getting me through my morning, and I've found a bug in my MidiSlicer app. It's not a show stopper, but it's annoying. There are two ways to fix it, and one way is a dastardly hack. The other is complicated, but the Proper Way(TM) to do things.
The problem is such: In the MidiSlicer app you can modify a file while it is playing. However, since playback merges all of the tracks together for playback since they all must play at once, if you "mute" a track by removing it from playback, it changes the number of "events" in the stream. So your current cursor position during playback is now invalid.
Say for example we're playing a track with the following layouts
Drums: 50 events
Bass: 35 events
Guitar: 60 events
Total that's 145 events. So let's say I'm playing back and I'm near the end of the loop, and I remove the bass track. Well I'm on position 140, but it doesn't exist anymore now, because there are only 110 events in the stream now. Make sense?
The dumb way to fix it: Add a "dummy" NOP midi message to the API that never gets played or saved to disk. Replace any events i remove with a dummy event.
The right way to fix it: Before modification, get the absolute time of the cursor position. This should be expressed basically as system ticks or a timespan. After modifying the events, seek back within the event stream to that same *time* based position, wrapping if needed. and get the actual position within the event stream from that.
The latter way is preferable for many reasons, but while I've written code to convert from the current position to an absolute time based position, I do not have the code to go the other way around.
It sounds like a simple computation of time - basically the time each quarternote takes in system ticks, and then measure the score as quarter notes - which is simplified from what i'm actually doing but close. However, it won't work because the tempo can change throughout the track, meaning the duration of a quarter note can change throughout the score.
Measuring it involves starting at 0 and moving through the track, counting up times at the current tempo, and then recomputing durations whenever the tempo changes. That's what I do going the first direction. Converting from Time to position should be roughly the same thing but my brain isn't working this morning.
No amount of caffeine is getting me through my morning,
Have you tried with bacon?
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
Why not just find the first previous non-muted object, storing its 'this', as well as the current offset tick, then recreating the vector without shuffling, finding that 'this', setting the container's location to 'that', and then playing the next object in the list at the appropriate time?
It's more complicated than it sounds because i'm not working on the merged tracks. I only see the individual tracks separated at that point.
They get merged only at the last second.
I'd have to show you the code, but putting in dummy messages is the easiest, if not the cleanest solution. It's far easier than recomputing all the delta times in each of the events in each of the tracks.
If you are creating a pure MIDI stream and sending it to some process that only works in MIDI diff-times, then you will have to do something like you are stating. If not, you can just add a check to the C# object in the container when it reaches play time, and if it's track is muted simply don't play it.
It doesn't work they way, because the driver takes events with deltas attached to them. You queue them up with the deltas attached and it plays them when the time indicated by the delta comes up. It does this in the background. All the deltas are relative to each other, so if you remove an event, it shifts all the notes that follow it back by whatever it's delta was. So I'd have to recompute all the deltas any time i remove an event.
I should add, I do actually, use the absolute time technique when I'm doing non-streaming playback, but that doesn't do background playback. It blocks the thread and just chews up CPU. It uses no waits, no timers, nothing, except querying the current system ticks and playing the event stream in a tight loop.
If you can't do as Greg says, then it is a matter of going through each non-muted track individually, and finding the last non-muted event played. Recreate the new container without the muted events, find that last event played, and do the corresponding finagling. Unless I'm missing something.