This article focuses on the last part of the DirectMidi wrapper class library which allows performing advanced playback and mixing audio functions with the DirectMusic - DirectX API. In the first part, we saw how DirectMusic handles basic MIDI I/O operations such as input and output ports, thruing from one port to another and downloading DLS instruments. In this part I will give a basic understanding on the use of DirectMusic through the DirectMidi class library to get full performance when synchronizing several audio wave files at the same time, positioning the MIDI and wave sequences in the space, sending MIDI messages to the performances and using DLS instruments with them.
2. Audio playback DirectMusic architecture
2.1 Basic concepts
The main problem of a basic system MIDI port like the MIDI Mapper is the limitation of the number of audio channels to 16. This limitation comes from the MIDI standard and extends to modern applications. This limitation is overcome in DirectMusic thanks to the appearance of Performance channels. The performance channels are similar to MIDI channels except that they are virtually unlimited. Each performance channel contains a single instrument, an instrument coming from a MIDI sequence, a DLS instrument or a waveform. These parts have their own pan, volume and transposition settings. At this point we can state that one performance channel belongs to a single MIDI channel and to a channel group from the port. From now on, when working with performance objects we will always use zero-based performance channels. The chart below explains this relation:
Performance channels are handled by Performance objects which are the responsible of sending audio data from the source to the synthesizer. Performance objects also handle timing, the routing of messages, tool management and notifications. Another important object related to performance and performance channels are Audiopaths. An audiopath can be seen as a chain of objects through which data is streamed. An application can gain access to any of these objects. For example, you might retrieve a buffer object to set 3D properties of a sound source, or a DMO effect to change the parameters of the effect. The last important objects involved in DirectMusic application are Segments. Segments are objects encapsulating sequenced sound data. The data might be a MIDI sequence, a waveform, a collection of information originating in a segment file from DirectMusic Producer, or a piece of music composed at run time from different components. A segment can be played as a primary or secondary segment. When playing as a primary segment only this segment can be played at a time. The secondary segments are typically short musical sound effects played at a time or synchronized over the primary segment.
2.2 Main COM interfaces in DirectMusic for audio playback
DirectMusic is part of DirectX technology which uses distributed object programming through COM. COM is the fundamental "object model" on which ActiveX Controls and OLE are built. COM allows an object to expose its functionality to other components and to host applications. It defines both how the object exposes itself and how this exposure works across processes and across networks. COM also defines the object's life cycle. Although DirectMusic uses this technology, don't feel overwhelmed, since the class library automatically handles these situations. In the following lines the most important interfaces involved in a DirectMusic audio application are presented:
IDirectMusic8: The IDirectMusic8 interface is the main object in a DirectMusic application. There should only be one instance of this interface per application. Performances need this object to get initialized.
IDirectMusicPerformance8: This interface is the responsible for managing the playback. It is used for adding and removing ports, mapping performance channels to ports, playing segments, dispatching messages and routing them through tools, requesting and receiving event notification, and setting and retrieving various parameters. It also has several methods for getting information about timing and for converting time and music values from one system to another.
IDirectMusicSegment8: It represents a segment, which is a playable unit of data made up of multiple tracks.
IDirectMusicAudioPath8: This interface manages the stages of the data flow from the performance to the final mixer. It also provides references to DMO objects and DirectSound buffers like the
IDirectSound3DBuffer8: It is used to retrieve and set parameters that describe the position, orientation, and environment of a sound buffer in 3D space.
IDirectSound3DListener8: It is used to retrieve and set parameters that describe a listener's position, orientation, and listening environment in 3D space.
3. Developing audio applications using DirectMIDI
3.1 Introduction - DirectMIDI audio processing layout
As it happens with the MIDI part of the DirectMidi, the main kernel of the library for audio handling is based on its eleven related classes. They are commented hereafter:
On the left side of the chart we can see the main objects of a DirectMusic application. The
CDirectMusic class is the responsible of instancing the DirectMusic COM object for a Win32 application. It initializes DirectMusic and creates or specifies the DirectSound object required for the software synthesizer.
The second elementary object is
CMasterClock which performs the enumeration and selection of the desired hardware clock as reference clock. The
CMasterClock is not used unless the user needs a specific timing configuration. After creating the
CDirectMusic object we can proceed to initialize the most two important objects involved in a DirectMusic audio playback application. These objects are the
CAPathPerformance which are the encapsulation and operational abstraction of the
IDirectMusicPerformance8 interface. The
CPortPerformance manages performances which use system ports like the MPU-401 or the software synthesizer which need to be added to this object. These
CPortPerformance objects don't allow neither audiopaths nor 3D positioning.
On the other hand we have the
CAPathPerformance that allows managing a performance which uses the default software synthesizer port. These
CAudioPathPerformances allow audiopaths and 3D positioning, besides the use of DMO's and effects like Reverb and Chorus. In order to attain 3D effects over the audio data stream and get DMO's we will need the
CAudioPath object which can be retrieved by using the
Two important objects required for 3D positioning are the
C3DListener which are the abstraction of the
8 respectively and can be initialized from the
CAudioPath object. Finally, we have two objects required to keep a reference to an audio sequence represented by the
IDirectMusicSegment8. These objects are the
C3DSegment. There is a subtle difference between these two objects. The
CSegment object is a basic unit of an audio file. This
CSegment is intended to be played in both types of performances. The
C3DSegment object, on the other hand, is a higher abstraction or encapsulation of the objects required by a sequence intended to be used with 3D features. Therefore, this
C3DObject contains a
CAudioPath and inherits from a
C3DBuffer and a
3.2 Creating the application
3.2.1 Setting up the development environment
The DirectMidi wrapper can be used in any kind of Win32 applications provided by the Visual Studio wizard. In the source code explained in the example below I will use a "Simple application" of a Win32 console. Once we have created the basic files for this application we can proceed to include the headers (.h) and source files (.cpp) required by the DirectMusic application. Thus, for example, if we start an application which uses the Audio part of the DirectMidi library, we will need to include the followings headers: CDirectMidi.h, CDirectBase.h and CAudioPart.h. So as for a MIDI application as for an Audio one, we will always need to include the first two headers. Then, depending on the classes we need, we will have to include the respective .cpp file for the implementation of that class. For example, if we use the COutputPort class, we will need to include the CMidiPort.cpp and COutputPort.cpp source files. For instance, if we use the
CAPathPerformance, we will need the CPerformance.cpp and CAPathPerformance.cpp classes.
Besides the DirectMidi classes, we will also need to include the existing files inside the Dsutil directory of the DirectMidi folder, since they contain DirectX basic source code utilities for wave file handling. After including all the required files, it's important to compile the whole project without precompiled headers (.pch) because the DirectMidi project doesn't support precompiled headers by default. Therefore, go to your Visual Studio 6 and select Project -> Settings and expand the left tree view to see your project files. Then, select the C/C++ tab option, click on the Precompiled headers in the combo and mark "Not using precompilder headers" for all the files included in the project. If you use Visual Studio 7, the procedure is similar: Click on the Solution explorer tab, select all the listed .cpp files and then right click. On properties, select C/C++, Precompiled headers and specify "Not Using Precompiled Headers". Finally, remove all the references to the "stdafx.h" header in the project sources.
In order to avoid unresolved external symbols after linking the DirectX functions, we will need to include a path to the DirectX libraries and headers. To do this, go to Tools in the menu bar, select Options and then click on the Directories tab to add the path to the DirectX8/9 headers and library files. If you have the Visual Studio 7 (.NET), go to Tools in the menu bar, click on Options and then open the Projects folder. Expand the Show Directories for combo list and select the library and include files option. Finally, add the header and library files directories to their respective lists.
3.2.2 The main objects
In this section it is explained how to declare and initialize the fundamental objects involved in a DirectMidi audio playback application. In the source code below you can see the list of instantiated DirectMidi objects:
#pragma comment (lib,"dxguid.lib") // guid definitions
#pragma comment (lib,"winmm.lib")
#pragma comment (lib,"dsound.lib")
#pragma comment (lib,"dxerr9.lib")
using namespace std;
using namespace directmidi;
int main(int argc, char* argv)
In the first lines of the code we include the ANSI and DirectMidi headers required for the application. After this, we proceed to specify the libraries that will be linked within the application using the
#pragma precompiler directive. It's very important to know that the DirectMidi library is wrapped into the directmidi namespace. Therefore, you must call
using namespace directmidi in order to tell the compiler that we are referencing this logic grouping. The first of these objects is the
CDirectMusic which is the responsible to initialize the DirectMusic application and will be the last one to be released. The second main object is the
CDLSLoader that manages the loading of audio sequences and DLS files. The next two important objects are
COutPort which is the
COutputPort object type that will be added to the
CPortPerformance object and
COutPort2 that will keep a reference to the internal output port of the
CAPathPerformance object. Together with these two output ports we have one of
CInputPort type in order to handle thruing from the default MPU-401 port to the performance synthesizer and will be explained at the end of this article. Then we have the two performance objects used along the application whose differences were explained previously. This example code will make use of DLS instruments, so we will need a container for the instrument collection extracted from the DLS file and an instance to a
CInstrument object in order to download them to the performance. In addition we will need a
CAudioPath object to apply effects over the data stream and a
CSegment and a
C3DSegment to play stereo plus reverb and 3D positioning MIDI files respectively. Finally, we will attain 3D audio positioning by using a
C3DBuffer object in conjunction with the
CAudioPathPerformance. Note that the instantiation order must be from parents to children objects, so that the later destruction of the objects follows a LIFO criterion, thus avoiding that a parent object is destructed before its children.
3.2.3 Starting up the performance
As we have seen previously, all the required objects for a DirectMusic application were already instantiated. What is left now is calling their methods in a proper order to start music output. Take a look at the following lines:
DWORD dwPortCount = 0;
while (!(PortInfo.dwFlags & DMUS_PC_SOFTWARESYNTH));
cout << "Selected output port: " << PortInfo.szPortDescription << endl;
COutPort.SetPortParams(0,0,0,SET_REVERB | SET_CHORUS,44100);
After initializing DirectMusic, it proceeds to initialize the audiopath performance. To do this, it calls the
CAPathPerformance::Initialize method with a reference to the created DirectMusic object, two NULL pointers representing the DirectSound object to be created and the
HWND window handle for the creation of DirectSound. After these three first parameters are established, we can now specify what kind of default Audiopath we want in our application. In this case we choose a
DMUS_APATH_DYNAMIC_3D type which is the one suitable for a 3D audio application. In the previous line, the
CPortPerformance object is initialized calling the
CPortPerformance::Initialize method, passing a reference to the main DirectMusic object and two pointers representing the DirectSound and the
HWND handler parameters. As it can be seen in the code above, we select and start a software synthesizer port, handled by the
COutputPort object type named
COutPort. After starting it, we call the
CPortPerformance::AddPort method to add the port to the performance and assign it a block of 16 performance channels (0 represents channels 0-15, first parameter) and map it to a channel group (second parameter).
3.2.4 Loading and playing segments
<!------------------------------- That's it! --------------------------->In the code below it is explained how to load a MIDI file sequence and play it with the two types of performances objects.
cout << "Playing a segment with the port performance."
"Press a key to continue...\n" << endl;
cout << "Playing a segment with the audiopath performance."
" Press a key to continue...\n" << endl;
CDLSLoader::LoadSegment method we can load either a MIDI or a WAVE (.WAV) file representing an audio sequence. We just need to specify the format of the file in the third parameter, where a
TRUE boolean value indicates that we are trying to load a standard MIDI file (without attached DLS data) and a
FALSE value indicates that we are going to load a wave, a segment or a MIDI file containing DLS. Optionally, we can specify how many times we want our sequence to be repeated. Therefore, if we call the
CSegment::SetRepeats with the
DMUS_SEG_REPEAT_INFINITE flag, the sequence will be repeated ad infinitum. If we are using a software synthesizer, it is mandatory to call
CSegment::Download in order to download the file data to the performance synthesizer. This is not necessary when non-software synthesizer ports are added. Once we have downloaded the sequence to the performance we can proceed to either play or stop it by using the
CPortPerformance::Stop methods respectively. Similarly, we can proceed to play a CSegment over the
CAPathPerformance using the same methods, but this time, we need to call
3.2.5 3D Segment positioning
The main pourpose of this section is to explain the use of a
C3DSegment. It is a all-in-one piece for playing sounds in the 3D space.
C3DSegment1 = CSegment1;
cout << "Playing a 3D segment on your left."
" Press a key to continue...\n" << endl;
cout << "Playing a 3D segment on your right. "
"Press a key to continue...\n" << endl;
First of all, the C3DSegment object must be initialized with a reference to the AudiopathPerformance object where it is to be played. This will initialize its inherited C3DBuffer and C3DListener objects and its internal AudioPath. The
C3DSegment1 = CSegment1 expression is not necessary in the example since you can initialize a C3DSegment object by calling
CDLSLoader::LoadSegment; it is there just for illustrative purpose. The reason is that DirectMidi supports "=" operator overloading for the
CSegment base class and the
C3DSegment inherited class. It just clones the audio data sequence and invokes a new
IDirectMusicSegment interface for the new object. After initializing the
C3DSegment we can proceed to play the segment and position its DirectSound 3D buffer by calling
C3DSegment::SetBufferPosition. It will apply space positioning to the playing sound.
3.2.6 MIDI functions with performances
One of the most important features in DirectMidi is the possibility to use MIDI commands and functions to apply different effects such as 3D positioning while having an unlimited number of notes playing at a time. In the next few lines we can see the base case. It shows how to send a
NOTE_ON MIDI message to the port performance.
cout << "Sending a MIDI note-on to the port "
"performance on PChannel 1. Press a key to continue...\n" << endl;
The fourth parameter in the
CPortPerformance::SendMidiMsg method is the PChannel or Performance channel. It is here where the set of MIDI channel and channel group assigned to the downloaded segment are mapped. This PChannel contains a specific instrument and MIDI data. Therefore, we will never use MIDI channels when using performances in the future.
184.108.40.206 Sending a note-on in the 3D space
The following simple code snippet, sends a NOTE_ON to the audiopath performance and positions the note on the left. When sending MIDI messages to the audiopath performances, we must provide an additional fifth parameter to the
CAudioPathPerformance::SendMidiMsg that indicates the audiopath where we are working. In the example below we use the internal audiopath inside the
C3DSegment object which controls 3D positioning.
cout << "Playing a MIDI note-on with the audio path performance"
" on your left. Press a key to continue...\n" << endl;
220.127.116.11 Loading and playing a DLS instrument in 3D
The following code explains how to play a NOTE_ON MIDI command using DLS instruments and position it in the 3D space. It also presents some new DirectMidi methods to work with audiopaths. See below:
cout << "Playing a GM/GS DLS instrument with "
"the audio performance on your left.
Press a key to coninue...\n" << endl;
We start this example part introducing a new method to get the default audiopath. This one is the
CAPathPerformance::GetDefaultAudioPath method which retrieves the internally created audiopath. Once we have a reference to the object represeting the audiopath, we can proceed to obtain the
C3DBuffer object that will perform 3D positioning in the application using the
CAudioPath::Get3DBuffer method. If we wanted to set the position of the listener in the space we would have to call the
CAudioPath::Get3DListener method. Before downloading a DLS instrument to the performance memory, we must unload the previously stored segment data from it. Therefore we call the
CSegment::Unload method to achieve this. Now, we can load the DLS data, but we must first provide a container for the instrument list in the DLS file, this is the
CCollectionA object. After calling
CDLSLoader::LoadDLS and passing a referece to a
CCollection object, we extract a specific instrument from it by calling
CCollection::GetInstrument. Afterwards, we set a patch number for the new instrument calling
CInstrument::SetPatch and assigning it the 0 MIDI program. Finally, before proceeding to download the instrument, we should provide a playable note region for the instrument using the
CInstrument::SetNoteRange method. Now, all is ready to download the instrument to the performance. The method which performs this is
CAPathPerformance::DownloadInstrument which is not too different from the previously presented method to play audiopath performances. It just needs the PChannel where the instrument will be assigned and two DWORD's for retrieving the channel group and the MIDI channel where it will be stored. After downloading it, we must select the patch of the instrument by sending a
PATCH_CHANGE message with the PChannel and the patch number to be set up in the performance. And finally, we can proceed to play the note in the 3D space by providing a reference to the default audiopath (
CAudioPath1) in the call to the
18.104.22.168 Wave sample instruments and thruing to the 3D performance
In this example we will be able to test one of the most amazing features of DirectMusic. We will load a sample from a wave file (.WAV) and download it to the audiopath performance in order to apply 3D effects. Afterwards, we will activate an input port to attain a thru connection between the port and the performance and play in the space with an external keyboard. The next code explains how to achieve it:
9 IDirectMusicPort8* pPort = NULL;
21 CSampleInstrument CSample1;
39 REGION region;
40 ARTICPARAMS articparams;
50 region.RangeKey.usHigh = 127;
51 region.RangeKey.usLow = 0;
52 region.RangeVelocity.usHigh = 127;
56 articparams.LFO.tcDelay = TimeCents(10.0);
57 articparams.LFO.pcFrequency = PitchCents(5.0);
61 articparams.PitchEG.tcAttack = TimeCents(0.0);
62 articparams.PitchEG.tcDecay = TimeCents(0.0);
63 articparams.PitchEG.ptSustain = PercentUnits(0.0);
64 articparams.PitchEG.tcRelease = TimeCents(0.0);
69 articparams.VolEG.tcAttack = TimeCents(1.275);
70 articparams.VolEG.tcDecay = TimeCents(0.0);
71 articparams.VolEG.ptSustain = PercentUnits(100.0);
72 articparams.VolEG.tcRelease = TimeCents(10.157);
100 cout << "Playing a downloaded WAV file with
the audio path performance on the right.
Press a key to continue... \n" << endl;
122 cout << "Activating MIDI thru from input "
"MIDI channel 0 to PChannel 2.\n" \
123 "Play with an external keyboard. Press a key to "
"continue... \n" << endl;
Firstly, before starting the sample loading, we proceed to unload the DLS instrument previously used, calling
CAPathPerformance::UnloadInstrument. In order to download the sample to the performance, we will need to know which default
IDirectMusicPort8 the performance is using to access to the wave-table memory. Therefore, we inquire this port by calling the
CAPathPerformance::PChannelInfo with a PChannel number as a parameter. This method provides a reference to an increased
IDirectMusicPort8 interface pointer related to the specified PChannel. Once we have obtained the
IDirectMusicPort8 interface pointer, we will be able to handle it in a higher level by activating a
COutputPort object type through a call to the
COutputPort::ActivatePortFromInterface method (line 19th). After this important process, we can continue instancing a
CSampleInstrument object type in order to load a wave file using the
CDLSLoader::LoadWaveFile method. Afterwards, we proceed to apply instrument parameters such as patch number, unity key note, looping, regions and articulations as shown in lines 29-78th. Before downloading the data to the port wave-table, we need to allocate memory for the download interfaces, so we will call the
COutputPort::AllocateMemory method. Finally, we can download the PCM data with the
COutputPort::DownloadInstrument method and play the note in 3D space through the PChannel 2 as commented in the 91-97th lines. The second part of the example above explains how to activate the MIDI thru from the default input port to the performance. To do this, we will need to activate a
CInputPort object (lines 110-112th) and obtain the channel group and the MIDI channel assigned to the PChannel 2. Once we have got this two parameters, we can proceed to activate the thru from the 0 MIDI channel in our external MIDI port to the PChannel that contains the MIDI channel and the channel group previously obtained in the call to
CAPathPerformance::PChannelInfo. In this case it is the number 2.
3.2.7 Advanced audio features
This last section explains how to apply advanced audio FX by using DMO's supplied by DirectX. DMO stands for DirectX Media Object and is a COM object that processes multimedia data streams from buffers allocated by the client. Basically the DMO is implemented as a COM object in a dll library and registered in the system. If you want to learn more about it you can read the More Information section of this article. In the next lines it is explained how to apply Reverberation effects to the downloaded sample instrument. See below:
FX.fInGain = DSFX_WAVESREVERB_INGAIN_MAX;
FX.fReverbMix = DSFX_WAVESREVERB_REVERBMIX_MAX;
FX.fReverbTime = DSFX_WAVESREVERB_REVERBTIME_MAX;
FX.fHighFreqRTRatio = DSFX_WAVESREVERB_HIGHFREQRTRATIO_MAX;
cout << "Playing a sample on PChannel 2 with maximum reverberation.\n" \
"Play with an external keyboard. "
"Press a key to end the application...\n" << endl;
catch (CDMusicException& DMExcp)
cout << "\n" << DMExcp.GetErrorDescription() << "\n" << endl;
Before applying the effect to the performance, it's important to release the old audiopath which was configured for 3D positioning and doesn't allow retrieving FX interfaces. Therefore, we call the
CAudioPath::ReleaseAudioPath method to remove the references to the last audiopath. After releasing it, we can proceed to create a new audiopath which shares its FX buffers. So we call the
CAPathPerformance::CreateAudioPath method with
DMUS_APATH_SHARED_STEREOPLUSREVERB as a parameter to create it. Previously apply the effect, it's necessary to instance a pointer to a
IDirectSoundFXWavesReverb8 handled by the
CComPtr ATL template. This template will automatically release the COM interface references. Then, we can get the reference to the DMO object by calling the
CAudioPath::GetObjectInPath method with the
DMUS_PCHANNEL_ALL parameter to search in all PChannels, the DMUS_PATH_BUFFER_DMO parameter to get a DMO object in a buffer, the 1 index representing the buffer in which the DMO resides, the class identifier of the object, the 0 index to find the first matching object and the FX interface pointer. If success is returned by the pevious method call, we are ready to apply the effect by filling the suitable parameters in the
DSFXWavesReverb members and calling the
IDirectSoundFXWavesReverb8::SetAllParameters interface method to execute it. The next table shows the different FX interfaces that can be retrieved from the stereo plus reverb audiopath:
3.2.8 Exception handling
The implementation of the DirectMidi class library provides assertion and failure detection in each line of the source code where a call to a DirectMusic COM method is made. This prevents error situations when a bad call is made, either by invoking the function in the wrong order or by supplying incorrect parameters such as non-initialized variables. Looking at the scheme of a DirectMidi wrapper function we can see that the method provides exhaustive information about the class member function, the line and the
HRESULT description where the exception was thrown:
HRESULT CMasterClock::ActivateMasterClock(LPCLOCKINFO ClockInfo)
HRESULT hr = DM_FAILED;
TCHAR strMembrFunc = _T("CMasterClock::ActivateMasterClock()");
if (!m_pMusic8 || !ClockInfo) throw CDMusicException(
if (FAILED(hr = m_pMusic8->SetMasterClock(ClockInfo->guidClock)))
if (FAILED(hr = m_pMusic8->GetMasterClock(
Therefore, when we catch a DirectMidi exception through a reference to a
CDMusicException object, we can obtain information about the error with the following class members:
* HRESULT m_hrCode: HRESULT code whith the DirectMusic API call failure
* TCHAR m_strMethod: UNICODE/MBCS string containing a method description where the error was generated.
* INT m_nLine: Number of the line in the module where the error was produced.
* LPCTSTR GetErrorDescription(): Retrieves a complete error description with the previous commented members.
* LPCTSTR ErrorToString(): Returns a string with a definition of the generated error.
4. More information
You can find more information about DirectMusic and DirectX in the Microsoft MSDN on the net or in the DirectX SDK documentation. If you are looking for more information about DirectMIDI you can visit the DirectMIDI homepage in the sourceforge.net website. More information about the DMO architecture and implementation can be found in this MSDN link.
5. The demo application
The demo application called Virtual Island is a simple 3D viewer inspired in a distant lost island in the Pacific Ocean. When you get into it, you can hear all the sounds that surround you like waves, seagulls and wind. This program hasn't got any practical utility in the real world but it is useful to check the DirectMidi features like 3D positioning and audiopaths. It's completely programmed in C++ using OpenGL functions and DirectX and the free source code can be found in the DirectMIDI homepage in the contributions section.
Distant seagulls flying into the deep blue sky
Island overview from the top
||DirectMidi changes |
MidiStation 1.4.4 features:
- All hardware MIDI ports available
- Interactive built-in music keyboard
- Connection to an external MIDI keyboard
- Visualization list of the external and internal keyboard messages
- GM instrument support
- Octave range selection
- Recording and playback
MidiStation 1.8.4 features:
- All hardware and software MIDI ports available
- Internal GM/GS set load
- PC keyboard octave control
- Improved playback system
- Message list guideline
MidiStation 1.9.0 features:
- Built-in and reusable keyboard control
- Implicit multithread synchronization
- Full octave range selection
- Improved PC keyboard control
- Hand cursor
MidiStation 1.9.1 features:
- Fixed running status bug (Andras22 bug)
- Fixed DLS port bug
- Fixed message list bugs
MidiStation 1.9.2 features:
- Loading and saving of .MDS sequenced files
- Unlimited recording
DirectMIDI 2.0b changes:
- Improved class destructors
- Software synthesizers available
- Added flexible conversion functions
- SysEx reception and sending enabled
- Better method to enumerate and select MIDI ports
- Restructured the class system
- Adapted the external thread to a pure virtual function
DirectMIDI 2.1b changes:
- Added exception handling
- Fixed bugs with channel groups
- Redesigned class hierarchy
- DirectMIDI wrapped in the directmidi namespace
- UNICODE/MBCS support for instruments and ports
- Added DELAY effect to software synthesizer ports
- Project released under GNU (General Public License) terms
DirectMIDI 2.2b changes:
- Redesigned class interfaces
- Safer input port termination thread
- New CMasterClock class
- DLS files can be loaded from resources
- DLS instrument note range support
- New CSampleInstrument class added to the library
- Direct downloading of samples to wave-table memory
- WAV file sample format supported
- New methods added to the output port class
- Fixed small bugs
DirectMIDI 2.3b changes:
- Added new DirectMusic classes for audio handling
- Improved CMidiPort class with internal buffer run-time resizing
- 3D audio positioning