Sunday, February 15, 2009

Introduction to Using NAudio

About NAudio
NAudio, is an Open Source* audio mixing libary written in C#, for the windows platform. It supports PInvoke methods for WaveOut, ASIO, DirectX and some other functions only available on Vista (WASAPI - Windows Vista Core Audio). It doesn't currently include abstraction support for PortAudio, OpenAL or GStreamer but here's hoping for some cross platform compatibility in the future.

The SubVersion repository for NAudio is very clean and easy to distinguish what you may need. My development environment is Windows XP x64 so there was an additional step required for the 64 bit environment, which was to set the platform to build as x86 by default.

You can do this by:

Clicking Build > Configuration Manager and then change all of the projects platform to x86

With the platform set, everything was ready to go. I recompiled and had a working x86 version, that runs on the 64 bit OS.

Introduction: Playing a Sample
This post is going to focus on loading an audio sample, playing the file and being able to reset the position of playback for the sample.

Start a new project, copy the NAudio.dll file over and add it in as a reference. Then set-up the using statements:

using NAudio.Wave;
using NAudio.CoreAudioApi;

In the class, we need to define the items which are going to be visible to all of our methods. We have two declarations we are concerned with in this introduction:
public partial class NAudioTest : Form
{
IWavePlayer waveOutDevice;
WaveStream mainOutputStream;
string fileName = null;

waveOutDevice which is an instance of IWavePlayer is going to be used to define the interface to a device that we will be using to play our audio.

mainOutputStream is used to store the audio sample we load and provides a level of abstraction from the Stream so that we don't need to manually move the stream data around when we want to adjust properties such as position - which allows us to easily seek positions within the sample.

fileName will just store the file name of the wave.

With these class declarations in place, the next logical step for us is to setup the device which is going to be used to output the Audio. At an appropriate location (under a button click event etc.) we can now initialise the instance. In this tutorial we are going to use the default available ASIO interface. This may not be suitable for all sound cards, it doesn't work on mine by default - but thanks to ASIO4ALL it does. Grab a copy of it, it's free and works wonders providing a lowlatency interface for mixing audio on windows.

try
{
waveOutDevice = new AsioOut();
}
catch (Exception driverCreateException)
{
MessageBox.Show(String.Format("{0}", driverCreateException.Message));
return;
}
The default ASIO device has hopefully been created (if it didn't work for you have you downloaded ASIO4ALL yet?). Now we have an audio device declared but nothing too exciting to your sense have started to occur - for that we need to load an audio file and we need to find one first. For the purpose of this tutorial you can either assume a fixed location and assign it to the fileName variable or add something similar to the OpenFileDialog snipet below:
OpenFileDialog openFileDialog = new OpenFileDialog();
openFileDialog.Filter = "Wave Files (*.wav)|*.wav|All Files (*.*)|*.*";
openFileDialog.FilterIndex = 1;
if (openFileDialog.ShowDialog() == DialogResult.OK)
{
fileName = openFileDialog.FileName;
}
I'll assume that needs no explanation of its own. This sample now needs to be loaded and stored in to a wave stream, which we have called mainOutputStream. We will use the CreateInputStream method from the NAudio Demo application to complete this task.
mainOutputStream = CreateInputStream(fileName);
Passing in the name of the file and having the input stream retuned. This doesn't require a lot of understanding if your just looking for the wave file to be loaded and the stream retuned, lets assume that's all you care about from this tutorial.
private WaveStream CreateInputStream(string fileName)
{
WaveChannel32 inputStream;
if (fileName.EndsWith(".wav"))
{
WaveStream readerStream = new WaveFileReader(fileName);
if (readerStream.WaveFormat.Encoding != WaveFormatEncoding.Pcm)
{
readerStream = WaveFormatConversionStream.CreatePcmStream(readerStream);
readerStream = new BlockAlignReductionStream(readerStream);
}
if (readerStream.WaveFormat.BitsPerSample != 16)
{
var format = new WaveFormat(readerStream.WaveFormat.SampleRate,
16, readerStream.WaveFormat.Channels);
readerStream = new WaveFormatConversionStream(format, readerStream);
}
inputStream = new WaveChannel32(readerStream);
}
else
{
throw new InvalidOperationException("Unsupported extension");
}
return inputStream;
}
Now for assembly of all of the pieces. We have a waveOutDevice defined where we will be sending audio to and a wave file has been loaded in to our mainOutputStream. All that's left is to connect the two and hit play:

try
{
waveOutDevice.Init(mainOutputStream);
}
catch (Exception initException)
{
MessageBox.Show(String.Format("{0}", initException.Message), "Error Initializing Output");
return;
}
waveOutDevice.Play();
Assuming everything was setup right, you should now hear a wave file being played. If you have difficulties check ASIO4ALL is installed. And the finishing touch is to reset the wave file to play back from the beginning. Find a friendly on-click button event and add the following:

mainOutputStream.CurrentTime = TimeSpan.FromSeconds(0);

Ahem. Done.

Thanks to Mark Heath for this great library - check out Mark's Blog Here. All of this code was ripped from the NAudio Demo Application and is available within the source code package available on codeplex.

Next Time

I'll be looking at loading multiple wave files simultaneously and how to use the included Mixing functions. Stay tuned.

* It is licensed under the Microsoft Public License (Ms-PL), which I personally am not to farmiular with. However according to the FAQ there doesn't seem to be any issues distributing it along with another Open Source application. As I use the GPL I need to look a little further in to the licensing compatability but based on what I have read, I am hopeful for the moment that it is fine.

6 comments:

Sunny said...

thanks providing a tutorial

PGiZ said...

Great Articles , :)

Sergi said...

Hi!

I like so much your examples. I find them very interesting...

I have a question. I have my laptop with his own sound card device and I have another USB sound card.

How can I play 2 mp3 (one of them by the Internal Sound Card and the other by the USB sound card?

I tried to do it, but I'm unable to do it

Anonymous said...

thx for the tutorial. it was really helpful to get me started using the library with c++/cli. i somewhat adapted the code though for my needs. maybe helpful for those wanting to code within managed c++ evironment.
// create waveOut-device
try {
outDevice = gcnew DirectSoundOut();
}
catch (Object ^error) { // oops! :D
MessageBox::Show(Convert::ToString(error));
return;
}

// get a file name to play back
OpenFileDialog ^openDlg = gcnew OpenFileDialog();
openDlg->Filter = "Wave Files (*.wav)|*.wav|All Files (*.*)|*.*";
openDlg->FilterIndex = 1;
if (openDlg->ShowDialog() != System::Windows::Forms::DialogResult::OK) {
MessageBox::Show("Open Error");
return;
}

// create a wave-stream from file
outStream = CreateInputStream(openDlg->FileName);
try {
outDevice->Init(outStream);
}
catch (Object ^error) {
MessageBox::Show(Convert::ToString(error));
return;
}

// play wave
outDevice->Play();

}
// CREATE A WAVE STREAM FROM FILE NAME
private: WaveStream ^CreateInputStream(String ^ fileName) {
WaveChannel32 ^inputStream;

if ((fileName->ToLower())->EndsWith(".wav")) { // if filename has .wav-extension

// create stream
WaveStream ^readerStream = gcnew WaveFileReader(fileName);

// check if encoding is pcm: if not convert
if (readerStream->WaveFormat->Encoding != WaveFormatEncoding::Pcm) {
readerStream = WaveFormatConversionStream::CreatePcmStream(readerStream);
readerStream = gcnew BlockAlignReductionStream(readerStream);
}
// check if it's 16bit-resolution: if not convert
if (readerStream->WaveFormat->BitsPerSample != 16) {
WaveFormat ^format = gcnew WaveFormat(readerStream->WaveFormat->SampleRate,
16, readerStream->WaveFormat->Channels);
readerStream = gcnew WaveFormatConversionStream(format, readerStream);
}
inputStream = gcnew WaveChannel32(readerStream);
}
if (inputStream != nullptr) return inputStream;
else return nullptr;
}

Anonymous said...

missed two lines at the beginning. insert this in front of the other stuff:
IWavePlayer ^outDevice;
WaveStream ^outStream;

Rescued said...

Thank you so much for creating this blog. I decided to create an audio application for one of my classes, and had no idea where to start. You're a life saver!