In some project I was working on, there was a requirement to introduce a feature that allowed the application users to record an audio of themselves. Being new to the world of Blazor, I immediately started looking for what the internet had to offer with regard to the requirements at hand. I stumbled upon a great package called Plugin.Maui.Audio
by @jfversluis. Of course, there were notable mentions that used interop JS to leverage the power of the browser but I did not want to go that route. I gave in after sometime but the interop approach was not working.
Looking at this package, the example given was for a .Net MAUI app. I divert. One of the reasons I fell in love with a Blazor Hybrid App, aside from the one codebase for all platforms, is the ability to utilize the powers of Blazor as well as .Net which gives you essentially a lot of power and flexibility in implementing features. Now, back to the package example. I had to sit down to tailor the example to fit in my Blazor application.
In this short article, I will share how I implemented this as well as playing the recorded audio using HTML's audio player tag.
Take note that this article assumes you have created your Blazor Hybrid app. We can start from the just-created Blazor app without any modifications.
First things first, permissions are very important since this is a Hybrid App which means it shall have to interact with different devices. You can find the detailed steps here.
Android
The AndroidManifest.xml
file will need to be modified to include the following uses-permission inside the manifest tag.
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
iOS
The Info.plist
file will need to be modified to include the following 2 entries inside the dict tag.
<key>NSMicrophoneUsageDescription</key>
<string>The [app name] wants to use your microphone to record audio.</string>
Replacing [app name] with your application name.
MacCatalyst
This change is identical to the iOS section but for explicitness:
The Info.plist
file will need to be modified to include the following 2 entries inside the dict tag.
<key>NSMicrophoneUsageDescription</key>
<string>The [app name] wants to use your microphone to record audio.</string>
Replacing [app name] with your application name.
Windows
The Package.appxmanifest
file will need to be modified to include the following entry inside the Capabilities tag.
<DeviceCapability Name="microphone"/>
After that, in the root of the project, create a folder called Services. Inside this folder, create a file called AudioService.cs
.
The below shall be the contents.
using Plugin.Maui.Audio;
using System.Threading.Tasks;
namespace MyBlazorHybridApp.Services
{
public interface IAudioService
{
Task StartRecordingAsync();
Task<Stream> StopRecordingAsync();
bool IsRecording { get; }
}
public class AudioService : IAudioService
{
private IAudioManager _audioManager;
private IAudioRecorder _audioRecorder;
public bool IsRecording => _audioRecorder.IsRecording;
public AudioService(IAudioManager audioManager)
{
_audioManager = audioManager;
_audioRecorder = audioManager.CreateRecorder();
}
public async Task StartRecordingAsync()
{
await _audioRecorder.StartAsync();
}
public async Task<Stream> StopRecordingAsync()
{
var recordedAudio = await _audioRecorder.StopAsync();
return recordedAudio.GetAudioStream();
}
}
}
In the above code, you can see that the above class implements 2 methods namely StartRecordingAsync
and StopRecordingAsync
. It also implements a bool IsRecording
which will be useful in our UI. Take note that the StopRecordingAsync
returns a Stream. According to the docs of Maui.Plugin.Audio, the StopAsync
method returns the IAudioSource
instance with the recording data. I am getting the audio stream in the IAudioSource instance provided.
Move over to MauiProgram.cs
and add using Plugin.Maui.Audio;
on the top of the file and builder.Services.AddSingleton(AudioManager.Current);
where the services are being injected in the builder before var app = builder.Build();
. This shall simply register your custom service with the application so it can be used globally within the application.
In your razor file, it will look like below
//different dependecies above
@inject AudioService AudioService;
<div class="flex-column mt3">
<div class="f5 mb2">
Record an audio
</div>
<div class="f5 mb2">
@if (!AudioService.IsRecording)
{
<button @onclick="StartRecording"> Start</button>
}
else
{
<button @onclick="StopRecording"> Stop</button>
}
</div>
<audio controls src="@AudioSource">
Your browser does not support the audio element.
</audio>
</div>
@code {
private Stream recordedAudioStream;
private string AudioSource { get; set; }
public async void StartRecording()
{
if (await Permissions.RequestAsync<Permissions.Microphone>() != PermissionStatus.Granted)
{
// Inform user to request permission
}
else
{
await AudioService.StartRecordingAsync();
}
}
public async void StopRecording()
{
recordedAudioStream = await AudioService.StopRecordingAsync();
var audioBytes = new byte[recordedAudioStream.Length];
await recordedAudioStream.ReadAsync(audioBytes, 0, (int)recordedAudioStream.Length);
var base64String = Convert.ToBase64String(audioBytes);
AudioSource = $"data:audio/mpeg;base64,{base64String}";
StateHasChanged();
}
}
The above code is for the UI. It injects the service into the UI and has buttons that will start/stop accordingly. However, when stopping, I save the IAudioSource instance with the recording data into a Stream called recordedAudioStream
. I then convert the stream into a base64string so that the AudioSource can be populated with the right data to play. So, what happens is when one presses play, it starts to record and when one presses Stop, it stops recording and the previously recorded audio is loaded into the audio player where one can replay the audio. This approach helps one not build out a UI for playback of the data but can utilize what is already available like the audio tag. I hope this has helped someone.
Happy Coding π§βπ».
Top comments (4)
Looking to enhance your Blazor Hybrid app with top-notch audio recording features? Consider partnering with a leading audio visual solutions provider. Their expertise in AV integration can help you seamlessly incorporate high-quality audio functionality, ensuring a professional and polished user experience. Elevate your app with the best in the industry!
I am working on a project where i have to record audio,
transcribe and convert it to text.
So I have exactly the same problem as you had (or have?).
Tried your code with a standalone Hybrid app and it works very well.
Although i also found some interop solutions, i was thinking about using the plugin as well.
Problem is that i want to share the razor pages with the web version of the app.
Therefore i created a shared project as proposed in the news from .net maui (new template for .net9), but considered to use it with targetnetwork .net8.
Naturally it cannot use the audioservice inside the "shared" razor pages from the .net maui project since it does not know how to handle it.
The only option i see is to create a local version of the razor page inside the maui project or do you have an idea for a better solution?
Hi @whann0205 , apologies for the late one. How did you go about solving it or are you still stuck?
Yes, i have found a solution.
All Pages can live in a shared Library and still access the plugin in
their pages. The only thing to do is move the IAudioService interface to the shared
project instead and let the AudioService be in the maui project.
Register it in mauiprogram.cs like this:
For the WebVersion of the solution one has to find something else
but so its possible to share the pages.
What helped me most is that example from BethMassi
(github.com/BethMassi/HybridSharedUI)