---
title: Audio (expo-audio)
description: A library that provides an API to implement audio playback and recording in apps.
sourceCodeUrl: https://github.com/expo/expo/tree/sdk-52/packages/expo-audio
packageName: expo-audio
iconUrl: /static/images/packages/expo-av.png
platforms: ["android", "ios", "web"]
isNew: true
---
> **warning** This page documents an upcoming version of the Audio library. **Expo Audio is currently in alpha and subject to breaking changes.**
`expo-audio` is a cross-platform audio library for accessing the native audio capabilities of the device.
Note that audio automatically stops if headphones/bluetooth audio devices are disconnected.
## Installation
## Configuration in app config
You can configure `expo-audio` using its built-in [config plugin](/config-plugins/introduction/) if you use config plugins in your project ([EAS Build](/build/introduction) or `npx expo run:[android|ios]`). The plugin allows you to configure various properties that cannot be set at runtime and require building a new app binary to take effect. If your app does **not** use EAS Build, then you'll need to manually configure the package.
```json app.json
{
"expo": {
"plugins": [
[
"expo-audio",
{
"microphonePermission": "Allow $(PRODUCT_NAME) to access your microphone."
}
]
]
}
}
```
## Usage
### Playing sounds
```jsx
const audioSource = require('./assets/Hello.mp3');
const player = useAudioPlayer(audioSource);
return (
<View style={styles.container}>
<Button title="Play Sound" onPress={() => player.play()} />
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
backgroundColor: '#ecf0f1',
padding: 10,
},
});
```
### Recording sounds
```jsx
const audioRecorder = useAudioRecorder(RecordingPresets.HIGH_QUALITY);
const record = async () => {
await audioRecorder.prepareToRecordAsync();
audioRecorder.record();
};
const stopRecording = async () => {
// The recording will be available on `audioRecorder.uri`.
await audioRecorder.stop();
};
useEffect(() => {
(async () => {
const status = await AudioModule.requestRecordingPermissionsAsync();
if (!status.granted) {
Alert.alert('Permission to access microphone was denied');
}
})();
}, []);
return (
<View style={styles.container}>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
backgroundColor: '#ecf0f1',
padding: 10,
},
});
```
### Playing or recording audio in background 
On iOS, audio playback and recording in background is only available in standalone apps, and it requires some extra configuration.
On iOS, each background feature requires a special key in `UIBackgroundModes` array in your **Info.plist** file.
In standalone apps this array is empty by default, so to use background features you will need to add appropriate keys to your **app.json** configuration.
See an example of **app.json** that enables audio playback in background:
```json
{
"expo": {
...
"ios": {
...
"infoPlist": {
...
"UIBackgroundModes": [
"audio"
]
}
}
}
}
```
### Notes on web usage
- A MediaRecorder issue on Chrome produces WebM files missing the duration metadata. [See the open Chromium issue](https://bugs.chromium.org/p/chromium/issues/detail?id=642012).
- MediaRecorder encoding options and other configurations are inconsistent across browsers, utilizing a Polyfill such as [kbumsik/opus-media-recorder](https://github.com/kbumsik/opus-media-recorder) or [ai/audio-recorder-polyfill](https://github.com/ai/audio-recorder-polyfill) in your application will improve your experience. Any options passed to `prepareToRecordAsync` will be passed directly to the MediaRecorder API and as such the polyfill.
- Web browsers require sites to be served securely for them to listen to a mic. See [MediaDevices `getUserMedia()` security](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia#security) for more details.
## API
```js
```