↓ Skip to main content

Soundgen: An open-source tool for synthesizing nonverbal vocalizations

Overview of attention for article published in Behavior Research Methods, July 2018
Altmetric Badge

About this Attention Score

  • In the top 25% of all research outputs scored by Altmetric
  • High Attention Score compared to outputs of the same age (87th percentile)
  • High Attention Score compared to outputs of the same age and source (84th percentile)

Mentioned by

blogs
1 blog
twitter
19 X users

Citations

dimensions_citation
62 Dimensions

Readers on

mendeley
88 Mendeley
Title
Soundgen: An open-source tool for synthesizing nonverbal vocalizations
Published in
Behavior Research Methods, July 2018
DOI 10.3758/s13428-018-1095-7
Pubmed ID
Authors

Andrey Anikin

Abstract

Voice synthesis is a useful method for investigating the communicative role of different acoustic features. Although many text-to-speech systems are available, researchers of human nonverbal vocalizations and bioacousticians may profit from a dedicated simple tool for synthesizing and manipulating natural-sounding vocalizations. Soundgen ( https://CRAN.R-project.org/package=soundgen ) is an open-source R package that synthesizes nonverbal vocalizations based on meaningful acoustic parameters, which can be specified from the command line or in an interactive app. This tool was validated by comparing the perceived emotion, valence, arousal, and authenticity of 60 recorded human nonverbal vocalizations (screams, moans, laughs, and so on) and their approximate synthetic reproductions. Each synthetic sound was created by manually specifying only a small number of high-level control parameters, such as syllable length and a few anchors for the intonation contour. Nevertheless, the valence and arousal ratings of synthetic sounds were similar to those of the original recordings, and the authenticity ratings were comparable, maintaining parity with the originals for less complex vocalizations. Manipulating the precise acoustic characteristics of synthetic sounds may shed light on the salient predictors of emotion in the human voice. More generally, soundgen may prove useful for any studies that require precise control over the acoustic features of nonspeech sounds, including research on animal vocalizations and auditory perception.

X Demographics

X Demographics

The data shown below were collected from the profiles of 19 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 88 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 88 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 16 18%
Researcher 15 17%
Student > Bachelor 8 9%
Student > Master 6 7%
Student > Doctoral Student 5 6%
Other 12 14%
Unknown 26 30%
Readers by discipline Count As %
Agricultural and Biological Sciences 16 18%
Computer Science 10 11%
Psychology 9 10%
Environmental Science 5 6%
Linguistics 3 3%
Other 15 17%
Unknown 30 34%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 17. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 18 May 2022.
All research outputs
#2,146,111
of 25,653,515 outputs
Outputs from Behavior Research Methods
#215
of 2,564 outputs
Outputs of similar age
#42,408
of 342,320 outputs
Outputs of similar age from Behavior Research Methods
#8
of 52 outputs
Altmetric has tracked 25,653,515 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 91st percentile: it's in the top 10% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 2,564 research outputs from this source. They typically receive more attention than average, with a mean Attention Score of 8.0. This one has done particularly well, scoring higher than 91% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 342,320 tracked outputs that were published within six weeks on either side of this one in any source. This one has done well, scoring higher than 87% of its contemporaries.
We're also able to compare this research output to 52 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 84% of its contemporaries.