Publikation:

Shannon entropy as a robust estimator of Zipf's Law in animal vocal communication repertoires

Lade...
Vorschaubild

Dateien

Zu diesem Dokument gibt es keine Dateien.

Datum

2021

Autor:innen

Kershenbaum, Arik
Gammon, David E.
Geffen, Eli
Gustison, Morgan L.
Ilany, Amiyaal
Lameira, Adriano R.

Herausgeber:innen

Kontakt

ISSN der Zeitschrift

Electronic ISSN

ISBN

Bibliografische Daten

Verlag

Schriftenreihe

Auflagebezeichnung

URI (zitierfähiger Link)
ArXiv-ID

Internationale Patentnummer

Angaben zur Forschungsförderung

Projekt

Open Access-Veröffentlichung
Core Facility der Universität Konstanz

Gesperrt bis

Titel in einer weiteren Sprache

Publikationstyp
Zeitschriftenartikel
Publikationsstatus
Published

Erschienen in

Methods in Ecology and Evolution. British Ecological Society. 2021, 12(3), pp. 553-564. ISSN 2041-2096. eISSN 2041-210X. Available under: doi: 10.1111/2041-210X.13536

Zusammenfassung

  1. Information complexity in animals is an indicator of advanced communication and an intricate socio‐ecology. Zipf's Law of least effort has been used to assess the potential information content of animal repertoires, including whether or not a particular animal communication could be ‘language‐like’. As all human languages follow Zipf's law, with a power law coefficient (PLC) close to −1, animal signals with similar probability distributions are postulated to possess similar information characteristics to language. However, estimation of the PLC from limited empirical datasets (e.g. most animal communication studies) is problematic because of biases from small sample sizes.
    2. The traditional approach to estimating Zipf's law PLC is to find the slope of a log–log rank‐frequency plot. Our alternative option uses the underlying equivalence between Shannon entropy (i.e. whether successive elements of a sequence are unpredictable, or repetitive) and PLC. Here, we test whether an entropy approach yields more robust estimates of Zipf's law PLC than the traditional approach.
    3. We examined the efficacy of the entropy approach in two ways. First, we estimated the PLC from synthetic datasets generated with a priori known power law probability distributions. This revealed that the estimated PLC using the traditional method is particularly inaccurate for highly stereotyped sequences, even at modest repertoire sizes. Estimation via Shannon entropy is accurate with modest sample sizes even for repertoires with thousands of distinct elements. Second, we applied these approaches to empirical data taken from 11 animal species. Shannon entropy produced a more robust estimate of PLC with lower variance than the traditional method, even when the true PLC is unknown. Our approach for the first time reveals Zipf's law operating in the vocal systems of multiple lineages: songbirds, hyraxes and cetaceans.
    4. As different methods of estimating the PLC can lead to misleading results in real data, estimating the balance of a communication system between simplicity and complexity is best performed using the entropy approach. This provides a more robust way to investigate the evolutionary constraints and processes that have acted on animal communication systems, and the parallels between these processes and the evolution of language.

Zusammenfassung in einer weiteren Sprache

Fachgebiet (DDC)
570 Biowissenschaften, Biologie

Schlagwörter

animal communication, information theory, language, Shannon entropy, Zipf's Law

Konferenz

Rezension
undefined / . - undefined, undefined

Forschungsvorhaben

Organisationseinheiten

Zeitschriftenheft

Zugehörige Datensätze in KOPS

Zitieren

ISO 690KERSHENBAUM, Arik, Vlad DEMARTSEV, David E. GAMMON, Eli GEFFEN, Morgan L. GUSTISON, Amiyaal ILANY, Adriano R. LAMEIRA, 2021. Shannon entropy as a robust estimator of Zipf's Law in animal vocal communication repertoires. In: Methods in Ecology and Evolution. British Ecological Society. 2021, 12(3), pp. 553-564. ISSN 2041-2096. eISSN 2041-210X. Available under: doi: 10.1111/2041-210X.13536
BibTex
@article{Kershenbaum2021Shann-52906,
  year={2021},
  doi={10.1111/2041-210X.13536},
  title={Shannon entropy as a robust estimator of Zipf's Law in animal vocal communication repertoires},
  number={3},
  volume={12},
  issn={2041-2096},
  journal={Methods in Ecology and Evolution},
  pages={553--564},
  author={Kershenbaum, Arik and Demartsev, Vlad and Gammon, David E. and Geffen, Eli and Gustison, Morgan L. and Ilany, Amiyaal and Lameira, Adriano R.}
}
RDF
<rdf:RDF
    xmlns:dcterms="http://purl.org/dc/terms/"
    xmlns:dc="http://purl.org/dc/elements/1.1/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:bibo="http://purl.org/ontology/bibo/"
    xmlns:dspace="http://digital-repositories.org/ontologies/dspace/0.1.0#"
    xmlns:foaf="http://xmlns.com/foaf/0.1/"
    xmlns:void="http://rdfs.org/ns/void#"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema#" > 
  <rdf:Description rdf:about="https://kops.uni-konstanz.de/server/rdf/resource/123456789/52906">
    <dcterms:rights rdf:resource="https://rightsstatements.org/page/InC/1.0/"/>
    <dc:contributor>Geffen, Eli</dc:contributor>
    <dc:creator>Ilany, Amiyaal</dc:creator>
    <dc:contributor>Lameira, Adriano R.</dc:contributor>
    <dc:contributor>Demartsev, Vlad</dc:contributor>
    <void:sparqlEndpoint rdf:resource="http://localhost/fuseki/dspace/sparql"/>
    <dcterms:isPartOf rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/>
    <dc:creator>Geffen, Eli</dc:creator>
    <dc:date rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-02-18T13:29:46Z</dc:date>
    <dc:contributor>Gustison, Morgan L.</dc:contributor>
    <dcterms:abstract xml:lang="eng">1. Information complexity in animals is an indicator of advanced communication and an intricate socio‐ecology. Zipf's Law of least effort has been used to assess the potential information content of animal repertoires, including whether or not a particular animal communication could be ‘language‐like’. As all human languages follow Zipf's law, with a power law coefficient (PLC) close to −1, animal signals with similar probability distributions are postulated to possess similar information characteristics to language. However, estimation of the PLC from limited empirical datasets (e.g. most animal communication studies) is problematic because of biases from small sample sizes.&lt;br /&gt;2. The traditional approach to estimating Zipf's law PLC is to find the slope of a log–log rank‐frequency plot. Our alternative option uses the underlying equivalence between Shannon entropy (i.e. whether successive elements of a sequence are unpredictable, or repetitive) and PLC. Here, we test whether an entropy approach yields more robust estimates of Zipf's law PLC than the traditional approach.&lt;br /&gt;3. We examined the efficacy of the entropy approach in two ways. First, we estimated the PLC from synthetic datasets generated with a priori known power law probability distributions. This revealed that the estimated PLC using the traditional method is particularly inaccurate for highly stereotyped sequences, even at modest repertoire sizes. Estimation via Shannon entropy is accurate with modest sample sizes even for repertoires with thousands of distinct elements. Second, we applied these approaches to empirical data taken from 11 animal species. Shannon entropy produced a more robust estimate of PLC with lower variance than the traditional method, even when the true PLC is unknown. Our approach for the first time reveals Zipf's law operating in the vocal systems of multiple lineages: songbirds, hyraxes and cetaceans.&lt;br /&gt;4. As different methods of estimating the PLC can lead to misleading results in real data, estimating the balance of a communication system between simplicity and complexity is best performed using the entropy approach. This provides a more robust way to investigate the evolutionary constraints and processes that have acted on animal communication systems, and the parallels between these processes and the evolution of language.</dcterms:abstract>
    <dcterms:available rdf:datatype="http://www.w3.org/2001/XMLSchema#dateTime">2021-02-18T13:29:46Z</dcterms:available>
    <dcterms:issued>2021</dcterms:issued>
    <dc:rights>terms-of-use</dc:rights>
    <dcterms:title>Shannon entropy as a robust estimator of Zipf's Law in animal vocal communication repertoires</dcterms:title>
    <dc:contributor>Kershenbaum, Arik</dc:contributor>
    <dc:contributor>Ilany, Amiyaal</dc:contributor>
    <dspace:isPartOfCollection rdf:resource="https://kops.uni-konstanz.de/server/rdf/resource/123456789/28"/>
    <dc:creator>Demartsev, Vlad</dc:creator>
    <foaf:homepage rdf:resource="http://localhost:8080/"/>
    <dc:creator>Kershenbaum, Arik</dc:creator>
    <bibo:uri rdf:resource="https://kops.uni-konstanz.de/handle/123456789/52906"/>
    <dc:language>eng</dc:language>
    <dc:creator>Gammon, David E.</dc:creator>
    <dc:creator>Lameira, Adriano R.</dc:creator>
    <dc:contributor>Gammon, David E.</dc:contributor>
    <dc:creator>Gustison, Morgan L.</dc:creator>
  </rdf:Description>
</rdf:RDF>

Interner Vermerk

xmlui.Submission.submit.DescribeStep.inputForms.label.kops_note_fromSubmitter

Kontakt
URL der Originalveröffentl.

Prüfdatum der URL

Prüfungsdatum der Dissertation

Finanzierungsart

Kommentar zur Publikation

Allianzlizenz
Corresponding Authors der Uni Konstanz vorhanden
Internationale Co-Autor:innen
Universitätsbibliographie
Begutachtet
Ja
Diese Publikation teilen