↓ Skip to main content

Domain-Specific Languages

Overview of attention for book
Cover of 'Domain-Specific Languages'

Table of Contents

  1. Altmetric Badge
    Book Overview
  2. Altmetric Badge
    Chapter 1 J Is for JavaScript: A Direct-Style Correspondence between Algol-Like Languages and JavaScript Using First-Class Continuations
  3. Altmetric Badge
    Chapter 2 Model-Driven Engineering from Modular Monadic Semantics: Implementation Techniques Targeting Hardware and Software
  4. Altmetric Badge
    Chapter 3 A MuDDy Experience–ML Bindings to a BDD Library
  5. Altmetric Badge
    Chapter 4 Gel: A Generic Extensible Language
  6. Altmetric Badge
    Chapter 5 A Taxonomy-Driven Approach to Visually Prototyping Pervasive Computing Applications
  7. Altmetric Badge
    Chapter 6 LEESA: Embedding Strategic and XPath-Like Object Structure Traversals in C++
  8. Altmetric Badge
    Chapter 7 Unit Testing for Domain-Specific Languages
  9. Altmetric Badge
    Chapter 8 Combining DSLs and Ontologies Using Metamodel Integration
  10. Altmetric Badge
    Chapter 9 A Domain Specific Language for Composable Memory Transactions in Java
  11. Altmetric Badge
    Chapter 10 CLOPS: A DSL for Command Line Options
  12. Altmetric Badge
    Chapter 11 Nettle: A Language for Configuring Routing Networks
  13. Altmetric Badge
    Chapter 12 Generic Libraries in C++ with Concepts from High-Level Domain Descriptions in Haskell
  14. Altmetric Badge
    Chapter 13 Domain-Specific Language for HW/SW Co-design for FPGAs
  15. Altmetric Badge
    Chapter 14 A Haskell Hosted DSL for Writing Transformation Systems
  16. Altmetric Badge
    Chapter 15 Varying Domain Representations in Hagl
  17. Altmetric Badge
    Chapter 16 A DSL for Explaining Probabilistic Reasoning
  18. Altmetric Badge
    Chapter 17 Embedded Probabilistic Programming
  19. Altmetric Badge
    Chapter 18 Operator Language: A Program Generation Framework for Fast Kernels
Overall attention for this book and its chapters
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (70th percentile)
  • Good Attention Score compared to outputs of the same age and source (74th percentile)

Mentioned by

twitter
2 X users
wikipedia
3 Wikipedia pages

Citations

dimensions_citation
78 Dimensions

Readers on

mendeley
16 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Title
Domain-Specific Languages
Published by
ADS, July 2009
DOI 10.1007/978-3-642-03034-5
ISBNs
978-3-64-203033-8, 978-3-64-203034-5
Editors

Taha, Walid Mohamed

X Demographics

X Demographics

The data shown below were collected from the profiles of 2 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 16 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 16 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 1 6%
Unknown 15 94%
Readers by discipline Count As %
Computer Science 1 6%
Unknown 15 94%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 5. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 08 June 2021.
All research outputs
#5,958,657
of 23,041,514 outputs
Outputs from ADS
#7,666
of 37,451 outputs
Outputs of similar age
#31,682
of 110,963 outputs
Outputs of similar age from ADS
#73
of 301 outputs
Altmetric has tracked 23,041,514 research outputs across all sources so far. This one has received more attention than most of these and is in the 73rd percentile.
So far Altmetric has tracked 37,451 research outputs from this source. They receive a mean Attention Score of 4.6. This one has done well, scoring higher than 79% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 110,963 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 70% of its contemporaries.
We're also able to compare this research output to 301 others from the same source and published within six weeks on either side of this one. This one has gotten more attention than average, scoring higher than 74% of its contemporaries.