↓ Skip to main content

Machine Learning and Knowledge Extraction

Overview of attention for book
Cover of 'Machine Learning and Knowledge Extraction'

Table of Contents

  1. Altmetric Badge
    Book Overview
  2. Altmetric Badge
    Chapter 1 Current Advances, Trends and Challenges of Machine Learning and Knowledge Extraction: From Machine Learning to Explainable AI
  3. Altmetric Badge
    Chapter 2 A Modified Particle Swarm Optimization Algorithm for Community Detection in Complex Networks
  4. Altmetric Badge
    Chapter 3 Mouse Tracking Measures and Movement Patterns with Application for Online Surveys
  5. Altmetric Badge
    Chapter 4 Knowledge Compilation Techniques for Model-Based Diagnosis of Complex Active Systems
  6. Altmetric Badge
    Chapter 5 Recognition of Handwritten Characters Using Google Fonts and Freeman Chain Codes
  7. Altmetric Badge
    Chapter 6 An Efficient Approach for Extraction Positive and Negative Association Rules from Big Data
  8. Altmetric Badge
    Chapter 7 Field-Reliability Predictions Based on Statistical System Lifecycle Models
  9. Altmetric Badge
    Chapter 8 Building a Knowledge Based Summarization System for Text Data Mining
  10. Altmetric Badge
    Chapter 9 Spanish Twitter Data Used as a Source of Information About Consumer Food Choice
  11. Altmetric Badge
    Chapter 10 Feedback Matters! Predicting the Appreciation of Online Articles A Data - Driven Approach
  12. Altmetric Badge
    Chapter 11 Creative Intelligence – Automating Car Design Studio with Generative Adversarial Networks (GAN)
  13. Altmetric Badge
    Chapter 12 A Combined CNN and LSTM Model for Arabic Sentiment Analysis
  14. Altmetric Badge
    Chapter 13 Between the Lines: Machine Learning for Prediction of Psychological Traits - A Survey
  15. Altmetric Badge
    Chapter 14 LawStats – Large-Scale German Court Decision Evaluation Using Web Service Classifiers
  16. Altmetric Badge
    Chapter 15 Clinical Text Mining for Context Sequences Identification
  17. Altmetric Badge
    Chapter 16 A Multi-device Assistive System for Industrial Maintenance Operations
  18. Altmetric Badge
    Chapter 17 Feedback Presentation for Workers in Industrial Environments – Challenges and Opportunities
  19. Altmetric Badge
    Chapter 18 On a New Method to Build Group Equivariant Operators by Means of Permutants
  20. Altmetric Badge
    Chapter 19 Topological Characteristics of Digital Models of Geological Core
  21. Altmetric Badge
    Chapter 20 Shortened Persistent Homology for a Biomedical Retrieval System with Relevance Feedback
  22. Altmetric Badge
    Chapter 21 Explainable AI: The New 42?
  23. Altmetric Badge
    Chapter 22 A Rule Extraction Study Based on a Convolutional Neural Network
  24. Altmetric Badge
    Chapter 23 Evaluating Explanations by Cognitive Value
  25. Altmetric Badge
    Chapter 24 Measures of Model Interpretability for Model Selection
  26. Altmetric Badge
    Chapter 25 Regular Inference on Artificial Neural Networks
Attention for Chapter 12: A Combined CNN and LSTM Model for Arabic Sentiment Analysis
Altmetric Badge

About this Attention Score

  • Good Attention Score compared to outputs of the same age (66th percentile)
  • High Attention Score compared to outputs of the same age and source (83rd percentile)

Mentioned by

1 policy source
5 X users


15 Dimensions

Readers on

220 Mendeley
You are seeing a free-to-access but limited selection of the activity Altmetric has collected about this research output. Click here to find out more.
Chapter title
A Combined CNN and LSTM Model for Arabic Sentiment Analysis
Chapter number 12
Book title
Machine Learning and Knowledge Extraction
Published in
arXiv, August 2018
DOI 10.1007/978-3-319-99740-7_12
Book ISBNs
978-3-31-999739-1, 978-3-31-999740-7

Abdulaziz M. Alayba, Vasile Palade, Matthew England, Rahat Iqbal, Alayba, Abdulaziz M., Palade, Vasile, England, Matthew, Iqbal, Rahat

X Demographics

X Demographics

The data shown below were collected from the profiles of 5 X users who shared this research output. Click here to find out more about how the information was compiled.
Mendeley readers

Mendeley readers

The data shown below were compiled from readership statistics for 220 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 220 100%

Demographic breakdown

Readers by professional status Count As %
Student > Ph. D. Student 33 15%
Student > Master 28 13%
Student > Bachelor 20 9%
Researcher 15 7%
Lecturer 10 5%
Other 22 10%
Unknown 92 42%
Readers by discipline Count As %
Computer Science 85 39%
Engineering 12 5%
Unspecified 5 2%
Biochemistry, Genetics and Molecular Biology 3 1%
Chemistry 3 1%
Other 13 6%
Unknown 99 45%
Attention Score in Context

Attention Score in Context

This research output has an Altmetric Attention Score of 5. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 26 January 2022.
All research outputs
of 24,002,307 outputs
Outputs from arXiv
of 1,011,770 outputs
Outputs of similar age
of 338,253 outputs
Outputs of similar age from arXiv
of 23,982 outputs
Altmetric has tracked 24,002,307 research outputs across all sources so far. This one has received more attention than most of these and is in the 71st percentile.
So far Altmetric has tracked 1,011,770 research outputs from this source. They receive a mean Attention Score of 4.0. This one has done well, scoring higher than 85% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 338,253 tracked outputs that were published within six weeks on either side of this one in any source. This one has gotten more attention than average, scoring higher than 66% of its contemporaries.
We're also able to compare this research output to 23,982 others from the same source and published within six weeks on either side of this one. This one has done well, scoring higher than 83% of its contemporaries.