Two days ago, scientists, science communicators, journalists, developers, publishers, and educators (to name just a few) descended upon London for the SpotOn London 2012 conference hosted by the Nature Publishing Group. “SpotOn”, which stands for science, policy, outreach, and tools online, brought together people who are passionate about improving scientific communication in the age of the Internet. With all the members of our team in attendance, Altmetric had a big presence at the conference. We helped to organise the Hackday fringe event and sat on the panel of the “Altmetrics beyond the numbers” session.
We at Altmetric, along with our colleagues at Digital Science, enjoyed this opportunity to learn from and mingle with an impressive collection of experts in various fields, including journal publishing, science journalism, social media/marketing, programming, and of course, scientific research. Below are some of our general impressions of the conference.
1. Thought-provoking plenaries
The conference opened with a frenetic, thought-provoking speech by author and medical doctor Ben Goldacre, who expounded on the need to impose new structures on existing data in order to obtain novel insights. Next up was Kamila Markram, co-founder of the journal publishing and social platform Frontiers, who gave an open-access primer, describing how the needs of “cyberscientists” are being met with open-access journals, data publishing repositories, specialised software, and article-level metrics. Later on, the conference closed with an excellent plenary session called “How to do smart journalism on complex science”, which featured the perspectives of notable science communicators Ed Yong, James Randerson, Victoria Gill, and Stephen Curry.
2. Engaging live and remote discussions
During the conference, hundreds of conversations took place both in person and online. (For all of the hashtags associated with the conference sessions since this past Saturday, Altmetric tracked over 10,000 tweets. Now, our goal is to set up a way to navigate intelligently through these data – more on this later.) For us, there were 3 few stand-out sessions, namely:
- “Altmetrics beyond the numbers” (#solo12alt / video) – Panelists fielded excellent questions from the audience about how valuable alternative metrics are for tracking the impact of scholarly articles, as well as whether alt-metrics will remain relevant if social media culture changes in the future.
- “The journal is dead, long live the journal” (#solo12journals / video) – The changing landscape of journal publishing – pushed by the open-access movement and changing views of the “brands” of journals – was the subject of a lively discussion by industry expert panelists and members of the audience.
- “Fixing the fraud: How do we safeguard science from misconduct?” (#solo12fraud / video) – A diverse panel broke down the problem of scientific fraud into components, citing the “publish or perish” culture as a main driving factor for dishonesty, and the profound lack of replication experiments as a major issue that needs to be formally addressed by scientists and publishers.
3. The “connectedness” of the conference
All talks, panel sessions, and workshops were live-streamed online. Additionally, each session was live-tweeted by members of the audience (a smartphone, tablet, or laptop was in the hands of nearly every attendee), and responses from external viewers poured in on Twitter. Notably, many session moderators monitored the Twitter conversations and often asked the panelists questions originating from remote viewers. What resulted was a fuller experience for both live and remote audiences: along with all of the great content being discussed in the live sessions, simultaneous conversations online added new layers of insight for all. Moreover, with such a diverse selection of concurrent sessions, having the archived live-stream and tweets enabled attendees to get up to speed with everything they had missed.
The need to be specific
The main weakness of the conference could best be described as a lack of specificity in many of the session discussions. Certainly, with so many people from different disciplines converging in one space, the session organisers could have risked alienating certain groups by launching into highly technical discussions. However, taking a superficial approach to particular issues without providing specific examples or anecdotes caused some of the discussions to stray away from science, leaving us feeling as if many issues were only being addressed hypothetically. For instance, sessions relating to the use of social media and Twitter in science did not seem to convey the benefits or challenges inherent in participating in science- or research-related tweeting. There was also little elaboration on the best practices and tools that could be used to improve scientific engagement through social media. In the future, referring to evidence, concrete examples, and case studies would greatly improve the quality of the discussions.
Post-SpotOn, publishers and the research community at large now have a lot more to think about and discuss. Interestingly, perhaps reflecting the growing angst against the Journal Impact Factor, #altmetrics (“alternative metrics”) were the subject of much discussion during the conference. “Alternative metrics”, often erroneously illustrated as the usurper of the Impact Factor, are really just different (not replacement) indicators of impact. In the next post, we’ll address some of the questions that came up during the “Altmetrics beyond the numbers” session at SpotOn London.