Facilitator: Linda Galloway
Note-taker: Christy Caldwell
Altmetrics is essentially Different ways of measuring output.
Definition of altmetrics, or rather what it is not:
- Not citation metrics
- Not raw usage (like counter)
Buckets of altmetric types:
- Scholarly activity – Mendeley stats
- Social activity – twitter
- Scholarly commentary – blogs
- Mass media
- Data, software reuse / citation
How do you know there’s consistency? That you have an apples to apples comparison?
NISO white paper being drafted right now.
This might help with tenure and promotion. Niso study has found no tenure / promotion uptake. Impact story is trying to find stories of how people are using this in their own packets.
Still so new – how is this even measurable.
Might be more important later, but need to pay attention now.
Welcome trust is looking at altmetrics.
Harvard has looked at it for Tenure promotion. Slides online. Amy Brand.
Sentiment analysis – what’s the qualitative analysis?
Likes on Facebook: what do lots of likes mean?
Correlations could help with sentiment analysis: reddit, Facebook but no blogs or Mendeley readership might indicate low quality.
Transparency is an issue: eg how are influential tweets measured?
Long tail: 2nd level, 3rd level metrics Plum Analytics. Not just capturing nyt article but then follow impact from there. Not much out there yet.
Actually finding these mentions to articles is a data mining challenge.
Cultural differences. In Latin America, Facebook is primary way of discussing scholarly but private. China has their own social media infrastructure.
One size does not fit all. ESP internationally.
Article level metrics at pLos but their legacy code limits scaling up.
At least one person in discussion used altmetrics with their own librarian promotion.
Orcid is part of this: perhaps with sciencev, to bring it all together.
NSF has a grant on capturing use and reuse of software to measure impact of the research work.
Outreach: Someone had success workshops through WISE.
UCB: smart publishing, altmetrics was part of it.
Working with the faculty development office has worked well.
It’s very difficult to collect data from all the various profiles people are using within institutions.
A few 100,000 authors submitting orcid numbers to scopus.
Google scholar doesn’t include any altmetrics or orcid in Profiles.
With NSF opening with the expanded definition of research work, there may be needs for measuring impact of non traditional objects.
Making discovery about research from these “buckets” is something institutions do care about. R&D, HR, development, intellectual property transfer are all offices on campus that are potential stakeholders.