Presentation by Todd Carpenter during the Project Muse Publishers meeting in Baltimore, MD on April 24, 2014. During this talk, Todd discussed standards related to scholarly publishing and the output of several NISO initiatives.
Todd Carpenter Presentation at Project Muse Publishers Meeting - April 24, 2014
1. Around
the
publishing
technology
world
in
45
minutes
A
bit
on
NISO
&
standards
for
digital
content
Authorship
&
Iden>fica>on
Demand
Driven
Acquisi>on
Open
Discovery
Annota>on
Altmetrics
April 24, 2014 1
25
3. ! Non-‐profit
industry
trade
associaon
accredited
by
ANSI
! Mission
of
developing
and
maintaining
technical
standards
related
to
informaon,
documentaon,
discovery
and
distribuon
of
published
materials
and
media
! Volunteer
driven
organizaon:
400+
contributors
spread
out
across
the
world
! Responsible
(directly
and
indirectly)
for
standards
like
ISSN,
DOI,
Dublin
Core
metadata,
DAISY
digital
talking
books,
OpenURL,
MARC
records,
and
ISBN
About
April 24, 2014 3
4. April 24, 2014
38% Publishers/Publishing
Organizations
27% Libraries/Library
Organizations
123 LSA Members
(non-voting)
35% Library Systems Suppliers,
Publishing Vendors Intermediaries
ISO
ANSI
Other SDOs
National Information Standards Organization (NISO)
4
5. 5
Standards
are
familiar,
even
if
you
don’t
no4ce
Image: DanTaylor Image: Joel Washing
April 24, 2014
6. Communicang
science
has
changed
Image: Walters Art Museum
Image: Domenico, Caron, Davis, et al.
13. • ISO
27729:
Informaon
Documentaon
-‐-‐
Internaonal
Standard
Name
Idenfier
(ISNI)
• Launched
in
Spring
2012
• Idenfier
for
public
identy
of
pares
in
cultural
creaon
across
all
media
• Main
contributor
is
the
Virtual
Internaonal
Authority
File
(VIA)
–
Created
by
16
naonal
libraries
April 24, 2014 13
14. April 24, 2014 14
Nearly
7.5
Million
ISNIs
are
assigned
Another
6
million
“unverified”
names
800,000
researchers/scholars
490,000
instuons
Authorita4ve
iden4ty
(ISNI)
Versus
Individually
asserted
ID
(ORCID)
15. Potenal
reference
of
the
future?
ORCID/ISNI,
ISSN,
Vol/Issue
[DOI
metadata],
Instuon
ID,
Geo-‐locaon
[based
on
ISNI],
Date
[DOI
metadata],
DOI
April 24, 2014 15
17. The
context
for
ODI
• Emergence
of
Library
Discovery
Services
soluons
– Based
on
index
of
a
wide
range
of
content
– Commercial
and
open
access
– Primary
journal
literature,
ebooks,
and
more
• Adopted
by
thousands
of
libraries
around
the
world,
and
impact
millions
of
users
17April 24, 2014
18. General
Goals
• Define
ways
for
libraries
to
assess
the
level
of
content
providers’
parcipaon
in
discovery
services
• Help
streamline
the
process
by
which
content
providers
work
with
discovery
service
vendors
• Define
models
for
“fair”
linking
from
discovery
services
to
publishers’
content
• Determine
what
usage
stascs
should
be
collected
for
libraries
and
for
content
providers
18April 24, 2014
19. Balance
of
Constuents
Libraries
Publishers
Service
Providers
19
Marshall Breeding, Independent Consultant
Jamene Brooks-Kieffer, Kansas State University
Laura Morse, Harvard University
Ken Varnum, University of Michigan
Sara Brownmiller, University of Oregon
Lucy Harrison, Florida Virtual Campus (D2D
liaison/observer)
Michele Newberry, Independent
Lettie Conrad, SAGE Publications
Jeff Lang, Thomson Reuters
Linda Beebe, American Psychological Assoc
Aaron Wood, Alexander Street Press
Roger Schonfeld, JSTOR, Ithaka
Jenny Walker, Independent Consultant
John Law, Proquest
Michael Gorrell, EBSCO Information Services
David Lindahl, University of Rochester (XC)
Jeff Penka, OCLC (D2D liaison/observer)
April 24, 2014
20. Subgroups
• Technical
recommendaons
for
data
format
and
data
transfer
• Communicaon
of
library’s
rights/descriptors
regarding
level
of
indexing
• Definion
of
fair
linking
• Exchange
of
usage
data
20April 24, 2014
21. Deliverables
• Vocabulary
• NISO
Recommended
Pracce
– Data
format
and
data
transfer
– Library
rights
to
specific
content
– Level
of
indexing
– Fair
linking
– Usage
stascs
• Mechanisms
to
evaluate
conformance
with
recommended
pracce
21April 24, 2014
22. Current
steps
• 30-‐day
public
comment
period
October
18-‐
November
18,
2013
• Working
Group
evaluaon
of
comments,
edits
to
RP,
responses
• Working
Group
approval
(spring)
• Discovery
to
Delivery
Topic
CommiNee
approval
(summer)
• NISO
Publica4on
(summer)
22April 24, 2014
24. Barbara
Fister’s
take
on
the
Five
Laws
of
Library
Science
http://www.slideshare.net/bfister/erl-slides-fister
25. If
you’re
not
acvely
involved
in
geong
what
you
want,
you
don’t
really
want
it.
Peter
McWilliams
from
You
Can't
Afford
the
Luxury
of
a
Nega8ve
Thought
26. Goals
of
NISO
DDA
Iniave
• Create
a
recommended
pracce
to
address
the
complex
issues
around
Demand
Driven
Acquision
of
Monographs
• Develop
a
flexible
model
for
DDA
that
works
for
publishers,
vendors,
aggregators,
and
libraries.
– Flexible,
but
addresses
budget,
consoral
buying,
aggregaon
and
data
management
needs
April 24, 2014 26
27. Timeline
• Appointment
of
working
group
• Informaon
gathering
– Main
survey
completed
– Interviews
– Addi8onal
surveys
• Public
libraries
• consor8a
– Informa8on
gathering
completed
• Comple8on
of
ini8al
draD
• Gathering
of
public
comments
• Comple8on
of
final
report
Aug
2012
Aug
2013
Nov
2013
Mar
2014
Mar-‐Apr
2014
May
2014
April 24, 2014 27
28. Commiqee
members
• Lenny
Allen
Oxford
University
Press
• Stephen
Bosch
University
of
Arizona
• Scoq
Bourns
JSTOR
• Karin
Byström
Uppsala
University
• Terry
Ehling
Project
Muse
• Barbara
Kawecki
YBP
Library
Services
• Lorraine
Keelan
Palgrave
Macmillan
• Michael
Levine-‐Clark
University
of
Denver
• Rochelle
Logan
Douglas
County
Libraries
• Lisa
Mackinder
University
of
California,
Irvine
• Norm
Medeiros
Haverford
College
• Lisa
Nachgall
Wiley
• Kari
Paulson
ProQuest
• Cory
Polonetsky
Elsevier
• Jason
Price
SCELC
• Dana
Sharvit
Ex
Libris
• David
Whitehair
OCLC
April 24, 2014 28
30. Outline
of
DDA
Recommendaons
• Goals
for
DDA
• Choosing
content
to
make
available
• Choosing
a
DDA
model
• Profiling
content
to
include
• Loading
records
• Removing
records
• Assessment
• Preservaon
• Consora
DDA
• Public
Libraries
DDA
April 24, 2014 30
31. 1.
Establishing
Goals
• Four
Broad
Goals
for
DDA
– Saving
Money
– Spending
The
Same
Amount
of
Money
More
Wisely
– Providing
Broader
Access
– Building
a
Permanent
Collecon
via
Patron
Input
April 24, 2014 31
32. 2.
Choosing
Content
to
Make
Available
• Important
Issues
– Not
all
p-‐books
available
as
e-‐books
– No
single
supplier
provides
all
e-‐books
– Not
all
e-‐books
available
via
DDA
or
under
same
models
• Therefore
– More
comprehensive
coverage
requires
more
suppliers
and
more
models
– Broadest
coverage
possible
=
include
print
– Approval
vendors
can
help
manage
DDA
across
mulple
suppliers
• Publishers
should
recognize
that
libraries
may
wish
to
limit
number
of
suppliers,
and
plan
accordingly
April 24, 2014 32
33. 3.
Choosing
DDA
Models
Mix
of
auto-‐purchase
and
Short
Term
Loans
based
on
goals
of
program
• Auto-‐Purchase
– Purchase
triggered
on
the
first
use
longer
than
free
browse
– Purchase
triggered
awer
set
number
of
uses
– Purchase
triggered
awer
set
number
of
STLs
• Short
Term
Loans
(short
term
rental)
– A
set
number
of
STLs
prior
to
auto-‐purchase
– Only
STLs,
with
no
auto-‐purchase
April 24, 2014 33
34. 3.
Choosing
DDA
Models
(cont)
• Evidence-‐based
acquision
– Somemes
only
opon
based
on
plaxorm
capabilies
– Library
and
publisher
should
develop
expectaons
based
on
analysis
of
past
usage
• Publishers
may
wish
to
parcipate
in
some
or
all
models.
• Some
concern
by
publishers
about
sustainability
of
STL
April 24, 2014 34
35. 4.
Profiling
• DDA
profiles
should
be
based
on
the
broadest
definions
possible
within
these
areas,
and
relave
to
goals
of
the
program
– Subject
coverage
should
provide
access
to
a
wide
range
of
content,
even
in
subjects
that
may
not
be
core
– Retrospecve
coverage
for
crical
mass
• Especially
in
programs
that
otherwise
limit
coverage
• May
or
may
not
overlap
with
print
holdings,
depending
on
library
preference
April 24, 2014 35
36. 5.
Loading
Records
• Libraries
should
– Load
records
regularly
and
as
soon
awer
receipt
as
possible
– Load
records
into
as
many
discovery
tools
as
possible
– Code
records
for
easy
suppression
or
removal
– Enrich
metadata
to
increase
discoverability
– Load
point-‐of-‐purchase
records
awer
purchase
to
ease
acquisions
workflow/payment
April 24, 2014 36
37. 6.
Removing
Content
• Libraries
should:
– Remove
records
from
all
discovery
tools
as
soon
as
feasible,
owen
using
supplier’s
delete
file
– Establish
regular
cycle
for
removal
– Maintain
a
record
of
tles
removed
for
assessment
April 24, 2014 37
38. 7.
Assessment
• There
are
mulple
reasons
for
assessment,
so
this
should
be
planned
from
the
start
– Measuring
overall
effecveness
of
the
program
– Measuring
success
at
cost
reducon
– Measuring
usage
– Predicng
future
spending
– Managing
the
consideraon
pool
• Data
sources
might
include
– COUNTER
reports
– Vendor/publisher
supplied
reports
– ILS
or
other
local
data
April 24, 2014 38
39. 8.
Preservaon
Libraries
and
publishers
should
work
together
to
ensure
that
un-‐owned
content
remains
available,
perhaps
in
partnership
with
third-‐
party
soluons
such
as
LOCKSS
and
Porco.
April 24, 2014 39
40. How
DDA
impacts
specific
groups
9.
Consor4a
DDA
Three
basic
models
– Mulplier
(a
mulple
of
list
price
allows
shared
ownership)
– Limited
Use
(shared
ownership,
but
with
a
cap
on
use
before
a
second
copy
purchased)
– Buying
Club
(shared
access
to
consideraon
pool,
but
individual
ownership)
10.
Public
Library
DDA
– Mediated
for
greater
control
(fewer
resources)
– Wish
lists
– Owen
not
through
the
catalog
April 24, 2014 40
45. “Books
have
been
held
hostage
offline
for
far
too
long.
Taking
them
digital
will
unlock
their
real
hidden
value:
the
readers.”
–
Clive
Thompson
The
Future
of
Reading
in
a
Digital
World
in
Wired
Magazine
17.06
(2009)
51. Chapter
verse?
Character
count?
X-‐Path?
Pre/post
mark
hashing?
Some
(imperfect)
loca8on
methods
52. Who’s
done
doing
what?
• NISO
hosted
a
series
of
thought
leader
meengs
in
2012
• Recommended
focus
on
locaon
determinaon
(started
group,
now
disbanded)
• Open
Annotaons
model
based
on
work
with
Open
Annotaon
Collaboraon
• W3C
Annotaons
meeng
last
month
– New
W3C
working
group
forming
as
part
of
their
Digital
Publishing
iniave.
April 24, 2014 52
64. Steering
Commiqee
• Euan
Adie,
Altmetric
• Amy
Brand,
Harvard
University
• Mike
Buschman,
Plum
Analycs
• Todd
Carpenter,
NISO
• Marn
Fenner,
Public
Library
of
Science
(PLoS)
(Chair)
• Michael
Habib,
Reed
Elsevier
• Gregg
Gordon,
Social
Science
Research
Network
(SSRN)
• William
Gunn,
Mendeley
• Neoe
Lagace,
NISO
• Jamie
Liu,
American
Chemical
Society
(ACS)
• Heather
Piwowar,
ImpactStory
• John
Sack,
HighWire
Press
• Peter
Shepherd,
Project
Counter
• Chrisne
Stohn,
Ex
Libris
• Greg
Tananbaum,
SPARC
(Scholarly
Publishing
Academic
Resources
Coalion)
April 24, 2014 64
65. Alterna4ve
Assessment
Ini4a4ve
Phase
1
Mee4ngs
October
9,
2013
-‐
San
Francisco,
CA
December
11,
2013
-‐
Washington,
DC
January
23-‐24
-‐
Philadelphia,
PA
Round
of
1-‐on-‐1
interviews
–
March/Apr
Phase
1
report
expected
in
May
2014
66. Meengs’
General
Format
• Collocated
with
other
industry
meeng
• Morning:
lightning
talks,
post-‐it
brainstorming
• Awernoon:
discussion
groups
– X
– Y
– Z
– Report
back/react
• Live
streamed
(video
recordings
are
available)
April 24, 2014 66
67. Meeng
Lightning
Talks
• Expectaons
of
researchers
• Exploring
disciplinary
differences
in
the
use
of
social
media
in
scholarly
communicaon
• Altmetrics
as
part
of
the
services
of
a
large
university
library
system
• Deriving
altmetrics
from
annotaon
acvity
• Altmetrics
for
Instuonal
Repositories:
Are
the
metadata
ready?
• Snowball
Metrics:
Global
Standards
for
Instuonal
Benchmarking
• Internaonal
Standard
Name
Idenfier
• Altmetric.com,
Plum
Analycs,
Mendeley
reader
survey
• Twiqer
Inconsistency
“Lightning by snowpeak is licensed under
April 24, 2014 67
69. SF
Meeng
–
General
outputs
• The
importance
of
best
pracces
for
media
coverage
of
science
(using
DOIs,
etc.)
• More
Altmetrics
research
is
needed
and
could
be
promoted
through
this
group
• Providing
a
standard
set
of
research
outputs
that
we
can
use
to
compare
different
services
• The
importance
of
use
cases
for
specific
stakeholder
groups
in
driving
the
discussion
forward
April 24, 2014 69
70. SF
Meeng
Discussions
• Business
Use
cases
– Publishers
want
to
serve
authors,
make
money
– People
don’t
value
a
standard,
they
value
something
that
helps
them
– …
Couldn’t
idenfy
a
logical
standard
need
that
actors
in
the
space
would
value,
and
best
pracces
are
of
interest
• Quality
Data
science
– Themes:
context,
validaon,
provenance,
quality,
descripon
metadata
– We'll
never
get
to
the
point
where
assessment
can
be
done
without
a
human
in
the
loop,
but
discovery
and
recommendaon
can
• Definions
– Define
“ALM”
and
“Altmetrics”
– Map
the
landscape
– We'll
never
get
to
the
point
where
assessment
can
be
done
without
a
human
in
the
loop,
but
discovery
and
recommendaon
can
April 24, 2014 70
71. DC
Meeng
Discussions
• Business
and
Use
Cases
• Discovery
– metrics
only
get
generated
if
material
is
discovered
• Qualitave
vs.
Quantave
• Idenfying
Stakeholders
and
their
Values
– stakeholders
in
outcomes
/
stakeholders
in
process
of
creang
metrics
– shared
values
but
tensions
– branding
• Definions/Defining
Impact
– metrics
and
analyses
– what
led
to
success
of
citaon?
– how
to
be
certain
we
are
measuring
the
right
things
• Future
Proofing
– what
won't
change
– impact
-‐
hard
to
establish
across
disciplines
April 24, 2014 71
72. Philly
Meeng
Discussions
• Definions
– Define
life
cycle
of
scholarly
output
and
associated
metrics
– Qualitave
versus
Quantave
aspects
-‐
what
is
possible
to
define
here
– Consider
other
aspects
of
these
data
collecons
• Standards
– Develop
definions
(what
is
a
download?
what
is
a
view?)
– Differenate
between
scholarly
impact
versus
popular/social
use
– Define
sources/characteriscs
for
metrics
(social,
commercial,
scholarly)
• Data
Integrity
– Counter
biases/gaming
– Associaon
with
credible
enes
-‐
e.g.
ORCID
ID
v.
gmail
account
– Reproduceability
is
key
– Everyone
needs
to
be
at
the
table
to
establish
overall
credibility
• Use
cases
(3X)
April 24, 2014 72
73. Alterna4ve
Assessment
Ini4a4ve
Phase
2
Presenta4ons
of
report
(June
2014)
Priori4za4on
Effort
(June
-‐
Aug,
2014)
Project
approval
(Sept
2014)
Working
group
forma4on
(Oct
2014)
Consensus
Development
(Nov
2014
-‐
Dec
2015)
Trial
Use
Period
(Dec
15
-‐
Mar
16)
Publica4on
of
final
recommenda4ons
(Jun
16)
74. Other
work
underway
• Open
Access
Metadata
Indicators
• Bibliographic
data
exchange
• SUSHI-‐lite
profile
• Project
Transfer
formalizaon
• Book
Interchange
Tag
Suite
(BITS)
-‐
Potenal
• Data
transformaon
-‐
Potenal
• Scholarly
data
citaon
-‐
Potenal
• E-‐book
circulaon
data
exchange
-‐
Potenal
April 24, 2014 74