08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
"What does 'Full Life-Cycle' Data Management Mean ?"
1. What does “Full Life-Cycle” Data
Management Mean ?
“BIG DATA”
US Office of Personnel Management
March 14, 2013
2. “As required by the National Archives and
Records Administration (NARA) in 36 CFR
Chapter XII, Subchapter B, Records
Management, Federal agencies are
responsible for creating and maintaining
authentic, reliable, and usable records and
ensure that they remain so for the length of
their authorized retention period.”
http://www.archives.gov/records-mgmt/toolkit/pdf/ID373.pdf
3. First, a brief digression concerning graphics…
Edward Tufte’s favorite…
4. DISCRETION…
Exercise care in the selection of graphic formats
– not all graphics enhance understanding
some may confuse…
Lacking effective compound graphics, simplicity
and the use of multiple graphic images may be
more effective.
The New York Times often produces exemplary
graphics that compress complex data and
complex relationships…
5. NYT: “LEADING CAUSES OF CANCER DEATHS”
http://www.nytimes.com/imagepages/2007/07/29/health/29cancer.graph.web.html
6. “Data” ? [technical definition]
“…’data’ are defined as any information that can be stored in
digital form and accessed electronically, including, but not
limited to, numeric data, text, publications, sensor streams,
video, audio, algorithms, software, models and simulations,
images, etc.”-- Program Solicitation 07-601
“Sustainable Digital Data Preservation and Access Network Partners (DataNet)”
Taken in this broadest possible sense, “data” are thus simply
electronic coded forms of information. And virtually anything
can be represented as “data” so long as it is electronically
machine-readable.
7. “Data” [epistemicdefinition – addressing the meaning of data]
“Measurements, observations or descriptions of
a referent -- such as an individual, an event, a
specimen in a collection or an
excavated/surveyed object -- created or
collected through human interpretation
(whether directly “by hand” or through the use
of technologies)”
-- AnthroDPA Working Group on Metadata (May, 2009)
[funded by Wenner-Gren Foundation and US NSF]
8. “Experiments to determine the density of the earth,” by Henry Cavendish, ESQ., F.R.S. AND A.S. Read
June 21, 1798 (From the Philosophical Transactions of the Royal Society of London for the year
1798, Part II. , pp. 469-526)
From: http://www.archive.org/details/lawsofgravitatio00mackrich
11. DATA
SETS
some
examples
with “native
metadata”
2-d_soil_temps.csv
surface, and sub-surface soil temperatures (at 2cm and 8cm depths) measured at one location for a few days in order to
calibrate a model of temperature propagation. Surface temperature was measured with an infrared thermometer,
subsurface temperatures with a thermocouple.
----------------------------
5-minute_light_data_for_4_continuous_days_plus_reference.xls
PPF (photosynthetic photon flux = photosynthetically active radiation 400-700nm) measured with an array of photodiodes
calibrated to a Licor sensor, along a linear transect for a few days. used to get an idea of how much light plants along
the transect are receiving.
----------------------------
CO2_of_air_at_different_heights_July_9.xls
concentration of CO2 in the air during the evening for one day, measured with a Licor infrared gas analyzer and a series of
relays and tubes with a pump. used to examine the gradient of CO2 coming from the soil when the air is still during the
evening.
----------------------------
Fern_light_response.xls
Light response curves for bracken ferns, measured with a Licor photosynthesis system. Fronds are exposed to different light
levels and their instantaneous photosynthesis and conductance is measured. used in conjunction with the induction
data (below) for physiological characterization of the ferns.
----------------------------
La_Selva_species_photosyntheis_table.xls
incomplete data set on instantaneous photosynthesis rates for various tropical understory and epiphytic species grown in a
shade house in Costa Rica.
----------------------------
manzanita_sapflow_12-5-07_to_7-7-08.xls
instantaneous sap flow data (as temperature differences on a constant temperature heat dissipation probe) for multiple
branches of Manzanita, collected with a datalogger. used to correlate physiological activity with below-ground
measures of root grown and CO2 production.
----------------------------
moisture_release_curves.xls
percentage of water content, water potential (in MegaPascals) and temperature of soil samples, measured in the laboratory
for calibration of water content with water potential. soil is from the James Reserve in California.
----------------------------
Photosynthetic_induction.xls
a time-course of photosynthetic induction for a leaf over 35 minutes. instantaneous photosynthesis measured as �mol CO2
m/2/s and light level is probably 1000 micromoles. used to determine physiological characteristics of bracken ferns.
----------------------------
run_2_24-h_data_for_mesh.xls
measurements of micrometeorological parameters on a moving shuttle, going from a clearing across a forest edge and into
the forest for about 30 meters. Pyronometers facing up and down, pyrgeometer facing up and down, PAR, air
temperature, relative humidity. Also data from a station fixed in the clearing and some derived variables calculated.
used for examining edge effects in forests.
----------------------------
Segment_of_wallflower_compare_colorspaces_blur.xls
pixel counts from images of wallflowers that were segmented into flower/not-flower under different color spaces.
segmentation was made using a probability matrix of hand-segmented images. used to automatically count flowers in
images collected after this training data was collected (and used to determine the best color space for this task).
12. Data Development:
“Data Reduction - Processing Level Definitions” (an example)
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19860021622_1986021622.pdf
Report of the EOS Data Panel Vol IIA, NASA, 1986 (Tech Memorandum 87777)
Tom Moritz, OPM “Big Data” July, 2012
13. Data in Public Service
The Federal government manages data in
satisfaction of three primary requirements:
1) To account transparently for government
operations
2) To provide citizen access to the products of
government activities
3) To fulfill mandated tasks for which the
government has no original data (this
requires data acquisition)
14.
15. The basic goal is to make all data held by the US
government fully reliable and “audit-worthy”.
All data and all derived data products should be able
to withstand exacting examination and testing.
All descriptive information required for auditing
should be fully disclosed, readily available and
easily accessible in standard reporting formats.
17. • AGS Alto Golfo Sustentable
• ASM American Society of Mammalogists
• CEC Commission for Environmental Cooperation
• CEDO Intercultural Center for the Study of
Deserts and Oceans
• CI Conservation International
• CIRVA International Committee for the Recovery
of the Vaquita
• CICESE Centro de Investigación Científica y
Ecuación Superior de Ensenada
• CILA International Boundary and Water
Commission
• CITES Convention on International Trade in
Endangered Species of Wild Fauna and Flora
• Conagua National Water Commission
• Conanp National Commission for Protected
Natural Areas,
• Semarnat (Comisión Nacional de Áreas
Naturales Protegida—Semarnat)
• Conapesca National Fisheries and Aquaculture
Commission
• Sagarpa (Comisión Nacional de Pesca y
Acuacultura, Sagarpa)
• Profepa Federal Attorney for Environmental
Protection
• Secretariat of Agriculture, Livestock, Rural
Development, Fisheries, and Food (Mexico)
Salud Secretariat of Health (Mexico)
• COSEWIC Committee on the Status of
Endangered Wildlife in Canada
• Department of Fisheries and Oceans (Canada)
• United States Department of the Interior
• European Cetacean Society
• US Environmental Protection Agency
• US Food and Drug Administration
• GEF Global Environmental
• IBWC International Boundary and Water
Commission
• National Institute of Ecology, Semarnat
• Inapesca National Fisheries Institute, Sagarpa
• IUCN World Conservation Union
• International Whaling Commission
• Local Economic and Employment Development
program
• United States Marine Mammal Commission
VAQUITA STAKEHOLDERS
18. • Marine Stewardship Council
• NAMPAN North American Marine Protected
Areas Network (CEC)
• US National Academy of Sciences
• North American Wildlife Enforcement Group
(CEC)
• US National Marine Fisheries Service, NOAA,
Department of Commerce
• US National Oceanic and Atmospheric
Administration, Department of Commerce
• United States National Ocean Service (NOAA)
• PACE Species Conservation Action Programs,
Conanp
• PGR Attorney General Office (Mexico)
• POEMGC Marine Ecological Planning of the Gulf
of California Program, Semarnat
• Procer Conservation Program for Species at Risk
• Secretariat of Economy (Mexico)
• Sectur Secretariat of Tourism (Mexico)
• Sedesol Secretariat for Social Development
(Mexico)
• Semar Secretariat of the Navy
• Semarnat Secretariat of the Environment and
Natural Resources
• Society for Marine Mammalogy
• Solamac Latin American Society for Aquatic
Mammals
• Somemma Mexican Society for Marine
Mammalogy
• SWFSC Southwest Fisheries Science Center( US
NMFS, NOAA)
• The Nature Conservancy
• Universidad Autónoma de Baja California Sur
• University of California
• United Nations
• United States Coast Guard
• United States Fish and Wildlife Service
• World Wildlife Fund
19. Values: “Data Quality” ???
In the most general colloquial terms, “Data Quality” is the fundamental issue
of concern to scientists, policy makers, managers/decision makers and
the general public.
“Quality” can be considered in terms of three primary values:
• Validity: logical in terms of intended hypothesis to be tested (all potential
types of data that could be chosen should be weighed for probative
value,,,)
• Competence (Reliability) : consideration of the proper choice of expert
staff, methods, apparatus/gear, calibration, deployment and operation
• Integrity: the maintenance of original integrity of data as well as tracking
and documenting of all transformations and sequences of transformation
of data
20. Auditing – A Case History
“InterAcademyCouncil Names IPCC Review Committee”
“AMSTERDAM, Netherlands – The InterAcademy Council (IAC), an
organization of the world’s science academies, announced today that
Harold T. Shapiro, an economist and former president of Princeton
University and the University of Michigan, will chair a 12-member
committee to conduct an independent review of the procedures and
processes of the Intergovernmental Panel on Climate Change (IPCC). The
review was requested in March by U.N. Secretary-General Ban Ki-moon
and IPCC Chair Rajendra K. Pachauri.
“The committee will review IPCC procedures for preparing its assessment
reports. Among the issues to be reviewed are data quality assurance and
control; the type of literature that may be cited in IPCC reports; expert and
government review of IPCC materials; handling of the full range of
scientific views; and the correction of errors that are identified after a
report has been completed. The committee also will review overall IPCC
processes, including management functions and communication strategies
(the full statement of task is available at
www.interacademycouncil.net/ipccreview).”
http://reviewipcc.interacademycouncil.net/IACNamesIPCCReviewCommittee.html
21. Climate Change Assessments:
Review of the Processes and Procedures of the IPCC
(InterAcademyCouncil)
U.N. Press Conference Aug. 30, 2010
“Opening Statement”
by Harold T. Shapiro
President Emeritus and Professor of Economics
and Public Affairs, Princeton University and
Chair, InterAcademy Council Committee to
Review the IPCC
http://reviewipcc.interacademycouncil.net/OpeningStatement.html
22. US BLM Manual 1283
”Data Administration and Management”
“Every employee is responsible for the quality, integrity,
relevancy, accuracy, and currency of the data that is
created, collected, or maintained, whether the data are
in manual (paper copy) or electronic format. Managers
will employ good data management practices to
manage the data collected and maintained by their
program specialists. The program specialist who uses,
manages, and distributes the data must ensure that
data are collected according to established standards
and maintained to ensure accuracy and integrity. This
section identifies specific responsibilities in support of
the data management program.”
Rel. No. 1-1742 Supersedes Rel. No. 1-1678 Date: 7/10/2012
http://www.blm.gov/pgdata/etc/medialib/blm/wo/Information_Resources_Management/pol
icy/blm_manual.Par.77674.File.dat/BLM_1283_manual_final.pdf
23. A Gallery of Efforts to Depict
Full Life Cycle Data Management
24. Source: DDI Structural Reform Group. “DDI Version 3.0 Conceptual Model." DDI
Alliance. 2004. Accessed on 11 August 2008.
http://www.icpsr.umich.edu/DDI/committee-info/Concept-Model-WD.pdf
25. US NSF “DataNet” Program
“the full data preservation and access lifecycle”
• “acquisition”
• “documentation”
• “protection”
• “access”
• “analysis and dissemination”
• “migration”
• “disposition”
“Sustainable Digital Data Preservation and Access Network Partners (DataNet) Program Solicitation” NSF 07-
601 US National Science Foundation Office of Cyberinfrastructure Directorate for Computer & Information
Science & Engineering
29. IWGDD“DIGITAL DATA LIFE CYCLE”
Exhibit B-2. Life Cycle Functions for Digital Data*
• Plan
−− Determine what data need to be created or collected to support a research agenda or a mission function
-- Identify and evaluate existing sources of needed data
−− Identify standards for data and metadata format and quality
−− Specify actions and responsibilities for managing the data over their life cycle
• Create
−− Produce or acquire data for intended purposes
−− Deposit data where they will be kept, managed and accessed for as long as needed to support their intended
purpose
−− Produce derived products in support of intended purposes; e.g., data summaries, data aggregations, reports,
publications
• Keep
−− Organize and store data to support intended purposes
-- Integrate updates and additions into existing collections
-- Ensure the data survive intact for as long as needed
• Acquire and implement technology
−− Refresh technology to overcome obsolescence and to improve performance
−− Expand storage and processing capacity as needed
−− Implement new technologies to support evolving needs for ingesting, processing, analysis, searching and accessing
data
• Disposition
−− Exit Strategy: plan for transferring data to another entity should the current repository no longer be able to keep it
−− Once intended purposes are satisfied, determine whether to destroy data or transfer to another organization
suited to addressing other needs or opportunities
http://www.nitrd.gov/about/harnessing_power_web.pdf
Tom Moritz, OPM “Big Data” July, 2012
31. DataOne:
The Data Life Cycle: An Overview
The data life cycle has eight components:
Plan: description of the data that will be compiled, and how the data will be
managed and made accessible throughout its lifetime
Collect: observations are made either by hand or with sensors or other
instruments and the data are placed a into digital form
Assure: the quality of the data are assured through checks and inspections
Describe: data are accurately and thoroughly described using the appropriate
metadata standards
Preserve: data are submitted to an appropriate long-term archive (i.e. data
center)
Discover: potentially useful data are located and obtained, along with the
relevant information about the data (metadata)
Integrate: data from disparate sources are combined to form one
homogeneous set of data that can be readily analyzed
Analyze:data are analyzed
DataOne Best Practices Primer:
http://www.dataone.org/sites/all/documents/DataONE_BP_Primer_020212.pdf
32. W. K. Michener “Meta-information concepts for ecological data management”
Ecological Informatics 1 (2006) 3-7
Tom Moritz, OPM “Big Data” July, 2012http://tinyurl.com/d49f3vm
33. Federal Geographic
Data Committee
”Stages of the Geospatial
Data Lifecycle pursuant to
OMB Circular A–16, sections
8(e)(d), 8(e)(f), and 8(e)(g)”
http://www.fgdc.gov/policyandplanning/a-16/stages-of-geospatial-data-lifecycle-a16.pdf
34. “The Geospatial Data Lifecycle is not intended to
be rigidly sequential or linear. The quality
assurance and (or) quality control (QA/QC)
functions for the data should be included at
every stage of the Geospatial Data Lifecycle.”
[emphasis added]
--”Stages of the Geospatial Data Lifecycle pursuant to OMB Circular A–16, sections
8(e)(d), 8(e)(f), and 8(e)(g)”
http://www.fgdc.gov/policyandplanning/a-16/stages-of-geospatial-data-lifecycle-a16.pdf
35. Interagency Science Working Group
National Archives and Records Administration
http://www.archives.gov/records-mgmt/toolkit/pdf/ID373.pdf
“Establishing Trustworthy Digital Repositories: A Discussion Guide Based on the ISO Open
Archival Information System (OAIS) Standard Reference Model January 19, 2011”
36. “Sustainable data curation”
“There are several main elements necessary to sustain data curation:
“Robust data storage facilities (hardware and software) that are capable of
accurately handling data migration across generations of media.
“Backup plans, that are tested, so irreplaceable data are not at risk.
Unintended data loss can occur for many reasons: some major causes are:
poor stewardship leading to the loss of metadata to understand where the
data is located and documentation to understand the content, physical
facility and equipment failure (fire, flood, irrecoverable hardware crashes),
accidental data overwrite or deletion.
“Science-educated staff with knowledge to match the data discipline is
important for checking data integrity, choosing archive organization, creating
adequate metadata, consulting with users, and designing access systems that
meet user expectations. Staff responsible for stewardship and curation must
understand the digital data content and potential scientific uses. “
C.A. Jacobs, S. J. Worley, “Data Curation in Climate and Weather: Transforming our ability to improve predictions through global knowledge
sharing ,” from the 4th International Digital Curation Conference December 2008 , page 10. www.dcc.ac.uk/events/dcc-
2008/programme/papers/Data%20Curation%20in%20Climate%20and%20Weather.pdf [03 02 09]
37. Sustainable data curation(cont.)
“Non-proprietary data formats that will ensure data access capability for
many decades and will help avoid data losses resulting from software
incompatibilities…
“Consistent staffing levels and people dedicated to best practices in
archiving, access, and stewardship…
“National and International partnerships and interactions greatly aids in
shared achievements for broad scale user benefits, e.g. reanalyses,
TIGGE…
“Stable fundingnot focused on specific projects, but data management in
general…”
C.A. Jacobs, S. J. Worley, “Data Curation in Climate and Weather: Transforming our ability to improve predictions through global knowledge
sharing ,” from the 4th International Digital Curation Conference December 2008 , page 10-11. www.dcc.ac.uk/events/dcc-
2008/programme/papers/Data%20Curation%20in%20Climate%20and%20Weather.pdf [03 02 09]
38. Database Lifecycle Management
“The Database Lifecycle Management covers the entire
lifecycle of the databases, including:
• Discovery and Inventory tracking: the ability to discover
your assets, and track them
• Initial provisioning, the ability to rollout databases in
minutes
• Ongoing Change Management, End-to-end management of
patches , upgrades, schema and data changes
• Configuration Management, track inventory, configuration
drift and detailed configuration search
• Compliance Management, reporting and management of
industry and regulatory compliance standards
• Site level Disaster Protection Automation”
http://www.oracle.com/technetwork/oem/pdf/511949.pdf
Tom Moritz, OPM “Big Data”
40. “Data Quality” ???
“In the most general colloquial terms, ‘Data Quality’ is the fundamental issue
of concern to scientists, policy makers, managers/decision makers and the
general public.
‘Data Quality’can be considered in terms of three primary values:
• Validity: logical in terms of intended hypothesis to be tested (all potential
types of data that could be chosen should be weighed for probative
value,,,)
• Competence (Reliability) :consideration of the proper choice of expert
staff, methods, apparatus/gear, calibration, deployment and operation
• Integrity: the maintenance of original integrity of data as well as tracking
and documenting of all recording, migration, transformations and
sequences of transformation of data”
Tom Moritz, OPM “Big Data” July, 2012
41. “…the “validation” of any scientific hypotheses rests
upon the sum integrity of all original data and
of all sequences of data transformation
to which original data have been subject. “
– Tom Moritz
“The Burden of Proof”
Tom Moritz, OPM “Big Data”
http://imsgbif.gbif.org/CMS_NEW/get_file.php?FI
LE=2b032cf8212d19a720f21465df0686
All data go through processes of development. This 1986 NASA publication is still an excellent guide to basics of scientific data management…
The text accompanying the DCC model is very helpful in differentiating “full life cycle” actions / “sequential actions” and “occasional actions” -- the graphic is much less effective…
The accompanying text is more helpful but still not comprehensive…
Michener’s chart from 2006 makes a better effort at suggesting constant elements and feedback loops…
This Oracle “model” focuses on “databases” – not on “data” per se…