All Learning Resources

  • Session 3: Metadata Maintenance in the IODE Ocean Data Portal V2, part of the ODINAFRICA Ocean Data Portal Training-of-Trainers Course.

    This video presentation is Session 3 of 3 from the ODINAFRICA Ocean Data Portal (ODP) training-of-trainers course.  The ODP course demonstrates the International Oceanographic Data and Information Exchange (IODE) Ocean Data Portal V2. The focus is on ODP Data Provider for national nodes (NODCs). This course includes some specific topics like an introduction to Linux OS and an application server which are used as an operating environment. This course can be also used to gain a better understanding of the IODE Ocean Data Portal and its capabilities. 

    The course is comprised of three sessions and each session is composed of several video presentations. Session 3, Metadata Maintenance in the IODE Ocean Data Portal V2, contains the following videos.

    • Metadata Templates and Patterns
    • Metadata Quality Procedures
    • Monitoring Procedure
    • Using Virtual Data Provider Capabilities 

    PowerPoint slides for each presentation are available for download from the main course page.

    About the Ocean Data Portal:

    • Formally established in 2007 as a program under the IODE and supported by the Partnership Centre for IODE Ocean Data Portal, Russian Federation
    • Seeks to provide open and seamless access to marine data collections in an enabling and globally distributed environment
    • Facilitates discovery, evaluation, and access to data
    • Provides benefits to both data providers and data users
    • Focuses on standards, technology, people, and capacity planning 
  • Introduction to Designing Successful Open Access and Open Data Policies Course

    This is an introductory course for policymakers and funders and comprises slide presentations and a video. The course is organized into the following modules.

    • Introduction to Open Science for Funders
    • The Impact of Open Science
    • Definitions and Terms used in Open Science
    • Requirements of Horizon 2020 Regarding Open Access and Open Data 
    • Creating Effective Open Access Policies
    • Developing Open Data Policies

    Suggested reading is also provided.

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format on the slide preview pages for each module.

  • Introduction to Open Science for Funders, a part of the Designing Successful Open Access and Open Data Policies course.

    Provides a brief introduction to the main concepts of Open Science, discusses the rationale for Open Science and highlights the implications for the research lifecycle.

    This slide presentation is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It introduces the concept of and rationale for Open Science and some of the practicalities of Open Science.

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format at the provided URL.  Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce....

  • Open Access Explained! - a part of the Designing Successful Open Access and Open Data Policies course.

    Provides a brief introduction to the main concepts of Open Science, discusses the rationale for Open Science and highlights the implications for the research lifecycle.

    This video module is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It puts Open Access in the historical context of scientific publishing and explains the benefits and rationale.

    Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce....

     
  • Impact of Open Science, part of the Designing Successful Open Access and Open Data Policies course.

    This slide presentation module is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It discusses and provides evidence for the impact of Open Data and Open Access. 

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format at the provided URL.  Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce...

  • Definitions and Terms used in Open Science and Open Access, part of the Designing Successful Open Access and Open Data Policies course.

    This slide presentation is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It provides definitions of the main terms used in Open Science, including Research Data, and Gold, Green, Gratis and Libre Open Access.

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format at the provided URL.

    Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce...

  • The Requirements of Horizon 2020 Regarding Open Access and Open Data, part of the Designing Successful Open Access and Open Data Policies course.

    This slide presentation module is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It outlines the Horizon 2020 policy and recommendations on Open Access and Open Data, and the underlying rationale.

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format at the provided URL.

    Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce....

  • Creating Effective Open Access Policies, part of the Designing Successful Open Access and Open Data Policies course.

    This slide presentation module is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It describes what an Open Access policy should cover and what makes a policy effective; provides a model Open Access policy; explains why policies of this type work and gives examples.

    Available to view and download in PowerPoint format and available to download in text, PDF, and EPUB format at the provided URL.

    Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce...

  • Developing Open Data Policies, part of the Designing Successful Open Access and Open Data Policies course.

    This slide presentation module is part of the Introduction to Designing Successful Open Access and Open Data Policies Course for policymakers and funders. It describes what an Open Data policy covers; discusses the content of a model Open Data policy; gives a practical checklist for developing an Open Data policy; discusses what makes an Open Data policy effective; and analyses existing policies of funders and links to examples.

    Associated readings may be found for this topic at:  https://www.fosteropenscience.eu/learning/designing-successful-open-acce....

  • Open Science at the Core of Libraries

    This introductory course is addressed to librarians at different levels and positions that are committed to supporting researchers and their research processes at their institutions and would like to gain an understanding of the implications of Open Science for them, the potential opportunities and possible challenges, and check on existing best practices to deal with them.

    The course includes five sections:

    • What is Open Science?
    • What are the Benefits of Open Science?
    • What are the Challenges for Open Science?
    • What is in Open Science for Librarians?
    • What to Do Next?

    Each section may include videos, associated readings, and Open Science lifecycle diagrams.  

    Learning outcomes:

    The learning outcomes of this course are:

    • Understand the relevance of Open Science in relation to research integrity, reproducibility and impact
    • Identify the implications and opportunities for libraries in the development and support of Open Science
    • Know existing initiatives and best practices on Open Science
    • Identify suitable resources and tools to further develop library services on Open Science

    Greater insight on how to implement Open Data and Research Data Management, Open Access, copyright and e-infrastructures into the scholarly lifecycle and grant proposal preparation, can be found in the other FOSTER courses and training resources.

  • An Introduction to the Basics of Research Data

    This is an animated video explaining the basics of research data. This video is recommended background material for students planning to attend a workshop on research data management. It has as its purpose the explanation of the basics of research data, in a simplistic, entertaining, and interesting manner. It is hoped that the information contained in the video will prepare research data novices for a future research data management course/workshop.

  • Long-lived Data: Tools to Preserve Data: Theoretical Overview

    This slide presentation covers the research data management life-cycle, data preservation concepts and processes, metadata for preservation, levels of digital preservation, what should be preserved, and institutional readiness.

    Slides are available for download at the provided URL.

  • Long-Lived Data: Tools to Preserve Research Data

    This slide presentation provides an introduction to BagIt and Bagger. BagIt is a hierarchical file packaging format designed to support storage and network transfer of digital content. Bagger enables creators and recipients of BagIt packages to confirm that the files are complete and valid. The presentation covers:

    • What are BagIt and Bagger?
    • Installing and running Bagit and Bagger
    • Types of metadata
  • Write a Data Management Plan

    Video tutorial describing how to write a data management plan, for example for a research grant application to an Economic and Social Research Council. A data management and sharing plan can help you consider--when you design and plan your research--how data will be managed during the research process and shared afterward with the wider research community. 

    Research benefits:

    • Establishes how to collect and manage research data
    • Helps keep track of data assets (e.g., when staff leave your institution)
    • Identifies data support, resources, and services
    • Plans for data security and ethical measures
    • Prepares you for data request

    For access to more UK Data Service video tutorials go to https://www.ukdataservice.ac.uk/use-data/tutorials. 

  • Demonstration of the Open Science Framework (OSF) Electronic Lab Notebook (ELN)

    A demonstration of Open Science Framework's (OSF) Electronic Lab Notebook (ELN). OSF's application is a free, open-source web tool designed to help researchers track, manage, store, and if they choose, share their entire research workflow. 
    A wide range of resources for perspective electronic lab notebook (ELN) users can be found on the Gurdon Institute website at https://www.gurdon.cam.ac.uk/institute-life/computing/elnguidance. The site includes:

    • What is an Electronic Lab Notebook, and why should I use one?
    • A note about DIY documentation systems
    • Electronic Lab Notebook (ELN) vs Laboratory Information Management System (LIMS)
    • Disengagement - what if I want to change systems?
    • Which ELN would be best for me/my group?
    • Evaluating the product
    • Some current ELN products (suitability, platform, storage, and other details)
  • Data Sharing and Management Snafu in Three Short Acts

    A data management horror story by Karen Hanson, Alisa Surkis, and Karen Yacobucci. This is what shouldn't happen when a researcher makes a data sharing request! Topics include storage, documentation, and file formats.

  • DCC Curation Webinar: Customising DMPonline

    Demonstration of DMPonline functionality, followed by a demonstration of how to customize DMPonline for your institution. ​DMPonline helps you to create, review, and share data management plans that meet institutional and funder requirements.

  • DMPonline: Recent Updates and New Functionality

    This presentation covers updates and enhancements to the DMPonline  DMPonline helps you to create, review, and share data management plans that meet institutional and funder requirements. Six areas are discussed:

    1. Usability improvements
    2. Lifecycle and review
    3. API for systems integration
    4. Institutional enhancements 
    5. Locale-aware support
    6. Maintenance
  • Organizing and Modeling Data

    This presentation is part of the World Geodetic System (WGS) Data Management Planning Course. It provides a brief overview of data management, databases, and data modeling. Presentation sections include:

    • Why manage data?
    • What is a database?
    • Information systems cycle and models
    • Database design in research
    • A database modeling exercise
    • Structured Query Language (SQL)
    • Tips and tricks/good practice

    Access this downloadable PowerPoint slide presentation called "Data Management for Library Ph.D." course at the 13.15 - 14.15 time slot on the agenda. 

  • Providing and Using (Open) Biodiversity Data through the Infrastructure of the Global Diversity Information Facility (GBIF)

    Introduction to the Global Biodiversity Information Facility (GBIF), including lessons and achievements from the first ten years of using GBIF. This presentation was part of the Conference Connecting Data for Research held at VU University in Amsterdam. Topics include:

    • What is biodiversity?
    • Biodiversity data
    • History of GBIF
    • GBIF usage trends
    • Primary GBIF data
    • Using GBIF
    • Darwin Core
    • Integrated Publishing Toolkit (IPT)
  • OpenEarth: A Flood of Dutch Coastal Data in Your Browser

    OpenEarth is a free and open source initiative to deal with data, models, and tools in earth science and engineering projects, currently mainly marine and coastal. This presentation was part of the Conference Connecting Data for Research held at VU University in Amsterdam. Presentation topics include:

    • What is OpenEarth?
    • Community
    • Philosophy
    • Features
  • Open Access Publishing: A User Perspective

    Part of the Embracing Data Management, Bridging the Gap Between Theory and Practice workshop, held in Brussels, this talk provides an introduction to open access publishing. Topics include:

    • Benefits
    • Philosophy
    • Requirements
    • Preprints and postprints
    • Tips
  • Modern Research Management Workshop

    Introduction to research data management (RDM). Topics include:

    • Data organization, storage, backup, security, and deposit 
    • Benefits of managing and sharing data
    • File formats
    • File naming
    • Data publishing
    • Data reuse
    • Open data
    • Sensitive data
    • Data management plans and the DMPonline tool
  • Data Management Tools

    An overview of research management tools:

    • Re3data to find repositories
    • FAIRsharing and RDA Data Standards Catalogue
    • DMPonline for writing data management plans
    • OpenAIRE for managing outputs and reporting results
  • How NOT to Share Your Data: Avoiding Data Horror Stories

    This presentation is designed to encourage best practice from researchers when sharing their data. It covers basic issues such as repositories, file formats, and cleaning spreadsheets. It was designed for researchers in the sciences who already have some basic awareness that data sharing has many benefits and is expected by many UK research funders. Topics include:

    • Where you should and should not share your data
    • What data should you include?
    • Choosing a file format
    • Spreadsheet use
    • How you should and should not describe your data
  • Research Data Lifecycle

    Data often have a longer lifespan than the research project that creates them. Researchers may continue to work on data after funding has ceased, follow-up projects may analyse or add to the data, and data may be re-used by other researchers.

    Adhering to a well-defined research data lifecycle results in organised, well documented, preserved and shared data that advance scientific inquiry and increase opportunities for learning and innovation.

  • Bringing Research Data into the Library: Expanding the Horizons of Institutional Repositories

    The focus of library-managed institutional repositories has so far been on document-like items (published articles, preprints, theses, reports, working papers, etc.) but there is growing demand to expand their use into new genres such as scientific research datasets (sensor readings, genomics data, neuroimages, etc.). This webcast explains how institutional repositories are including this type of collection, what librarians need to know in order to manage such collections, and a few case studies from the MIT Libraries. ​

     
  • Marine GIS Applications (using QGIS)

    This course provides an in-depth overview of the application of Geographic Information Systems (GIS) to the marine environment using QGIS. All resources can be accessed from the provided URL. 

    Topic 1:
    Aims and Objectives:

    • Provide an introduction to GIS for marine applications
    • Focus on some publicly available marine datasets
    • Show the potential applications of GIS for the marine environment

    Learning Outcomes:

    • Knowledge and understanding of GIS, spatial data, raster, and vector models
    • Core tasks involved in the GIS analysis process including data acquisition, management, manipulation and analysis, and presentation and output
    • Core functionality of QGIS Desktop and QGIS Browser
    • Creating and editing spatial data
    • Appreciation of coastal and marine GIS data applications

    Topic 2: Introduction to Marine GIS

    • A brief explanation of GIS
    • GIS components
    • GIS data models
    • Marine GIS applications
    • Spatial analysis

    Topic 3: Introduction to QGIS

    • Why QGIS?
    • Exercise: Introduction To QGIS

    Topic 4: View data in QGIS Desktop

    • QGIS provides several windows, convenient for the user
    • Exercise

    Topic 5: Map Projections and Coordinate Systems

    • Geographic Coordinate Systems
    • Coordinate Systems And Map Projections
    • Video: Map projections 
    • Exercise

    Topic 6: Create Base Map in QGIS

    • Define the Area of Interest (AOI)
    • Exercise

    Topic 7: Creating Data Collection from the World Ocean Database

    • Exercise: Obtaining Marine Data from the World Ocean Database

    Topic 8: Introduction to Ocean Data View

    • Video: Ocean Data View (ODV) by Reiner Schlitzer
    • Exercise: Creating Data Collection from the World Ocean Database
    • Exercise: Export Marine Data from the Ocean Data View

    Topic 9: Working with Spreadsheet Data

    • Exercise: Adding Spreadsheet Data

    Topic 10: Edit Data in QGIS

    • Exercise

    Topic 11: Edit Data: Area of Interest and Analysis Mask

    • Exercise

    Topic 12: Interpolating surfaces

    • Map Interpolation
    • Interpolating Surfaces: Summary 
    • Exercise: Interpolate to Raster

    Topic 13: Rendering Raster Data

    • Exercise

    Topic 14: Raster Calculator

    • Exercise: Using the Raster Calculation

    Topic 15: Working with NetCDF

    • Exercise

    Topic 16: Plotting Vector Arrows from U and V Component Grids

    • Marine Geospatial Ecology Tools (MGET)

    Topic 17: Downloading species observations from the Ocean Biogeographic Information System (OBIS)

    • Exercise: Downloading OBIS Data in CSV Format

    Topic 18: Creating KML files for Google Earth

    • Example KML document
    • Exercise

    Topic 19: Publication Quality Maps

    • Exercise: Create Publication Quality Maps
  • Making Research Data Available

    There is a growing awareness of the importance of research data. Elsevier is committed to encouraging and supporting researchers who want to store, share, discover and reuse data. To this end, Elsevier has set up several initiatives that allow authors to make their data available when they publish with Elsevier. The webinars in the collection (located on the bottom half of the web page) cover:

    • Ways for researchers to store, share, discover, and use data
    • How to create a good research data management plan  
    • Data Citation: How can you as a researcher benefit from citing data? 
  • Metadata Recommendations, Dialects, Evaluation & Improvement

    This webinar describes a flexible, multifaceted approach to evaluating and improving metadata collections (in multiple dialects).

    The initial goal of many metadata efforts was discoverable data but, like many other elements of data management, the metadata landscape has evolved considerably over the last decade to include new use cases and requirements. The approach that has been developed includes web-based tools for understanding and comparing recommendations and dialects, flexible comparisons of completeness of metadata collections (in multiple dialects) with respect to particular recommendations, evaluation of completeness of single metadata records, identification of specific metadata improvement needs and an open forum for sharing information, experiences, and examples. Recommendations for metadata requirements and metadata improvement needs are discussed and shared.
     

  • FAIR Data in Trustworthy Data Repositories

    Everybody wants to play FAIR, but how do we put the principles into practice?

    There is a growing demand for quality criteria for research datasets. The presenters argue that the DSA (Data Seal of Approval for data repositories) and FAIR principles get as close as possible to giving quality criteria for research data. They do not do this by trying to make value judgments about the content of datasets, but rather by qualifying the fitness for data reuse in an impartial and measurable way. By bringing the ideas of the DSA and FAIR together, we will be able to offer an operationalization that can be implemented in any certified Trustworthy Digital Repository. In 2014 the FAIR Guiding Principles (Findable, Accessible, Interoperable and Reusable) were formulated. The well-chosen FAIR acronym is highly attractive: it is one of these ideas that almost automatically get stuck in your mind once you have heard it. In a relatively short term, the FAIR data principles have been adopted by many stakeholder groups, including research funders. The FAIR principles are remarkably similar to the underlying principles of DSA (2005): the data can be found on the Internet, are accessible (clear rights and licenses), in a usable format, reliable and are identified in a unique and persistent way so that they can be referred to. Essentially, the DSA presents quality criteria for digital repositories, whereas the FAIR principles target individual datasets. In this
    webinar the two sets of principles will be discussed and compared and a tangible operationalization will be presented.

  • How to Import Works into Your ORCID Record Using a Search & Link Wizard

    Several ORCID (Open Researcher and Contributor ID) member organizations have built search and link tools that allow you to import information about publications and other works into your ORCID record from other databases. The linking process can begin on the ORCID site, or at the organization's website. Note that ORCID does not store preprints or content. Rather we require that works data added to your record include a link that allows users to easily navigate to the source document. 

     

  • ORCID Registry: How to Group Works on Your ORCID Record

    Learn how and why to group works that have been added to your ORCID (Open Researcher and Contributor ID) record from different sources, so that they are displayed together on your record.

  • Getting Started with ORCID & ORCID API

    This presentation introduces ORCID (Open Researcher and Contributor ID), ORCID features, and ORCID's approach to API development.

  • Data Management Support for Researchers

    Tips and advice from a variety of researchers, data managers, and service providers, to help with data management. Titles include:

    • Sharing data: good for science, good for you
    • What support needs to be provided to assist researchers with data management?
    • How can choices about data capture open up, or limit, opportunities for researchers?
    • What should researchers do to help their data survive?
    • Why should researchers share their data?
    • How can repositories and data centres help researchers?
  • DMP Assistant: bilingual tool for preparing data management plans (DMPs)

    The DMP Assistant is a bilingual tool to assist in the preparation of a Data Management Plan (DMP). This tool, which is based on international standards and best practices in data management, guides the researcher step by step through the key questions to develop his plan. DMP Assistant is powered by an open source application called DMPOnline, which is developed by the Digital Curation Centre (DCC).  Site registration is required.  Data management planning templates are available for the DMP Assistant after registration and sign in.

  • Data Management for Clinical Research MOOC

    This course presents critical concepts and practical methods to support planning, collection, storage, and dissemination of data in clinical research.
     
    Understanding and implementing solid data management principles is critical for any scientific domain. Regardless of your current (or anticipated) role in the research enterprise, a strong working knowledge and skill set in data management principles and practice will increase your productivity and improve your science. The instructors' goal is to use these modules to help you learn and practice this skill set.

    This course assumes very little current knowledge of technology other than how to operate a web browser. The course will focus on practical lessons, short quizzes, and hands-on exercises as we explore together best practices for data management.

    The six modules cover these topics:

    • Research Data Collection Strategy
    • Electronic Data Capture Fundamentals
    • Planning a Data Strategy for a Prospective Study
    • Practicing What We've Learned: Implementation
    • Post-Study Activities and Other Considerations
    • Data Collection with Surveys
  • National Network of Libraries of Medicine (NNLM) Research Data Management Webinar Series

    The National Network of Libraries of Medicine (NNLM) Research Data Management (RDM) webinar series is a collaborative, bimonthly series intended to increase awareness of research data management topics and resources.  The series aims to support RDM within the library to better serve librarians and their institutional communities. Topics include, but are not limited to, understanding a library’s role in RDM, getting started, data management planning, and different RDM tools.

    Several NNLM Regional Medical Libraries will collaborate and combine efforts to feature experts from the field for this national webinar series. Each session will include separate objectives based on the featured webinar presenter. Attendee participation will be possible through the WebEx platform chat features and other electronic methods designed by the guest presenter. Sessions are recorded, closed-captioned, and posted for later viewing.

    Each session will last approximately 1 hour and 1 MLA CE contact hour will be offered per session. CE contact hours will only be available during the live presentations of the webinar.

    Watch the webpage for upcoming webinars.

  • Overview of Interdisciplinary Earth Data Alliance (IEDA) Data Management Resources

    In the digital era, documenting and sharing our scientific data is growing increasingly important as an integral part of the scientific process. Data Management not only makes our data resources available for others to build upon, but it also enables data syntheses and new analyses that hold the potential for significant scientific advancement. Effective data management begins during the planning stages of a project and continues throughout the research process from field and/or laboratory work, through analysis, and culminating with scientific literature and data publication. By planning ahead, and following some best practices along the way, the process of data management can be simple and relatively low-effort, enabling rapid contribution and publication of data in the appropriate data systems at the conclusion of a project.

    IEDA offers a variety of tools to support investigators along the full continuum of their data management efforts:  Links to these tools and resources are available from the landing page for this resource.

    Pre-Award

    • IEDA Data Discovery Tools
    • IEDA Data Management Plan (DMP) Tool

    Research & Analysis

    • Register sample-based data sets and samples 
      • Register sample metadata and get a unique sample identifier (IGSN)
      • Download Templates for Analytical Data
      • Learn about contributing Analytical Data to the EarthChem Library
    • Register sensor-based data sets and samples 
      • Contribute sensor data files (e.g. geophysical data) and supporting metadata to MGDS
    • IEDA Analysis Tools
      • GeoMapApp earth science exploration and visualization application 
        • Analyze your own geospatial data within the context of other available datasets

    Synthesis & Publication

    • Register final data sets with IEDA 
    • Publish your data 
      • Publishing your data with a DOI ensures that it can be directly referenced in your paper and cited by others.
    • IEDA Data Compliance Reporting (DCR) Tool 
      • Rapidly generate a Data Compliance Report (DCR) based on your NSF award number to demonstrate that your data are registered with IEDA systems and you are compliant with NSF Data Policies.
  • Simplifying the Reuse and Interoperability of Hydrologic Data Sets and Models with Semantic Metadata that is Human-Readable & Machine-Actionable

    This slide set discusses the big, generic problem facing geoscientists today that stems from lack of interoperability across a huge number of heterogeneous resources, and how to solve it.  Practical solutions to tame the inherent heterogeneity involve the collection of standardized, "deep-description" metadata for resources that are then wrapped with standardized APIs that provide callers wtih access to both the data and the metadata.